21 Jul, 2025
In the world of technology, the lines between curiosity, ethics, and fierce competition get blurry, especially when you're young and eager to "make your mark." Today I want to share a story with you from my university days. At the time, it felt like an epic victory, but over the years it has become one of the biggest lessons I've ever learned about responsibility and ethics in software development.
Context: Carpooling, University, and Fierce Competition
A few friends and I were determined to create the next big app. Our project was a carpooling solution designed to connect students at our university, optimizing routes and saving costs. We had the passion, the skills, and a functional version.
Just as we were about to finish, another group of students, mainly from the business department if I remember correctly, launched their own carpooling app. And they did it fast. Overnight, they had an active user base. The competition had materialized, and it was real.
"Eureka": A Catastrophic Security Flaw
I've always had an interest in "breaking" systems. Many of my classmates from that time can probably remember the times I accessed university information and systems (I'll publish those stories someday). Back then, my motivation was simple: curiosity and, I admit, a strong competitive instinct.
I turned my attention to the rival application. It didn't take me long to discover a vulnerability of catastrophic severity that gave me an adrenaline rush (I can still remember the feeling of discovering it, sitting on the floor of one of the hallways of Building G). They were using Firebase in an insecure way, a configuration so precarious that with a simple knowledge of the API, you could access the entire user database.
And there it was, all in plaintext: the institutional email addresses and, what's worse, the passwords of every student who had registered. It was a "Eureka!" moment in every sense of the word. From a purely technical perspective, it was an exciting find. From a competitive point of view, I felt like I had found the ultimate weapon.
What I Did Next: The Impulse of a Young Competitor
My past self, driven by the thrill of discovery and the heat of competition, made an impulsive and arrogant decision. I wrote a mass email to the entire list of users I had obtained, informing them that their personal information was compromised and warning them to stop using my rivals' application.
I didn't stop there. In an act of immaturity, I also sent an email to the application's founders. It wasn't a friendly warning or an offer of help. It was mockery, a message to rub their mistake and my "victory" in their faces. At that moment, it felt like a masterstroke, an excellent (and aggressive) campaign that would benefit our own application.
What I Would Do Today
Today, when I remember that episode, I feel a mix of shame and gratitude. Shame for my lack of professionalism and empathy, and gratitude for the profound lesson it taught me. If I were faced with the same situation now, my approach would be radically different.
This is what I've learned and what I would do differently:
Ethics comes first. The end doesn't justify the means. The first and most important lesson is that user security and privacy are sacred. Exploiting a vulnerability to gain a competitive advantage, even if it's to expose a flaw, is crossing a very dangerous ethical line. My main goal today would be to protect those affected, not to sink the competition.
Responsible disclosure is the way. Instead of a mass email and mockery, my first step would be to contact the founders privately and discreetly. I would explain the vulnerability I found in detail, the potential impact, and offer my help to fix it. The goal is to fix the problem before malicious actors can exploit it.
If after contacting them and offering help, they showed no intention of fixing the flaw, then it would be necessary to take other measures. But the first option should always be collaboration.
The user is the priority, not a weapon. If the developers refused to act, my next step wouldn't be blackmail. It would be to escalate the problem, perhaps through a trusted professor or a university authority, if such a channel existed.
And yes, ultimately, if there were no other option to protect the users, a broader communication could be considered, but never as a first resort and always with an informative and helpful tone, not one of alarm or mockery.
Conclusion: Emotion is a Bad Counselor
In the end, our application never took off, and theirs also stopped running after a while.
That experience didn't define me, but it did shape me. It taught me that emotion is a bad counselor and that technical knowledge comes with great responsibility. Today, I still "break" systems, but with a different purpose: to build, not to destroy. And that, perhaps, is the most important lesson any developer can learn.
So... What would you do?
Thanks for reading!
This post was written with the help of Gemini.
.png)


