Vibe Coding? Straight to Jail

2 weeks ago 3

Sketch Pad

What are the consequences of vibe coding? And I’m not talking about the developer here — I’m talking about the consumer. The end user. Let me explain.

Today, there’s a culture of “ship fast, die young” entrepreneurs building internet products. If you check out the #buildinpublic movement, you’ll see thousands of apps popping up like mushrooms seemingly every day. People promote building a SaaS in a weekend and shipping it immediately. Get users, get money, iterate, fail, try again. With the recent popularity of LLMs and their advancement as coding assistants, vibe coding is becoming more and more prevalent. The benefits are clear as day — they genuinely free up development time. But what do they sacrifice in return? Security.

It’s relatively easy to prompt your AI agent of choice these days and iterate on something with code. They even seem competent. Want to skip creating that boring CRUD boilerplate and glue code you’ve written hundreds of times? Just have your robot coding buddy do it. But here’s the problem: when you’re too bored to write that code, you’ll be too bored to review it properly. That’s where vulnerabilities come knocking.

We’ve all seen this play out on X. A founder promotes their new shiny app, people sign up, it gains some traction, and then… boom! The whole database gets leaked by some anime avatar account with 12 followers. They even post a thread with a writeup of the exploit: “1. Go to /users/1, then just go to /users/2 and you’ll see the other user’s info! EZ.” This is exaggerated, of course, but the majority of the time, it really is that basic.

So what are the consequences? The founder gets their reputation ruined and the app is deemed unsafe, but what about the user? What about the user whose name is now online, or their phone number, their email? The developer gets a slap on the wrist — the law usually goes after people with money, and this app doesn’t make much of it. Meanwhile, the user’s information stays online forever.

This can happen with manually written apps too. There’s always been copy-pasting code from random internet forums, but I feel like today, with vibe coding, it’s even more likely to happen. There should be greater consequences for blindly accepting code that a bot spits out. We need to be more vigilant and check it meticulously before we even think of pushing it to our codebase. Or maybe there will be some other AI that handles the pen testing for us?

Read Entire Article