The $1B Rorschach Test
I have three sons, but only two of them are conversational (so far). Of the two chatty ones, they approach life a little differently. Dax is a rule follower, always looking for the right answer. Jed, on the other hand? His life is lived through a lens that none of us can see. Imagination station is the only station his train stops at.
When it comes to coloring, Dax is a sort of neoclassicist. How is it supposed to be done? Jed, though, is an impressionist. Painting with all the colors of the wind. And regardless of the shape of his drawing, he shows me the drawing, locks eyes with me, and expectantly asks, “what do you see Dad?” His tone of voice indicates there is clearly a right answer.
Wavy lines? “It’s a dragon.” Large blob? “Dragon.” An entire page covered purple? “Also Dragon.”
Like his own homemade Rorschach test. Though, the thing about a Rorschach test is that, unlike Jed’s drawings, there isn’t a right answer, there’s only your answer.
This week, an equally insightful Rorschach test broke out online when the New York Times published a piece about Matthew Gallagher and his company, MEDVi, that supposedly scaled to $1.8B in revenue in 14 months. People’s reactions, like any good dragon drawing, say more about them than it says about the reality of the situation.
The Test
Here’s the story.
Gallagher launched MEDVi in September 2024 from his house in LA. He used ChatGPT, Claude, Grok, Midjourney, Runway; the whole AI stack. Those tools built the website, generated marketing assets, handled customer service, and wired together the backend. All in, he spent ~$20K to get the thing off the ground.
A lot of the praise online made it seem like this vibe coding wizard had called forth the elements from the four corners of the earth and crafted a Fortune 500 from whole cloth. There’s a lot to unpack in all of that.
My first immediate takeaway was this: MEDVi is not, in any meaningful sense, a healthcare company. It doesn’t employ doctors. It doesn’t run a pharmacy. It doesn’t manufacture or distribute drugs. It doesn’t do clinical oversight. It doesn’t manage patient care. All of that; literally all of the actual healthcare, is outsourced to two platforms: CareValidate and OpenLoop Health. CareValidate connects patients with doctors and pharmacies. OpenLoop handles the clinical infrastructure. MEDVi is, functionally, a marketing layer on top of rented telehealth infrastructure.
That’s not an insult. I think we’ve all matured past the point of criticizing “wrapper companies.” Lots of businesses are marketing layers. In fact, many are saying distribution is all that matters. That “marketing layer” did $400M in its first year across 250K customers, supposedly spitting off a 16.2% net margin. Now they’re (and by they’re I mean he’s?) on track for $1.8B.
Across dozens of reactions I saw just about every version of a dragon people could have seen. So I wanted to unpack each of them.
Dragon No. 1: “This Is an AI Story”

The framing of the story from the jump is “AI built this billion dollar company.” Everyone who has been talking about “one person billion dollar companies” since ChatGPT launched saw in this headline the appearance of their promised messiah. The AI story of the century!
But here’s the thing. It’s not. Now, I’m not here to pass judgement (at least… not in this particular paragraph). But what the NYT wants to scream or what ever honors Tech Twitter is lining up to bestow, framing this as an AI story misses the mark.
Here is the actual stack of preconditions that had to be true for MEDVi to exist:
First, telehealth infrastructure had to exist at a level where a random guy in LA could plug into a licensed physician network and a compliant pharmacy supply chain with essentially no healthcare expertise. That infrastructure has been years in the making, accelerated by COVID, and is what CareValidate and OpenLoop built (with ~700 employees, btw). Not him.
Second, the regulatory environment around telehealth had to be in a state of, let’s call it, permissive ambiguity. During COVID, the bumpers on the healthcare industry got thrown out the window. Restrictions on telehealth prescribing relaxed dramatically. A lot of the oversight mechanisms that existed to prevent exactly this kind of thing just… got quiet. And many of them haven’t come back.
Third, GLP-1 demand had to be absolutely insatiable. And it is. The demand for weight-loss drugs in America right now is, to use a technical term, bonkers. Wegovy, Ozempic, compounded semaglutide — people want it, they want it now, and they don’t want to go sit in a doctor’s office to get it.
All three of those things had to be true. Independent of AI. Gallagher could have done this seven years ago if those three conditions had existed seven years ago. The AI part, whether its the website generation, the ad creative, the chatbot, they’re all certainly nice and I’m sure were faster and cheaper than they would have been 7 years ago. Efficient, no doubt. But it’s the wrapping paper, not the gift.
As one person on Twitter put it: calling this an AI company is like calling a Shopify dropshipper a “manufacturing company” because they have a nice storefront. The storefront isn’t the business. The supply chain is the business. And the supply chain here is 100% human, 100% regulated, and 0% built by ChatGPT.
Dragon No. 2: “This Is Genius Execution”
Some people looked at the story and saw something more refined: a visionary growth hacker. A generational marketing mind. A guy who cracked the code on distribution in the AI age.
The “congrats” and “total Chad” and “get it, bro” sentiment was… robust. My favorite genre was the “they’re just mad cause they didn’t think of it” crowd. Classic.
Again, I’m not here to pass judgement (that paragraph comes later). Getting to $400M in revenue in your first year is an extraordinary marketing achievement. The conversion funnel, the Facebook ad optimization, the sheer volume of customer acquisition all take real skill. Anybody who tries to deny that has never tried to do anything similar themselves. In telehealth, you can think of a particular concept we’ll call “prescription friction,” which is basically the drop-off rate between someone clicking an ad, landing on a site, and then actually completing a prescription. MEDVi clearly reduced that friction to near zero.
But here’s the thing about reducing friction to near zero in healthcare: some of that friction exists for a reason.
The Quiet Part Out Loud
If you’ve been reading my writing for a while, you know that one of my most consistent themes is the importance of recognizing when people are playing different games. And I recently wrote about how every once in a while, someone says the quiet part out loud. They tell you exactly who they are and what game they’re playing, and most people just… don’t listen.
Gallagher is shouting the quiet part from the rooftops. A lot of people pointed out he should maybe be shouting less loud (and in mediums that aren’t admissible as evidence in a court of law). And what he’s screaming is: I built a marketing machine on top of rented healthcare infrastructure, reduced every possible safeguard to minimum viable compliance, and rode the greatest demand wave in pharmaceutical history to a billion dollars in revenue.
But immediately, upon hearing the congratulatory shouting, a good swath of good ol’ tech twitter felt their eyebrows start to rise. Here’s what our little army of online sleuths pulled together.
MEDVi’s website featured fabricated doctors. People found physician profiles with clearly fake names like the “you’re telling me that names not real” Dr. Tuckr Carlzyn, attached to stock photos that have no corresponding medical professional anywhere on the internet. If you put fake doctors on your website to sell drugs, that isn’t technically “growth hacking.” In fact, the technical term is checks notes fraud.

On top of that, the site’s intake process has essentially no meaningful clinical guardrails. Multiple people tested it by entering absurd parameters. One person said they wanted to go from their current weight to 60 pounds. Approved. Another entered a birthdate of February 31st. Accepted. The system told a 7’11”, 350-pound person they had a 94% chance of hitting their goal weight. It even included the helpful note that they wouldn’t even need to change their diet.
Pretty rough.
It’s not just consumers noticing stuff. Even before the New York Times piece came out, MEDVi had received an FDA warning letter for false and misleading information violations. The violations weren’t just at the “marketing layer,” they were at the infrastructure layer too! MEDVi’s clinician network partner, OpenLoop, suffered a data breach in January 2026 that exposed 1.6M patient records. That, plus AI-generated fake before and after pictures, a class action lawsuit in Delaware… and ALL of this was BEFORE the New York Times got wind of the story and thought “this is about AI!”

The biggest flaw here lies in the question of the “one person billion dollar company.” Billion dollars of what? Revenue? Here’s the thing. When your margins have 3-5 different exorbitant “takers” then, at some point, its more of a vanity metric than an actual worthwhile number. If you’re (1) paying two other companies for your entire physical infrastructure, (2) paying ungodly customer acquisition costs in an insanely competitive and completely commoditized market, (3) constantly losing customers to whoever has the latest “sign up now” discount, and (4) it’s only a matter of time before “lawsuit costs” becomes a consistent part of your P&L, then I’m pretty hesitant to use “revenue” as an anchor for any company’s value.
Hims & Hers did $2.4B in GLP-1 revenue with 2.4K employees and generated 5.5% net margins. Again, people anchor to this guy doing ~16% margins with one person (two if you include his younger brother), then its a glorious tale of AI’s efficiency gains. But, in my mind, that margin difference is a function of what other legitimate players are paying to adhere to compliance standards. If I don’t have to actually worry about the patient’s care experience, then sure… that saves some nickels.
Crimes Pays, Actually
When someone says they (he) “vibe code a billion-dollar business,” I instinctively start running the numbers. And the numbers here tell a very specific story: you can make a lot of money selling drugs if you don’t care about the health of the people you sell them to.
Now, importantly, that has always been true. Snake oil salesmen, the quintessential “guy selling nothing while promising everything” wasn’t a motor oil or a skincare oil. It was OG D2C healthcare. And, shockingly, I’m still not here to judge. Cause that will (I promise) come later. But I can’t stand by and nod to the dragons people are seeing in this odd Rorschach test. It’s not about AI. It’s not about “distribution is all that matters in a meme economy.” It’s not just “the golden age of the idea guy.”
The main takeaway is this: If you remove most (all) of the regulatory guardrails around selling a supposed miracle drug, all while using fabricated medical testimonials and sourcing grey market, if not black market, products that are divvied out with little (no) clinical oversight, and ride that demand wave for all its worth, then sure. “Crime” pays. As someone very wisely tweeted about this particular case: at some point, “move fast and break things” just became “move fast and break laws.”
The Disappointment
Now, finally, we arrive at the moment of truth. Here is where I have come to judge.
But I’m not judging Mike Gallagher and MEDVi. I’m not even judging telehealth regulators or growth hackers or even, apparently, snake oil salesmen. COVID was tough and we’re going through a paradigm shift alone, never mind the miracle drug that is flooding the market.
No… the people I’m disappointed with are the journalists, founders, operators, and pundits who are happy to lift this up as a textbook case of AI opportunity when it clearly is not.
I am deeply optimistic about the power of technology, AI, private markets, innovation, and all that jazz. But this is NOT the poster child of that. And the fact that people are so desperate for the “one person billion dollar company” made possible by AI, that they’re willing to overextend this analogy to check that box?
This is why technology has a public relations crisis. Not just in AI, but across crypto, fintech, longevity, and everything in between. One person on Twitter (said) it best:
“I hate what a lot of VCs are doing to builders and AI startups. They’re hyping up a lot of what’s either blatant fraud or not real revenue numbers and attributing it to AI. I get it, you want your AI acceleration and 2-person 1B company narrative to stick. But in your rush to prove your investment theory correct, you’re hurting the credibility of AI companies… We don’t need to turn AI into yet another one of those pump-and-dump cycles. Do your due diligence, stop pushing false narratives. We want public to trust startups coming out of valley and your false Twittter narratives are painting a target on the back of the companies.”
What Do You See?
When you look at the MEDVi story, you see yourself. You see your incentives. You see the game you’re playing.
If you’re playing to make AI the default tool the world over, you see validation. The prophecy fulfilled. The one-person unicorn.
But if you’re playing to survive for the long-haul? If you’re trying to actually build something durable, something real, something that helps people and lasts? You might see something much simpler.
You see a guy who made a crappy website, slapped it on top of someone else’s healthcare infrastructure, pointed Facebook ads at people desperate for GLP-1s, and didn’t bother to check whether the fake doctors on his site would pass a Google search or if losing 85% of your body weight would checks notes kill you. And you see an entire industry falling over itself to call that genius.
The problem is that nobody thinks there’s a right answer. And when I look at Jed’s drawings? That’s true. They can all be dragons, why not? But in the world of business building, customer needs (and potential death), margin pressure, and durability? There is a right or wrong answer. And I don’t think this is the right answer.