Reality Check: Validation for Marketers

Validation for Marketers

Most businesses fail because they scale before validating. Validation isn't a phase you complete and move past. It's a discipline you maintain at every stage. Test cheap, learn fast, scale only what works.

The difference between a business that compounds and one that collapses is usually just this: one founder kept asking "does this actually work?" The other assumed it did.

Why Most Businesses Scale Too Early

Excitement replaces evidence. You built something. It feels good. You want to grow it. But feeling good isn't data.

The gap between "I think this will work" and "I know this works" is where most money gets wasted. That gap feels small when you're inside it. From the outside it's a canyon.

You've seen your mum use the product. Your co-founder loves it. You showed it to five mates and they all said it was brilliant. This is not validation. This is selection bias and politeness.

Real validation means strangers with the actual problem will exchange money for the solution. Not friends. Not family. People who have no reason to be kind to you.

Most founders skip this step because it feels risky. What if people don't want it? What if the answer is no? Better to stay in the land of assumptions where the answer is always yes.

But that's backwards. It's riskier to build for six months and launch to silence. It's riskier to hire before you know the model works. It's riskier to spend months on positioning before you've proven people will buy.

The founders who move fast are the ones asking uncomfortable questions early.

The $50 Test

Before you build anything substantial, before you worry about positioning or messaging or which social platform to focus on, spend $50 on ads pointing to a landing page.

The goal is simple: can you get a real stranger with the actual problem to show genuine interest. Not hypothetical interest. Real interest demonstrated by their behaviour.

Fifty dollars. Two days. More information than six months of planning.

Most founders skip this because it feels too simple. Too crude. But simplicity is the point. You're not running a campaign. You're running an experiment.

The $50 Test: Step by Step

Step 1: Build a Landing Page

You need a landing page. Nothing fancy. A simple one-pager that lives somewhere online.

Use Carrd (€15 one-time, zero learning curve). Use Wix (zero cost to start). Use Webflow. Use Notion embedded as a public page. The tool doesn't matter. The page matters.

Your landing page needs five elements:

A headline that describes your offer in one sentence. Not "Buy My Course." Rather: "Learn the skills that make freelance writers £5,000+ per month." Specific, benefit-focused, one sentence.

A problem statement. Two or three sentences about the problem people have. "Most freelance writers charge too little because they don't know how to position themselves." Something your target audience will nod at.

A description of your solution. What does the offer do. Does it give them skills. Confidence. A community. Access to something exclusive. A year of support.

A clear call-to-action button. "Get Early Access" or "Join Waitlist" or "Pre-Order" or "I Want In." Not "Submit." Not "Click Here." Something that signals what they'll actually get.

An email capture form. You need to know who was interested. One field minimum. Email address. That's it. You might want name as well.

Optional: add a price. If you know what it will cost, show it. If you don't, you can say "Early price: £97, regular price £297" (or whatever). Knowing you'll charge something filters for people serious about the problem.

The page should be 200 words max. Every word should be doing work. No fluff. No corporate speak. No promises you're not sure you can keep.

Step 2: Write Your Headline

This is the most important part. Your headline is why someone clicks the ad.

Bad headline: "Freelance Writing Course." They see it and scroll.

Better headline: "How a £100-a-day freelance writer started charging £1,000+ per article." Someone scrolls. This is interesting.

Even better headline: "The positioning framework that got me from £100 to £1,000 per article in six months." This is specific, time-bound, and result-focused.

Your headline should answer: why would someone care about clicking this ad? What problem does it acknowledge. What result does it hint at.

Test one headline for the first £25. If it's not working (clicks are cheap or non-existent), pause it and try a new one. Don't overthink it. You're learning which angle resonates.

Step 3: Create Your Call-to-Action

Your button text needs to be clear about what happens next.

Weak: "Learn More" or "Click Here." These are generic and don't signal value.

Better: "Get Early Access" (suggests limited slots). "Join Waitlist" (clear next step). "Pre-Order Now" (they pay upfront). "I Want This" (casual, direct).

The best CTA matches your offer. If you're doing early access with a discount, use "Claim Early Discount." If it's a waitlist, use "Join the Waitlist." If they can pre-order, use "Pre-Order."

On your form, you can ask one question after they submit email. Not required, but useful: "What's your biggest challenge with [the thing you're solving]?"

You'll get a small box of responses. Read them. These responses are gold. They tell you how people describe the problem. How they think about it. Whether your framing matches theirs.

Step 4: Run Your Ads

Two platforms work well for a first test: Google Search and Facebook.

Google Search: People are searching for something related to your space. Run search ads on keywords like "learn freelance writing" or "freelance writing rates" or "freelance writing community." You only pay when someone actually clicks. Budget: £25. Daily budget: £12.50.

Facebook: You can target people by interest and behaviour. Target interests like "freelance writing" or "side hustles" or "solopreneurs." Budget: £25. Daily budget: £12.50.

Run for two days minimum. Two days minimum lets you see patterns without the noise of a single day. After two days, pause and look at the data.

Write your ad copy in the same tone as your headline. Short sentences. Benefit focused. Clear about what they get.

Step 5: Measure and Learn

Two days later, look at four metrics.

Click-through rate (CTR): how many people saw your ad and clicked it. Anything above 1.5% is solid for a first test. Anything below 0.5% means your headline or audience might be off.

Cost per click (CPC): how much you paid per click. Your £25 divided by your clicks. If you got 50 clicks, that's £0.50 per click. This matters because it tells you whether your audience is efficiently reachable on this platform.

Conversion rate: what percentage of people who clicked actually gave you their email. If 50 people clicked and 10 gave their email, that's a 20% conversion rate. Twenty percent is solid. Ten percent means the form is confusing or your landing page isn't clear. Five percent means your headline attracted the wrong people or the offer is unclear.

Total conversions: the raw number. Did anyone actually fill in the form. That's your signal. One conversion is signal. Zero conversions is a data point: either traffic problem or offer problem.

Read through the responses to your "biggest challenge" question. Look for patterns. Not one person saying one thing. Patterns. Multiple people saying the same problem in different words.

That's your feedback loop. Fifty dollars. Two days. Real data.

What you'll know after the test:

Nobody clicked: traffic problem. Your headline or audience targeting isn't resonating. Fix the headline first. Try a different angle. Or your audience is too broad (targeting "marketing" instead of "B2B SaaS marketing").

People clicked but nobody converted: offer problem. Your landing page isn't clear about what the offer actually is. Or your headline attracted the wrong people. Or the offer is genuinely not interesting to the people who clicked. Go back to the form responses. Read them. Do you understand the problem as they see it. Adjust your positioning.

People converted: you have signal. Your headline and offer resonate with the audience you're targeting. This is worth testing further. Run another £50. Dial in your best-performing ad. See if signal repeats. Once you've confirmed it's repeatable, move to the Sales Test.

Three Checkpoints

Every business passes through three validation gates.

Checkpoint 1: Interest Test

Do people want this? Not your mum. Not your inner circle. Strangers on the internet with the problem.

This is the $50 test. Landing page. Ads. Email capture. Pre-order.

What you're measuring: did people click? Did they give you their email? Did they pre-order?

If interest is there, move to checkpoint 2.

If interest isn't there, you have a traffic or messaging problem. Fix it before moving forward.

Checkpoint 2: Sales Test

Will people actually pay. Not "would you buy this?" which means nothing. Not a survey about interest. Real humans with the real problem, giving you real money.

This is where most founders stall. You got fifty pre-orders and now you're terrified to follow up because what if they change their minds.

But that's exactly where validation lives. In the moment someone decides the problem is painful enough to pay.

The Sales Test is a conversation with people who've shown interest. Not selling them. Talking to them. Learning whether their problem is real enough to spend money on.

Your questions matter. They need to go deeper than "do you like my idea."

Start with: "What made you interested." Listen for the answer. Not the polite answer. The real answer. "I'm frustrated that my clients don't value my work" is a real answer. "It sounds cool" is noise.

Then: "How are you solving this problem right now." This matters. If they say "I'm not, it's just an annoyance," that's different from "I pay someone £500 a month to help with this." The second one signals urgency.

Then: "What would change if you had my solution." Think carefully about their answer. Would it make their job easier. Would it make them more money. Would it reduce stress. Or would it just be nice to have.

Then, if they're still engaged: "Would you pay for this right now." Not "would you buy this eventually." Right now. The specificity matters. Someone might say "yeah, eventually" to almost anything. "Right now" filters for urgency.

Then if they say yes: "What would you pay." Listen to the number. Don't anchor them. Don't suggest a price first. Let them propose one. Their number tells you what the problem is worth to them. If they say £50, they don't think it's urgent. If they say £500, they're serious.

Take detailed notes. Three to five conversations like this will show you patterns. Don't just ask one person. Ask five. Ten if you can.

Look for these patterns:

Are they describing the same problem you think they have. Sometimes what you think is the main problem isn't. They might describe different worries.

Do they all agree on how urgent it is. If person one says "I'd pay £200 for this" and person five says "I don't think I'd pay anything," you have a consistency problem. Maybe person five doesn't have the problem. Maybe you're targeting wrong.

Are they comparing you to a competitor. If they mention what they're currently paying someone else, that's your benchmark. If they say "I spend £300 a month on [competitor solution]," they're signalling they're willing to spend money on this category.

The temptation is to have these conversations only with people who said yes to the initial test. But talk to people who showed interest but didn't convert too. "I filled in the form but didn't pre-order because..." Their objections are valuable. Are they worried about the price. The timeline. Whether it actually works. Whether you're credible.

These are the real objections. Not hypothetical ones. Real ones from real people.

If people pay, move to checkpoint 3. But "people pay" doesn't mean everyone. It means a consistent pattern. Three out of five say they'd pay £200 right now. That's signal.

If they don't, you don't have a product problem. You have an offer problem. Maybe the price is wrong. Maybe the positioning is off. Maybe the delivery timeline doesn't match their urgency. Maybe you're talking to the wrong people. Take the feedback and adjust.

Checkpoint 3: Math Test

Does the business make sense. Not the product. The business. The unit economics.

Lifetime Value versus Customer Acquisition Cost. LTV versus CAC.

LTV is how much profit a customer generates over their relationship with you. If you charge £500 and the average customer stays for two years and you spend £50 to serve each one, your LTV is roughly £950.

CAC is what it costs to acquire one customer. Ads, content, community management, sales conversations. All of it.

The basic rule: LTV needs to be at least three times CAC. If you're spending £100 to acquire a customer and they generate £100 in lifetime value, the business model doesn't work. When you factor in overhead and mistakes, you're underwater.

If the math doesn't work, no amount of scaling makes it work. You can't buy your way out of broken unit economics.

If the math works, you have signal at all three checkpoints. Now you can scale.

The Sunk Cost Trap

This is the hardest part of validation.

You've spent six months building. You've invested your own money. You've told people you believe in this. Your ego is on the line. You've sacrificed other opportunities to focus on this.

Then validation says: this might not work.

The temptation is overwhelming. Keep going. Just a bit more. Maybe the next version will resonate. Maybe I've just been talking to the wrong people. Maybe I need to wait for the right market moment.

That's the sunk cost trap. You're justifying continuing investment because of past investment. That's how good money follows bad.

Here's what you need to know: the six months you've already spent are gone. Completely irrelevant to the decision you're making right now.

The only decision that matters now is: if you had no history, no sunk cost, no ego involved, would you invest the next £500 in this idea based on the evidence in front of you.

If the answer is no, stop. Not because the idea is bad. But because the evidence doesn't support continued investment right now.

You can revisit it later. With different positioning. With a different audience. With a better understanding of the problem. But continuing to invest when validation isn't there is expensive.

This isn't giving up. This is being honest about what the market is telling you.

The founders who win are the ones who can kill ideas ruthlessly. Who can look at validation signals and say: this isn't working yet. I'm moving on. Or I'm pivoting. Or I'm trying a different angle.

The founders who lose are the ones who ignore validation signals because they've already invested.

Validation requires honesty. Even when honesty is expensive.

What Good Validation Looks Like

Here's what you need to understand: you're not trying to prove the business will work. You can't. Nobody can.

You're trying to prove it's worth the next bet.

Good validation isn't 100% certainty. You never get that. Good validation is: "enough signal to justify the next £500 of investment."

Let's be concrete about what that looks like at each checkpoint.

At Checkpoint 1 (Interest Test):

You ran £50 in ads. You got three clicks. Three people filled in the form. That's signal. Three conversions from a £50 test is solid (that's a 6% conversion rate roughly, depending on cost per click).

This doesn't prove your business will work. It proves that the problem and solution you're describing resonate with some people. You're worth testing further.

Good validation at this stage means: "enough people are curious enough to give you their email." Not many. Just some. And importantly, it doesn't mean they'll pay. It just means they're interested.

At Checkpoint 2 (Sales Test):

You talked to ten people who showed interest. Seven said they'd pay for this right now. They suggested prices ranging from £150 to £500. Five of them are specific enough that you believe they're serious.

That's good validation. Not perfect. Not proof. But signal that a segment of your audience will actually trade money for your solution. That they're not just interested, they're motivated.

Bad validation at this stage looks like: you talk to ten people. One person says she'd maybe pay something. The other nine say "sounds interesting but I'm not sure I'd actually use it."

Good validation means: most people you talk to describe a real problem. Most people you talk to would actually pay. Most people you talk to can articulate why they'd pay.

At Checkpoint 3 (Math Test):

You need three numbers.

Customer acquisition cost: you spent £50 and got five paying customers. That's £10 per customer so far. It's a small sample. But it's your starting number.

Lifetime value: you're charging £200 upfront. Your service costs you £20 to deliver. That's £180 profit per customer. If the average customer stays for one year and you deliver the service multiple times (or sells them something else), your LTV might be £400 or £500. Not £180, because you're learning that customers buy more than once.

The ratio: if your LTV is £400 and your CAC is £10, you have a ratio of 40:1. That's incredible. That's more than the 3:1 baseline. That's signal to scale.

But here's the thing: those numbers are fragile. Your CAC might go up as you buy more ads. Your LTV might go down if people don't buy as much as you hoped. So you don't scale hard yet. You scale gradually. You hit £100 in spending and see if the ratio holds.

Good validation at this stage means: "the math works on the data I have, even if it's limited, and I believe it will hold when I test further."

Bad validation means: "the math only works if everything goes perfectly."

The Validation Cycle

Here's what good validation looks like over time.

Month 1: £50 test. You get interested people. You confirm there's a signal.

Month 2: £200 in ads. You talk to people who convert. You're learning about their pain points.

Month 3: You've talked to thirty people. Twenty of them would pay. You've done three pilot deals. The math works if you acquire customers at current cost.

Month 4: You've refined the offer based on what you learned. You run another £200 test. Conversion is better than month 1.

Month 5: You've done ten paid deals. Revenue is happening. The model is repeating.

At each stage, you're answering the question: is this worth the next bet. Not: will this be a £10m business. Just: is this worth another £500.

Yes. Run the next test. No. Stop or pivot. The discipline is asking the question at every stage.

This is how you avoid the sunk cost trap. At every stage, you're making a small bet. £50. £200. £500. If it doesn't work, you've spent £500. That's not a failure. That's information.

The Permission This Gives You

Validation is permission to stop wasting time on ideas that don't have signal.

Most businesses don't fail because the founder didn't work hard enough. They fail because the founder worked hard on the wrong idea, for the wrong people, at the wrong time.

Validation isn't a phase you complete in month one. It's a discipline you maintain for the life of the business.

The businesses that compound are the ones that test, learn, adjust, and repeat. Every single quarter. Every new market. Every new offer.

The sunk cost trap is thinking that past investment guarantees future success. It doesn't. Past investment is gone. Future success depends on whether the next market signal supports continued investment.

This is how you separate successful founders from busy founders. Successful founders test and scale what works. Busy founders keep pushing what doesn't work because they've already invested.

The Bullshit Police

Every metric needs one test: does this connect to actual business results?

Page views don't connect unless they lead to something. Followers don't connect unless they convert. Engagement doesn't connect unless it drives revenue. Comments and shares feel like progress. They're often noise.

If you can't draw a clear line from a metric to a business outcome, that metric is noise. Platform metrics are designed to keep you on the platform. They're not designed to tell you whether your business is working.

This is where most businesses get lost. They optimise for the wrong things. More posts because that drives engagement. Bigger list because that feels like progress. Better analytics dashboard because it looks more professional.

Meanwhile the actual levers sit untouched. Traffic stays flat because nobody's running real ads or building real distribution. Conversion stays flat because the offer hasn't been tested.

The bullshit police is simple. Ask: does this metric predict revenue?

If yes, track it.

If no, ignore it.

This cuts through most of the noise.

MER: The One Number That Matters

There's one metric that ties everything together. Marketing Efficiency Ratio. Total revenue divided by total marketing spend.

If you spent £5,000 on marketing and generated £20,000 in revenue, your MER is 4. You made four pounds for every pound you spent.

Not attribution. Not multi-touch models that let you take credit for every touchpoint. Just: how much did we spend, and how much did we make?

Simple. Hard to game. Cuts through the attribution debates.

A MER of 3 is breakeven once you factor in overhead. Anything above 3 is profit. Anything below 3 means you're spending more than you're making.

Most businesses don't calculate this because it's scary. What if it's terrible. Better not to know.

But knowing is how you fix it. If your MER is 2, you need to either increase revenue per customer or decrease customer acquisition cost. Those are your only two levers. Knowing the number tells you which problem to solve.

Businesses that track MER obsessively outrun businesses that don't. Because they know exactly whether they're working.

Validation as Ongoing Discipline

You don't validate once and move on.

Every new channel needs validation before you invest heavily. The channel that worked last quarter might be saturated this quarter. The price that sold well might face resistance at scale. The offer that resonated with your first cohort might not work for the next one.

The discipline is asking "does this actually work?" at every stage, not just at the start.

This is counterintuitive. It feels like you should build momentum. Keep accelerating. But businesses that survive are the ones that stay paranoid. That test before they scale.

It's the difference between a founder who got lucky once and a founder who built a business. One got an arbitrage window and rode it. The other keeps validating at every stage.

Validation is how you separate signal from noise. It's how you know which lever to pull next.

Run the $50 test. Pass the three checkpoints. Track your MER. Then repeat at the next stage.

That's the discipline. That's how you scale.


Further reading: The Equation, The Diagnostic, The Traps