How to Turn Facebook Ads Into A Market Research Lab

From Phil V.

Insight Source: Verifiably spent $25M+ on Meta ads, generated $55M+ in revenue for D2C brands.

Insight Source: Verifiably spent $25M+ on Meta ads, generated $55M+ in revenue for D2C brands.

When Phil Vilk started CreativeLaunch, he followed the playbook every creative agency follows: produce more volume, test more concepts, deliver more ads.

For months, he built an entire production system around one core belief: creative volume and diversity were the path to scaling Facebook ads. He could recreate any video from Foreplay, any ad from Facebook Ad Library, any concept from Instagram. His team turned existing content into infinite variations.

And it worked. To a point.

Clients would see winners. Ads would rip. Sales would come in. But then clients would ask the question Phil couldn’t answer:

“Why did it work?”

Phil would explain: “We modeled a good video. We followed the format. The hook was proven.”

“Okay, but how do I get more of it to work?”

Phil realized he had no answer with high conviction. He’d built a system that can model someone who probably copied someone who probably copied someone else.

And then came the bigger problem: “When something stops working, you don’t know why. When something works, you cannot recreate it.”

This realization led Phil to a complete paradigm shift — one that changed his entire business model, and the fundamental way he thinks about Facebook advertising.

This is the story of how Phil transformed from a creative production agency to a market intelligence company, built a repeatable process for discovering customer insights through advertising experiments, and developed a philosophy that challenges everything most media buyers believe about testing.

What Becomes Expensive When Everything Is Free?

Phil’s transformation started with a simple observation about AI.

“When Claude first got released, maybe 10% of things I would output would go to production. I’m at like 80 to 90% of things that I have a conversation with actually goes live and is customer facing.”

That’s not incremental improvement. That’s a fundamental shift in what’s possible.

And it led Phil to a stark realization: “It’s only inevitable until somebody cracks instant video generation.”

If AI can already produce 80-90% production-ready work for copy, strategy, and analysis, video generation is next. And when that happens, the cost of producing content goes to effectively zero.

“When everything becomes free and cheap, what becomes expensive?”

Phil’s answer came from a famous story:

“A consultant comes into a nuclear plant and the thing’s melting down. They pay $50,000 to solve the problem. On the line item: the cost to solve the problem, $49,999, and the Sharpie to circle where the actual problem was, $1.”

Knowing what to fix is the expensive thing.

This applies directly to media buying and creative production:

When AI can generate infinite video variations for free, when anyone can produce 100 ads per week, when creative production becomes commoditized, the only moat left is knowing what to produce.

Not how to produce it. Not how fast you can produce it. What specific ad to make for what specific audience.

“The actual expensive thing is knowing what thing to actually do.”

This realization fundamentally changed how Phil thought about his agency. He wasn’t in the content production business. He was in the knowledge business — using Facebook ads as a research tool to discover what actually works and why.

The Ultimate Survey Tool You’re Already Paying For

Most businesses use Facebook for one purpose: getting sales.

Phil realized they should be using it for two: getting sales and understanding customers.

“The actual value of a business or of the ads is less about getting sales directly and more about understanding the customer — using it as almost like market research where Facebook is the ultimate surveying tool.”

Think about what Facebook offers as a research platform:

  • You can reach billions of people at any given point
  • You can test hypotheses in 3-7 days
  • You get real behavioral data, not survey responses
  • When someone buys, they’re voting with their wallet, not giving opinions

“If somebody actually buys off of a Facebook ad through a random funnel you built, that says a lot about the product, the positioning, and the piece of content that convinced them to purchase. That’s huge signal.”

Compare this to traditional market research:

Focus Groups: You pay people to give opinions about whether they’d buy. They’re influenced by the group dynamic and trying to be helpful.

Surveys: You pay people to say what they think they’d do. But stated preference is not revealed preference.

Facebook Ads: You’re not influencing them beyond the ad itself. If they bought, they actually were convinced to buy the product. That’s validated signal.

The key insight: “Most businesses spend $200,000, $300,000, $400,000 a month on ads — $5 million a year. If you’re purely using that just to get sales, you’re missing so much opportunity.”

You’re already paying for the world’s best market research tool. Why only use it for one thing?

The Inverted Pyramid: Where Everyone Starts Wrong

This led Phil to develop what he calls the “inverted pyramid” framework — and it explains why most media buyers waste money testing.

“Too many people are focused on the format. What am I supposed to make? Who can I copy? And they’re starting on the foundation where it’s unstable and the thing breaks.”

Here’s how most media buyers approach testing:

  • Start with format: What creative should we make? Talking head? UGC? Testimonial?
  • Look for inspiration: What’s working for competitors? What’s in the swipe file?
  • Test variations: Make 20 different videos and see what sticks
  • Hope something works: When it does, you don’t know why. When it doesn’t, you don’t know why.

Phil’s framework flips this completely:

  1. WHO — Who are the distinct audiences buying this product?
  2. WHY — Why does the winning audience buy? (The angle)
  3. WHAT — What’s the best way to show it? (The format)

“Once you fundamentally know the audience, it’s like a stair step, it unlocks the next step.”

The Dual Monitor Case Study

Phil uses the example of a product that attaches dual monitors to your laptop — those USB-powered side screens for working on the go.

Most media buyers would start by asking: “What creative concepts should we test?”
Phil starts by asking: “Who are the types of people that are buying this product?”

Audience Hypothesis:

  • Work-from-home people who want a better setup (5 days/week remote)
  • Office workers who don’t have monitors at their desk
  • Digital nomads traveling country to country with just a backpack

Notice the criteria for distinct audiences: “Would somebody working from home be traveling in different countries? Probably not. Would somebody working at an office be traveling? Probably not.”

If audiences overlap completely, they’re not distinct. You need genuinely different customer segments.

Once you have three distinct audiences, you test them in isolation:

  • Same product
  • Same website
  • Same budget
  • Same targeting (broad)
  • Different messaging specific to each audience

You run this for 7-14 days and see which audience responds best.

“Generally those three individual ads inside of that ad set are specific to that ad set. I can’t be confident in scaling this yet, but interesting, people working as nomads is outperforming all these other things by 30%. We should explore this more.”

Now you have knowledge. You know nomads are the best audience.

Next question: Why do nomads buy? (The Angle)

  • Productivity: Get more done with more screen real estate
  • Professionalism: Look more professional on video calls
  • Portability: Maintain your setup anywhere in the world

Test these three angles with nomad-specific messaging. Same format, same targeting, different “why.”

When you identify the winning angle (let’s say “portability”), now you ask:

What’s the best way to show this? (The Format)

  • Talking head video of a nomad explaining their setup?
  • Timelapse of someone setting up in different countries?
  • Before/after of laptop-only vs dual monitors?

“Once you build on the audience and angle, you know. You’re not starting at the format where it’s unstable.”

The Problem With Testing Everything At Once

Phil gives a stark assessment of what most agencies and media buyers do wrong:

“With all these AI editing tools, you just combine so many variables together that when something works, when a creator hits, when a video rips on Facebook, you don’t know why.”

“Was it the environment that changed in the background? Was it the creator? Was it what the creator said? Was it what they showed? Was it an overlay of text on top of it? You just don’t know why something worked.”

This is the fundamental problem with “test more creative” as a strategy:

When you change 5 variables at once, a winner tells you nothing. You don’t know which variable drove performance. You can’t replicate it. You can’t build on it.

“But the way to know is not by testing a bunch of variables. It’s isolating as many as you can until you have conviction on this thing, answer this question, and then you add a variable on top.”

Phil directly compares this to CRO (Conversion Rate Optimization):

“It’s what A/B testing should have been. If anyone is in CRO, it’s the same principles. Change one variable on the website, run it for a certain period of time, and see of the two which one does better.”

The question everyone asks: “What variables do I change?”

Phil’s answer: “The audience. Who is the ad for? Number one.”

“Most people, that’s where everyone goes wrong. People focus on how. How do I reach my person? What do I show the person? What format? What ad do I copy? Who has the best ads?”

“But the real question for your business is: Who on the other end? Who are my potential customers?”

The Data Goldmine You’re Ignoring

Once Phil realized Facebook was a research tool, he had to answer: How do you discover audiences to test?

Most agencies do this wrong. They use AI to generate “customer avatars.” They copy what AG1 or other successful brands are doing. They make educated guesses.

Phil’s approach: Mine your own data.

“Everyone, this is the thing that’s mind-blowing, is everyone will use AI to give me a bunch of audience ideas or they’ll copy a top-performing brand. But really you’re sitting on so much gold that nobody looks at their customer data.”

Where the gold is hidden:

1. Reviews (Across All Platforms)

  • Google reviews
  • Amazon reviews
  • Facebook reviews
  • Trustpilot

“What are the patterns in all the reviews? What are the names of the people? What are the genders? What are they saying the use cases are? What are they complaining about?”

2. Customer Support Tickets

“Huge data mine. You can look at somebody’s post-purchase surveys. Call transcripts if your business takes phone calls.”

3. Your Website & Ad Comments

“You can scrape an entire product page. What are your top ad comments saying? People are tagging other people that would be good fits for the product.”

“There’s so much gold there. People are begging you to understand them. They’re telling you why they bought, how they use it, who else should buy it.”

The difference between this and AI-generated personas:

  • AI personas: Invented characteristics based on similar businesses
  • Your customer data: Real people telling you exactly why they bought

Why “Off-Seasons” Don’t Exist

One of Phil’s most contrarian insights challenges how most brands think about seasonality.

“A lot of businesses are driven by seasonal demand. Valentine’s Day, Mother’s Day, Father’s Day, Christmas. Those are the spikes that we all get.”

“But what happens in between those peaks? Most people have what they call an off season. So they pull back spend, lose momentum, and everything that they built.”

Phil’s perspective: There is no off-season. You just don’t know who’s buying.

“Is your Shopify store getting zero dollars in sales? Probably not. You’re probably getting some sales. Who is buying? Why are they buying?”

The opportunity: Unlock the “off-season” by understanding the cluster.

“If you can unlock a season, you know, like perfect example is Fourth of July. It is one day, but it’s an entire season.”

For some businesses, Fourth of July is their worst period (gifting products, business services). For others, it’s their biggest month (party supplies, patriotic products, outdoor gear).

The difference: The winning businesses figured out WHO buys during that period and WHY.

“When you understand the ‘off-season’ buyer, you unlock an entire new season.”

Phil gives the gifting example:

Most brands try: “I’m gonna find people that have a birthday on March 17th and sell my product.”

“What are the odds of somebody seeing the ad and it’s not the receiver, it’s the person gifting it? They’re thinking of someone to give them a gift. I have to buy with my credit card. I have to think about what size. I have to make sure it arrives on time. I have to think about where to ship it. Do I have to package it?”

That’s why birthday targeting doesn’t work. The window is too tight. Too many decisions.

But what if you identified: “Parents buy our product in August to send to college kids for back-to-school.”

That’s a season. August 1-31. A full month. The parent is buying for themselves (to send to their kid), so they control timing and logistics. It’s a specific use case you can message to.

“That’s the real value of the insight.”

Win Or Learn, Never Lose

Phil has a unique perspective on “failed” tests that more media buyers should adopt.

“You either win or you learn. That’s the position you should be in.”

“Oftentimes people are like, ‘My ad didn’t scale, I lost.’ No. You won or you learned versus ‘I won and I lost.'”

The insight: Build a repository of what works and what doesn’t work.

When you run isolated experiments:

  • Audience tests tell you which audiences respond (winners) and which don’t (learnings)
  • Angle tests tell you which messages resonate (winners) and which don’t (learnings)
  • Format tests tell you which execution styles work (winners) and which don’t (learnings)

“When you onboard a new media buyer or a new agency or a new team member, how much time is lost in that initial period of them understanding how the business operates?”

Usually 1-2 months. Because they have to:

  • Figure out who the customer is
  • Learn what messaging works
  • Understand what’s been tested before
  • Discover what doesn’t work

“But if you had a playbook, if you had an understanding of ‘okay, we’re in October and this is how our business will function, and then here’s what happens after that period,’ people aren’t asking questions. They go straight into it.”

This is like having your health metrics tracked over time. You know what levers to pull to change what outcomes.

“Most people have no understanding about their business. Which is funny because they’re somehow scaling it really fast and they’re growing. But the more they grow, the less they know.”

How To Apply This To Your Business

Phil’s approach to using Facebook as a market research tool reveals a framework any media buyer or founder can apply:

  1. Start With The Question, Not The Tactic
    “The first step is define the question. Always start with the question and work backwards from it.” Don’t start with “let’s test 20 new creatives.” Start with “what do we need to learn?” Examples: Who are the people buying today? Why did people buy during our “off-season”? What use cases are customers mentioning?
  2. Mine Your Existing Data First
    Before using AI or copying competitors: export reviews from all platforms (Google, Amazon, Facebook, Trustpilot), analyze support tickets for patterns, read post-purchase surveys, review call transcripts, and check ad comments. “Look at somebody’s post-purchase surveys. Call transcripts. Scrape an entire product page. What are your top ad comments saying?”
  3. Build Hypotheses From Data Patterns
    Don’t invent audiences. Discover them from your data. Look for patterns: What use cases do customers mention repeatedly? What problems do they say it solves? Who are they tagging or recommending it to? What situations do they describe?
  4. Build Knowledge That Compounds
    After each test: document what you learned, update your playbook, and share insights across the business (not just media buying). “How do you use Facebook to teach a business about its own business, about its own customers?”
  5. Apply Insights Beyond Facebook
    The real value isn’t just better Facebook ads. When you discover nomads are your best audience: create dedicated landing pages for nomads, send email campaigns with nomad messaging, write blog content for nomad use cases, find Reddit/forums where nomads discuss productivity, and build partnerships with nomad-focused brands. “What happens when you make specific funnels for that specific audience? Your conversion is probably gonna go up.”

Phil’s entire philosophy comes down to one principle:

“Media buying amplifies strong businesses but cannot save weak ones.”

When content production becomes free, when anyone can generate 100 ad variations per day, when AI can make infinite creative, the only sustainable advantage is knowing what to make.

“The fundamental moat in any business now is do you understand your customer? Any business that understands their customer is going to win because most people don’t understand their customer.”

Because understanding compounds:

  • Better audience targeting
  • Better angle messaging
  • Better format selection
  • Better landing pages
  • Better email sequences
  • Better product development
  • Better business decisions

“Facebook can validate this in three days. You just put spend behind it. Do we get sales or do we not get sales? If we get sales, what did I do? I did this one thing. Let me go deeper.”

Versus focus groups, surveys, and traditional research that take months and give you opinions instead of revealed preferences.

If you’re running Facebook ads for e-commerce or lead generation:

Mine your existing customer data. Export reviews from all platforms, analyze support tickets, read ad comments. Look for patterns in who’s buying and why. Document 3 distinct audience hypotheses.

Then run your first isolated experiment. Pick one variable (audience, angle, or format), create 3 variations, run for 7-14 days with equal budgets. Document what you learn regardless of whether it “wins.”

The agencies and founders who win in the next 5 years won’t be the ones who produce content fastest. They’ll be the ones who understand their customers deepest.

For more, watch the full Q&A interview with Phil here.

Need Additional Help?

  1. Access and hire vetted top 1% media buyers for ad management or coaching
  2. Access the creative platform that these top 1% mediabuyers are using to scale

Follow me on socials:

Watch the Full Interview Behind This Insight

Full Q&A inside Catalyst, our private community led by vetted top 1% media buyers.

Scale With Top 1% Media Buyers

Proven frameworks, weekly coaching, and direct access to vetted top 1% media buyers.

Hire Top Media Buyers

Discover and hire from a verified network of top 1% media buyers with proven results.

Want More Insights?

Get 1 actionable insight weekly from Top 1% Media Buyers. Backed by vetted results. 3-minute reads. Zero fluff.