Marketers struggle to see the ROI they want from LinkedIn. They follow generic best practices, trust platform defaults, and wonder why results suck.
Success doesn’t come from copying others. It comes from testing what works for your specific business.
Rob Muldoon, who taught this LinkedIn experimentation framework at CXL, puts it bluntly: “The advertisers who do things slightly different or have used experimentation to get the edge on LinkedIn are the ones that are the most successful.”
This guide will show you how to build a systematic testing approach that delivers actual business results, not just vanity metrics.
Table of contents
Why most linkedin advertisers fail
LinkedIn’s platform offers plenty of targeting options and campaign types. That flexibility is a trap for the unprepared.
Let’s be clear: what works for one company will fail miserably for another. Every LinkedIn account is different. The audience behaves differently than on other platforms. The professional context changes everything.
B2B audiences on LinkedIn represent wildly different:
- Company sizes and structures
- Internal metrics and goals
- Professional pressures
- Career aspirations
- Buying motivations
- Stakeholder relationships
In smaller companies, your target might talk directly to the CEO. In large enterprises, they’re navigating layers of bureaucracy. These differences matter.
To succeed, you need to:
- Be skeptical of conventional wisdom
- Challenge platform defaults
- Run real experiments
- Measure what matters
- Kill what doesn’t work
- Scale what does
If that sounds like work, it is. But it’s the only way to win.
Building your experimentation framework
Don’t just run random tests. You need a structured approach that builds knowledge over time.
Muldoon emphasizes that a proper framework “allows you to save time, keep your experiments meaningful, keep them consistent, give them the best chance of being conclusive, and encourages you to keep a record.”
Here’s how to build yours:
1. Start with high-impact goals
Don’t waste time on trivial improvements. Choose goals that would significantly move your business forward if achieved.
Examples:
- Increase webinar registrations by 50% (if webinar attendees drive sales)
- Cut cost per qualified lead by 30% (if acquisition cost limits growth)
- Boost conversion rate from sponsored content by 25% (if conversions drive revenue)
- Shorten sales cycles by 20% (if long cycles kill momentum)
Your goals need to be SMART: Specific, Measurable, Attainable, Relevant, Time-based. “Improve performance” isn’t a goal – it’s a wish.
Ask yourself: “If this test succeeds, will anyone actually care?”
2. Create testable hypotheses
A good hypothesis states exactly what you’re changing and what you expect to happen. It focuses on one variable to ensure clear causation.
Examples:
- “Website Conversions objective will generate more webinar registrations than Brand Awareness objective for our target audience.”
- “Targeting by skills will yield higher-quality leads than targeting by job titles.”
- “Video ads with customer testimonials will outperform static images showing product benefits.”
Document your reasoning. When tests fail (and many will), understanding why you thought they’d work helps build knowledge.
3. Define your success metrics
Pick metrics that directly measure success and provide early signals:
Primary KPI: The direct measurement of your goal
Secondary KPIs: Leading indicators that show if you’re on track
For webinar campaigns:
- Primary: Cost per registration
- Secondary: Click rate, landing page conversion, form completion
For lead generation:
- Primary: Cost per qualified lead
- Secondary: Click rate, form completion, lead quality score
Secondary metrics help diagnose why tests succeed or fail. They’re your diagnostic tools.
4. Set precise test parameters
Establish clear boundaries for reliable results:
Duration: LinkedIn campaigns need at least 2 weeks for statistical significance. Consider:
- Audience size (smaller audiences need more time)
- Budget (higher budgets gather data faster)
- Expected conversion rates (lower rates need more time)
Test type:
- A/B tests: Two variations, one changed element
- Multivariate: Multiple variations simultaneously
- Control/exposed: Test group vs. baseline
Variables and constants:
- Document what’s changing and what isn’t
- Change only one element at a time in A/B tests
- Control for external factors like seasonality or news events
Budget allocation:
- Split budget evenly between test groups
- On LinkedIn specifically, separate campaigns give cleaner results than within-campaign tests
5. Document everything
Knowledge fades without documentation. Create a system to track:
- Goals and hypotheses
- Test parameters
- Audience definitions
- Creative variations
- Results and metrics
- Analysis and insights
- Next steps
Use a structured database (Airtable works well) to build institutional knowledge over time.
Four key testing areas on linkedin
LinkedIn offers four main categories for experimentation, in order of importance:
1. Audience targeting: get this right or nothing else matters
Muldoon is adamant: “Audience is the most important factor to think about for your LinkedIn campaigns.”
The targeting options include:
Company-level targeting:
- Company size ranges
- Industry classification
- Revenue
- Growth rate
- Company connections
Employee-level targeting:
- Job title
- Job function
- Seniority
- Skills
- Groups
- Interests
- Years of experience
- Education
The most common targeting mistake:
Putting multiple audience segments in one campaign with “OR” targeting.
This is lazy and ineffective. LinkedIn’s algorithm will favor segments that initially perform better, creating a self-fulfilling prophecy. Early performers get more budget, while other segments never get a fair shot.
Instead, create separate campaigns for each segment:
- Campaign 1: Marketing Managers at 500-1,000 employee companies
- Campaign 2: Marketing Managers at 1,000-5,000 employee companies
- Campaign 3: Marketing Managers with B2B marketing skills
- Campaign 4: Marketing Managers in B2B groups
This gives each segment equal budget and shows you definitively which performs best.
Audience size reality check:
There’s endless debate about ideal audience size on LinkedIn. “It needs to be 50,000!” “No, 100,000!” “Actually, 500,000!”
It’s all bullshit. Muldoon puts it plainly: “Your audience is your audience size.”
The only technical requirement is that it exceeds 300 members. In practice:
- Under 5,000 may struggle to spend budget
- 10,000-20,000 works well for testing
- Larger audiences give the algorithm more options
Don’t artificially inflate your audience just to hit some arbitrary number. Precision targeting of 8,000 perfect prospects beats 500,000 loosely-matched ones every time.
Testing approach:
Start with company targeting, then layer on employee attributes:
- Test company size ranges against each other
- Test different industries
- Test job functions within target companies
- Test skills vs. groups vs. interests
2. Ad formats and creative: stand out or get ignored
Once you’ve found your audience, you need to capture their attention. LinkedIn’s feed is crowded. Mediocre creative gets scrolled past.
Format testing options:
- Single image ads
- Carousel ads
- Video ads
- Document ads
- Conversation ads
- Text ads
- Dynamic ads
Different messages work better in different formats. Test them.
Five creative strategies that actually work:
- Use stats and numbers relevant to your audience
LinkedIn members love statistics that apply to their professional challenges.
Examples:
- “34% higher win rates with [Product]” (Gong)
- “Top 30 tools for CFOs” (Spend Desk)
Quantify your value proposition. Generic claims get ignored; specific numbers get attention.
- Address your audience directly
If you’ve done the work to target precisely, make it obvious in your creative.
Examples:
- “3 ways sales leaders can improve win rates” (Outreach)
- “Financial services communication playbook” (Slack)
Don’t just mention roles or industries – connect to specific challenges they face.
- Show real people, not stock photos
B2B marketing is evolving from boring product specs to human-centered approaches. People buy from people.
Successful tactics:
- Feature actual team members explaining products
- Showcase thought leaders from your company
- Use customer testimonials in their own words
- Show behind-the-scenes content
Muldoon notes: “I am starting to bring people, the employees, to the forefront of the brand. And it always sees an increase in performance.”
- Offer value worth paying for
LinkedIn has a strong value exchange culture. Members won’t share contact info unless they get something valuable in return.
Your lead magnet should be good enough that you “probably should be charging for it.” Examples:
- Strike’s ebook (presented as physical to feel substantial)
- Drift’s “7 Secrets from CMOs” (exclusive insights)
- Templates or frameworks that solve immediate problems
If your offer isn’t compelling enough to trade an email for, it’s not good enough.
- Visualize the before/after
B2B solutions can be abstract. Before/after visuals make benefits immediately clear.
Examples:
- Walnut shows website visitor intent graphs before/after implementation
- Cognism uses simple speech bubbles showing problem/solution
Complex value propositions need simple visualizations. Show the transformation.
3. Objectives and bidding: don’t trust the defaults
LinkedIn’s campaign objectives aren’t just administrative settings – they fundamentally change how the platform serves your ads.
Campaign objectives:
- Brand awareness
- Website visits
- Engagement
- Video views
- Lead generation
- Website conversions
- Job applicants
Test different objectives even when your end goal stays the same. If you want webinar registrations, try:
- Website Conversions (optimizing for registration completions)
- Lead Generation (optimizing for in-platform form fills)
- Website Visits (optimizing for landing page traffic)
Each objective reaches different audience segments with varying conversion potential.
Bidding strategies:
LinkedIn offers multiple bidding options that significantly impact performance:
- Maximum delivery: LinkedIn controls bidding automatically. Convenient but often wasteful.
- Target cost: You set a target cost per result. Better control while using automation.
- Manual bidding: You set exact maximum bids. Most control but requires active management.
One B2B company cut cost per lead by 25% by switching from Maximum Delivery to Manual bidding after finding their sweet spot.
Budget pacing:
Test how your budget distributes:
- Daily vs. lifetime budgets
- Even vs. accelerated delivery
- Budget allocation between campaigns
For events or launches, accelerated delivery often works better. For ongoing lead gen, even pacing typically wins.
4. Retargeting: your highest-converting audiences
As campaigns run, you’ll build retargeting options. These typically convert better than cold audiences, though with limited scale.
Website visitor retargeting:
- Test different lookback windows (30, 60, 90 days)
- Segment by specific pages visited
- Create sequences based on site behavior
Video viewer retargeting:
- Test completion percentage thresholds (25%, 50%, 75%, 95%)
- Compare different video types
- Build progressive sequences
Lead form engagement:
- Retarget form openers who didn’t submit
- Create nurture campaigns for partial completers
- Test messaging for abandoned forms
Company page engagement:
- Target people who engaged with your updates
- Create campaigns for consistent engagers
- Test messaging for different engagement levels
Exclusion targeting:
Prevent audience overlap and message fatigue:
- Exclude current customers from acquisition campaigns
- Exclude recent converters from lead generation
- Create “cooling off” periods for non-engagers
Evaluating what actually works
Don’t just look at surface metrics. Dig deeper to understand the full impact of your tests.
1. Statistical significance
Make sure your data is reliable:
- Sample size requirements:
- At least 1,000 impressions per variant
- At least 100 clicks per variant
- At least 30 conversions per variant
- Confidence level: Aim for 95% confidence (only 5% chance results occurred randomly)
- Duration: Run tests for at least 7-14 days, even with early results, to account for weekly patterns
2. Performance analysis
Look beyond basic metrics:
- Primary KPI: What was the percentage change between variants?
- Secondary metrics: What do they tell you about user behavior?
- Segment performance: Did certain audience segments respond differently?
- Trends over time: Did performance improve or degrade during the test?
3. Business impact assessment
Translate results into business outcomes:
- ROI calculation: What’s the financial impact of implementing the winning variant at scale?
- Volume potential: How does this improvement affect lead gen capacity?
- Cost implications: How does this change your unit economics?
- Competitive advantage: Does this finding give you an edge?
4. Scalability evaluation
Determine if winning approaches can grow:
- Audience limitations: Is the total addressable audience large enough?
- Budget sensitivity: At what point do returns diminish?
- Frequency issues: How quickly will the audience tire of seeing ads?
- Creative fatigue: How often will you need fresh creative?
Recognizing diminishing returns
Every campaign eventually hits limitations. Know the warning signs:
Audience saturation signals
LinkedIn’s professional audience is finite, especially in niche B2B segments. Watch for:
- Rising frequency metrics (same users seeing ads repeatedly)
- Declining click-through rates despite fresh creative
- Increasing costs per action
- Difficulty spending budget
Solutions:
- Expand targeting carefully without sacrificing quality
- Refresh creative more frequently
- Implement frequency caps
- Develop sequential messaging
- Introduce pause periods for fatigued segments
Budget efficiency thresholds
Every campaign has a point where more money yields proportionally smaller results:
- Track marginal cost per acquisition as spend increases
- Monitor performance when budget changes exceed 20%
- Analyze time-of-day performance
- Split budget across more specific campaigns instead of increasing single campaign budgets
Creative fatigue timeline
LinkedIn audiences experience creative fatigue faster than other platforms:
- Performance typically degrades after 3-6 weeks
- Higher-frequency campaigns burn out faster
- Specific targeting accelerates fatigue
Countermeasures:
- Prepare creative refreshes every 3 weeks
- Test new variations before existing ones fully degrade
- Implement creative rotation systems
- Use performance triggers to prompt changes
Case study: finding the perfect audience
A B2B software company selling marketing automation tools was targeting marketing directors across all industries with 1,000+ employees – over 500,000 people.
Their systematic testing approach:
Phase 1: Industry testing
They created separate campaigns for marketing directors in:
- Technology companies
- Financial services
- Healthcare
- Manufacturing
- Professional services
Results: Technology companies delivered leads at 40% lower cost than average, while manufacturing cost 60% more per lead.
Phase 2: Company size testing
Within technology, they tested:
- 1,000-5,000 employees
- 5,000+ employees
Results: The 1,000-5,000 segment converted 2.5x better than larger companies.
Phase 3: Targeting method testing
For technology companies (1,000-5,000 employees), they tested:
- Job title targeting
- Job function + seniority
- Skill-based targeting
- Group-based targeting
Results: Skill-based targeting cut cost per lead by 35% compared to job titles.
By combining these findings, they identified their optimal audience: marketing directors in technology companies with 1,000-5,000 employees who had specific marketing automation skills.
This audience performed 3x better than their original broad targeting, transforming LinkedIn from an expensive, inconsistent channel to their highest-performing digital acquisition source.
Case study: creative that actually converts
A digital transformation consultancy struggled with engagement despite reaching IT leaders.
Their testing program:
Phase 1: Format testing
They tested identical messaging across:
- Single image posts
- Carousel ads
- Document ads
- Video ads
Results: Video generated 2.3x higher engagement than static images.
Phase 2: Content approach testing
Within video, they tested:
- Product features
- Thought leadership
- Customer stories
- How-to content
Results: Customer stories drove 3x more engagement and 2x higher conversions.
Phase 3: Message testing
They tested different customer story approaches:
- Problem-focused
- Results-focused
- Process-focused
Results: Results-focused messaging that highlighted specific outcomes (e.g., “How Company X reduced IT costs by 37% in 6 months”) performed strongest.
By implementing these findings, they boosted engagement by 215% and cut cost per qualified lead by 42%.
Stop following the crowd. Start testing.
Successful LinkedIn advertising doesn’t come from copying others. It comes from discovering what works for your specific situation through systematic testing.
The platform’s complexity demands a methodical approach. Generic best practices fail because they weren’t developed for your unique audience, industry, and objectives.
Start with audience definition – it’s the foundation everything else builds upon. Be rigorous in your testing methodology. Let data guide your decisions.
Rob summarizes it perfectly: “Every LinkedIn account is different. You need to forge your own path and maybe go against the grain of the path that the LinkedIn campaign manager tool lays out for you.”
The best B2B marketers on LinkedIn aren’t the ones with the biggest budgets. They’re the ones who test, learn, and optimize with discipline.
If this approach seems like work, that’s because it is. But that’s also why most of your competitors won’t do it – and why you’ll have the advantage when you do.