Skip to main content

How to A/B test your campaign

Jaclyn Curtis avatar
Written by Jaclyn Curtis
Updated over 2 months ago

How to A/B Test Your Campaign

A/B testing (also known as split testing) helps you identify what works best in your outreach — whether it’s your subject line, opening message, tone, or follow-up sequence. By experimenting with small variations, you can make data-driven decisions that consistently improve reply and conversion rates.


Why A/B Testing Matters

Instead of guessing which version of your message will perform better, A/B testing gives you clear, measurable insights. It helps you:

  • Understand which tone or style resonates most with your audience.

  • Improve reply rates and appointment conversions over time.

  • Validate hypotheses about messaging, targeting, or offer positioning.

  • Eliminate guesswork and base future campaigns on proven data.


How to A/B Test Inside Alsona

There are two main ways to run A/B tests in Alsona:

Option 1: Create Message Variants in a Single Workflow

You can easily add multiple versions of a message inside your campaign workflow.

  1. Go to your campaign’s Workflow tab.

  2. Click Add Variant under any message step.

  3. Write alternative versions of the message — for example:

    • Variant A: Short, direct opener

    • Variant B: Personal, story-based approach

  4. Alsona will automatically distribute messages evenly to your audience and track results.

You can view performance for each variant by checking the campaign analytics. Metrics like response rate, positive replies, and appointments booked will help you identify the winning message.


Option 2: Compare Separate Campaigns

If you want to test bigger differences — such as targeting or channel type — you can create separate campaigns and compare their performance.

For example:

  • Campaign A: LinkedIn Search Source + AI Assistant messages

  • Campaign B: CSV import + Manual follow-up replies

After both campaigns have been running long enough, compare key metrics under Campaign Analytics, such as:

  • Connection/acceptance rate

  • Reply rate

  • Open Rate

  • Meeting conversion rate

This gives you a clear side-by-side picture of what’s working best across strategies.


A/B Testing Best Practices

Keep your tests simple and consistent:

  • Test one variable at a time. For example, change only your first line or CTA — not both.

  • Set a control group. Keep one version unchanged so you have a reliable benchmark.

  • Wait for a meaningful sample size. Don’t judge results too early; gather enough sends and responses for accuracy.

  • Avoid testing too many things at once. The more you test simultaneously, the harder it becomes to know what caused the result.

  • Document your learnings. Record which variants win and apply insights to future campaigns.


📊 Example of an A/B Test

Variant

Message Style

Response Rate

Outcome

A

Direct and concise

14%

Baseline

B

Personal and question-based

23%

Winner – Higher engagement


💡 Pro Tip

Combine A/B testing with Alsona’s AI Appointment Setting to scale results. Once you identify the winning message structure, your AI agent can automatically use the best-performing tone and language to handle future replies.

Did this answer your question?