Before you start
- A/B testing is available on email campaigns in Connect
- You can test up to five variations per campaign
- You need enough recipients for the test to mean something — small lists produce noisy results. As a rough guide, aim for at least 1,000 contacts in your audience for a meaningful test
- Decide what you're testing before you build the campaign — going in with a clear hypothesis ("does free shipping beat 20% off?") gives you a usable answer at the end
- Test one variable at a time — if you change the subject line, the offer, and the button colour all at once, you'll never know which change moved the needle
💡 Tip — A/B tests that don't change behaviour are just as valuable as ones that do. A "no significant difference" result tells you to focus your effort on something else.
Topics
- What you can test
- Choose your test type
- Plan your test
- Run a subject line test
- Run a content test
- Configure your test settings
- Read your results
- Tips for better A/B tests
- Troubleshooting
- FAQ
What you can test
In Connect, you can run an A/B test on either of two things at the email-campaign level:
- Subject lines — test up to five different subject lines to see which one drives the highest open rate. Best for understanding what hooks your audience (offer-led versus curiosity-led, short versus long, emoji versus no emoji)
- Content — test up to five different versions of the email body to see which one drives the highest click rate or conversions. Best for understanding what resonates inside the email (offer wording, image versus no image, button colour or position, single CTA versus multiple)
You choose between the two when you create your campaign — you can't test subject lines and content together in the same campaign.
Choose your test type
Connect supports two types of A/B test. Pick the one that matches your goal.
| Test type | What it does | Best for |
|---|---|---|
| Send equally | Splits your audience into equal random groups and sends each group a different variation at the same time | Learning — you want to see how every variation performed against the others, on equal-sized groups |
| Test first, then send the best | Sends each variation to a small subset of your audience first, picks a winner based on a metric you choose, then sends that winner to everyone else | Performance — you care more about getting the best possible result for this send than about pure learning |
ℹ️ Note — Test first, then send the best introduces a delay between when the test starts and when the winning variation goes out to the bulk of your list. Plan your send time accordingly so the final email doesn't land too late in the day.
Plan your test
Spend a few minutes here — this is where most A/B tests succeed or fail.
- Pick one variable to test. Subject line wording, sender name, offer phrasing, hero image, button text, button colour, CTA position. Just one
- Write a hypothesis. "I think a shorter subject line will drive a higher open rate than a longer one." Then test it
- Make the variations meaningfully different. Testing "Save 20%" against "Save 20% today" is unlikely to produce a clear result. Testing "Save 20%" against "Free shipping" probably will
- Decide your winning metric upfront. Open rate for subject line tests, click rate (or conversions) for content tests
- Plan your sample size. For "Test first, then send the best", we recommend sending the test to 20% to 30% of your audience to give the result enough weight without burning too much of the list
Run a subject line test
- In Connect, go to Campaigns and click + Create new campaign
- Select Email as the channel and pick or create a campaign folder
-
On the email setup screen, select Test subject lines
- Enter your first two subject lines
-
To add more, click Test another subject line — you can have up to five
- Once all subject lines are entered, click Configure A/B test to set up how the test runs (see Configure your test settings)
- Continue through the rest of the campaign flow — recipients, schedule, confirm — and click Send or Schedule
⚠️ Warning — Don't send a "Test first, then send the best" campaign too late in the day. A test that starts at 3pm with a 7-hour wait window means most recipients won't see the email until around 10pm. Schedule the campaign earlier so the final send still lands at a useful time.
Run a content test
- In Connect, go to Campaigns and click + Create new campaign
- Select Email as the channel and pick or create a campaign folder
-
On the email setup screen, select Test content
- The platform will prompt for two content groups by default — click New content group to add up to five
- For each content group, hover over it and click Select to build the email content for that variation. You can pick an existing email, use a template, or build from scratch
- Once all content groups are set up, click Configure A/B test at the top of the page (see Configure your test settings)
- Continue through recipients, schedule, and confirm, then click Send or Schedule
Configure your test settings
After entering your variations, click Configure A/B test to set how the test should run.
For Send equally tests
- Confirm the variations you want to include
- Click Save — your audience will be split into equal random groups and each group will receive one variation at the scheduled send time
For Test first, then send the best tests
- Click Test first, then send the best
- Set the percentage of recipients to include in the test (we recommend 20% to 30%)
- Set the wait time — how long the platform should wait before declaring a winner and sending to the rest of your list. Longer windows give a more reliable result, but delay your final send
- Choose the winning metric — best open rate (for subject line tests) or best click-through rate (for content tests)
-
Click Save
⚠️ Warning — Don't send a "Test first, then send the best" campaign too late in the day. A test that starts at 3pm with a 7-hour wait window means most recipients won't see the email until around 10pm. Schedule the campaign earlier so the final send still lands at a useful time.
Read your results
Once your A/B test campaign has been delivered, you can review the results to see which variation performed best.
- In Connect, go to Campaigns
- Click the Sent tab and find your A/B test campaign
- Open the campaign report — you'll see a breakdown of each variation alongside an aggregate for the campaign as a whole
What to look at:
- Open rate for subject line tests — which subject line drove the most opens?
- Click-through rate for content tests — which version of the email drove the most clicks?
- Unsubscribe rate for both — a variation that wins on opens but spikes unsubscribes isn't actually a winner
-
Conversion rate if you're tracking it — opens and clicks are leading indicators, but real revenue is the final word
💡 Tip — If two variations come back within a percentage point of each other, treat the result as inconclusive. The "winner" is probably noise. Pick a bigger variable to test next time.
Tips for better A/B tests
- Test things that could change behaviour, not things that probably won't. A bigger discount, a different hero image, a different CTA — meaningful changes give you meaningful answers
- Keep one variable per test. It's tempting to change four things at once. Resist
- Run tests on a regular cadence. One test every few campaigns builds a library of insights about your audience over time
- Apply what you learn. A test only matters if you actually use the result on the next campaign
- Be honest about noise. Small lists, small differences, and short windows produce results you can't trust. When in doubt, run it again
Troubleshooting
My A/B test results look almost identical across variations
- Cause: The variations were too similar to produce a clear winner, or the audience was too small for the difference to register.
- Fix: Test bigger differences next time (different offer wording rather than different word order). Run the test on a larger audience, or pick a bigger sample percentage.
The "Test first, then send the best" campaign sent too late at night
- Cause: The wait window pushed the final send beyond a useful time of day.
- Fix: Either start the test earlier in the day, or shorten the wait window. As a guide, schedule so the final send lands during normal waking hours for your audience.
I can't see the option to A/B test on my campaign
- Cause: A/B testing is set during email campaign creation. If you've already moved past the setup step, you'll need to start a new campaign.
- Fix: Go to Campaigns, click + Create new campaign, select Email, and choose Test subject lines or Test content at the setup step.
My winning variation didn't perform as well in the bulk send as it did in the test
- Cause: This is normal. Test groups can produce slightly inflated results due to sampling variability. The directional read is still useful, just don't expect the bulk send to match the test exactly.
- Fix: No fix needed. Look at the directional learning ("offer A beat offer B") rather than expecting the absolute number to repeat.
My A/B test report only shows aggregate numbers, not a per-variation breakdown
- Cause: You may be looking at the campaign-level summary rather than the detailed report.
- Fix: Open the campaign and look for the per-variation breakdown in the campaign report. Each variation should have its own line.
FAQ
How many variations can I test at once? Up to five. Two is typical, three is fine, four or more usually needs a much larger audience to produce a clear winner.
Can I A/B test SMS campaigns? A/B testing in Connect is available for email campaigns. For SMS, you can run smaller "manual" tests by creating two campaigns and splitting your audience using segments, but there isn't a built-in A/B test option for SMS today.
How big does my audience need to be? There's no hard minimum, but small lists produce unreliable results. As a rough guide, aim for at least 1,000 contacts. Below that, treat the result as a directional hint rather than a verdict.
Should I use Send equally or Test first, then send the best? Use Send equally when you want to learn — you'll see how every variation performed on equal groups. Use Test first, then send the best when you care most about getting the strongest possible result for this send.
What metric should I use to pick a winner? Open rate for subject line tests. Click-through rate for content tests. If you're tracking conversions or revenue, that's the real signal — opens and clicks are leading indicators of it.
Can I run an A/B test inside an automation? A/B testing is set up at the campaign level. Automated SMS or email sends inside Flows don't currently support the same A/B test wizard.
Related articles
- How to send an email campaign
- Campaigns Overview
- Campaign Reports and Analytics in Connect
- How to create a segment
You can now run A/B tests on subject lines or content, choose between Send equally and Test first, then send the best, and read the results to make better decisions on every campaign that follows.
Any questions? Reach out to us via email at support@meandu.com