Skip to main content

A/B Testing

A/B testing (and multivariate testing) lets you compare different versions of your email steps to find what performs best. Test subject lines, body content, CTAs, and send times to continuously improve your sequence performance.

How It Works

  1. You create 2-4 variants of an email step
  2. CronDB randomly distributes contacts across variants
  3. Each variant tracks opens, clicks, and replies independently
  4. After reaching statistical significance, you can pick a winner and apply it to all future contacts

Setting Up an A/B Test

  1. Open a sequence and select an Email step
  2. Click A/B Test in the step toolbar
  3. Click + Add Variant to create additional versions (up to 4 total)
  4. Edit each variant's subject line and/or body
  5. Set the distribution (default is equal split)
  6. Click Save

Example: Testing Subject Lines

Variant A (50%):

Subject: Quick question about {{company}}

Variant B (50%):

Subject: {{domain}} — growth opportunity?

Both variants share the same email body. This isolates the subject line as the variable being tested.

Example: Testing Body Content

Variant A (50%):

Short, direct email — 3 sentences with a calendar link CTA

Variant B (50%):

Longer email — includes a case study reference and open-ended question CTA

Both variants share the same subject line.

Multivariate Testing

Test up to 4 variants simultaneously for more complex experiments:

VariantSubjectBodyDistribution
AQuestion-basedShort + calendar CTA25%
BQuestion-basedLong + case study25%
CStatement-basedShort + calendar CTA25%
DStatement-basedLong + case study25%
Sample Size

Multivariate tests need more contacts to reach statistical significance. With 4 variants, plan for at least 400 contacts (100 per variant) to get meaningful results.

Distribution Options

Equal Split (Default)

Each variant gets an equal share of contacts. Best for straightforward A/B tests.

Weighted Distribution

Assign custom percentages to each variant:

  • 70/30 — Test a new approach while keeping your proven version as the primary
  • 60/20/20 — Give your best guess the majority while testing two alternatives

Champion/Challenger

Reserve a percentage for the "champion" (current best) and test challengers:

  • Champion: 80% — Your proven version
  • Challenger: 20% — The new approach being tested

Metrics Tracked

Each variant tracks:

MetricDescription
SentNumber of emails sent for this variant
OpenedUnique opens (percentage)
ClickedUnique link clicks (percentage)
RepliedUnique replies (percentage)
BouncedHard and soft bounces
UnsubscribedOpt-outs from this variant

Picking a Winner

Automatic

Enable auto-winner to let CronDB pick the best variant automatically:

  1. Set the winning metric: open rate, click rate, or reply rate
  2. Set the minimum sample: how many contacts each variant needs before evaluation
  3. Set the confidence level: 90%, 95%, or 99%

Once the criteria are met, CronDB stops the test and routes all future contacts to the winner.

Manual

Review the results yourself and click Pick Winner on the variant you want to keep. All future contacts will receive that variant.

Best Practices

  1. Test one variable at a time — Change only the subject OR the body, not both, unless running a multivariate test
  2. Wait for significance — Do not pick a winner too early; let each variant accumulate at least 50-100 sends
  3. Test continuously — After picking a winner, create a new test with the winner vs. a new challenger
  4. Document your learnings — Keep notes on what works for your audience

A/B Testing Limits

PlanMax Variants per StepMax Active Tests
Starter22
Pro410
Enterprise4Unlimited

Next Steps