Skip to content
MailToolFinder

How We Test Deliverability

Most deliverability claims are self-reported. Ours are independently measured. Here's exactly how we do it.

Testing Protocol

Every month, we send an identical campaign from each of the 30 email tools we track. The campaign goes to our seed panel — a fixed set of 100 real email addresses distributed across four major providers.

We then check each mailbox to see where the email landed: primary inbox, spam/junk folder, or not delivered at all. The inbox rate is the percentage that reached the primary inbox.

Seed Panel Composition

Provider Addresses Notes
Gmail 25 Personal accounts, aged 1+ years
Outlook / Microsoft 365 25 Mix of personal and business accounts
Yahoo 25 Personal accounts
Apple Mail (iCloud) 25 iCloud Mail accounts

The same addresses are used every month. Seed addresses are not publicly disclosed to prevent tools from optimizing for our tests.

Campaign Details

Content

Identical plain-text and HTML versions. Newsletter-style content with a text link — no aggressive CTAs, images, or spam trigger words.

Sending setup

Each tool uses its default shared sending infrastructure. We configure SPF, DKIM, and DMARC on all sending domains. No dedicated IPs unless they're the default.

Timing

All campaigns are sent on the same day each month, within a 4-hour window. This minimizes timing-based variability.

Measurement

Mailboxes are checked 24 hours after sending. Emails in the primary inbox count as delivered. Spam folder and missing emails are tracked separately.

Metrics We Track

Inbox Rate

Primary metric

Percentage of test emails that reach the primary inbox. This is the number shown in our rankings.

Spam Rate

Percentage that land in spam/junk folders. A tool can have a high delivery rate but poor inbox placement if most emails go to spam.

Missing Rate

Percentage that never arrive. These emails were blocked before reaching the mailbox — typically by provider-level filtering.

Authentication

We verify SPF, DKIM, and DMARC pass rates for each tool. Proper authentication is a prerequisite for good deliverability.

Limitations

!

Shared infrastructure. We test on each tool's default sending setup. Dedicated IP users may see different results.

!

Content matters. Deliverability depends on what you send, not just who sends it. Aggressive sales copy or spammy content will perform differently from our neutral test emails.

!

Sample size. 100 seed addresses is enough to detect meaningful differences between tools, but not enough to guarantee your exact experience.

Frequently Asked Questions

How do you measure inbox rate?
We count how many of our 100 seed addresses receive the email in the primary inbox (not spam or promotions). The inbox rate is the percentage that land in the primary inbox.
How often do you test?
Every tool is tested once per month, on the same day, with the same content. Consistency is critical for valid comparisons.
Can tools game these results?
Our seed addresses are not disclosed. Tools cannot identify our test emails from regular subscriber emails, so the results reflect real-world inbox placement.

See how each tool performs in our latest tests.

View Deliverability Rankings