General
March 31, 2022

A/B testing email content - To avoid spam folders (10 test examples included)

A/B testing email content the right way will help you avoid the spam folder and increase engagement.

A/B testing email content - To avoid spam folders (10 test examples included)

Introduction - Why A/B test email content before sending?

Have you ever wondered, while looking at your email engagement stats - why certain subject lines and email templates have significantly higher open rates than others? There must be other reasons beyond recipients manually opening certain emails more often, mustn't there? Well, you were correct. Inbox providers (like Google and Microsoft) use AI to compare what you send against other content which has received low engagement in the past (or been marked as dangerous/spam) to make automatic decisions for filtering emails.

Based on your content, some of the places outside the inbox where you could be landing are:

  • Spam

  • Updates

  • Promotion

  • Unfocused (if you’re a Microsoft user)

Therefore, beyond using intuition or consulting lists of supposedly ‘good’ or ‘bad’ keywords, you might find yourself left guessing what aspect of your email content is contributing towards where your message ranks within the inbox.

Ideally, we want to achieve primary inbox placement and even with an important tag on your message in some cases (if you’re getting very effective). Throughout this article, we’ll discuss how the ranking/filtering process works and how you can iterate content by A/B testing over time to achieve a better placement inside the inbox. 

Table of contents:


Introduction - Why A/B test email content before sending?

What content in emails contributes to inbox placement

Why lists of the most spammy email keywords won’t help

How to optimise email content for inbox placement

The criteria of a good AB content test

Examples of 10 content changes to A/B Test

Summary

What content in emails contributes to inbox placement

The key aspects of an email which contribute towards inbox placement can be summarised across the following points:

  • Both the subject line and the body of text: Inside your email template, the content decisions you make in each of these areas contribute equally towards inbox placement. It’s important to remember the contribution of each area is a 50/50 split, so don’t underestimate the subject line's importance, even though it’s much shorter than your body of text (hopefully...).

  • The links you include: The quantity and type of links you have included inside an email can impact your inbox placement. Therefore, you’ll want to ask yourself ‘do we really need that link’, and ensure the websites you’re linking to ALWAYS have an SSL certificate. We’d recommend a maximum of two hyperlinks included on the first email you send to a prospect or a new list sign-up to establish the best inbox placement as a starting point.

  • The size of files attached to the email: Especially with corporate email accounts many inbox providers will automatically flag emails with large files, therefore, you will want to keep attached files under 2MB in size and can consider hosting content online, then linking to it, rather than attaching large files to emails. 

Why lists of the most spammy email keywords won’t help

You’ll immediately want to know what keywords inside your emails are detracting from your performance and eliminate them. This draws most marketers to search for lists of the most ‘spammy’ keywords over emails (i.e Hubspot’s list of 394 email spamming keywords). 

Unfortunately, as HubSpot states in their article; spam filters have become far more sophisticated in recent years and look at the general context around your keyword selection rather than just the density of specific keywords. We’ve also found most marketers in our customer base are already using little to none of the keywords from lists like the above. 

Therefore, although lists of trigger keywords can help to add context, you’ll find yourself in the position of not knowing or not having data on the impact of the following content decisions:

  • The impact of alternative keywords: Seeing as there is no definitive list of trigger keywords and all the lists across the internet combined will not even be close to competing with Google and Microsoft's content filtering algorithms, blindly swapping out one keyword for another leaves you at risk of going from bad to worse, or needlessly using a less explicit keyword, which will create less engaging content for your customers and prospects.

  • There is no reference point to score content: Static keyword lists provide no consistency to score your content against prior templates or your domains general sender reputation, this is a vital baseline to establish so you can make decisions around the prioritisation of different templates and how far you can push the envelope on keyword choice, to provide the best overall marketing results.

  • Analysis of the entire email is not provided: The context around specific keywords is just as important as the keywords you select due to the complexity of filtering algorithms nowadays. Therefore, it’s an overly simplified viewpoint to consider some keywords as ‘bad’ and others as ‘good’, instead you’ll need to analyse the entirety of your content in a testing environment to make data-driven decisions.

How to optimise email content for inbox placement

To be able to accurately analyse content for inbox placement you need to be able to see the result of exactly where emails are landing in B2B inboxes. This means you can’t use a small segment of your prospect list, because you won’t have the ability to see and compare where your emails landed inside the inbox. 

Our solution to this at Allegrow allows you to send two variations of your email to approximately 100 unique B2B email inboxes (each) from our network. As this process gets carried out we automatically use our search-ability on those mailboxes to report on what percentage of each email variation is landing in the primary inbox vs spam or promotion folders. 

This allows you to compare at a high level the inbox placement between your A/B version of email content and draw conclusions around optimal email structure and specifically the best way to express your value proposition over email to land in the primary inbox. 

Importantly the data set we’re using is from live B2B email accounts inside the Allegrow network, which means you’re always able to test new iterations and stay ahead of the curve as the filtering algorithms Google and Microsoft use update over time. 

The criteria of a good AB content test

When you’re creating a content test you’ll want to ensure the data you get back is going to be actionable and provide maximum value to your go-to-market. So, I’ve summarised our key guidance on the criteria of a great content test:

  • Impacts revenue or pipeline: Prioritise testing email content that is most attributed to pipeline and revenue. These emails could commonly be the 1st email in your prospecting sequence, the initial email a list sign-up receives or the email in your sequences that currently creates the most purchases.

  • A/B test variations of the same email: When you’re creating a content test, generally, you’re always looking to test an email against a different variation of the email which fulfils the same purpose. One of these emails is what’s currently being used in real life as a control and the other is the iterated version.

  • Test evergreen content: Essentially to get the benefits of running content tests in the long term you’ll want to make sure you’re testing email templates where new contacts are consistently receiving the email template over time. Not content that gets sent once to a static list and then never used again. Prospecting sequences or any kind of lifecycle email is a good example of evergreen content.

  • Ensure you’re happy to use both versions on live prospects: Reverse engineering the filtering algorithm to reach a version that has significantly better inbox placement is great, but, you need to ensure you’d be happy to put the iterated versions that you’re testing live. (so be careful not to oversimplify the email variation your testing to a degree where it’s no longer designed for a human to find it emotive and intriguing). Your goal is to strike the perfect balance between inbox optimization and being engaging for the reader. 

Examples of 10 content changes to A/B Test

If you’d like inspiration for aspects of your email templates that you could iterate for testing the B version. Then, here’s a list of 10 examples of changes you could make across content tests to get you started. 

Subject line

Iteration on the subject line is very low hanging fruit when it comes to getting better inbox placement and engagement. You want to think about finding a subject line that is short (keep it around 3 words or less for prospecting), relevant to your content and similar to emails your target prospect usually opens on a regular basis. 

Subject line email A/B test

Volume of Words

Less is usually more when it comes to email. Experiment with being more concise and how this impacts your inbox placement. We usually advise emails in a prospecting sequence to be between 1-4 lines of text. Any more than that, and you’re probably doing too much (over) selling. Try cutting your content down to 50% of the words and see if you get a lift in engagement and inbox placement. 

Volume of words email A/B test

Keyword selection

Your choice of keywords and phrases used in the body of text, can of course impact your inbox placement and email sentiment considerably. You can look to test specific ‘buzzwords’ to your value proposition with alternatives or re-phrase content altogether. We advise your focus when testing this area should be to find a balance between language which reads well and is optimized for inbox placement.

Keyword selection email A/B test

Links

Seeing as the overuse of links can be associated with spammy content for an outbound email - we advise a maximum of two links in your opening sequence email (including links in the footer). With content testing, you can quantify the impact of different types and quantities of links in your content on spam rate. (i.e. The impact of a Calendly link with a link to blog content). 

With & without links email content A/B test

Call to Action

The primary call to action which you use in your go-to-market emails can be one of the main similarities between your content and spammy content, if you don’t test and iterate the emails ‘ask’ correctly. As Gong's research shows; you should look to confirm a prospect's interest with the CTA, rather than asking for time in cold outreach. As this is 2X as effective compared to average outreach emails. 

CTA email content A/B test

Signature / Footer details

Given that your footer/signature structure is added to every single automated email, it’s worth optimising and paying attention to. The following details are typically included to build credibility/context with prospects: Full Name, Job title, Company Name, Website, Phone number, HQ/relevant regional address. 

Email signature / footer content A/B test

Bold words / Font choices

Many businesses are experimenting with using bold words or different font types in prospecting emails to stand out. You can test content to establish the implications these changes may have on the natural inbox placement of your emails. 

Bold email content A/B test

Structural changes 

You’ll typically want to separate content in single lines to ensure your emails do not look like ‘too much hard work’ to read when a prospect initially opens them. This inevitably means creating your content in a way that may seem to contain an unusual amount of paragraphs, while editing this structure you can test the impact of different changes on how often each version of your content will reach the primary inbox.

Paragraph structure email content A/B test

Naming / Sender Profile 

Both the job title and name of the sender being used for cold emails can influence prospect engagement. Some businesses choose to repurpose the profile of the company's leadership across all their sending accounts for this reason (e.g. the CRO, CEO etc). Whereas, other businesses prefer the continuity of each SDR having their own name used for outreach to contacts that are assigned to them. Seeing as this is a relatively new area of demand generation set-up to experiment with, you may want to test your content going out from different mailbox names to see what aspect of the difference is due to contact sentiment vs natural inbox placement. 

Sender profile & name email A/B test

Personalization tags

Having the prospect's first name and company included in the email isn’t exactly pushing the boat out when it comes to personalization these days. However, the placement of personalisation tags and the quantity of them you use can influence inbox placement. Generally, it’s advised personalisation is most effective at the beginning and end of a prospecting email - but, don’t shy away from testing these general assumptions for yourself. 

Personalization tag choice A/B email test

Summary

We can clarify, the content and structure of your email will affect which specific folder inside the inbox your email gets placed in automatically by email providers. This directly correlates to the engagement rates you see on campaigns. If you’re at the stage of only optimising your emails based on how the end recipient would read them - That’s the equivalent of creating a blog post and disregarding any technical SEO considerations. 

To conclude, using an environment of real inboxes which are unique and not connected to your internal domains, will allow you to automatically monitor where different versions of your emails land. Therefore, you can start to take a more methodical and data-driven approach to email content. 

If you're curious to see how this works inside Allegrow or have content testing added to your existing Allegrow plan, you can book a demo with us here.