# 5 Simple A/B Tests to Help Increase Conversions
Having trouble figuring what the best practices are for your site? A/B testing helps you reduce uncertainty and take out the guesswork in figuring out how to optimize your site. By implementing data-driven decisions within your web design, you can increase conversions many times over and achieve mind-blowing results.
## What is A/B Testing?
A/B testing, also known as bucket testing or split testing, is a core research methodology used to determine which version of two products (typically web pages or apps) are better at achieving stakeholder goals.
As the name suggests, the test involves two different product designs — Design A and Design B, which are shown to customers at random. Half of your product’s users should see Design A while the other half should see Design B. After a predetermined amount of time, collect the metrics and data from the two designs, and see which one performs better.
While it may seem like the designer’s job to determine what the optimal solution is, the basis of a truly great design process is one that interweaves testing with design. Solid design principles and previous experience are excellent foundations, but the properties of a clean UI and satisfying user experience are difficult to identify, and sometimes even counterintuitive. It’s effective to ask users what they’re looking for and what kinds of designs are appealing to them, but it’s often far more effective to make data-driven decisions for your product by field testing them directly. Often what people say completely contradicts how they act, but actively using different A/B tests allows the stakeholders to see exactly how users respond to these variations by comparing the metrics from the test. What metrics to use entirely depends on what problems your alternative designs are trying to solve, but most frequently the plan is to maximize conversions.
The difficulty in A/B testing often is not in implementing the test but figuring out what exactly to test. In a perfect world, every aspect of a design would be individually tested, but to do so would be a drain on both time and money with the number of staff necessary to conduct each test. The question remains — what different design aspects and variables are worth testing? Here are 5 A/B testing ideas that will increase conversion rates, improve your user experience, and optimize
## Test #1: Call To Action
Every website has a purpose: whether the goal is to get the user to buy a product, click a link, download an application, or some other method of generating a lead, your call-to-action (CTA) is generally the important feature of your site. Generally, a CTA is attention-getting, but that's about where the similarities end.
The many properties of the CTA can be changed and experimented with during A/B tests to optimize conversions. For example, the size of the CTA is critical; oftentimes business owners assume that a bigger CTA or one with more information is better, but this isn’t always the case.
Humana, an American health insurance company, found that when they reduced the text in their CTA, conversions went up by 192 percent. Humana’s example also serves to showcase why A/B testing is so important. It would be a normal assumption that customers looking for health insurance would want to know the benefits, cost-saving features, and different plans that are available, but in this case you would be entirely wrong. Seemingly small changes can lead to huge upswings in conversion but are often counterintuitive and require extensive testing.
Electronic Arts also had a similar result from their A/B testing for Sim City. When the gaming company removed details about a “20% off for preordering” deal from their Sim City store page, the purchases for the game increased by 40%. Conventional wisdom would tell us that consumers would love to hear about additional deals, but it turns out that veteran Sim City players simply weren’t motivated by these incentives. In fact, they were a big distraction.
One easy A/B test for your CTA is to change the color of the button or banner. Create a version of your website with the only change being the color scheme of the CTA and show every other user the alternative version. Common CTA colors to test out are red, green, and blue.
## Test #2: Typography
While your font choice seems like an asinine thing, typography is one of the most essential things to creating a readable and welcoming web design. With hundreds of fonts used by various companies and thousands more available online, it doesn’t make much sense to try every font for an A/B test. Instead, here are specific aspects of fonts that should be tested to maximize readability.
**Serif or Sans Serif**
Serif fonts are those that have additional flourishes or embellishments at the end of letters. Sans Serif, as the name suggests, does not have these additions. Generally, Sans Serif fonts are recommended to new web designers; however, Georgia (a serif font) is by far the most popular font used by various websites. While not every company has A/B tested their font, the fact that the most popular font used online is a serif font means that at least experimenting with them could be worth your time.
**Font Size**
According to the Nielsen Norman group, small fonts and poor contrast between words is the number one complaint about online reading. Font size is tricky since the optimal font size depends on the type of font you’re using. Michael Benard, a usability expert, concluded that Tahoma was best used at a 10-point font, Courier at a 12-point font, and Arial at 14 points. After deciding the appropriate font type for your website, an A/B test for the font at different sizing could result in a much more readable site.
**Color and Contrast**
In web advertising, it's been well studied that bold colors catch the most attention, and in turn, are what gets clicked on in a session. While dark test on a light background is the industry standard for readability (and what you should be using for the bulk of your text), certain areas of the webpage real-estate would be better served with different colors to attract more attention.
## Test #3: Images
Humans are visual creatures, so it should be no surprise that images are one of the most effective methods of improving your website’s conversion rate. While sticking in a generic stock photo can increase the traffic that a website receives, choosing the optimal generic photo can take some testing.
Highrise features a bland portrait of a person near their CTA. During their A/B tests for images, they had two different subjects — “Michael” and “Jocelyn.” Both were simple portraits of the subjects standing upright and smiling, but “Michael” generated about 5% more conversions than “Jocelyn.” Despite these images being the same size, having the exact same placement, and containing the same number of people, “Michael” would go on to generate thousands of dollars of additional revenue every year.
While basic stock images might suffice, you might want to go one step further and test different image designs and themes to try to create more activity on your website. WallMonkeys, a company specializing in wall decor, sticker sets, and custom wallpaper patterns, used a generic headline overlay with a stock image of home furniture. They chose to run an A/B test using images of their more fantastical designs and special wall decors and found that their version with their unique products had a 27% higher conversion rate compared to the control. In a future A/B test where they experimented with a web model that allowed for easier exploration of their web content and included an algorithm that showed consumers items that they were more likely to be interested in, WallMonkeys ended up with an astonishing conversion rate increase of over 500%
## Test #4: Pricing Schemes
If your website has a digital product or software, it’s likely you have some form of freemium model available for your customers. While many people throw in a free trial and call it done, there can be extensive testing done on both the type of freemium model you employ and the types of pricing schemes available to your users. Models typically fall under three different types:
1. **Free Trial**: free trials allow users to use the complete product, but only for a period of time. After which the user needs to decide whether or not they want to continue paying for the product.
2. **Freemium**: Freemium models give the user unlimited use of the product, but usually with limited features. These models give users a taste of the final product, but do not contain advanced features so the scope of the product is decreased
3. **Money-back guarantee**: Users are given no “free” version of the software at all, but are allowed a full refund if the product isn’t satisfactory.
Testing between these types can generate a huge amount of profit for your business. Acuity Scheduling, a software designed for scheduling online appointments, initially used a freemium model, but after introducing free trials the company found that paid signups increased by over 268%. Testing between free trials and freemium models could result in similar increases for your business as well.
Money-back guarantees are considered the least effective method of generating conversion, so while it’s always best practice to test every variation of features using A/B tests, this would be the one to skip if time or resources are an issue.
## Test #5: Site Security Symbols
Internet users are incredibly on guard for phishing scams and other sites designed to steal their personal info. Trust symbols increase sales dramatically and they’re one of the “safer” options to test since they very rarely decrease conversions. Your website should already have top-tier encryption and strong security to ensure that your user data is protected, so a security symbol that showcases your website’s safety measures can help increase user trust.
Blue Fountain Media, a digital marketing agency, initially used a security prompt that was “homemade”— their tab simply consisted of a general lock icon with assurances that the user data will not be used by any third parties. The agency opted to test a VeriSign symbol, a third-party provider that ensures user data is protected on each site. The symbol in fact only stated that the website was VeriSign trusted, and did not contain any mentions of the website’s handling of the user data. Nonetheless, Blue Fountain Media found during testing that there was a 42% increase in sales, demonstrating the trust in security symbols from consumers.