Conversion rate optimization (CRO) feels more like a sport than marketing.
If you’ve ever boosted sign-ups, sales, or ppc click-through rates (CTRs) by changing an image, for instance, you know how easy it is to get hooked.
All of a sudden, you’re reading WhichTestWon at bedtime, skipping out on nights out with friends to A/B test your retargeting ads, and creating secret Pinterest boards with the coolest landing page designs that you’ve ever come across. Of course, you tell nobody about your after-hours guilty pleasure.
You come across awesome tests, try out new ideas, and aim to become your company’s resident revenue magician.
The challenge, however, is that your ideas aren’t working like you think they will. You unleash your frustrations on GrowthHackers.com, re-read your favorite Unbounce articles compulsively, and hop on Clarity to figure out what’s up.
After all this floundering, you come to realize that your ‘best practices’ around conversion optimization, A/B testing, and building processes for experimentation are constricting your ability to tailor decisions to your business.
What you need to do is take a step back, breathe, and read this blog post.
Sledgehammer-in-hand (we mean this figuratively), give yourself the freedom to break the following rules:
Rule #1: CRO = A/B Testing
Not so much.
A/B testing is one of many techniques that can help companies improve conversion rates.
Other strategies include qualitative interviews, secondary data analysis, competitor research, and surveys. By thinking of CRO in terms of a few A/B tests, you’ll limit your opportunity to create a comprehensive testing program.
Website optimization platform Optimizely uses the expression “optimization culture” to describe the difference between A/B tests and CRO.
You can think of CRO, for instance, as a process that bridges together multiple A/B tests, qualitative research, and user feedback.
For inspiration, take a look at this case study that describes Hotwire’s process for running 120 experiments per year.
The experiments aren’t random—they’re based on an organizational objective to ‘connect the dots’ into a cohesive user experience.
While it’s true that Hotwire is a bigger company with many more resources than a small or mid-sized business, the same principle holds true.
Every A/B test must be grounded in a process of research and connect to an underlying organizational objective.
How to break rule #1:
Stop thinking of A/B testing as CRO, and start building systems instead.
Every standalone decision must tell a greater organizational story. Look beyond what’s in front of you—focus on the relationship between the tests you’re running now and the decisions that you’ll be making as a result of them.
Make sure to focus on ROI rather than the shiny object in front of you. Always ask why.
Rule #2: Just Start Testing
I am pretty sure that in my past writing, I’ve jumped on the CRO bandwagon in telling marketers to ‘just start testing’ with the rationale that it’s better to execute on initiatives than become stuck with analysis paralysis.
I would like to redact that statement and kill that rule. Here’s why:
As the last section points out, CRO needs more than a few one-off tests.
Rather than testing random word choices, images, and button colors, you need to take a step back to understand what your customers care about.
Rather than testing something because ‘it worked for someone else,’ think about what your customers want and how you can take steps to convince them in their decision-making processes.
Not to mention, experimentation takes time out of your day—and to some extent, redirects your focus away from product-building and sales. While it’s important to branch away from your company’s status quo, you could be chasing down dead ends.
How to break rule #2:
Conduct regular, qualitative research with your target customer base through phone and in-person interviews, in addition to live chat software.
Pay attention to the word choices that they’re using when they speak to you, how they describe their pain points, and where they’re going to find the information that they need.
Rely on these conversations to come up with ideas to test.
You should look for priorities, gaps in positioning, points of friction, and areas of opportunities that you may have previously missed.
Incorporate these ideas into your marketing strategy and then, start testing.
Rule #3: Aim For Higher Conversions
Well, yes and no.
While you want to aim for higher conversions and conversion rates, you’ll also want to see how these results fit into your core business.
Rather than dive into a lengthy explanation, let’s jump into an example: this talk from the 2014 Lean Startup Conference.
Appfolio, a SaaS company for property managers, decided to test an idea for a new product—a rental application tool called ‘RentApp.’
Driving traffic through paid search campaigns, Appfolio generated sky-high conversion rates.
By all accounts, the product seemed to deliver a strong value proposition. People were signing up.
The problem, however, was the funnel.
Compared to the rest of Appfolio’s products, the monetization potential just wasn’t there.
Even with success on the CRO front, Appfolio decided to kill RentApp’s as a product.
Even though conversion rates and ROI were high, the two performance metrics weren’t high enough when compared to the overall business. Appfolio decided to kill the RentApp product and focus on more profitable areas of the business.
How to break rule #3:
Always look at the context driving your CRO strategy—don’t limit your decisions to self-contained ecosystems.
You may have high sign-ups, for instance, but low retention down the funnel.
To break this rule, revert back to steps 1 and 2. Whenever you interpret (or share) the results of your tests, incorporate analysis that describes the impact to your company as a whole.
How do conversion rates and ROI measure up against potential opportunity costs?
Rule #4: Explainer Videos > Everything Else
Explainer videos are awesome in the sense they have helped many, many brands boost their conversion rates. Despite this industry-wide success, however, they’re not always the best CRO tool.
It’s easy to make the blanket claim that video converts better than text, for instance. But let’s take a moment to think about this idea.
On mobile, for instance, where audiences have restricted data plans, they may be unwilling to use unnecessary bandwidth. For reference, take a look at the following Facebook ad experiment that AdParlor ran last year.
In one gaming app install campaign, the ad operations specialist found that video-playback rates were higher on WiFi-enabled devices (than devices that had data plans only).
At first glance, the reason seems to be that people want to conserve bandwidth.
It could also be the case that there is some unexplained variable that corresponds to device usage and video playback rates—for instance, something related to the demographics of users (like age?) who aren’t clicking on video or enabling WiFi on their devices.
Video isn’t always the highest converting option. It’s all about context.
In another example, mobile data collection platform Device Magic was worried that their homepage video was too technical—so they tested a variation page with an image slider instead. As a case study for conversion optimization, Visual Website Optimizer points out:
“They wanted to see if a video or series of rotating jQuery slides would work best for driving people to sign up.”
The image slider variation increased homepage to signup page conversions by 35%, and the total increase in subsequent signups was 31%.
The image slider was easier to understand than the video, which was highly technical.
How to break rule #4:
There are so many factors that explain why certain multimedia is higher converting than others. What’s more important than the medium in which you’re communicating information is how you’re piquing interest and generating engagement.
Rather than focusing on what type of multimedia that you have, figure out how to tell your story as clearly, concisely, and compellingly as possible.
Rule #5: Always Be Testing
Like rule #2, this is a statement that I’ve often written in my previous work.
After becoming more deeply entrenched in the CRO space, I’d like to invalidate it.
The fact is that you shouldn’t ‘always be testing.’
You should run clear, focused experiments that generate clear business impact.
While it’s important to keep your eyes peeled for potential tests to run, you shouldn’t feel pressured to be running A/B tests for the sake of A/B testing.
You’ll end up collecting a bunch of meaningless data that you don’t have time to analyze. What’s the point?
How to break rule #5:
Pick a handful of experiments that make sense to run over a period of weeks or months. This number should vary based on the size of your organization and traffic volumes.
Focus on them. Bring some structure to them. Think them through, and give your team (or yourself) the space you need to run a really awesome analysis.
Use your best judgment.
When it comes to CRO, there are no hard and fast rules.
Every company is different, and every customer will have different needs.
Focus on what’s right for you, challenge ‘best’ practices, and don’t let A/B test case studies convince you that you need to do something.
The best way to learn is through experience and experimentation.