Table of contents
You’ve already started your first Conversion Optimization Test using the TrustBox Optimizer. Now, it’s time to monitor and interpret your test results.
Let’s begin by examining how the TrustBox Optimizer calculates the results. Then, we’ll help you analyze them and decide on your next steps. If you are not using TrustBox Optimizer yet, check out this guide on how to start a Conversion Optimization Test.
The Statistics Engine behind the TrustBox Optimizer
Statistics are at the core of every A/B testing tool. They are crucial to making solid business decisions.
When you start a Conversion Optimization Test using the TrustBox Optimizer, we record the total number of visitors seeing each of your TrustBox variations (or no TrustBox). We also keep track of the conversion rate. Using Bayesian statistics, we compute the statistical probability of each variation beating other variations long-term. This is expressed as a percentage between 0% and 100% and provides an answer to the all important business question: "Will this change benefit my business?"
During the first week of your test, or if the level of traffic is very low (under 200 visits), we will not show a detailed report. It’s too early to draw any conclusions from such a small data sample. After that period, you will see a daily detailed report of your test results. The longer you let the test run, the more data we can collect and the more accurate your results will be. We recommend a test period of eight weeks.
TIP: As soon as you correctly implement the code on your site, we’ll start collecting data which will go into your report. If you have spent time testing the TrustBox code in your dev environment, or forgot to implement the conversion tracking code when you first released, restart the test once the code is working as expected on your live site. This way, you can flush out all irrelevant test data that might pollute your test results. You can restart the test from the TrustBox Optimizer Test Settings page by clicking the Save and restart button.
Interpreting the detailed report to make a business decision
The detailed report is generated daily. We compute the probability of a variation beating another one over time and express that as a number between 0% and 100%. The report is a snapshot of the data we’ve collected so far. The longer you run the test and the more data we collect, the more accurate the results will be.
There are four key data points to consider:
- Chance to win
- Potential loss
- Conversion range
- Estimated revenue
Depending on these key data points, you can implement the winning variation or continue testing to find a more optimal spot to show reviews.
Chance to win
With Chance to win, there are three types of test outcomes:
- Close to 50-60% means that the tested variations are very similar. This usually means either variation could work equally well for your business.
- Above 70% means that there is a difference between the variations and one has a higher chance to outperform other variations in the test over time.
- Above 90% means that there is a high probability that one variation will outperform the other test variations. This variation can be declared a winner.
|Chance to win||What does it mean?||Risk assessment|
The tested variations are very similar.
|Neither win, nor lose|
There is a high chance that this variation will outperform other variations in the test over time.
There is a very high chance that this TrustBox variation will outperform other variations in the test over time. This variation can be declared a winner.
Always consider the conversion rate you could lose when you choose one variation over another. The lower the percentage, the safer the choice you are making.
Use Potential loss in combination with Chance to win to decide when to end the test. When the Chance to win of a variation is high, and its Potential loss is low enough to be within your comfort zone, choose that variation as the winner.
Conversion rate shows the range within which your actual conversion rate lies. A very high conversion rate range or a high overlap in the ranges may indicate that we need more data to detect a difference between the variations in the test. The report’s graph section can be switched to show the conversion rate range and the overlap between the variations.
If you’re also tracking revenue in the conversion script, you will find Estimated Revenue in the detailed report. This number shows you what your monthly revenue could be if you run each variation on 100% visibility. The projected revenue numbers are derived from all revenue data we have collected with the relative increase in conversion. Use this as an additional indicator when you decide which TrustBox variation to keep implemented on your site.
Best Practices and Tips for Optimizing the Optimizer
The TrustBox Optimizer shows you how to improve your use of TrustBoxes and reviews. Every test is an opportunity to learn about what works or doesn’t work for your business and your visitors. There are no bad test results. It’s all about optimizing continuously until you find the best, most effective solution.
We’ve compiled some tips to help you avoid pitfalls and quickly obtain the most valuable results. Check out example cases to see the Optimizer’s results in real scenarios, plus recommended ways to proceed.
Test duration and accuracy
- Aim to have 1,000 visitors per variation before you take action. The longer you run the test and the more data we collect, the more accurate the result will be.
- Don’t stop the test too early. It’s good practice to run the test for at least 2-3 full weeks.
- Don’t run the test too long. Running a test for longer than eight weeks introduces too much seasonality - make a decision based on the data collected and plan a follow-up test.
- Avoid introducing changes on your site or changing how you drive traffic to your site while you are running a split test. This may affect the test results.
Do you have other TrustBoxes on your site?
Remember that you’re testing the effect of a specific TrustBox on your page. If you have other TrustBoxes on the same page or elsewhere on your site, they will also influence visitors throughout their customer journey.
Did your customers actually see the TrustBox?
Sometimes TrustBoxes are placed in a position where they don’t attract any attention. Showing your reviews directly on the page can have a much bigger impact, than a small TrustBox in the footer.
If TrustBoxes directly compete with a key CTA or conversion driver, they might end up reducing conversion, rather than improving it. For example, if you replace a key page module like “recently viewed” with reviews.
TIP: Aim to position your TrustBox so that it’s visible and complements (rather than distracts from) a CTA or conversion driver.
Are you testing at a key decision point?
Sometimes visitors don't go to the page where the TrustBox is placed, or they are already convinced to buy or use your services. Showing your reviews at key decision points will have a much greater effect on conversion, than hiding them away on a testimonials page.
TIP: Combine conversion insights data with user research to get a deeper understanding of your customers’ behavior and better optimize your online business.
New vs. returning visitors
Returning visitors already know you and don’t need to be convinced to shop or use your services again in the same way as new customers. Showing reviews on key landing pages where you send most of your new traffic gives new visitors a good reason to continue their customer journey, rather than drop off.
Returning customers can be your brand advocates! Try to get them to review your company (and reassure others that they can trust you) by using one of our TrustBoxes that collect reviews.
Page speed influence
It’s always good to aim for fast page load time. In addition to helping your SEO and user experience, it prevents potential complications around your Conversion Optimization Test. By default, our TrustBoxes wait for your webpage to load before they are shown. A slow loading webpage results in visitors not seeing the TrustBox before making a decision. Then, the TrustBox would have no impact on conversion.
This is especially important when the page where you are testing the TrustBox requires a simple user action (e.g. Sign In), before they leave the page. We recommend confirming that the page loads fast enough to ensure the user actually sees the TrustBox before leaving the page.