Hello! How can we help?

Browse the Trustpilot Support Center to find info on Google Seller Ratings, Rich Snippets, TrustBoxes, Product reviews, plus how-to guides for editing your reviews and account settings

TrustBox Optimizer: Understanding Your Conversion Optimization Test Results

You’ve already started your first Conversion Optimization Test using the TrustBox Optimizer; now it’s time to monitor and interpret your test results.

In this article, we’ll take you through how the TrustBox Optimizer calculates the results, and help you analyze them and decide on next steps.

How does the Statistics Engine behind the TrustBox Optimizer work?

How to interpret the detailed report to make a business decision?

Best Practices and Tips for Getting the Most out of the Optimizer

Example Scenarios

If you are not using TrustBox Optimizer yet, check out this guide on how to start a Conversion Optimization Test.

How does the Statistics Engine behind the TrustBox Optimizer work?

Statistics are at the core of every A/B testing tool and are crucial when it comes to interpreting the results and making business decisions based on a test.

When you start a Conversion Optimization Test using the TrustBox Optimizer, we will record the total number of visitors seeing each of your TrustBox variations (or no TrustBox), as well as how many of them converted. Using a method called Bayesian statistics, we compute the statistical probability of each variation to beat the other variations in the test over time and express this as a percentage between 0% and 100%. This helps to give a simple and accurate answer to the important business question "Will this change be beneficial for my business?"

During the first week of your test running, or if the level of traffic is very low (under 200 visits), we will not show the detailed report, as it’s too early to draw any conclusions from a small data sample. After that period you will begin to see a detailed report with your test results based on the data collected thus far. The report is generated once a day and is based on all data collected since the beginning of the test. The longer you let the test run, the more data we can collect and the more accurate your results will be. Although we would recommend you do not extend the test period beyond eight weeks.

TIP: As soon as you correctly implement the code on your site, we’ll start collecting data which will go into your report. If you have spent some time testing the TrustBox code in your dev environment, or if you forgot to implement the conversion tracking code when you first released; we recommend that you restart the test once the code is working as expected on your live site, so that you can flush all irrelevant test data that might pollute your test results. You can restart the test from the TrustBox Optimizer Test Settings page by clicking the Save and restart button.

How to interpret the detailed report to make a business decision?

As the detailed report gets generated daily, we will compute the probability of each variation to be the best one over time and express that as a number between 0% and 100%. The report is a snapshot of the data we’ve collected so far - the longer you run the test and the more data we collect, the more accurate the result will be.

There are four key data points to look for in the report

  • Chance to win
  • Potential loss
  • Conversion range
  • Estimated revenue

Depending on what these key data points are, you can choose to implement the winning variation or run more tests to find a more optimal spot to show reviews.

Chance to win

Depending on how high the Chance to win is, there can be three types of test outcomes:

  • Close to 50-60% means that the tested variations are very similar. This usually means either variation could work equally well for your business.
  • Above >70% means that there is a difference between the variations and one has a higher chance to outperform other variations in the test over time.
  • Above >90% means that there is a big probability that one variation will outperform the other test variations and it can be declared a winner.
Chance to win What does it mean? Risk assessment
ca. 50-60%

The tested variations are very similar.

Example and recommendations

Neither win, nor lose
> 70%

There is a high chance that this variation will outperform other variations in the test over time.

Example and recommendations
Safe
> 90%

There is a very high chance that this TrustBox variation will outperform other variations in the test over time. This variation can be declared a winner.

Example and recommendations
Very Safe

Potential loss

Quantify the risk of your decision by understanding the conversion rate you can expect to lose if you pick one variation as the winner, when one of the other variations would have been a better choice. The lower the percentage, the safer choice you can make.

Use Potential loss in combination with Chance to win to decide when to end the test. For example, when the Chance to win of a variation is high, and its Potential loss is small enough to be within your comfort zone, we recommend that you choose that variation as the winner.

Conversion range

The conversion rate is expressed as a range within which your actual conversion rate is likely to be. A very big conversion rate range or a big overlap in the ranges may indicate that we need more data to be able to detect a difference between the variations in the test. The graph section of the report can be switched to show the conversion rate range and the overlap between the variations.

Estimated Revenue

If you’re also tracking revenue in the conversion script, you will find Estimated Revenue in the detailed report. This number shows you what your monthly revenue could be if you run each variation on 100% visibility. The projected revenue numbers are derived from all revenue data we have collected during the test period and applying the relative increase in conversion. It can be used as an additional indicator to help you make a business decision on which TrustBox variation to keep implemented on your site.

Best Practices and Tips for Getting the Most out of the Optimizer

The TrustBox Optimizer is designed with the main purpose to allow you to learn and improve how you use TrustBoxes and reviews on your website. Every test you run is an opportunity to learn about what works or doesn’t work for your business and your visitors - as such, there are no bad test results. It’s all about optimizing continuously until you find the best and most impactful solution.

Below we’ve compiled a list of tips to help you avoid pitfalls and get you the most useful results as fast as possible. Check out the example cases also to see the Optimizer’s results in real scenarios, and recommended ways to proceed.

Test duration and accuracy

  1. Aim to have 1,000 visitors per variation before you take action. The longer you run the test and the more data we collect, the more accurate the result will be.
  2. Don’t stop the test too early. It’s good practice to run the test for at least 2-3 full weeks.
  3. Don’t run the test too long. Running a test for longer than eight weeks introduces too much seasonality - make a decision based on the data so far and plan a follow-up test.
  4. Avoid introducing changes on your site or changing how you drive traffic to your site while you are running a split test as this may affect the test results.

Do you have other TrustBoxes on your site?

Remember that you’re testing the effect of a specific TrustBox on your page - if you have other TrustBoxes on the same page or elsewhere on your site, they will also influence visitors throughout their customer journey.

Did your customers actually see the TrustBox?

Sometimes TrustBoxes are placed in a position where they don’t attract any attention. For example, showing your reviews directly on the page can have a much bigger impact than a small TrustBox in the footer.

Or, if TrustBoxes directly compete with a key CTA or conversion driver, they might end up reducing conversion, rather than improving it. For example, if you replace a key page module like “recently viewed” with reviews.

PRO TIP: Aim to position your TrustBox so that it’s visible and complements (rather than distracts from) a CTA or conversion driver.

Are you testing at a key decision point?

We've seen cases where visitors don't go to the page where the TrustBox is placed, or that at this particular point of the customer journey, people are already convinced to buy or use your services. For example, showing your reviews at key decision points will have a much bigger effect on conversion than hiding them away in a testimonials page.

PRO TIP: Combine conversion insights data with user research to get a deeper understanding of your customers’ behavior and better optimize your online business.

New vs. returning visitors

Returning visitors already know you and don’t need to be convinced to shop or use your services again in the same way as new customers would. For example, showing reviews on key landing pages where you send most of your new traffic can give new visitors a good reason to continue their customer journey rather than drop off.

Returning customers can be your brand advocates - try to get them to review your company and reassure others that they can trust you using one of our TrustBoxes that help you collect reviews.

Page speed influence

As a rule of thumb, it’s always good to aim to have a fast page load time. In addition to helping your SEO and user experience, it will also prevent potential complications around your Conversion Optimization Test. Since our TrustBoxes by default wait for your webpage to load before they are shown, a slow loading webpage can result in visitors not seeing the TrustBox before making a decision. Such a scenario would mean the TrustBox would not have an impact on conversion.

This is especially important if the page where you are testing the TrustBox requires a simple action from the user (e.g. Sign In) before they leave the page. In that case, it’s recommended to make sure that the page loads fast enough so the user can actually see the TrustBox before leaving the page.

Example Scenarios

Below we’ve listed a few scenarios that you could experience when running a Conversion Optimization Test with one or more TrustBoxes.

When the Chance to win is ca. 50-60%

Example test: TrustBox Mini vs. Not showing a TrustBox (OFF) in the footer of every page

Result: Chance to win is ca. 50-60% for one of the test variations. This usually means either variation could work equally well for your business.

Suggested next steps: Implement the variation you like the most and start planning a new Conversion Optimization Test.

Ideas for your next test: Reviews are a crucial part of the decision process when shopping or researching online, so it’s just a question of finding the place and/or TrustBox where they can have the biggest impact for your business.

  • Test another TrustBox or placement on the same page, e.g. placing the TrustBox closer to a call to action or testing with a TrustBox that shows reviews.
  • Try a similar test on a different page. Pick a key decision point where you drive most of your traffic to, and you see the biggest drop-offs.

When the Chance to win is above 70% for Variation OFF

Example test: TrustBox Carousel vs. Not showing a TrustBox (OFF) on product or category pages.

Result: Chance to win is above 70% for Variation OFF - this means you’ve just acquired valuable knowledge about your site’s visitors! Now you know that the specific place where you are using the TrustBox is not working as well as you probably thought it would - it may be that your site’s visitors don’t need additional social proof at this stage of their journey.

Suggested next step: Depending on at what point you feel comfortable making a decision, let the test run for a little longer to get to higher chance to win, or stop the test and start planning a new one.

Ideas for your next test: Find the sweet spot in the customer journey where the TrustBox can have a bigger impact, e.g. a critical decision point to showcase your reviews where people need them the most. You can test another TrustBox or a different placement on the same page, or test at a different point in the customer journey:

  • If you’re having issues with getting visitors to add to basket - continue optimizing on your product or category pages.
  • If your customers are already adding to basket, but you have bigger issues with getting them to actually check out, then test on the checkout page.
  • Remember that on product pages your customers are mainly looking for product reviews, so run a test with one of our Product Review TrustBoxes.

When the Chance to win is above 70% for Variation A (or B)

Example test: TrustBox Drop-Down vs. Horizontal and/or vs. Not showing a TrustBox (OFF) in area close to CTA or decision point

Result: Chance to win above 70% means that keeping the better performing TrustBox is likely to have a positive effect on conversion. It seems the specific place where you are using the TrustBox is working well, and your reviews are impacting your conversion rate positively.

Suggested next step: Keep the better performing TrustBox and in addition to that, try optimizing further to see if you can push the conversion rate even more.

Ideas for your next test: Test if showing reviews works better than just showing your overall TrustScore and stars. You could even push it further by trying to test reviews on a specific topic (e.g. delivery).

Comments

0 comments

Article is closed for comments.