Want to ensure that Googlebot crawls and indexes your service review content correctly? Follow the steps in this article to troubleshoot Rich Snippets for service reviews.
Ensure the TrustBox code is correct
Check that the TrustBox script and TrustBox widget code snippets are implemented correctly. The TrustBox must load normally on the page.
The SEO TrustBox must contain the
Here’s a step-by-step video showing how to implement the SEO attribute correctly:
Is your structured data error-free?
- Open Google’s Structured Data Testing Tool
- Paste the page link where the SEO TrustBox is implemented
- Click RUN TEST
Here’s a step-by-step video showing how to test the URL in Google’s Structured Data Testing Tool:
If Googlebot finds an error, it may stop crawling the page!
It's vital that your page is error-free!
Fetch and Render and Request Indexing are equally important
- Open Google Search Console
- Go to Crawl - Fetch as Google
- Both actions must be performed - Fetch and Render and Request Indexing
Here’s a step-by-step video showing how to do Fetch and Render and Request Indexing:
The TrustBox is displayed identically to Googlebot and visitors
- To see how Google renders the TrustBox in the Google Search Console, click on a row to view the details of a fetch attempt
- Both pages must look identical
- If Googlebot can’t see the TrustBox, the page might be too slow to load or something is blocking Googlebot from crawling the content
Below is the list of resources which Googlebot couldn’t get.
Here’s a step-by-step video on How to view Render(s):
Rich Snippet Stars appear shortly after the Indexing Request and then disappear until Google’s Normal Indexing method picks up the page.
According to Google’s Webmaster Trends Analyst John Mueller, there are essentially two ways to get indexed:
“...one is the normal indexing method where when pages get indexed they are more likely to stay in the index for longer periods of time. The second [is] "fast track" indexing where Google quickly puts it into the index, maybe because you used the submit to index feature in the fetch as Google in Google Search Console. Those pages may not stay in the index for long periods of time, not at least until the normal indexing method picks up the page.”
For more details, read Google Fast Track Indexing vs. Normal Indexing.
What blocks Googlebot from crawling and indexing?
- The page speed is low
- The page code is bloated
- The sitemap is too complex
- The TrustBox is placed too low on the page
- The TrustBox appears on too many pages (Rich Snippets abuse)
- There's duplicate content
- The URL parameters are restricted from indexing to avoid duplicate content
- The Robots.txt are blocking Googlebot
- There's an incorrect configuration of .htaccess file
- There are broken links, 404 errors, or incorrect redirects
“Recrawling is not immediate or guaranteed. It typically takes several days for a successful request to be granted. Also, understand that we can't guarantee that Google will index all your changes, as Google relies on a complex algorithm to update indexed materials.” - From Ask Google to recrawl your URLs
“Relevancy is determined by over 200 factors, one of which is the PageRank for a given page. PageRank is the measure of the importance of a page based on the incoming links from other pages.” - From How Google Search Works
“Markup should not be used to hide content not visible to users in any form, since it might create a misleading or deceptive search experience.” - From Structured data quality guidelines