In this article
Why this issue is happening
One or more of your products uses the link [link]
attribute to specify a landing page that cannot be crawled by Google because robots.txt forbids Google's crawler to download the landing page when crawling with a mobile user-agent. This prevents Google from performing the automated quality and policy checks on product landing pages that ensure a seamless user experience. Affected products will stop showing in Shopping ads and free product listings for desktop and mobile devices until Google is able to crawl the landing page.
How to fix your issue
To resolve this issue, update the robots.txt file on your web server to allow Google's crawler to fetch the provided landing pages. The robots.txt file can usually be found in the root directory of the web server (for example, http://www.example.com/robots.txt). In order for us to access your whole site, ensure that your robots.txt file allows both user-agents "Googlebot"
(used for landing pages) and "Googlebot-image"
(used for images) to crawl your site. Add the following lines to your robots.txt file:
User-agent: Googlebot
Disallow:
User-agent: Googlebot-image
Disallow:
Learn more about how to configure robots.txt. You can test your current configuration with the URL Inspection tool. Make sure to select "Smartphone"
as the type of fetch.
If you fix these issues and update your items via a new feed upload or the Content API, the errors you see should disappear within a couple of days. This time allows us to verify that we can access the landing pages that are provided, after which the items will start showing up on Shopping ads and free listings again. If you want to speed up the process you can increase Google's crawl rate.
Next steps
After making the requested changes, check that you’ve fixed the issue by making sure it’s no longer listed on the “Needs attention” page.
Keep in mind: It may take some time for your change to be reflected on the “Needs attention” page.