What's the problem?
Some of your items specify a landing page (via the 'link' attribute) that cannot be crawled by Google because robots.txt forbids Google's crawler to download the landing page when crawling with a mobile user-agent. These items stop showing up on Shopping ads for desktop and mobile devices until we are able to crawl the landing page.
Why should you fix this?
Shopping ads users expect that the information on your landing pages matches what is shown on Shopping ads. To ensure this seamless user experience we perform automated quality and policy checks on product landing pages. These checks require us to download the landing pages with Google's crawling system.
How can you fix this?
Update the robots.txt file on your web server to allow Google's crawler to fetch the provided landing pages. The robots.txt file can usually be found in the root directory of the web server (e.g. http://www.example.com/robots.txt). In order for us to access your whole site, ensure that your robots.txt file allows both user-agents 'Googlebot' (used for landing pages) and 'Googlebot-image' (used for images) to crawl your site. You can do this by changing your robots.txt file as follows:
If you have fixed these issues and updated your items via a new feed upload or the Content API, the errors you see should disappear within a couple of days. This time allows us to verify that we can access the landing pages that are provided, after which the items will start showing up on Shopping ads again. If you want to speed up the process you can increase Google's crawl rate.