Some of your products specify a landing page (via the link
[link] attribute) that cannot be crawled by Google because robots.txt forbids Google's crawler to download the landing page. These products will remain disapproved and stop showing up in your Shopping ads and free product listings until we are able to crawl the landing page.
Update the robots.txt file on your web server to allow Google's crawler to fetch the provided landing pages. The robots.txt file can usually be found in the root directory of the web server (for example, http://www.example.com/robots.txt).
In order for us to access your whole site, ensure that your robots.txt file allows both user-agents 'Googlebot' (used for landing pages) and 'Googlebot-image' (used for images) to crawl your full site.
You can allow a full-site crawl by changing your robots.txt file as follows:
If you have fixed these issues and updated your products via a new feed upload or the Content API, the errors you see here should disappear within a couple of days. This time allows us to verify that we can crawl the landing pages that are provided, after which the products will start showing up in your Shopping ads and listings again. If you want to speed up the process you can increase Google's crawl rate.