How to fix: Robots.txt error

Update your robots.txt file to allow user-agents "Googlebot" to crawl your site with the 3D assets

Google is unable to crawl your website and gather information about 3D assets. To allow Google access to your content, make sure that the robots.txt file allows user-agent "Googlebot" to crawl your site.

You can do this by adding the following lines to your robots.txt file:

User-agent: Googlebot


Learn more about the Robot Exclusion Protocol

Once you upload and test your robots.txt file, Google's crawlers will automatically find and start using your robots.txt file. You don't have to do anything. If you updated your robots.txt file and you need to refresh Google's cached copy as soon as possible, learn how to submit an updated robots.txt file.

Google may use information obtained from any of these or other various user-agents to improve our Google Ads systems.

Was this helpful?

How can we improve it?

Need more help?

Try these next steps:

Clear search
Close search
Google apps
Main menu