Google is unable to crawl your website and gather information about 3D assets. To allow Google access to your content, make sure that the robots.txt file allows user-agent "Googlebot" to crawl your site.
You can do this by adding the following lines to your robots.txt file:
User-agent: Googlebot
Disallow:
Learn more about the Robot Exclusion Protocol
Once you upload and test your robots.txt file, Google's crawlers will automatically find and start using your robots.txt file. You don't have to do anything. If you updated your robots.txt file and you need to refresh Google's cached copy as soon as possible, learn how to submit an updated robots.txt file.
Google may use information obtained from any of these or other various user-agents to improve our Google Ads systems.