URLs timed out errors

We received a timeout when we tried to access the page. Make sure the page is accessible. Some errors you may see are:

DNS lookup timeout We received a timeout on DNS lookup.
URL timeout We received a timeout when connecting to your webserver or during the request.
robots.txt timeout The server timed out when we were trying to access your robots.txt file. Before we crawled the pages of your site, we tried to check your robots.txt file to ensure we didn't crawl any pages that you had roboted out. However, we received a timeout when we tried to access your robots.txt file. To make sure we didn't crawl any pages listed in that file, we postponed our crawl. When this happens, we return to your site later and crawl it once we can reach your robots.txt file. Note that this is different from a 404 response when looking for a robots.txt file. If we receive a 404, we assume that a robots.txt file does not exist and we continue the crawl.