Oct 10, 2020

Indexing Request results in Quota Exceeded

For many years I have been manually submitting new URLs (URL Inspection, Request Indexing) without incident and it is my belief that I am allowed about 50 a day and when I hit that limit and see the Quota Exceeded Please try submitting this again tomorrow, I must wait some time (maybe 24 hours) before I can submit more and I understand that - been doing that for a long time so I know how it works.

It seems that in the last week or so I can only submit a few URLs (maybe 5-10) before seeing the message even if I have waited more than 24 hours.  Sometimes only 2-3 can be submitted before I see the message.

I am wondering it something has changed lately with this process and what can I do to get back to being able to submit about 50 or so URLs a day.
Locked
Informational notification.
This question is locked and replying has been disabled.
Community content may not be verified or up-to-date. Learn more.
Recommended Answer
Oct 10, 2020
Hi
The documentation doesn't show a number, it just shows that there is a limit
There is a per-property daily limit of live inspections
The url inspection tool is made for troubleshooting, not made for manual indexing a bunch of urls

It is fine if you need to index a few urls.
If you have more urls, then it is recommended to use / update sitemaps
Last edited Oct 10, 2020
Diamond Product Expert Travler. recommended this
Helpful?
All Replies (4)
Recommended Answer
Oct 10, 2020
Hi
The documentation doesn't show a number, it just shows that there is a limit
There is a per-property daily limit of live inspections
The url inspection tool is made for troubleshooting, not made for manual indexing a bunch of urls

It is fine if you need to index a few urls.
If you have more urls, then it is recommended to use / update sitemaps
Last edited Oct 10, 2020
Diamond Product Expert Travler. recommended this
Oct 10, 2020
I add about 100-150 new URLs a week to this site and for a very long time I was able to submit the URLs one but one from beginning to end without incident.

It takes a while so put on a good podcast or something, get yourself dialed in and sit back and submit them and you will get very good results from Google search - often within minutes after submitting.

This is important since my competitors are getting the same data feeds I am and I want to show up in the search ahead of them.

Then months ago something changed and I could only do 50 a day so had to spread out the manual submits over 2 days which was not too bad but I got it done.

Now I can only do a few a day.

If you want new URLs to show up quickly and favorably submit them and request indexing one by one by hand.

If you want to wait days/weeks and often never for URLs to show up in Google search while your competitors are showing up ahead of you wait for Google to digest your sitemap.

My sitemap has 50,000+ URLs for Google to look at and I can't wait for that to happen.

If the tool was for troubleshooting why is there a message that says Page changed?  Request Indexing.
The Help says "Request indexing for a URL: You can request that an URL be crawled (or recrawled) by Google."
It says nothing about troubleshooting.

I know what works and I will keep doing what works and may have found a solution/workaround but since I'm not supposed to be doing that I'll keep that idea to myself.

I now see other topics in the community about this new issue of not being able to do more than a few a day lately.

I don't care if it is recommended or not.  I should do it another way like a sitemap?  Fat chance on that working in minutes like I'm used to.
I've seen things in the sitemap never show up in a Google search, but if I manually submit them they show up almost instantly.

I go with what works.
Oct 14, 2020
The Google, submit for indexing tool has been temporarily disabled
 
"We have disabled the "Request Indexing" feature of the URL Inspection Tool, in order to make some infrastructure changes. We expect it will return in the coming weeks. We continue to find & index content through our regular methods."
 
 
Google crawls pages on its own, no need to request crawling manually. The submit to index tool is designed for occasional use, not for every new page. 
 
 
Oct 15, 2020
So it seems we now have an "answer" or explanation for what has changed which is what I wanted to know.  In fact - now we know that something did change.

Let me assure you with 100% certainty and after 10+ years of experience that if you want things like new URLs and changes to existing URLs to show up in a Google search in a relative short amount of time the way to do it is Inspect and Request Indexing.

If you want to wait days/weeks/months or never for these things to show up only use a sitemap and you will be very disappointed in the results.

Of course I have sitemaps that I submit regularly too but after many years of experiments and day/weeks/months of waiting for new things and changes from a sitemap to show up in a Google search a sitemap is not as effective as Inspect, Request Indexing.

I have URLs in a sitemap that have never shown up in a Google search and when I spot them and Inspect and Request Indexing they will (or used to) show up favorably within minutes. 

From the link:

C. Submit new and updated URLs to Google
While you can merely host your sitemap on your site for our systems to discover, you can also provide notification about new URLs or existing URLs that have changed content.

Nothing is mentioned that says that feature is supposed to be used for troubleshooting.

That's what works, that's what we want to do and can't wait for them to complete their infrastructure changes and for the feature to return in the coming weeks"
Last edited Oct 15, 2020
false
4278376351021910326
true
Search Help Center
true
true
true
true
true
83844
false
false
Search
Clear search
Close search
Main menu