Recently, I've been frustrated that the almighty Googlebot wasn't indexing the pages of my startup's blog fast enough.
I checked that all the sitemaps were submitted via Google Webmaster, I checked robots.txt, I checked Google's crawl rate.
Everything looked fine on my end. What's weirder is that the crawl rate looked healthy—Google was crawling my site daily.
I figured that because the site is new (it's less than 6 months old), I would just have to wait until I had more site maturity and authority—surely then the indexing would be fixed.
At the time I didn't resubmit the sitemap, because conventional wisdom says that resubmitting the same sitemap doesn't do anything—Google checks automatically for updates. In addition, I have wordpress set up to automatically notify Google, Bing, Yahoo, and Ask.com whenever a new post is added.
Then yesterday I resubmitted the "posts" sitemap via Google's webmaster tools.
Lo and behold, today I found that all 18 of the missing posts had gotten indexed overnight.
Here is a time series graph of the number of pages indexed:
I was quite surprised to see that, because of course I didn't think I could just magically get all the new posts indexed by resubmitting the sitemap.
I am not sure why the new posts stopped getting indexed, nor why resubmitting the sitemap fixed it. It may be that there is a bug with Google's crawler, or some undocumented "feature" that I was running into.
I feel pretty stupid that it took me this long to get this fixed, given how simple the solution turned out to be.
But if you are having issues with some of your new posts or pages not getting indexed, why not try resubmitting the sitemap?