SEO - Search Engine Optimization

XML Sitemaps: The Most Misunderstood Tool in the SEO’s Toolbox

Last Updated: April 12, 2017

Posted by MichaelC-15022In all my years of SEO consulting, I’ve seen many clients with wild misconceptions about XML sitemaps. They’re a powerful tool, for sure — but like any power tool, a little training and background on how all the bits work goes a long ways. IndexationProbably the most common misconception is that the XML sitemap helps get your pages indexed. The first thing we’ve got to get straight is this: Google does not index your pages just because you asked nicely. Google indexes pages because (a) they found them and crawled them, and (b) they consider them good enough quality to be worth indexing. Pointing Google at a page and asking them to index it doesn’t really factor into it. Having said that, it is important to note that by submitting an XML sitemap to Google Search Console, you’re giving Google a clue that you consider the pages in the XML sitemap to be good-quality search landing pages, worthy of indexation. But, it’s just a clue that the pages are important… like linking to a page from your main menu is. ConsistencyOne of the most common mistakes I see clients make is to lack consistency in the messaging to Google about a given page. If you block a page in robots.txt and then include it in an XML sitemap, you’re being a tease. “Here, Google… a nice, juicy page you really ought to index,” your sitemap says. But then your robots.txt takes it away. Same thing with meta robots: Don’t include a page in an XML sitemap and then set meta robots “noindex,follow.”While I’m at it, let me rant briefly about meta robots: “noindex” means don’t index the page. “Nofollow” means nothing about that page. It means “don’t follow the links outbound from that page,” i.e. go ahead and flush all that link juice down the toilet. There’s probably some obscure reason out there for setting meta robots “noindex,nofollow,” but it’s beyond me what that might be. If you want Google to not index a page, set meta robots to “noindex,follow.” OK, rant over… In general, then, you want every page on your site to fall into two buckets: Utility pages (useful to users, but not anything you’d expect to be a search landing page) Yummy, high-quality search landing pages Everything in bucket #1 should either be blocked by robots.txt or blocked via meta robots “noindex,follow” and should not be in an XML sitemap. Everything in bucket #2 should not be blocked in robots.txt, should not have meta robots “noindex,” and probably should be in an XML sitemap. (Bucket image, prior to my decorating them, courtesy of Minnesota Historical Society on Flickr.) Overall site qualityIt would appear that Google…

Source: XML Sitemaps: The Most Misunderstood Tool in the SEO’s Toolbox

About the author / 

S K Routray

S K Routray is a computer science graduate and Co founder at Gracioustech.com. He worked as a Online Marketing lead at many MNC Companies. He has passion for writing on SEO techniques, Social Media Marketing and digital marketing techniques. If he wasn’t an online marketer, he'd take his love for food and become a great chef cum hotel entrepreneur. Join NAS Writers team to write for NAS.

Email Subscriptions

Enter your email address:

Delivered by FeedBurner

Subscribe to our Newsletter

Best Email Marketing Tool!

Multiply Profits AND Automate Your Business

AWeber's email marketing software makes it easy.

Learn how they can do it for you, too.

Follow us on Twitter