Yahoo’s Robots-Nocontent Tag
Most bloggers make use of SEO techniques like robots.txt, nofollow links, and AdSense section targeting to properly direct (or block) the attention of search engines. Earlier this month, Yahoo unveiled the newest addition to this list with the introduction of robots-nocontent.
This new attribute allows bloggers and webmasters to deemphasize areas of a page as irrelevant content. All you do is add class=”robots-nocontent” to your HTML like so:
Navigation, menus, boilerplate text, ads, etc.
Putting this in place gives Yahoo the signal that the marked sections should not be factored in for ranking purposes. Instead, only the unique, relevant content on your pages is counted in the algorithm. Yahoo gets a more polished index and uses less bandwidth; you get higher and more relevant rankings. Everyone wins.
Keep in mind, though, that robots-nocontent has yet to be publicly adopted by the other search engines, particularly Google, so it won’t do anything to help your non-Yahoo rankings. It’s doubtful that using it would do any harm, however, and Yahoo is still the internet’s #2 search engine, so using robots-nocontent is still a good idea from an SEO standpoint.
6 Responses to “Yahoo’s Robots-Nocontent Tag”
Stephen, thanks for sharing that info. I wonder how long it will take for Google to adopt this. By then this feature will be a must for all the websites.
By the way guys, Stephen is one of guest bloggers that will be participating on DBT.
Glad to be here, Daniel. To answer your question, there’s no saying when Google might adopt robots-nocontent, if at all. Google has a strong tendency to be the search industry leader. For example, Google invented the sitemaps protocol that MSN and Yahoo had to then start supporting because of its popularity. Google may decide ignore it completely or even produce their own alternative in lieu of yielding their covetted thought leadership to Yahoo. Only time will tell.
If different search engines start having different rules wouldn’t be harder for publishers like us to create content.
It would be hard if another search engine robot like that of microsoft come up with their own rules.
Keith, that is true. In fact sometimes it is a good thing that Google has such a giant market share of the search industry.
This allow them to create standards for the whole industry.
To answer your question, Keith, yes it would make it harder. Since the robots-nocontent directive is relatively new, however, and hasn’t been widely adopted (primarily because Google doesn’t support it), Google might decide it’s better for them to make their own standard.
On that note, if a competing directive is going to pop up, I predict it will pop up sooner rather than later. The longer Google and MSN let Yahoo have the stage, the less likely they’ll be able to compete with Yahoo’s mindshare.
How to edit robots.txt in blogspot ?
Comments are closed.