Big Three Search Engines Back Single, Open Protocol
A trio of search titans is joining forces to make it easier for people to find information on the Web. Google, Yahoo and Microsoft on Thursday agreed to support a single open protocol to make Web sites more visible to search engines.
Sitemaps 0.90 is freely available to webmasters. The protocol intends to make it easier for them to notify search engines about their sites and get indexed more comprehensively and efficiently. The result: better representation in search indices for webmasters and more relevant results for searchers, according to the companies.
"Sitemaps address the challenges of a growing and dynamic Web by letting webmasters and search engines talk to each other, enabling a better Web crawl and better results," said Narayanan Shivakumar, distinguished entrepreneur with Google.
"Our initial efforts have provided webmasters with useful information about their sites, and the information we've received in turn has improved the quality of Google's search," Shivakumar added.
Anatomy of Sitemaps
A Sitemap is an XML file that can be made available on a Web site and acts as a marker for search engines to crawl certain pages. It aims to help webmasters make their sites more search engine-friendly. It does this by allowing them to list all of their URLs along with optional metadata, such as the last time the page changed, to improve how search engines crawl and index their Web sites.
With Sitemaps 0.90, webmasters can now universally submit their content in a uniform manner. Any webmaster can submit their Sitemap to any search engine that has adopted the protocol. The Sitemaps protocol used by Google has been widely adopted by many Web properties, including sites from the Wikimedia Foundation.
"At industry conferences, webmasters have asked for open standards just like this," said Danny Sullivan, editor-in-chief of Search Engine Watch. "This is a great development for the whole community and addresses a real need of webmasters in a very convenient fashion. I believe it will lead to greater collaboration in the industry for common standards, including those based around robots.txt, a file that gives Web crawlers direction when they visit a Web site."
Any company that manages dynamic content and a lot of Web pages can benefit from Sitemaps. Take, for example, a company that utilizes a content management system to deliver custom Web content -- such as pricing, availability and promotional offers -- to thousands of URLs. If it places a Sitemap file on its Web servers, search engine crawlers will be able discover what pages are present and determine which have recently changed, and crawl them accordingly.
Sitemaps allow new links to reach search engine users more rapidly by informing "spiders" and helping them to crawl more pages and discover new content faster, the companies said. This could also drive online traffic and make search engine marketing more effective by delivering better results to users.
"The launch of Sitemaps is significant, because it allows for a single, easy way for Web sites to provide content and metadata to search engines," said Tim Mayer, senior director of product management, Yahoo Search. "Sitemaps helps webmasters surface content that is typically difficult for crawlers to discover, leading to a more comprehensive search experience for users."
Worth the Hype?
The protocol will be available at sitemaps.org, and the companies plan to have Yahoo Small Business host the site. Any site owner can create and upload an XML Sitemap and submit the URL of the file to participating search engines.
However, Jason Dowdell, who operates MarketingShift, a blog focused on media research and technology, sees it as a ruse, and reckons it much ado about nothing. Dowdell doesn't deny that it will be easier for businesses to get their Web sites indexed, but it won't benefit end users much and isn't as comprehensive on the webmaster end as it should be, he told TechNewsWorld.
"You would think that by signing up for this Sitemaps.org program you could access all the information about what the search engine bots are doing on your site, and other statistics, on a single platform," Dowdell noted. "But you still have to log into your Google or Yahoo accounts to do that. So this really doesn't save webmasters much time on the reporting side."