« Ask.com at SES New York 2007 | Main | Checking (Gary) Price »

April 11, 2007

Sitemaps Autodiscovery

Today, Ask.com, Google, Microsoft Live Search and Yahoo! together are announcing support of “autodiscovery” of Sitemaps. The new open-format autodiscovery allows webmasters to specify the location of their Sitemaps within their robots.txt file, eliminating the need to submit sitemaps to each search engine separately.

Comprehensiveness and freshness are key initiatives for every search engine, and with autodiscovery of sitemaps, everyone wins:

· Webmasters save time with the ability to universally submit their content to the search engines and benefit from reduced unnecessary traffic by the crawlers
· The search engines get information with regards to pages to index as well as metadata with clues about which pages are newly updated and which pages are identified as the most important
· Searchers benefit from improved search experience with better comprehensiveness and freshness

In addition, Ask.com is now supporting submission of Sitemaps via http://submissions.ask.com/ping?sitemap=SitemapUrl.

Of course, neither autodiscovery nor manual submission guarantee pages will be added to the index. The pages must meet our quality criteria for inclusion in the index. And use of these submission methods does not influence ranking.

I will be talking about today’s announcement (along with my counterparts at Google, Microsoft and Yahoo!) during the SiteMaps and Site Submission session at SES in New York later this morning. If you aren’t able to join us, more information is available at http://www.sitemaps.org and http://about.ask.com/en/docs/about/webmasters.shtml#22.

We are excited about our participation with the Sitemaps via robots.txt protocol and look forward to our collaboration with Google, Microsoft, Yahoo! and others in furthering important initiatives that make search easier for webmasters and more powerful for users.

Vivek Pathak
Infrastructure Product Manager
Ask.com
 

Posted by Ask Blog Editor | Permalink

TrackBack

TrackBack URL for this entry:
https://www.typepad.com/services/trackback/6a00d8341c539153ef00d8341c9c5953ef

Listed below are links to weblogs that reference Sitemaps Autodiscovery:

» Search Engines Unite On Sitemaps Autodiscovery from Search Engine Land: News About Search Engines & Search Marketing
Last November, Google, Microsoft and Yahoo united to support sitemaps, a standardized method of submitting web pages through feeds to the search engines. Today, the three are now joined by Ask.com in supporting the system and an extension of it called ... [Read More]

Tracked on Apr 11, 2007 5:49:50 AM

» Sitemaps Auto-Discovery from Minefeed.com
The big search players Google, Yahoo, Ask and MSN together announced theyll be adding a new way [...] [Read More]

Tracked on Apr 11, 2007 8:45:44 AM

» Sitemaps.org updated from devBlog
The sitemaps site was not updated since November 2006, but it has been updated today, as announced on the Official Google Webmaster Central Blog. Now the site is available in 18 languages, and the protocol has been updated to let the webmaster add the... [Read More]

Tracked on Apr 11, 2007 9:27:56 AM

» 把网站地图的链接放在你的robots.txt文件里 from SEO 网站优化推广
各大搜索引擎都发布了对网站地图的新的支持方式,就是在robots.txt文件里直接包括sitemap文件的链接。 就像这样: Sitemap: http://www.mysite.com/sitemap.xml 目前对此表示支持的搜索引擎公司有Google, Yahoo, Ask and MSN。 而中文搜索引擎公司,显然不在这个圈子内。 这样做的好处就是,站长不用到每个搜索引擎的站长工具或者相似的站长部分,去提交自己的sitemap文件,搜索引擎的蜘蛛自己就会抓取robots.txt文件,读取其中... [Read More]

Tracked on Apr 11, 2007 11:46:35 AM

» Search Engines Unite On Sitemaps Autodiscovery from Minefeed.com
Last November, Google, Microsoft and Yahoo united to support sitemaps, a standardized method of [...] [Read More]

Tracked on Apr 11, 2007 2:17:44 PM

» Sitemap Submission: A Thing of the Past from Pronet Advertising
There was a time when it was a recommended practice to submit a sitemap of your site to search engines to help them better crawl your site. After today's announcement at SES, manual sitemap submission has become a thing of the past. [Read More]

Tracked on Apr 11, 2007 2:36:50 PM

» Get your site crawled by using sitemaps from Rehuel punt kom
I discovered that my articles were displayed on the first page of Google results, when searches are conducted using simple words I mention in my blog. blog worth puts me in the top 3 results, with a link to Can you trust blog worth calcul... [Read More]

Tracked on Apr 12, 2007 9:05:54 AM

» GNC-2007-04-13 #258 from Geek News Central Podcast
Full packed show tonight with a cool discovery at the end. Next show we may simulcast live on ustream.tv Looking for website Feedback you can win money listen for details. Will play the donated music in the next show. Sponsors:... [Read More]

Tracked on Apr 13, 2007 2:05:15 AM

» Sitemaps Made Simple from Shopping Cart Software News & Tips
Google, Ask and Yahoo have together announced the launch of Sitemaps Autodiscovery.  This enables webmasters to simply specify the location of their sitemap within the robots.txt file and universally submit their content to the search ... [Read More]

Tracked on Apr 23, 2007 9:08:09 AM

» What's new with Sitemaps.org? from Google Stuff.
What has the Sitemaps team been up to since we announced sitemaps.org ? Weve been busy trying to [Read More]

Tracked on Aug 3, 2007 1:45:24 PM

Comments

Great job, Ask! Congratulations!!

Posted by: John | Apr 11, 2007 7:38:28 AM

This is great, will save a lot of time for SE's and Webmasters

Great info.

Cheers

Posted by: Keith Cash | Apr 11, 2007 10:02:53 AM

Is it possible to add more than one "SITEMAP: $" in a robots.txt-file?

Posted by: Daniel Aleksandersen | Apr 11, 2007 11:50:44 AM

Does Ask! look for Sitemap presence in robots.txt files at the moment, or is this feature not yet been implemented?

Posted by: Daniel Aleksandersen | Apr 11, 2007 5:11:17 PM

That's great news. It's nice to see all the engines work together to make our lives easier.

Posted by: Dustin | Apr 12, 2007 1:01:32 PM

this is a great thing for large sites like ours with hundreds of thousands of pages.

Posted by: Matt Ellsworth | Apr 12, 2007 6:24:23 PM

Great update from ASK :)

Pratheep

Posted by: Pratheep | Apr 12, 2007 11:54:55 PM

Fantastic. Timing was perfect we are working on them ...NOW

Posted by: Chris | Apr 13, 2007 3:53:44 PM

I do like this initiative, but this is going to make it more difficult for sites which have multiple sites (with different domains) all sitting on the same file system. CMS solutions like http://drupal.org do offer the ability to do this - suppose we'll have to use an apache redirect or something to have a domain specific robots.txt file.

Posted by: Mike Gifford | Apr 14, 2007 11:27:29 AM

Good to see more use being made of Robots.txt. How about incorporating other useful hints for search engines such as a geographic location to ensure sites are listed in the correct country search. (This could not be abused as there's only one robots.txt file per site and only one location would be accepted.)

Posted by: Stephen Newton | Apr 17, 2007 8:54:31 AM

If you have any problems to create XML sitemap files for your sites, try Sitemap Writer Pro. It is a simple tool that helps you to create and keep up-to-date sitemaps for your websites.
Sitemap Writer Pro has new tools - FTP manager for uploading sitemaps, search engine notification tool (now supports Ask.com and MSN.com), site crawler for adding an URLs into the sitemap and Yahoo Index viewer.

Posted by: Arthur | Apr 20, 2007 1:28:09 AM

Is it possible to add more than one "SITEMAP: $" in a robots.txt-file?
Posted by: Daniel Aleksandersen | Apr 11, 2007 2:50:44 PM

Hi, Daniel. There is no restriction on the number of sitemap directives. We will process all the sitemap directives given in the robots.txt file. --Ask.com

Posted by: The Ask.com Blog | Apr 23, 2007 10:34:40 AM

Does Ask! look for Sitemap presence in robots.txt files at the moment, or is this feature not yet been implemented? Posted by: Daniel Aleksandersen | Apr 11, 2007 2:50:44 PM

Hi Again, Daniel! Yes, Ask.com is already processing the sitemap directive in robots.txt. --Ask.com

Posted by: Ask.com Blog | Apr 23, 2007 10:36:11 AM

Good to see more use being made of Robots.txt. How about incorporating other useful hints for search engines such as a geographic location to ensure sites are listed in the correct country search. (This could not be abused as there's only one robots.txt file per site and only one location would be accepted.) Posted by: Mike Gifford | Apr 14, 2007 2:27:29

Mike, thanks for the suggestion. A number of suggestions also came up during the robots.txt summit at SES New York. Ask.com will be working together with the other search engines to enhance robots.txt and sitemaps. This should benefit web masters, searchers, and search engines. --Ask.com

Posted by: Ask.com Blog | Apr 23, 2007 10:37:56 AM

Does ask! Look for Sitemap presence in robots.txt files at the moment...
This is seo!!!

Posted by: emlak | Apr 26, 2007 3:47:22 AM

This is great, will save a lot of time for SE's and Webmasters

Great info.

Cheers

Posted by: CamelotAkademie.de | May 9, 2007 7:11:21 AM

ways up

Posted by: andrea | May 31, 2007 11:47:18 AM

This is great news and will save lots of effort. I thus need a /robots.txt file I assume?
Daniel

Posted by: Dan | Oct 4, 2007 6:13:34 AM

Hi,

whom many sites are using / providing Sitemaps to search engines?

Kind regards

.

Posted by: S.W.Schilke | Oct 30, 2007 8:32:31 AM

Thanks for posting this. Very helpful information. I was wondering if there was an easier way to create all those sitemaps, and now I know there is. Cheers.

Posted by: trademark registration | Nov 23, 2007 12:19:40 AM

The comments to this entry are closed.

Opinions expressed here and in any corresponding comments are the personal opinions of the original authors, not of IAC Search & Media and may not have been reviewed in advance.

Blog Search from: Bloglines