background preloader

Bing

Facebook Twitter

SEO for Bing - Google and Bing Indexing Differences. Bing and Yahoo are now (mostly) sharing their search functions and further integration will occur over the next several months.

SEO for Bing - Google and Bing Indexing Differences

This combination has resulted in a 25% market share of the Search landscape for Bing/Yahoo! If you have so far ignored the growing importance of Bing, now is the time to start looking at how to optimize your website for Bing search results. This post will take a look at how indexing occurs with Bing. Google is clearly the leader in the Search space and a very mature and sophisticated search engine. They do a remarkable job of indexing a wide variety of content. Canonical Requirements: Google is very good at determining a website’s Canonical URL even if a website is not coded to properly return the Canonical URL.

Bing, on the other hand, does not support the Canonical tag and does not offer Canonical URL management in their Webmaster Center. Page Size: Back in the early days of Google, googlebot would only crawl the first 100k of any given page. Bing Offers Recommendations for SEO-Friendly AJAX: Suggests HTML5 pushState. Bing has announced support for HTML5 pushState as a way to implement AJAX on a site in a way that enables Bing to crawl and index the URLs and content.

Bing Offers Recommendations for SEO-Friendly AJAX: Suggests HTML5 pushState

As Google has supported this implementation since early 2012, site owners finally have an AJAX option that can be crawled and indexed by both major search engines in the United States. (The ease of implementing is another story altogether.) Bing tells me that while they still support the #! Version of crawlable AJAX originally launched by Google, they’re finding it’s not implemented correctly much of the time, and they strongly recommend pushState instead.

Why AJAX Can Be Difficult To Crawl & Index One common use of AJAX is to make the website experience faster for a visitor, but this implementation can have drawbacks for SEO. A web developer could implement this one of several ways. A separate URL for each tab – with this implementation, when the visitor clicks a tab, a new request is made to the server for a completely new page. Bing Launches Way to "Disavow" Links, But Why? For years, Google’s webmaster guidelines have noted that attempts to manipulate Google’s algorithms with artificial external link profiles (paid links, link schemes and the like) are violations and that Google may take action (by removing the site from the index or lowering its ranking).

Bing Launches Way to "Disavow" Links, But Why?

This year, Google starting alerting site owners with “unnatural links”, recommending that they remove them. Google also launched a new algorithm called Penguin, aimed at flagging sites that attempt to manipulate Google’s guidelines with spam techniques such as artificial link profiles. One of the recommendations Google has given for recovering from Penguin is to have spammy links removed, but what if that’s not possible? Some in the SEO community are worried about negative SEO. If spammy links pointing at your site can hurt you, then can’t competitors just buy a bunch of them and point them at your site? So now, such a feature has finally been launched… by Bing. Wait, what? Confusingly, the FAQ then says: Webmaster Guidelines - Bing Webmaster Tools.