Get flash to fully experience Pearltrees
Summary : Users experience psychological and physiological effects when interacting with web pages, experiencing frustration when not completing tasks and engagement at faster web sites. Learn how web page response times affect user psychology and behavior. Previous research has shown that user frustration increases when page load times exceed eight to 10 seconds, without feedback (Bouch, Kuchinsky, and Bhatti 2000, King 2003)., Newer evidence shows that broadband users are less tolerant of web page delays than narrowband users. A JupiterResearch survey found that 33% of broadband shoppers are unwilling to wait more than four seconds for a web page to load, whereas 43% of narrowband users will not wait more than six seconds (Akamai 2006).
Summary : Within the last five years, the size of the average web page has more than tripled, and the number of external objects has more than doubled. The data appears to suggest that the more popular a web page, the smaller the total file size. The size of the average web page of the top 1000 websites has more than tripled since 2008 (our last update in May 2011 found it had more than septupled since 2003).
A recent study examined the effects of five web design features - customization, adaptive behavior, memory load, content density, and speed - on user preference for web-based services. The 2009 study by Turkish and American HCI researchers Seneler, Basoglu, and Daim tested site designs for online flight reservations. The results of this study are valuable because insights into the relative importance (to users) of interface attributes can help web designers increase adoption and retention rates, and boost online revenues.
When Google Analytics code gathers information about each pageview, how does it send that information back to the servers to process? It sends it in the query string of a small file, called __utm.gif. In this way, a log file is created that contains all of the visit information for every single pageview, transaction and event.
Cookies are at the heart of Google Analytics. Not just because they are delicious, but because they provide a critical link in tracking individual visitors and visits. There is shockingly little documentation on the cookies created by the tracking code, what they store or how they work. Because they are so integral to the Google Analytics reports, it is important to lift the hood and understand exactly what is going on. Flavors First, to dispel some confusion.
Since last December's admission from Google + Bing's search teams regarding the direct impact of Twitter + Facebook on search rankings, marketers have been asking two questions: What signals are Google + Bing counting? How much influence do these social signals have on the results? Over the last few weeks, we've been collecting data and running calculations in an attempt to provide more insight into these answers.
Webmaster Level: All Since the initial roll-out of rich snippets in 2009, webmasters have shown a great deal of interest in adding markup to their web pages to improve their listings in search results. When webmasters add markup using microdata , microformats , or RDFa , Google is able to understand the content on web pages and show search result snippets that better convey the information on the page. Thanks to steady adoption by webmasters, we now see more than twice as many searches with rich snippets in the results in the US, and a four-fold increase globally, compared to one year ago.
Webmaster Level: All As a webmaster, you have a unique understanding of your web pages and the content they represent. Google helps users find your page by showing them a small sample of that content -- the "snippet." We use a variety of techniques to create these snippets and give users relevant information about what they'll find when they click through to visit your site. Today, we're announcing Rich Snippets, a new presentation of snippets that applies Google's algorithms to highlight structured data embedded in web pages. Rich Snippets give users convenient summary information about their search results at a glance.
Carpe diem on any duplicate content worries : we now support a format that allows you to publicly specify your preferred version of a URL. If your site has identical or vastly similar content that's accessible through multiple URLs, this format provides you with more control over the URL returned in search results. It also helps to make sure that properties such as link popularity are consolidated to your preferred version. Let's take our old example of a site selling Swedish fish . Imagine that your preferred version of the URL and its content looks like this:
Search engine optimization — SEO — may seem like alchemy to the uninitiated. But there is a science to it. Search engines reward pages with the right combination of ranking factors, or “signals.” SEO is about ensuring your content generates the right type of signals.
First of all let me confess the term ‘fresh rank’ has been stolen from fellow SEO blogger Justin Briggs , I am going to refer to one of his excellent posts throughout the rest of this one. You will no doubt know about Google’s new QDF upgrade , an algorithm tweak designed to get you to ‘fresh’ content quicker, rather than bringing up old static results. You can see an example of it here;
Today, the New York Times published an article about a search engine optimization investigation of J.C. Penney. Perplexed by how well jcpenney.com did in unpaid (organic) search results for practically everything the retailer sold, they asked someone familiar with the world of search engine optimization (SEO) to look into it a bit more. The investigation found that thousands of seemingly unrelated web sites (many that seemed to contain only links) were linking to the J.C. Penney web site. And most of those links had really descriptive anchor text.
Mathematical for a simple network, expressed as percentages. (Google uses a logarithmic scale.) Page C has a higher PageRank than Page E, even though there are fewer links to C; the one link to C comes from an important page and hence is of high value. If web surfers who start on a random page have an 85% likelihood of choosing a random link from the page they are currently visiting, and a 15% likelihood of jumping to a page chosen at random from the entire web, they will reach Page E 8.1% of the time. (The 15% likelihood of jumping to an arbitrary page corresponds to a damping factor of 85%.) Without damping, all web surfers would eventually end up on Pages A, B, or C, and all other pages would have PageRank zero.
Wow, my post about how Google makes algorithm changes sure got a LOT of attention. While I happened to think the post itself was pretty darn informative (if I can be so humble…lol), it turns out that the majority of folks visiting just wanted a copy of the 2011 Google Quality Raters Handbook. Makes sense, but as most know by now, I was contacted by Google and had to stop sharing and linking to that document. So, let’s move on and talk about these Google Quality Raters. Who are they? What do they do?