SEO

Facebook Twitter

Interviewing

The Psychology of Web Performance - how slow response times affect user psychology. Summary: Users experience psychological and physiological effects when interacting with web pages, experiencing frustration when not completing tasks and engagement at faster web sites.

The Psychology of Web Performance - how slow response times affect user psychology

Learn how web page response times affect user psychology and behavior. Previous research has shown that user frustration increases when page load times exceed eight to 10 seconds, without feedback (Bouch, Kuchinsky, and Bhatti 2000, King 2003)., Newer evidence shows that broadband users are less tolerant of web page delays than narrowband users. A JupiterResearch survey found that 33% of broadband shoppers are unwilling to wait more than four seconds for a web page to load, whereas 43% of narrowband users will not wait more than six seconds (Akamai 2006). Tolerable Wait Times In a 2004 study, Fiona Nah found that the tolerable wait time (TWT) on non-working links without feedback peaked at between 5 to 8 seconds (Nah 2004). The Effects of Slow Download Times Conclusion Further Reading Akamai. Fogg, B. Average Web Page Size Septuples Since 2003 - web page statistics and survey trends for page size and web objects.

Summary: Within the last five years, the size of the average web page has more than tripled, and the number of external objects has more than doubled.

Average Web Page Size Septuples Since 2003 - web page statistics and survey trends for page size and web objects

The data appears to suggest that the more popular a web page, the smaller the total file size. The size of the average web page of the top 1000 websites has more than tripled since 2008 (our last update in May 2011 found it had more than septupled since 2003). In the past five years from 2008 to late 2012 the average web page grew from 312K to 1114K (see Figure 1), over 3.5 times larger (Domenech et al. 2007, Flinn & Betcher 2008, Charzinsk 2010, Souders 2012). During the same five-year period, the number of objects in the average web page has more than doubled from 49.9 to 100 objects per page in November 2012. Figure 1: Growth of the Average Web Page Average Web Page versus Survey Size In the HTTParchive.org site, the data also reveals the average web page for the top 292,880 pages is 1249K in size made of up 86 objects on average.

B. Study: Web Users Prefer Speed Over Customization. A recent study examined the effects of five web design features - customization, adaptive behavior, memory load, content density, and speed - on user preference for web-based services.

Study: Web Users Prefer Speed Over Customization

The 2009 study by Turkish and American HCI researchers Seneler, Basoglu, and Daim tested site designs for online flight reservations. The results of this study are valuable because insights into the relative importance (to users) of interface attributes can help web designers increase adoption and retention rates, and boost online revenues. As you can see from Figure 1 they found that the most preferred feature was high speed, followed distantly by minimal memory load, adaptive behavior, low content density, and customization features.

Figure 1: Relative Importance of Interface Design Features Broadband, Web Page Complexity and Response Times As Web 2.0 technogies like Ajax and DHTML have become more widespread, websites have grown more complex. Design Factors Tested Adaptive behavior Content density Speed. How Google Analytics Gets Information. When Google Analytics code gathers information about each pageview, how does it send that information to the data collection servers so it can be processed?

How Google Analytics Gets Information

The information is sent in the query string of a request for the __utm.gif file. This file is requested for every single pageview, transaction and event. When the files are processed, Google Analytics can then string together the individual actions into a visit. Many organizations store a copy of every tracking request sent to Google Analytics data collection servers - this is accomplished with the setLocalRemoteServerMode(); function in ga.js. Once you have a local copy of Google Analytics tracking requests, you can process them with Angelfish Software. Dissecting __utm.gif A typical __utm.gif request is long and messy. Although the URL looks overwhelming, it's pretty simple when broken down. There is a long query string with a dozen or so URL-encoded parameters, depending on when this is requested. Pageview Parameters. Cookies in Google Analytics.

Cookies are at the heart of Google Analytics.

Cookies in Google Analytics

Not just because they are delicious, but because they provide a critical link in tracking return visitors and attribution. There is shockingly little documentation on the cookies created by the tracking code, what they store or how they work. But they are so integral to the Google Analytics reports, it is important to lift the hood and understand exactly what is going on. ***Update: Universal Analytics uses a different cookie "recipe" than the one outlined in this article.

Flavors First, to dispel some confusion. If I am visiting www.mysite.com, and it sets cookies on my machine, those are first-party cookies. Google Analytics uses first-party cookies. Cookies come in two more flavors: session and persistent. How Google Analytics Uses Cookies Certain reports in Google Analytics rely heavily on cookies. Unique Visitors Cookies are a major component of unique visitor tracking. Activity Cookies store vital information about each visit.

Traffic Source. Facebook + Twitter's Influence on Google's Search Rankings. Since last December's admission from Google + Bing's search teams regarding the direct impact of Twitter + Facebook on search rankings, marketers have been asking two questions: What signals are Google + Bing counting?

Facebook + Twitter's Influence on Google's Search Rankings

How much influence do these social signals have on the results? Over the last few weeks, we've been collecting data and running calculations in an attempt to provide more insight into these answers. Today, I'd like to share some results of that process. But, before we begin, there's some important caveats. The data we're sharing below examines the top 30 ranking results for 10,217 searches performed on Google in late March (after the Panda/Farmer update, using top suggested keywords in each category from Google's AdWords data). However, this does not mean we can be confident that what we're measuring are actually ranking factors having a direct influence. Image credit: alfonsator on Flickr With those out of the way, let's look at some data!

Correlation of Link Metrics vs. SEO Pricing: 600+ Agencies Share Costs of Services & Pricing Models. Near the end of December 2011, we ran a survey on this blog asking consultants and agencies of all sizes and geographies to contribute their pricing models and cost structures.

SEO Pricing: 600+ Agencies Share Costs of Services & Pricing Models

I'm pleased to share the results of that survey in the hopes that it will give everyone in the search industry a better idea of the range of fees and the services provided. Obviously, this data is imperfect - SEOmoz is not a professional data surveying firm and our only tool was a basic list of questions on SurveyMonkey. That said, I'd be surprised if a professional surveyor found dramatically different data - there was enough participation to receive a trustworthy sample size and firms provided their personal/contact information (many of which I recognized while digging through the responses, but obviously will not be sharing identities publicly), which means we likely did not receive intentionally manipulative/misleading information.

(via AYTM's infographic) Top 9 Takeaways Infographic from AYTM Data Dump Files. Rich snippets: testing tool improvements, breadcrumbs, and events. Webmaster Level: All Since the initial roll-out of rich snippets in 2009, webmasters have shown a great deal of interest in adding markup to their web pages to improve their listings in search results.

Rich snippets: testing tool improvements, breadcrumbs, and events

When webmasters add markup using microdata, microformats, or RDFa, Google is able to understand the content on web pages and show search result snippets that better convey the information on the page. Thanks to steady adoption by webmasters, we now see more than twice as many searches with rich snippets in the results in the US, and a four-fold increase globally, compared to one year ago. Here are three recent product updates. Testing tool improvements Despite the healthy adoption rate by webmasters so far, implementing the rich snippets markup correctly can still be a major challenge. If you’ve added markup in the past but haven’t seen rich snippets appear for your site, we encourage you to take a few minutes to try testing the markup again on the updated testing tool. Events. Introducing Rich Snippets.

Webmaster Level: All As a webmaster, you have a unique understanding of your web pages and the content they represent.

Introducing Rich Snippets

Google helps users find your page by showing them a small sample of that content -- the "snippet. " We use a variety of techniques to create these snippets and give users relevant information about what they'll find when they click through to visit your site. Today, we're announcing Rich Snippets, a new presentation of snippets that applies Google's algorithms to highlight structured data embedded in web pages.

Rich Snippets give users convenient summary information about their search results at a glance. To display Rich Snippets, Google looks for markup formats (microformats and RDFa) that you can easily add to your own web pages. And now with microformats markup: or alternatively, use RDFa markup. To prepare your site for Rich Snippets and other benefits of structured data on the web, please see our documentation on structured data annotations. No. What's next? Specify your canonical. Carpe diem on any duplicate content worries: we now support a format that allows you to publicly specify your preferred version of a URL.

Specify your canonical

If your site has identical or vastly similar content that's accessible through multiple URLs, this format provides you with more control over the URL returned in search results. It also helps to make sure that properties such as link popularity are consolidated to your preferred version. Let's take our old example of a site selling Swedish fish. Imagine that your preferred version of the URL and its content looks like this: However, users (and Googlebot) can access Swedish fish through multiple (not as simple) URLs. Or they have completely identical content, but with different URLs due to things such as a tracking parameters or a session ID: Now, you can simply add this <link> tag to specify your preferred version: inside the <head> section of the duplicate content URLs: Of course you may have more questions.

Is rel="canonical" a hint or a directive? Sitemaps.org - Home. The Web Robots Pages. The Periodic Table Of SEO Ranking Factors. The Fresh Rank Algorithm, Is It More Important Than PageRank. First of all let me confess the term ‘fresh rank’ has been stolen from fellow SEO blogger Justin Briggs, I am going to refer to one of his excellent posts throughout the rest of this one. You will no doubt know about Google’s new QDF upgrade, an algorithm tweak designed to get you to ‘fresh’ content quicker, rather than bringing up old static results. You can see an example of it here; They’re not site links but links to fresh content on the BBC for the search term ‘football’. Google has stated that this affects around 35% of search queries, don’t get that mixed up with searches.

Now that is all well and good but from my point of view I want to know a few key points; 1) How does Google decide what is fresh? 2) Is the link graph involved when deciding ‘freshness’? 3) How do links from these ‘fresh’ pages influence rankings for the taget website? I wrote a really short post a few months ago based on fresh links vs text links vs links placed in old content. How Does Google Determine Freshness? New York Times Exposes J.C. Penney Link Scheme That Causes Plummeting Rankings in Google.

Today, the New York Times published an article about a search engine optimization investigation of J.C. Penney. Perplexed by how well jcpenney.com did in unpaid (organic) search results for practically everything the retailer sold, they asked someone familiar with the world of search engine optimization (SEO) to look into it a bit more. The investigation found that thousands of seemingly unrelated web sites (many that seemed to contain only links) were linking to the J.C. Penney web site. And most of those links had really descriptive anchor text. The New York Times presented their findings to Google.

J.C. So where did J.C. “Link Schemes” and the Google Webmaster Guidelines The web is big. Google was launched on a foundation of PageRank: the idea that people link to things they like and find valuable, so a page with a lot of links to it is probably more useful than a page without very many links. Everyone, that is, except Google. Paid Links Undoubtedly, J.C. What Happened with J.C. How Does Google Work? Learn How Google Works: Search Engine + AdWords. PageRank. Mathematical PageRanks for a simple network, expressed as percentages. (Google uses a logarithmic scale.) Page C has a higher PageRank than Page E, even though there are fewer links to C; the one link to C comes from an important page and hence is of high value.

If web surfers who start on a random page have an 85% likelihood of choosing a random link from the page they are currently visiting, and a 15% likelihood of jumping to a page chosen at random from the entire web, they will reach Page E 8.1% of the time. (The 15% likelihood of jumping to an arbitrary page corresponds to a damping factor of 85%.)

Without damping, all web surfers would eventually end up on Pages A, B, or C, and all other pages would have PageRank zero. PageRank is an algorithm used by Google Search to rank websites in their search engine results. PageRank works by counting the number and quality of links to a page to determine a rough estimate of how important the website is. Description[edit] History[edit] where At. Google Raters - All About Google Quality Raters. Wow, my post about how Google makes algorithm changes sure got a LOT of attention. While I happened to think the post itself was pretty darn informative (if I can be so humble…lol), it turns out that the majority of folks visiting just wanted a copy of the 2011 Google Quality Raters Handbook.

Makes sense, but as most know by now, I was contacted by Google and had to stop sharing and linking to that document. So, let’s move on and talk about these Google Quality Raters. Who are they? What do they do? Google Quality Raters are out there rating not only organic search results, but also Google ads (AdWords) and Videos, and probably more things but those are the three types of raters I am sure of.

There is a good forum out there that is all “Quality Raters” info and discussion. The Raters I will be talking about today are the ones that rate the organic results – called Search Quality Raters. What Is a Google Search Quality Rater? “There are a few names for this position. Got all that? 1. 2.