background preloader

Perfecting Keyword Targeting & On-Page Optimization

Perfecting Keyword Targeting & On-Page Optimization
(Last Updated: October 24, 2014 by Rand) How do I build the perfectly optimized page? This is a challenging question for many in the SEO and web marketing fields. There are hundreds of "best practices" lists for where to place keywords and how to do "on-page optimization," but as search engines have evolved and as other sources of traffic — social networks, referring links, email, blogs, etc. — have become more important and interconnected, the very nature of what's "optimal" is up for debate. My perspective is certainly not gospel, but it's informed by years of experience, testing, failure, and learning alongside a lot of metrics from Moz's phenomenal data science team. A) Have the best opportunity to rank highly in Google and Bing B) Earn traffic from social networks like Twitter, Facebook, LinkedIn, Pinterest, Google+, etc. larger version In the old days of SEO, "on-page optimization" referred merely to keyword placement. Uniquely valuable Provides phenomenal UX Crawler/bot accessible

On Page SEO Guidelines and Tips | Azure Web Design On Page SEO is the process of optimizing your HTML pages for the search engines’ perusal. It is by no means an exact science, but Google and all the other search engines have published guidelines for us to follow to better rankings. Server and File Settings Make sure that there is only one version of your site – 301 redirect all non www. HTML Title and Head Use targeted keyword in the <title> but keep in mind what visitors would see. Your Content Use only one <h1>tag per page.Use alt tags in images, links, etc. Linking Implement “rel=nofollow” links to unimportant pages within your site.

Robots.txt Tutorial How to Create Robots.txt Files Use our Robots.txt generator to create a robots.txt file. Analyze Your Robots.txt File Use our Robots.txt analyzer to analyze your robots.txt file today. Google also offers a similar tool inside of Google Webmaster Central, and shows Google crawling errors for your site. Example Robots.txt Format Allow indexing of everything User-agent: * Disallow: or User-agent: * Allow: / Disallow indexing of everything User-agent: * Disallow: / Disawllow indexing of a specific folder User-agent: * Disallow: /folder/ Disallow Googlebot from indexing of a folder, except for allowing the indexing of one file in that folder User-agent: Googlebot Disallow: /folder1/ Allow: /folder1/myfile.html Background Information on Robots.txt Files Robots.txt files inform search engine spiders how to interact with indexing your content. When you block URLs from being indexed in Google via robots.txt, they may still show those pages as URL only listings in their search results. Crawl Delay

301 Redirect - How to create Redirects 301 Redirect 301 redirect is the most efficient and Search Engine Friendly method for webpage redirection. It's not that hard to implement and it should preserve your search engine rankings for that particular page. If you have to change file names or move pages around, it's the safest option. The code "301" is interpreted as "moved permanently". You can Test your redirection with Search Engine Friendly Redirect Checker Below are a Couple of methods to implement URL Redirection via code and htaccess redirect IIS Redirect In internet services manager, right click on the file or folder you wish to redirect Select the radio titled "a redirection to a URL". ColdFusion Redirect PHP Redirect ASP Redirect ASP .NET Redirect JSP (Java) Redirect CGI PERL Redirect $q = new CGI; print $q->redirect(" Ruby on Rails Redirect def old_action headers["Status"] = "301 Moved Permanently" redirect_to " end Redirect Old domain to New domain using htaccess redirect

Related: