background preloader

SEO Webmaster Developer Tools

Facebook Twitter

Google site search is on the way out. Now what? Google site search is on the way out. Now what? How To Avoid SEO Disaster During a Website Redesign - Top Marketer Concerns - Search Engine Journal. Last week, I presented at The New Jersey Communications, Advertising and Marketing Association (NJ CAMA) about how to avoid SEO disaster during a website redesign.

How To Avoid SEO Disaster During a Website Redesign - Top Marketer Concerns - Search Engine Journal

Specifically, I covered how to avoid losing a majority of your rankings and organic search traffic when redesigning a website or migrating to a new CMS. During my career, I’ve unfortunately had to help companies who didn’t take the necessary precautions before redesigning their sites, and ended up paying a heavy price. For example, some companies lost 60-70% of their natural search traffic after a redesign or migration. After the presentation, I received a lot of questions about the subject overall, but there were several topics that stood out (and seemed to strike a nerve). That got me thinking about how many other people were also confused and concerned about those hot topics. You Want Scary? Before I begin, let’s take a look at a few graphs that will scare the daylights out of any digital marketer. 1. 2. 3. 4. 5. 6. 7. 8. Which pages to "noindex" Nofollow and ecommerce cart/checkout pages. Search Engine Optimization Blog.

Anonymous asks: What does “noydir,noodp” mean?

Search Engine Optimization Blog

The noydir and noodp are Meta Robots commands that instruct search engine bots on how to handle the usage of data for your site in the Yahoo! Directory and the Open Directory Project/DMOZ Directory. The noydir command corresponds to the Yahoo! Directory and the noodp command correlates to DMOZ. Google Downplays Use Of Priority & Change Frequency In XML Sitemap File. Indicate paginated content - Search Console Help. Sites paginate content in various ways.

Indicate paginated content - Search Console Help

For example: News and/or publishing sites often divide a long article into several shorter pages. Retail sites may divide the list of items in a large product category into multiple pages. Discussion forums often break threads into sequential URLs. If you paginate content on your site, and you want that content to appear in search results, we recommend one of the following three options. Do nothing. Indicate paginated content - Search Console Help. Featured snippets in search - Search Console Help. When a user asks a question in Google Search, we might show a summary of the answer in a special featured snippet block at the top of the search results page.

Featured snippets in search - Search Console Help

This featured snippet block includes a summary of the answer, extracted from a webpage, plus a link to the page, the page title and URL. A featured snippet might look something like this on the page: Where does the answer summary come from? The summary is a snippet extracted programmatically from a webpage. What's different with a featured snippet is that it is enhanced to draw user attention on the results page. How to Create a Request a Quote Form in WordPress. Search Engine Optimization Blog. The Hidden Power of Nofollow Links. Robots.txt Noindex: the best-kept secret in SEO - DeepCrawl. Hands up who knew you can add noindex directives within your robots.txt file before John Mueller spoke about it in the Google Webmaster Hangout this week?

Robots.txt Noindex: the best-kept secret in SEO - DeepCrawl

Yes, that’s a thing. And it’s not a new thing, either: we know Google have supported this feature for at least seven years, because Matt Cutts first mentioned it back in 2008. DeepCrawl has also supported it since 2011. NoIndex for SEO. Using the NoIndex is like saying "Don't remember this page" Generally in SEO we want our pages in the Google indices, because a page that is not indexed cannot be found via search.

NoIndex for SEO

The NoIndex code snip does exactly what it implies; it tells Google to NOT index the page, so that it cannot be found in search. How Do I Noindex URLs? - Yoast Knowledge Base. Yoast SEO provides multiple options for setting a URL or group of URLs to noindex.

How Do I Noindex URLs? - Yoast Knowledge Base

Individual Post, Page or Other Taxonomy Item Example: example.com/this-is-a-post-or-page/All Categories, Tags or Other Taxonomies Example: example.com/category/my-favorite-topic/Subpages of Archives Example: example.com/category/my-favorite-topic/page/5/Special PagesExample: Search and 404 pagesEntire Site Example: example.com/ Noindex an Individual Post, Page or Other Taxonomy Item The advanced tab (gear icon) inside the Yoast SEO meta box on the edit screen of the item allows you to set custom robots information like noindex, nofollow, and other advanced robots settings.

The layout may vary as the example below is from a default post type. Noindex All Categories, Tags or Other Taxonomies If you want to noindex a group of items, enable this feature using the steps below. By default, search pages and 404 pages will be set to noindex, follow. How to Optimize Your Shopify Store for Search Engines Using Screaming Frog SEO Spider 2.30.

SEO Spider Configuration. Other Configuration Options URL rewriting The URL rewriting feature allows you to rewrite URLs on the fly.

SEO Spider Configuration

For the majority of cases, the ‘remove parameters’ and common options (under ‘options’) will suffice. However, we do also offer an advanced regex replace feature which provides further control. Remove Parameters This feature allows you to automatically remove parameters in URLs. If the website has session IDs which make the URLs appear something like this ‘example.com/? The SEO Spider will then automatically strip the session ID from the URL. Options. Learn about robots.txt files - Search Console Help. A robots.txt file is a file at the root of your site that indicates those parts of your site you don’t want accessed by search engine crawlers.

Learn about robots.txt files - Search Console Help

The file uses the Robots Exclusion Standard, which is a protocol with a small set of commands that can be used to indicate access to your site by section and by specific kinds of web crawlers (such as mobile crawlers vs desktop crawlers). What is robots.txt used for? Non-image files For non-image files (that is, web pages) robots.txt should only be used to control crawling traffic, typically because you don't want your server to be overwhelmed by Google's crawler or to waste crawl budget crawling unimportant or similar pages on your site. Robots.txt Noindex: the best-kept secret in SEO - DeepCrawl.

Submit your updated robots.txt to Google - Search Console Help. The Submit function of the robots.txt Tester tool allows you to easily put in place and ask Google to more quickly crawl and index a new robots.txt file for your site.

Submit your updated robots.txt to Google - Search Console Help

Update and notify Google of changes to your robots.txt file by following the steps below. 1. Click Submit in the bottom-right corner of the robots.txt editor. Using Noindex, Nofollow HTML Metatags: How to Tell Google Not to Index a Page in Search. Indexing as many pages on your website as possible can be very tempting for marketers who are trying to boost their search engine authority. But, while it’s true that publishing more pages that are relevant for a particular keyword (assuming they’re also high quality) will improve your ranking for that keyword, sometimes there’s actually more value in keeping certain pages on your website out of a search engine’s index. Download our free SEO ebook here for more search engine optimization tips from experts. ...

Say what?! Stay with us, folks. Submit your updated robots.txt to Google - Search Console Help. Create a robots.txt file - Search Console Help. In order to make a robots.txt file, you need access to the root of your domain. If you're unsure about how to access the root, you can contact your web hosting service provider. Also, if you know you can't access to the root of the domain, you can use alternative blocking methods, such as password-protecting the files on your server, and inserting meta tags into your HTML.

You can make or edit an existing robots.txt file using the robots.txt Tester tool. Learn how to block URLs - Search Console Help. Learn about robots.txt files - Search Console Help. The Web Robots Pages. In a nutshell Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol. It works likes this: a robot wants to vists a Web site URL, say Before it does so, it firsts checks for and finds: User-agent: * Disallow: / The "User-agent: *" means this section applies to all robots. The "Disallow: /" tells the robot that it should not visit any pages on the site. There are two important considerations when using /robots.txt:

How to Look at Your Website the Way Google Does. When you spend months or years on a website, not to mention thousands of dollars, it’s hard to step back and look at it objectively. Can you look at it through the eyes of your users? Can you look at it the way Google does? If you can look at your website the way Google does, you’ll probably discover areas in which your website needs work. So in that spirit, I’m going to teach you how you can see your website from Google’s perspective, and how you can then target the areas that need improvement. Robots.txt and Meta Robots - SEO Best Practices.

Page-loading Speed

How to Properly Implement a 301 Redirect - Bruce Clay, Inc. What is a 301 redirect? A 301 redirect is a command used to tell the search engines that a page has permanently moved, and that you want them to index the new page and drop the old one from their index. Think of it as a change of address card for the web. As long as everything is done correctly, a 301 redirect will ensure that you keep the rankings earned by the old page and prevent duplicate content that could arise if the engines were to index both versions of your site. Read on for information about: How to Implement 301 Redirects Using .htaccess for Apache Make sure you have access to your server and your Apache configuration file, and that you can use your .htaccess files.

Once you know that you have access to this file, your next step is to locate it. ManageWP Orion – Manage Multiple WordPress Websites from a Single Dashboard. If you have more than one website, maintenance can start taking a lot of your time. Google Says Do Not Use Noindex In Robots.txt. Htaccess Redirects & Rewrite Generator for SEO by @aleyda. "This site may be hacked" message - Search Help. The "This site may be hacked" notification won't be removed until the webmaster of the site takes action. Try these steps to fix your website: Take Your WordPress Website Back To School - WP Shrug. As summer comes closer to an end and everyone is going back to school, it’s the time of year when we spend more time inside reading, studying, researching, and just being online in general. This means readers will be searching through websites like yours. Philadelphia - Girl Develop It. iFrame Generator - Free Online iFrame Code Maker Tool. CcTLDs - Learn SEO.

The web is Doom - mobiForge. In July 2015, inspired by something Mat Marquis said at TXJS 2015, I suggested that the average web page weight would equal that of the Doom install image in about 7 months time. Well, we’ve made it, albeit a bit later than expected. This is where we are today: Subdomains vs. Subfolders, Rel Canonical vs. 301, and How to Structure Links for SEO. There are some basic questions about SEO that come up really frequently, and it's often easy to assume an answer that isn't exactly right. In today's Whiteboard Friday, Rand tackles three of them: Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. Google Says Self-Signed SSL Certificate Won't Work For HTTPS Ranking Boost. We know Google has an HTTPS Ranking Boost.

Move a Site (incl. to HTTPS)

Head Of Google's Web Spam Team Matt Cutts Is Going On Leave. Cómo configurar el seguimiento de comercio electrónico - Ayuda de Analytics. La función de seguimiento de comercio electrónico está disponible tanto para las propiedades de sitios web como para las de aplicaciones, pero antes debe habilitar dicha función en el nivel de vista. Un programador cualificado debe seguir las instrucciones de la Guía para programadores correspondientes al entorno que utiliza (consúltelas a continuación) a fin de llevar a cabo la configuración técnica adicional. Para que sus informes muestren datos de comercio electrónico debe configurar el seguimiento de comercio electrónico. Cómo habilitar el seguimiento de comercio electrónico Haga clic en Administrador en la barra de menús situada en la parte superior de cualquier página de Analytics.

Utilice los menús desplegables para seleccionar la Cuenta, la Propiedad y la Vista. Analytics. Effects of a long-term holding page/503 http code whilst site is being rebranded? How to Contribute. AMP Issues Please file any feedback you have about the actual project at the amphtml issue tracker. Web Tools - Google Developers. Create Your First AMP Page. How Loading Time Affects Your Bottom Line.

Page loading time is obviously an important part of any website’s user experience. How to Solve Keyword Cannibalization - SEO Tips. Keyword cannibalization isn't an issue that's in the SEO forums much, nor is it something that many SEOs feature prominently in site reviews (at least, from my experience), but it can be detrimental to potential rankings for several different reasons. First, I'll illustrate how keyword cannibalization happens. The Vary: User-Agent HTTP header guide. Hobo Best SEO Tools for 2016. Google Pagerank Update History (Last Toolbar Update December 2013) Groupes. Indexing HTTPS pages by default. Canonical URLs in Yoast SEO - Yoast Knowledge Base. What it is and how (not) to use it. OnPage.org: Product, Pricing and Plans. Yahoo Mobile Search Gets AMP Support, Twitter Integration, New Image Search & More. Structured Data Testing Tool  

Download the Free MozBar SEO Plugin. Structured Data Testing Tool   Bounce Rate - Analytics Help. Data anomalies in Search Console - Search Console Help. Index Status report - Search Console Help. Updating Our Search Quality Rating Guidelines. Official Google Webmaster Central Blog: Updating Our Search Quality Rating Guidelines. GWMTs: Sitemap XML files Indexed vs Submitted vs Index Status Vastly Different. Should I care?

Google Webmaster Tools Index Status is 0 but sitemap URL shows indexed. Google Webmaster tools Sitemap submitted vs indexed vs Index Status. Creating A Sitemap.xml file for an https site query. FAQ: Search Console / Webmaster Tools - Google Webmaster Help. Using. FTP for Beginners. SEO best practices for URLs. SEO Cheat Sheet: Anatomy of A URL.

#28604 (Change default robots.txt) Why might Googlebot get errors when trying to access my robots.txt file? Google Panda 4, and blocking your CSS & JS. How to Optimize Your WordPress Robots.txt for SEO. #SMX Liveblog: Advanced Technical SEO Issues. Don't block Googlebot from crawling JavaScript and CSS. Select your mobile configuration — Webmaster's Mobile Guide. URLs blocked for smartphones - Search Console Help. Optimise Web's Mass 301 Redirects for 404 Pages. Web Fundamentals - Google Developers. Matt Cutts on How Google Handles 404 & 410 Status Codes. Magento SEO - The Definitive Guide. Warning About Contents in a JavaScript Array & Indexing in Google. Landing pages module for Magento - Yoast - Tweaking Websites. Learn about robots.txt files - Search Console Help. Google Adjusted How It Shows Index Count Estimates. Weekly SEO Recap: Panda U-Turns and new Bing tools. Matt Cutts on How Google Handles 404 & 410 Status Codes. About our stats and data - Search Console Help.

How to use Google and Bing Webmaster Tools. Sitemap errors - Search Console Help. Crawl Delay in Robots.txt - Am I Missing Something? : bigseo.