See filter

A Guide to Google Webmaster Tools for SEO

Wed, 19/02/2014 - 16:33

Webmaster Tools can be viewed as Google’s primary line of communication with webmasters regarding the health of their sites and any issues that are in play (or could potentially be).

In addition to monitoring technical aspects (i.e. Index status, crawl stats and page errors), Google Webmaster Tools can also be used to help improve the performance of your website.

To get started with webmaster tools, simply log in using your Google account and submit your website using the “add site” button.

Verification will be required to prove that you’re the site owner before you’ll start seeing any data populating the account. From experience, the easiest way to do this is using the HMTL file upload or Meta tag methods.

Alternatively, your site can be verified by linking your Webmaster Tools account with your Analytics account. Keep in mind that this can only be done if you’ve implemented the asynchronous code to track site activity.

To get you started, here are some of the areas you can look at to gain valuable insights regarding the performance and compliance of your website:

1.) Site settings: Accessible via the gear cog tab on the top right of the Webmaster Tools dashboard, the site settings section can be used to set your geographic target (i.e. America, United Kingdom, Australia). Note that the ability to edit appears to be dependent on the top-level domain (TLD) registered. For example: a domain is locked to target only South Africa, while a .com domain is more universal and can be targeted to various countries.

It’s worth changing this setting if you’ve registered a .com domain and you’re targeting users in South Africa, for example, to indicate to Google that this is your intention.

2.) HTML improvements: This is accessible via a dropdown in the search appearance section. Here you’ll see a list of URLs that have duplicate, missing or shortened Meta data that you can use to improve the overall optimisation of your site.

3.) Sitelinks: These refer to the collection of links that appear under the primary listing when a brand-related search for your website is done.

These are the pages on your site that Google has determined to be the most important (in addition to the home page). If a page is listed here but you’d rather not promote or place emphasis on it, you can “demote” it.

Note: You can’t list the pages that you want to appear and can only “demote” those that you don’t want to show up.

4.) Structured data: Here you can see all the structured mark-up (if you have any) that has been used on your site by source and data type. You can see the URLs where the structured data has been used and any items that have errors.

5.) Search queries: This is a key section when looking at ways to improve performance.

Here you can see the queries that resulted in visits to your site as well as impressions, clickthrough rate (CTR) and the average position your site was ranking for that keyword. This section can be viewed by top queries or top pages.

The top pages section shows which pages received the most impressions and clicks, and what their avg. position was in the SERPS.

There are many insights you can glean from this information. For example, what type of content people are looking for.

Another example would be the top pages results that could be used to improve Meta data optimisation. For the pages that resulted in a high number of impressions and clicks, but lower CTR, focus could be placed on improving optimisations and calls to action to improve click-through rates.

Note: The query data in Webmaster Tools is aggregated. This means that data is only shown in reports once a particular keyword returns your pages in the Google search results a minimum number of times. Some of the queries are also filtered out due to privacy issues.

6.) Index status: This section shows the total number of pages from your site that have been indexed, blocked by robots.txt, removed or crawled. Forget about the basic tab and head straight for the advanced one in this section. You can use the graph on this page to identify any sudden drops in indexed pages and then investigate further.

This section recently helped in identifying an indexing issue one of our clients had. By cross-referencing the blocked URLs with total indexed URLs. We were able to establish the robots.txt file had disallows for key pages that should not have been blocked and had resulted in a massive decline in the overall number of URL’s indexed.

7.) Remove URLs: If a page on your site has been indexed and wasn't supposed to be in the first place, simply create a removal request in this section.

8.) Crawl errors: This section can be used to identify server errors and not found (404) errors. Once redirects have been implemented for 404 pages, be sure to mark them as fixed.

Given that Google is the most widely used search engine, using webmaster tools to ensure that your website is compliant with their guidelines is definitely best practice.

To learn more, visit Google webmaster tools help section.