Google Webmaster Tools is a suite of free tools that not only makes it easier for Google to crawl your entire website, but also helps to improve your search engine ranking position. It offers variety of functions to the webmasters both for advance users and beginners. One more tools that can be use with Google webmasters is Google analytics by which you can find various traffic details about our website.
Why to Use the Google Webmaster Tool for Your Website?
Google webmaster tool offers a lot of different functions like:
- Check search queries which brings most of the visitors to your site.
- List of internal links and links to your site
- Sitemap submission for better visibility.
- Generate Robots.txt file and remove certain URLs from SERP.
- Check and adjust crawl rate and statistic about crawling of your site by Google.
The above list is too short to explain the features of Google webmasters features. Now it is confirms that using the webmaster tool offered by the Google is very useful and we should use it for website or blog.
How to Use the Google Webmaster Tool for your Website?
To enable the Google webmaster for your website you need to follow the simple procedure.
Step 1: Go to https://www.google.com/webmasters/tools/ and click on sign up for new webmaster account. Enter your Google account details and click OK.
Step 2: After sign up click on ADD SITE. Enter URL of your site and press next then you will reach to following page.
In this step you need to verify your site ownership. If you are on a self hosted WordPress site then just download that html file and upload to your website (either use FTP or directly by login to your hosting control panel).
Step 3: After verifying your authorship with the site now you will reach the webmaster dashboard page. As shown below
In this dashboard you can see number of options. Now you are successfully sign up for you Google webmaster account and you can use it to improve the site performance in SERP.
Here are the fundamental aspects of Google Webmaster Tools and how to use them:
Table of Contents
- 1. Configuration
- 1.1 Settings
- 1.2 Sitelinks
- 1.3 URL Parameters
- 1.4 Change of Address
- 1.5 Users
- 1.6 Associates
- 2. Health
- 3. Traffic
- 4. Optimization
- 5. Labs
Configuration enables you to customize various parameters and settings relating to your site. For example, you can target your website to a specific country, state the preferred form of your URL and add users and owners to your Webmaster Tools account, providing them with certain access, or permissions, to your account.
By using the settings feature of Google Webmaster Tools, you can set a geographical preference (the country you want your website to target), set the preferred form of your domain name and change the speed with which Google crawls your website.
1.1.1. Geographic Targeting
If you site has a generic top level domain, for example.com or.net, then you can use this setting to state any preferred geographical area you wish to target. Google’s listings/rankings for each page vary according to the country in which the search results are published. Therefore, without geographic targeting, or if you select ‘unlisted’, the visible ranking of each of your web pages in each country will depend upon your IP address and the source of the backlinks to your page.
If you want your pages listed predominantly in the USA, for example, then you must select USA in the drop down list offered on the Geographic Targeting page of the ‘Settings’ option. If your site has a country-specific TDL, such as .uk or .aus, then you cannot change the geo-targeting setting: that is set to the country indicated by your top level domain.
1.1.3. Preferred Domain
You can use the ‘preferred domain’ option to tell Google whether you want your website listed as www.yoursite.com or yoursite.com – each of these could be listed separately. This would dilute your ranking, because links, for example, may be split between the two, rather than accumulating to one domain.
Check the version you want use from this menu, and Google will redirect links and traffic from one version to the other. If you have registered both forms, it may be worthwhile using a 301 Redirect to redirect traffic from the secondary to the preferred domain. You may also have to verify both formats.
1.1.3. Crawl Rate
Google’s crawl rate is not the crawl frequency, but the speed with which your root domain or subdomain is crawled (i.e. usually your Home Page or subdomain home page). It is usually best to permit Google to choose its own crawl rate, but if this is affecting your bandwidth, then you can change it.
Google can include what are known as ‘sitelinks’ in its listing for a web page. These are automatically generated by an algorithm that determines that clicking on this link would offer a shortcut for a visitor to find further relevant information to the query their query on your website. You can ‘demote’ a sitelink for 90 days if you do not want it listed in Google’s SERPs.
1.3. URL Parameters
I some cases, specific web pages can be reached by more than one form of the URL. If the same page (i.e. the same content) can be reached using different forms of the URL, then Google sees that as duplicate content, and this can negatively affect your ranking. You can state a canonical, or preferred, URL.
You can use the ‘URL Parameters’ tool in Webmaster Tools to include parameters that tell Google how to handle the alternative versions of a URL. This is a complex subject for the beginner, so click the above link for more information.
1.4. Change of Address
You should inform Google when you move a website to a new domain. Google Recommend you follow a specific procedure for this to avoid losing your current ranking position.
1.4.1. First copy your content to your new domain. Do not close your original site just yet, because your new domain must be populated and verified first for a seamless switch. It is recommended to move a directory or section of your website first and check if everything is working correctly.
1.4.2. Set up a 301 Redirect for each page, so that visitors to your original web pages are redirected to the new site. More information on 301 Redirects here. Check that the redirects are working – both for you and for all links leading to the redirected pages, including external links.
1.4.3. Add the new site to Webmaster Tools and verify ownership. Then use the box in the ‘change of Address’ section of Webmaster Tools to inform Google of the new address.
You are also advised to inform owners of the external links of your new address, and maintain the 301 Redirects on the old address until this has been completed as far as you can.
You can use the ‘Users’ parameter to add or remove users or owners. Click either ‘Manage Site Owners’ or ‘Add a New User’ to the right of the page. You can create a new owner who has exactly the same permissions on your site as you have, or you can restrict owners to a subdomain. Users have certain permissions, such as they cannot handle Google Analytics on your site, and you also restrict Webmaster Tools permissions to view only.
Associates are not permitted to view site data or alter any parameters, but have restricted permissions to work on your behalf. Get more information on Associates here.
The ‘Health’ section of Google Webmaster Tools provides information on any problems associated with your website. For example, it points out crawl errors that are preventing Googlebot to access your site and offer details of any specific error. This section also lets you know whether or not your robots.txt file is working correctly, allows you see how Google’s algorithm is seeing your site and gives you information on how your pages are being indexed.
2.1. Crawl Errors
When you check out the Crawl Errors page on Google Webmaster Tools, you will find two sets of results: a) Site Errors, and b) URL Errors
Site errors provide a list of errors that prevent any access of Googlebot to your site (Googlebot is Google’s main site crawling algorithm). The URL errors section provides a list of the URLs that Googlebot (spider, crawler, algorithm) could not find. Details of a specific error can be established by clicking on the listed URL.
You should correct these errors which may be due to an internal link to a page that has been deleted, or to a link to an external page that no longer exists. The number of URL errors can affect your ranking, so do your best to resolve the problem, even if it means removing the link from your page.
2.2. Crawl Stats
The crawl stats offer an indication of the Googlebot activity over the past 90 days (equate Googlebot with the Google spider). The stats are in the form of graphs that tell you how many pages were crawled each day, the time spent to download a page and how many Kb were downloaded each day. It lets you know whether your website is or is not being adequately visited by Google.
2.3. Blocked URLs
This section informs you whether or not your use of the robots.txt file is effectively blocking sites from Googlebot that you want blocked. For example, you may have a number of different pages offering the same information in different languages: Google sees these as duplicate content. You avoid this by blocking all but the page in your preferred language. This section lets you see if your use of the robots.txt file is working properly.
2.4. Fetch as Google
This tool enables you to see your web pages as Googlebot sees it. If a spammer has added text to your source code that you cannot see on the page, you can find this by using this tool. If you feel your web page is not being listed as high as it should, you might find out why by using ‘Fetch as Google‘ – you might spot content that you did not know existed.
2.5. Index Status
This tool informs you how many of the pages on your website Google has been able to index (not necessarily listed). It tells the total number of pages indexed, the number ever crawled, the number not selected for indexing because they are too similar to other pages on the web (duplicate content), and the number of pages blocked by your robots.txt file.
You should look for a gradual increase in the pages indexed: in fact you should be adding pages to your site on a regular basis for best results. If you see a drop, then you may have server or Googlebot access problems. For more information on understanding these graphs, check Index Status here.
Google can detect any malware on your website. If any malware is reported then Click Here for Google’s recommended action.
The Traffic section relates to exposures of your web pages to Google search engine users for specific keywords, information on popular keywords being use for your type of website, and also details of impressions and clicks to visit your website. You will also find information relating to your ranking, and also the internal and external links that will affect your Google PageRank.
3.1. Search Queries
This tool offers you information related to the search queries that have resulted in the searcher being offered pages from your website. In other words, the keywords or search terms used by Google users searching for information online that resulted in your web pages being listed. The listing position could be on any page in Google’s listing for any query (i.e. search). The following information is provided:
This gives the total numbers of searches whereby your pages were provide to the searcher over the period convened – generally the last 30 days, but you can select a different period. The last period is compared with searches for the period prior to that.
3.1.2. Query List
The query list provides you with a list of keywords used by Google users that resulted in your web pages being displayed at least once – sometimes twice if there are too many results. Data for the top 2,000 queries (keywords) are listed. This can give hints as to the keywords to focus on for Adsense sites.
The impressions refer to the frequency with which pages from your website appear in the SERPs, and the % daily change compared to the last period. The tool uses the last 30 days, but you can change this. This tells you whether or not work is needed on the individual pages to maintain traffic.
The above information refers to appearances, or listings. ‘Clicks’ refers to how many times your listing attracted clicks. While your SEO might be good enough to get a listing, it may be on Page 50 = position 500). Alternatively, even if your page did get a page #1 listing, perhaps your Title or Description Meta tags might not be inspiring enough to attract a click.
3.1.5. Click Through Rate (CTR)
The % of impressions of your page in the results that end up getting a click to your web page compared with the previous period. Thus, if your CTR increase from 25% to 30%, this column would show +5.
3.1.6. Average Position
The average position in the rankings for that specific search term (keyword) over the period chosen (default 30 days), and how this has changed relative to the previous period of the same length. Green is good! The top ranked page only is taken for the average, so if one person finds your site in positions #1 and #5 for a specific search term , and the other in #3 and #8, then the Average Position will be #2: (1+3)/2.
3.2. Links to Your Site
This tool lists all the web pages that are linking to your website (your backlinks). It also provides information on which URLs are providing most links, and the pages on your website that are receiving most links. Note that the two forms of your website URL – http://www.yoursite.com and http://yoursite.com – are listed separately, so make sure you have each added to Webmaster Tools. It is highly recommended that you select a preferred domain, then both will effectively be regarded as the same. These links, and their quality, determine your Google PageRank – an important factor in your search engine ranking.
3.3. Internal Links
This tool reports pages that have links from other pages on your website. This indicates to Google your opinion of the relative importance of each page in your site. You can maximize the internal PageRank for specific pages by the way you organize your internal linking structure. For example, do not interlink every page between each other but offer links from other pages to those you want rake highest. However, each page should have at least one link in, or it is classed as an ‘orphan’ page. Your internal lining strategy is an important factor in the PageRank for specific pages in your website.
Optimization refers to search engine optimization in terms of making it easy for search engines to visit your site.
Fundamentally, a sitemap is an XML file listing all the pages on your website and how they are interlinked. This facilitates Google’s scanning your site, an enables Googlebot to find pages it might otherwise have missed. There is no guarantee, however, that all of your web pages will be visited and indexed.
A sitemap is particularly of use to tell Google more about any videos, audio files or graphics on your site. They are also useful if your site contains dynamic content that can change regularly, and if your internal linking structure is inefficient and incomplete. A sitemap is also useful if your website is new, and you do not yet have many backlinks that will lead Googlebot to your web pages.
Your sitemap should always be up to date, so regenerate it each time you add or remove a page. There is no need to resubmit it – simply change the old sitemap in your root directory with the new one. You can find a site map generator here or use a plugin if you have a blog.
4.2. Remove URLs
The Remove URL tool is useful if you want any content on your website removed from Google’s index, but want the bulk of the pages listed to remain indexed. This should only be used to urgently remove indexed content, such as confidential information you have included by accident. It must not be used simply to take an old or unwanted site offline, or to remove any pages you have deleted. All you do with page removal is to change your sitemap.
4.3. HTML Improvements
This tool checks your Meta data in the Head section of your HTML for problems. It reports on duplicate Title tags and Meta descriptions and checks if they are too long or too short. It also reports on non-indexable content. It is important to act on the information provide in this section of Webmaster Tools. The Meta data are given a high weighting by Google in its ranking algorithm.
4.4. Content Keywords
The Content Keywords section lists what Google see as the most significant keywords and variants for the site. If the keyword to the top of this list to not conform to your intended focus of you site, then you should check your content and perhaps revise those that are rich with those found by Googlebot. You can click on the keywords to find a representative sample of the pages on which they appear. It might be worthwhile comparing this list with your list of queries, and compare the keywords you are using with those being used by Google customers seeking information on your niche.
4.5. Structured Data
Structured data refers to such components of a web page as price lists, reviews and menus. Google can generate detailed information for viewers on the results page such as menu price ranges, average review ratings and even photographs of menu dishes. This information comes in the form of rich snippets, and can be generated for these content types: people, products, music, recipes, reviews, businesses and events.
4.6. Other Resources
There are currently three ‘Other Resources’ on the Google Webmaster Tools website. These refer to:
4.6.1. Rich Snippets Testing Tool: You can use this tool to test whether or not your data are structured sufficiently for Google to create a rich snippet.
4.6.2. Google Places: You can have your business listed in the top positions for local searches by using Google Places – a free of charge listing service. The service lists the top 7 local businesses according to your keywords between the top sponsored listings and the regular search results. This is a very useful service for those whose products are geographically specific, such as local restaurants and stores. Your site can be listed above all the others outside of your immediate area.
4.6.3. Google Merchant Center: This is where you can upload data about your products, so that it can be found by ‘Google Product search’ and other search services offered by Google. The Google Merchant Center is a useful service for those offering products online.
Google Labs offers experimental features for testing that have not yet been generally released. You can use them, although they can be changed or removed at any time.
5.1. Author Stats
Search statistics are provided for web pages for which you are the verified author. To become an author, you first have to generate a Google+ profile with a recognizable headshot photo. You must then verify that you are the author of a website, or other web content, by one of the two methods detailed On This Page.
5.2. Custom Search
This page in Google Webmaster Tools gives you access to a customizable search box for your own websites, and also for other sites and web pages. You can customize the appearance of the results to suit your needs.
5.3. Instant Previews
Enables you to enter any URL, including subdomains and internal pages on you website, and receive a preview of how it will look on your screen.
5.4. Site Performance
This tool measures the time taken on average to load the pages on your website. It also shows you how the load time has varied over the past few months.
These are the main aspects of Google Webmaster Tools of which you should be aware. Each tool has a specific use, and you will likely be better to learn how to use each properly in turn, than to attempt to apply them all at once. Understand each tool, and why you are using it, or it is easy to misinterpret the results.
Since it is estimated that Google takes around 65%-67% of all searches, it makes sense to take the firm’s advice on how to improve your website. When you add Google Analytics to all of this, you have a very powerful and comprehensive suite of tools to tell you just about everything you need to know about your website or blog – except how to make money! That is up to you, although it’s a lot easier with a page #1 ranking!