Google Webmaster Tools Tutorial – Part 2: Integrating Webmaster Tools into Your Website – Continued

Google Index

The google index area highlights your website’s index status, content keywords and the ability to remove some of the URLs for your website from the search results. These features can be handy if you have private information that you don’t want to be available to the public, but make sure you have read the rules first.

Index Status

Within the index status page you can view the stats on how many pages of your site have been crawled and indexed by Google bot. You can also notice in the below image that Google is able to implement site updates into the report, and in our case, has combined data from both our www.example.com domain and our example.com domain. Additionally, if you click the “advanced” tab you can see how many pages have been blocked from crawling. You can take this data and compare it against how many pages you specifically set to be blocked from Google bot, and take the appropriate actions necessary to correct any problems.

indexstatus

Content Keywords

The content keywords tab allows you to view the most prominent keywords on your website and how Google’s algorithm has ranked them. To view one in more detail, simply click on a keyword and you will be taken to a new page that includes variants of said keyword and the top URLs that Google has found on your website that request the keyword you selected.

contentkeywords

Remove URLs

If you ever need to hide certain pages of your website from Google’s search result, this is the place to do it. If you haven’t already declared a page not to be crawled within your Robots.txt file, you can request that Google removes the page from their listings. To do this, simply click the “create a new removal request” button and enter the page’s URL that you don’t want have crawled.

urlremoval

After you click continue you will be taken to a new page that verifies the URL you have specified for index removal as well as a drop down menu asking for a reason for removal. Depending on what you are trying to accomplish, you can choose from removing the page from both search results and the search results cache, just the search results cache, or remove all the pages that are underneath that directory. When you’ve made your decision, click “submit request” and Google will review the URL and your reason from removing it.

urlreason

Crawl

Within the crawl area you will find tools that help you discover crawl errors, make sure Google can see your webpages, see your crawling stats, test your robots.txt file and view your uploaded sitemaps. All of these tools are here to help you make sure your site is fully visible to Google and the spider is able to fully crawl all parts of your site that you want it to, so it is worth taking a few minutes and making sure everything is in order.

Crawl Errors

On the crawl errors page Google will let you know if it has had any issues while looking through your website. Problems may range from if your DNS is accessible, is your server online and was the bot able to fetch your robots.txt file properly. You’ll also notice a graph in the middle of the page that plots the number of not followed URLs as time goes on. If you have any that are not followed, it is worth looking into to see why they were not followed by the bot in the first place.

Crawl Stats

The crawl stats page gives you an insight of the raw numbers between your website and Google. On this page you’ll find that the pages crawled per day, kilobytes downloaded per day by the bot as well how long it takes the bot to download the pages are all featured with the high, average, and low values provided.

Fetch as Google

This tool can be great if you believe you are having a problem with Google crawling your website as a whole or just a specific part of it. To get started, just enter your URL in the provided field and click fetch. If Google is able to crawl the site successfully, you will receive green check marks underneath the status column. Additionally, this tool can be used as you add new pages to your site and wish to submit them to the index sooner than Google bot can find them and index them on its own. To do this, simply click the “submit to index” button. A box will pop up asking if you want to crawl only that specific URL or both the URL and its direct links.

fetch

Robots.txt Tester

The robots.txt tester is as straightforward as it sounds. If you want to make a test run with new updates to the file you can test it here before you fully implement it on your website to double check everything can still be crawled by Google bot. Per our example below, we have added a couple of directories that are part of a WordPress installation that we don’t want to be crawled.

tester

URL Parameters

As Google directly states, unless Google bot is experiencing problems with your website you shouldn’t mess around with these settings as misconfiguration can cause your website to become completely deindexed from Google. If you wish to learn more about this function, check out Google’s own video below.

Security Issues

Unless you have a major security issue currently running amuck on your website, you aren’t going to see anything under this category. If for some reason you do, contact your website’s administrator immediately so they can get the issue addressed as soon as possible.

Other Resources

The other resources tab gives a quick run-down of some other tools that Google has available. A few good ones to also implement into your online strategy would include Google Places, which is now Google My Business, Structured Data Testing Tool, and the custom search form that allows users to search your website quickly.

Wrapping Up

If you’ve made it with us this far, you should now have a pretty good understanding of the basics of Webmaster Tools and an idea of how you can use them to better promote your website in search results. When you’re ready, you can head over to part 3 of this tutorial to learn more advanced tips and tricks to really

Spread The Word: