Google Webmaster Tools – A Starters Guide
Hi,
Today I’ll start a series of posts about Google Webmaster Tools . This posts will be tutorial-alike and I’m planning to give basic elements of tools. If you’re already familiar with these tools, don’t waste your time. You can read one of my previous posts about:
or you can go somewhere else.
Anyway, let’s start. First of all, you need a Google acount for this. After you logged in to webmaster account you’ll see the dashboard. Dashboard will look like this:
All of your sites will be listed in dashboard. I’ve marked three point on the screenshot. First one is the “Messages”. This is the link to the message center. From time to time you will get notices about some of your sites. These notices will be listed in your message center. And sometimes message center will be unavailable. And I don’t know why. Don’t ask. So what notices can you get? Things like “crawl rate changed”, reconsideration requests and some crawl problems.
* I’ll explain this “crawl rate” thing later.
* Reconsideration requests are seems to be important but truly I haven’t got any results from a reconsideration request yet. Normally you can ask Google to reconsider your site which is banned (Google thinks that your site is spammy? dangerous? ) After a reconsideration request, all you have to do is wait for a return. I’m waiting for more than a year and when I got a return I’ll make you learn.
* Crawl problems are the most important ones, as you may guess. You have to follow these notices and try to resolve them immediately.
The second mark is the “add site” form. Using this form you can add your site to your webmaster tools account. Of course you’ll need to verify your site. After you added a site, it’ll appear in the third area marked. Later you can use these links to directly go to that sites reports. So if you haven’t added a site yet, just type your url and hit the “Add Site” button. Now you got a site listed in dashboard and a cross under the “Verified” section. So let’s click it and verify your site. Verification can be done in two ways:
After you choose one of the two ways just hit the “Verify” button and Google will handle the rest. After verification completed you can go the “Overview” section. This is the starting point when you next click your site from dashboard. Here is a screenshot of it:
Lets start from the top. “Home page crawl” section gives the time of last crawl of your homepage. If your site is new, it’ll take time to see something on this section. In order to get indexed as quick as possible you can follow my way . It’ll be good for you to keep these crawl times. Later you’ll be able to see how often Google bot visits your site — and of course changes in frequency. “Index status” will give an overview of your site’s index status. Either some of your pages are included in index or not. And either some of your pages from your sitemap are included or not. You can find details of inclusion in other sections. For now let’s skip it.
Below we got an important section: “Web crawl errors”. Let’s go over them:
* Errors for URLs in Sitemaps: This gives the number of erroneous URLs listed in your sitemap. If your sitemap is auto-generated (output of a plugin etc.) most probably the url strucure will be correct. So the errors will be due to server downtime or something like that. You have to view the “Details” and inspect the errors. If urls are broken you should remove them from your sitemap. It’s really a bad idea to provide broken links in your sitemap. After correcting the problems these errors will be gone during next crawl.
* HTTP errors: This section contains urls that give an HTTP error (401, 404, 407 etc.): “Article not found”, “Item not found” etc. First of all you have to think about the reason of existance of this url. How Google bot was able to react that url? Who gave a broken link? May be you have changed your url structure lately and created some broken links?
* Not found: Again broken links. (HTTP 404)
* URLs not followed: Mostly you got errors due to redirects. You should always be careful with redirects.
* URLs restricted by robots.txt: I’ll go over the robots.txt later. If you don’t know what robots.txt is and some urls are listed in this section than there is a problem. You can use robots.txt file to protect some of your urls to not to get indexed. So if this list contains a url that you want to get indexed than inspect your robots.txt.
* URLs timed out: This section is also important. If Google bot encountered a time out probably there is an issue with your web server. Or your HTML is too large?
* Unreachable URLs: Get rid of these urls or make them reachable.
I guess this enough for this post. I’ll continue later. See you.