Webmaster Tools – An online marketers best friend

Webmaster tools is widely ignored by a good number of online marketers, but why? This is the closest and most advanced way of getting to control how Google’s algorithm actually reads your website. You have the ability to tell Googlebot which pages to crawl and which pages not to crawl via the robots.txt file. In Webmaster Tools this is located under the Health Tab, to stop Google crawling a page simply add to your robots.txt “Disallow: /path-to-your-page” this is mainly used for the admin page if you have a CMS.

Webmaster Tools

Another tool in Webmaster which can be a best friend to online marketers is the Disavow Tool. This is one of the more recent tools that Google have introduced to fight against spam and links which don’t affiliate with their guidelines. Especially with now the fact that Panda is going to be a constant part of the search algorithm this tool is crucial to remove the bad links which could be harming your site. How do you find these bad links you might ask? The answer is with Webmaster Tools again.

To find the bad links pointing to your website go to Traffic > Links to Your Site and here will be all the websites which are linking to you. It will be advised to check these websites as they could have harmful links therefore if a bad site is linking to you and you’re linking to them, it could harm your site also and therefore you will need to use the Disavow Tool to tell Google that this site has nothing to do with yours.

Tracking is such an important factor when it comes to SEO and yes we have analytics for this but what about the health of your website? There are very advanced features in webmaster tools which concentrate on the movements of Googlebot on a website. Tracking how many pages a day it crawls is one of the features we have also, so the ability to actually see what Googlebot is doing is one of the best things an online marketer can know.

Lastly, what can webmaster do to help a website which has been penalised? Everything to help get it back on the good books with Google is the short answer. Webmaster offers the user a chance to remove what has harmed the site such as the remove Urls tab or disavow tools for link removals but more importantly you can use the “Fetch as Google” to prioritise certain pages to be re-crawled. This is used for let’s say softer errors such as DNS errors which limited Google crawling a page. For bigger errors such as harmful links linking to your site or any other pre penalisation error which has now been fixed you can send Google a reconsideration request. This is just basically you telling Google that you have done what they have told and is the website now ready to be indexed again.

So as you can see all of the above features are crucial to us online marketers, I hope this helps you to think a little bit more when using webmaster as it is there to help us out!

Leave a Reply