Conducting 15-minute checks with Google Webmaster Tools


So you just launched your awesome new site, you have a bunch of great content you spent hours doing but you realize over time that people just don’t find your stuff. Even when you do a Google search of your own site and its contents, you see your site nowhere near the top five or first page of the results.

Being the DIY champ that you are, you head on again to Google and search for answers. You search for ranking tips; watch video tutorials made by web marketing gurus and spend countless hours learning. Then lo, and behold, you stumble upon Google Webmaster Tools.

Google Webmaster Tools give you exactly what you need to know to optimize your site. Here at CaMP Analytics, we totally make use of this tool to make sure our sites do well with the rules dictated by the search engine giant. GWT will tell you what you’re doing wrong so it helps to know the rules and work your way around it.

Here are a few ways we optimize our sites, Google God style.

In Google Webmaster Tools dashboard you’ll see several options categorized according to its function. Let’s start with the Messages section.


1. Site Messages Section

This section shows messages or warnings that need urgent attention. The categories here: DNS, Server Connectivity and Robots.txt fetch should always be a green icon with a check mark symbol.

This section shows messages or warnings that need urgent attention. The categories here: DNS, Server Connectivity and Robots.txt fetch should always be a green icon with a check mark symbol.


2. Structured Data

The tool also offers you a way to check your structured data the way Google “sees” it and how it is presented in search results including your authorship data. This feature come in handy if you need to check how your structured data is shown in search pages.


3. HTML Improvements

This section offers you tips on meta data with duplicates, short and long descriptions, and other issues found.

As a rule of thumb, our writers always follow the standard number of characters for meta descriptions and title tags. The character count required changes from time to time so we also make it a point to update data. As of May this year, Google’s new meta data character limit is at 55 characters for title tags and 155 for meta description.


4. Sitelinks

Sitelinks included in search results when you do a private search of a site, can be demoted to reduce the chances of it appearing. Demoting however needs proper testing and investigation. A common instance of this happening is when a link becomes dated; for example, a product is unavailable or is offered seasonally.

Before demoting it is advisable to check site traffic coming through the sitelink and search queries.


5. Index Status

Check the report under Advanced Index Status Report and examine the total pages indexed and total pages blocked by Robots.txt and the number of pages that were removed.

If you notice that there’s a difference in the number of pages that should be indexed but were not, or an unexpected change in its numbers then you have to investigate immediately.


6. Crawl Errors

URL Errors list very specific errors encountered when Google tries to crawl a website on desktop and other devices. In the past, Google used to report page errors found under the redirecting URL. They’ve changed this and now Google reports the final destination URL of the discovered errors.

If any of the categories under Crawl Errors show 100% error rate this would likely mean the site is down or misconfigured. Here are a few items you need to check initially:

  • Check that site reorganization hasn't changed permissions for a section of your site.
  • If your site has been reorganized, check that external links still work.
  • Review any new scripts to ensure they are not malfunctioning repeatedly.
  • Make sure all directories are present and haven't been accidentally moved or deleted.


7. Robots.txt tester

When you click on robots.txt tester a popup message appears if Google is not able to fetch your robots file. The alert is called Robots.txt Fetch Fail and when this happens you have to immediately check the actual file and inform your developer. Other errors will be highlighted in red or yellow.

Google also highlights crawl errors right in the robots testing tool and allows you to edit and test the new version of this file.


8. Security Issues

This new feature helps pinpoint issues for when the site has been hacked so it is easier to recover.


9. PageSpeed Insights

Google PageSpeed Insights is a neat function to test your site performance for mobile and desktop. Of course, your site performance will vary upon your user’s preferred network connection so Google only considered those components that are network-independent. These include the use of images, JavaScript and CSS, server configuration and the HTML structure of your page.

A green check mark means that you passed their rules while the red and yellow exclamation point icons mean there are issues that need to be immediately addressed.

Google will also grade your site speed performance with a score range between 0 – 100. Here at CaMP we always aim for a mark of 85 and above both on mobile and desktop.

There you have it. Now it’s your turn to set up your GWT account and check whether your site passes Google rules and standards. If you don’t like to mess with all the tech stuff and need help, drop us a line here or check out our services.