3. 3
Webmaster Tools
Google Search Console allows us to
check the indexing status and
optimize visibility of our websites.
A site that's active in Webmaster Tools
has a better shot at ranking well.
7. 7
Sitemap.xml
A sitemap.xml is simply a list of pages on your website, and what search engines
should crawl and index.
All our important pages should be on our sitemaps, otherwise Google might not
be crawling and indexing them.
9. 9
Robots.txt
A Robots.txt file does the opposite, and tells search engines what they cannot
crawl and index.
For example, when we do online banking, we don’t want Google to index our
private pages – therefore these pages will be in a robots.txt.
15. 15
Errors
Server Errors 5xx
When you see this kind of error for your URLs, it means that Google bot
couldn’t access your URL, the request time out, or your site was busy.
As a result, Google bot was forced to abandon the request.
16. 16
Errors
Not Found 4xx
A 404 error is returned when Google bot attempts to visit a page that
doesn’t exist – either because you deleted or renamed it without
redirecting the old URL to a new page, or because of a typo in the link.
18. 18
Errors
Soft 404
Usually, when a user tries reaching a page, they will get a 404 error. As
a result, the content of the page won’t be crawled or indexed by search
engines.
A soft 404 occurs when your server returns a real page for a URL that
doesn't actually exist on your site. This usually happens when your
server handles faulty or non-existent URLs as "OK," and redirects the
user to a valid page like the home page or a "custom" 404 page.
19. 19
Errors
Access Denied
To crawl a page, Googlebot must be able to access it. If you’re seeing
Access Denied errors, it may be:
• Because your site requires users to log in to view all or some of your
content.
• Robots.txt file is blocking
• A proxy, or your hosting provider may be blocking Google
from accessing your site.
24. 24
Sitemaps
Not all pages are indexed, especially if you have put the pages in
robots.txt file.
Watch out for large numbers of pages no longer being indexed – this
might be a sign of Panda.
26. 26
Fetch As Google
Fetch As Google allows you to see if:
• Your URL’s are blocking Google
• Your pages are getting redirected
• Your website is temporarily down
• Your website is unreachable
• Submit your website to index
36. 36
Manual Actions
If Google thinks you are violating their web spam guidelines, such as
Penguin and Panda – then they may send you a Manual Action.
These are not to be ignored. If you are served a Manual Action you are
likely to either be seriously demoted or removed from Google, until you
ask for reconsideration.
37. 37
Manual Actions
If your site isn't appearing in search results, check the Manual Action
page. If you think you are adhering to the guidelines, you can request a
review of your site directly from the Manual Actions page.
38. 38
Messages
Types of messages:
• Sudden spikes or drops for your impressions or click (search queries)
• Your DNS server is down or misconfigured
• Your web server itself is firewalled off
• Your web server is refusing connections from Googlebot
• Your web server is overloaded, or down
• Your site’s robots.txt is inaccessible
• Increase in errors
• If they think you are behaving in a spammy way
• If your CMS is due an update (Wordpress)
• If they have detected Malware
• Google+ link regquest
50. 50
Sitelinks
If you rank well in Google, it will start showing multiple links/pages of
your website. We cannot say which pages should rank, but we can say
which ones we don’t want.