Ce diaporama a bien été signalé.
Le téléchargement de votre SlideShare est en cours. ×

Breaking Bad SEO - The Science of Crawl Space

Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Chargement dans…3
×

Consultez-les par la suite

1 sur 34 Publicité

Breaking Bad SEO - The Science of Crawl Space

Télécharger pour lire hors ligne

My original slide deck delivered at the 2014 BrightonSEO conference back in 2014...
So Why Breaking Bad?
It's the story of a high school chemistry teacher who discovers he has cancer and starts producing crystal meth to earn some fast cash to help fund his treatment and leave some money for his family if he dies.
Got me thinking about SEO as a science - it requires knowledge, skill and regular testing to deliver the right results or improvements.
Even the lead characters reflect certain aspects of the SEO industry...
There is an episode called Crawl Space, which ties in nicely with this topic.

My original slide deck delivered at the 2014 BrightonSEO conference back in 2014...
So Why Breaking Bad?
It's the story of a high school chemistry teacher who discovers he has cancer and starts producing crystal meth to earn some fast cash to help fund his treatment and leave some money for his family if he dies.
Got me thinking about SEO as a science - it requires knowledge, skill and regular testing to deliver the right results or improvements.
Even the lead characters reflect certain aspects of the SEO industry...
There is an episode called Crawl Space, which ties in nicely with this topic.

Publicité
Publicité

Plus De Contenu Connexe

Similaire à Breaking Bad SEO - The Science of Crawl Space (20)

Plus récents (20)

Publicité

Breaking Bad SEO - The Science of Crawl Space

  1. 1. Breaking Bad SEO The Science of Crawl Space
  2. 2. Meet Walter…
  3. 3. Jesse Pinkman
  4. 4. The Totality of Possible URLs For a Website “[You]…have several options to optimize the “crawl space” (the totality of URLs on your site known to Googlebot) for unique content pages, reduce crawling of duplicative pages, and consolidate indexing signals.” Google Webmaster Central Blog http://googlewebmastercentral.blogspot.co.uk/2014/02/faceted-navigation-best-and-5-of-worst.html What is Crawl Space?
  5. 5. Size Does Matter…
  6. 6. • domain.com • domain.com/home • domain.com/home/ • domain.com/Home • domain.com/?source=ppc • domain.com/index.html How’d I Get so Many Pages?
  7. 7. For SEO… There Can Only Be One!
  8. 8. But Don’t Lose Your Head
  9. 9. • URL Universe • Invoked URLs • URLs In Sitemaps • Crawlable URLs • Indexable URLs • Canonical URLs • Indexed URLs • PPC Landing URLs • PPC Display URLs • Socially Shared URLs • Organic Landing Page URLs • Externally Linked URLs • Mobile URLs • Translated/Regional URLs • Shortened URLs • Domain Duplicates What Makes Up Your Crawl Space?
  10. 10. Poor management has implications: • Orphaned pages • Incomplete sitemaps • Dilution of backlinks and shares • Performance is harder to track • Limited volume of unique URLs in analytics • Crawl inefficiency for SEO • Traffic growth strategies just won’t work The Threats
  11. 11. Do social signals impact organic rankings? • Twitter? • Facebook? • Google+ Potentially Social and Backlink Synergy
  12. 12. Let’s Cook!
  13. 13. Be the Boss (of URLs) • An SEO needs to be a great cook… • Make sure you have the right equipment • Follow an accurate formula • Use quality ingredients • Take ownership of your crawl space • Benchmark • URL Roadmap The Cook – YOU!
  14. 14. The Lab
  15. 15. Maximised + Minimised = Optimised Maximise indexable space • Increase volume of valuable pages • Increase crawl efficiency Minimise crawlable space • Define your crawl space • Identify and eliminate threats Optimise canonical space • A clean version of your website • Your URL à la carte The Formula
  16. 16. • Robots.txt • Noindex • Redirects • XML Sitemaps • Internal linking • Inbound link equity • Landing page URLs • Canonical tag implementation • Consistent OG tags • Mobile setup • Hreflang Use Organic Ingredients
  17. 17. So What’s the Recipe?
  18. 18. Crawl Space Solutions in One… New Universal Crawl is now out of Beta! Review website, XML sitemaps & organic landing page data in one Universal Crawl with Deep Crawl. Take advantage of a significant head start in defining, managing & optimising your crawl spaces. Universal Crawl is the New Heisenberg…
  19. 19. Deep Crawl Goes Universal
  20. 20. Understand what’s in your crawl space… • Assess indexation reports • Review the current index • Test URL parameter changes • Quick improvements through GWT Parameter settings Identify Indexable & Non-indexable Parameters
  21. 21. Make sure you’re not self harming • Check canonical implementation • All canonical pages should be linked internally • Assess pages without canonical tags Canonical URL Configuration
  22. 22. Which URLs Are Being Shared? ?
  23. 23. Where’s your thin content hiding out? Identify Low Value Navigational Pages
  24. 24. Take a good look at your analytics… • Review URLs delivering minimal traffic • Identify and assess URLs outside of your canonical setup Identify Low Value Non Navigational URLs
  25. 25. Identify Domain Aliases
  26. 26. Monitor your domain portfolio & keep alert! www.robotto.org
  27. 27. Check all disallowed URLs… • Webmaster Tools • Deep Crawl Indexation Reports Disallowed URLs
  28. 28. Review & Validate all linked URLs… Identify All Linked URLs
  29. 29. Crawl your sitemap regularly… • Run analysis and compare: • Scheduled sitemap crawls • Scheduled website crawls • All validated, linked URLs Compare Sitemaps
  30. 30. Where’s the link equity? • Identify pages delivering traffic but not internally linked • Understand the link profile of all pages: • Crawl aggregated link data • DeepCrawl automatically applies link metrics to all reports: Compare Landing Page URLs to Linked URLs
  31. 31. Watch your Language… Checkyour Hreflang! International SEO Considerations
  32. 32. • Universal Crawl tests implementation across: • Sitemaps • Headers • On-page • Review a matrix of language alternatives for each page • Assess gaps and inconsistencies in the setup • Review a ‘Pages Without Hreflang Tags’ report • See David Sottimano’s MOZ post on HrefLang: http://moz.com/blog/hreflang-behaviour-insights International SEO Considerations
  33. 33. • Review your options and consider your URL Universe • Setup your lab • Google Analytics • Google/BING Webmaster Tools • Deep Crawl – Universal Crawl • Follow the formula: • Maximised + Minimised = Optimised • Develop and test new recipes to focus your crawl spaces. #BreakingBadSEO
  34. 34. Thanks, Keep in Touch… Deep Crawl www.deepcrawl.co.uk @deepcrawl Tony King @ToastedTeacake #breakingbadseo

Notes de l'éditeur

  • Why Breaking Bad?
    Story of a high school chemistry teacher who discovers he has cancer and starts producing crystal meth to earn some fast cash to help fund his treatment and leave some money for his family if he dies.
    Got me thinking about SEO as a science - requires knowledge, skill and regular testing to deliver the right results or improvements.
    Even the lead characters reflect certain aspects of the SEO industry…
  • Meet Walter, the chemistry teacher.
    Despite getting his hands pretty dirty throughout the series, Walt is fundamentally a good guy and as the star of the show, from a search perspective he should be considered a white hat SEO.
    As a scientist, Walt is methodical in his approach, he understands the principles and practices needed to achieve results and has the skill to produce the very best crystal meth.
  • Jesse Pinkman on the other hand is a Black Hat SEO.
    An ex-student of Walt’s who failed chemistry at school and became a drug dealer.
    Jesse knows his industry and has the right contacts to get by, but he isn’t really interested in delivering the very best results or long term plans.

    So why "Crawl Space“?
  • The Totality of Possible URLs For a Website
    Google regularly refer to crawl space - it’s fundamentally about knowing your site architecture - the first step towards successful website optimisation.
    Identifying your crawl space is the first step towards knowing your site architecture.

    Why should we care about crawl space?

  • Because size matters.
    Google warn you about potential crawl space issues in GWT and outline some of the implications…
    Unnecessarily crawling a large number of duplicate content URLs
    Discovering undesired parts of your site
    Consuming more bandwidth than necessary
    Potential inability to completely index all of your site.
  • So check what’s in the search engine indexes already, is it a realistic number of pages for your site?
    Do you recognise the URL structure/formats being returned?
    The most common contributor to a large crawl space is duplicate content - seen here with BooHoo.com, which can be caused by a number of valid and invalid reasons.

    For SEO, when it comes to URLs per page…
    (There can only be one!)
  • There can only be one.
  • But don’t lose you head…
    Auditing your Crawl Space will help you understand the full mix of URLs and help formulate a consistent implementation.
  • To build a picture of your site architecture and to discover your complete crawl space, you need to consider what contributes to your URL Universe – the place where all theoretically possible URLs exist.
    Invoked URLs All of the URLs ever brought into existence, for any reason.
    Uninvoked URLs URLs that haven’t been invoked (but could be?)
    Internally Linked URLs All Invoked URLs which linked from other internal pages
  • A lack of Crawl Space management can leave you wide open to threats, and lead to a whole load of GWT warnings!
  • Who cares!? Even if social signals are not used in a ranking algorithm, social media and SEO are both able to drive discovery at scale. They both have enormous reach so understanding the crawl space for search & social is crucial.
    URL aggregation is not necessarily required, I’d recommend it.

    So let take a look at how to tackle all this…
    Lets Cook.
  • To get going we need all the components:
    A Cook
    A Lab
    The Formula
    Ingredients
    Recipe
  • Just like Walt, you need to be the boss, the head chef, the daddy, ideally a scientist – but not a prerequisite if you have the right tools to help you take ownership of your crawl space.

    The Lab…
  • This is Jesse’s RV – effectively a mobile meth lab. We need to set up our lab with the tools and equipment necessary to develop search marketing recipes. Need to link up a range of data sources to help us understand the crawl space:
    Webmaster Tools - Google & Bing
    Landing page data (Analytics)
    Linked URL data - WMT/hrefs/Majestic/OSE
    Website crawler (Xenu/Screaming Frog/DeepCrawl)
    DC’s new Universal Crawl now includes Google Analytics landing page data, along with link equity and social tagging reports to assess a comprehensive crawl space in one.

    The Formula…
  • Maximised + Minimised = Optimised
    Discovery, management and optimisation of your crawl space is essential and lays the foundation for strong performance.
    All spaces need to be carefully defined and managed efficiently.
    Maximise indexable space, Minimise crawlable space, Optimise canonical space.

    Thankfully the search engines empower you with some great ingredients to help you develop your recipes…
  • As SEOs, we have a whole host of juicy ingredients at our disposal to help cook up an optimised crawl space - picking the very best ingredients for your recipe will make all the difference in testing your recipes.

    Use the custom controls in DC to extract and schedule regular data comparisons for each source…
    Overwrite robots.txt rules to assess full crawl space as well as the one delivered to search engines.
    Schedule regular sitemap crawls and compare against internal and external links, review canonical setup and index controls to maintain consistency.
    - DeepCrawl Backlinks crawl
    - OG tags & Hreflang DeepCrawl report

  • New Universal Crawl out of Beta – Capture and audit comprehensive website, XML sitemap & organic landing page data in one Universal Crawl - designed to capture and inform your crawl space optimisation for search and social using a wide range of data sources.
  • DC’s Universal Crawl helps you quickly and easily understand your crawl space and identify URL sources, gaps, new formats and traffic value, plus you get all the regular DC features including fully customisable crawl settings, ability to compare test environment vs live site (support QA), custom extraction tools and scheduled crawls that record change, impact and help quantify SEO deliverables.

    Lets take a look at some recipes to control your crawl space…

    DeepCrawl automatically shows you what’s changed between crawls so you can understand how much of the site is changing.
    You might spot some URLs formats which are changing very frequently and affecting the crawl efficiency.
  • DeepCrawl indexation reports help you quickly assess all crawlable URLs, unique pages, noindex pages and identify URL parameters that you might not want indexed.
    Use Webmaster tools data and and site: checks to understand current search engine indexation and use the Parameter Removal controls in DeepCrawl to test the impact of stripping parameters. Monitor crawl efficiency changes and for a quick win update your URL parameter settings in Webmaster tools when you find the right formula.
  • ‘Canonicalized Pages’ reports to ensure your canonical implementation is correct.
    Identify canonical URLs which aren’t linked internally with the ‘Unlinked Canonical Pages’ report – you’d expect every canonical URL to be linked somewhere internally.
    The ‘Pages without Canonical Tag’ report shows you pages that are missing canonical tags, makes sure there aren’t any important pages included here.
  • Check the consistency of your canonical URLs against social URLs – are your OG & TwitterCard URLs consistent? DeepCrawl has a report called ‘Inconsistent Open Graph and Canonical URLs’ to automatically show any errors.

    You can also schedule custom extraction crawls to regularly assess social share equity changes for specific URLs using DC.
    Likewise for blog comments etc.
  • Use DeepCrawl Min. content/html ratio reports to find potential Panda pages linked internally.

    All pages with less than 10 percent minimum content/HTML ratio. 
  • Review and assess if you can make your crawl more efficient by excluding certain pages. Check your non-canonical URLs and use Universal Crawl to see non-navigational pages not driving traffic.
  • Understanding who’s working with you or who’s against you is important too. Review your domain aliases.
    Test all registered domains to check if they return a duplicate or redirect.
    Check www/non-www configuration and HTTP/HTTPS.
    Implement redirects or use cross domain canonical setups.
    Monitor domains using Robotto.org
  • DeepCrawl reports all ‘Disallowed URLs’ so you can easily see what's already being excluded.
    Test changes to your robots.txt file using the Robots Overwrite functionality and develop an optimised file that increases disallowed URLs and focuses your crawl on primary pages - test the impact.
  • Check that your website internal linking is working towards an optimised crawl space.
    Use DeepCrawl internal broken links, redirected links, 4xx and 5xx error reports to identify internal links on your site that are broken or are redirected URLs that may affecting your crawl efficiency.
  • Identify pages ‘Only in Sitemaps’ and not linked internally, plus all pages linked internally, which are not in the Sitemaps. Schedule regular sitemap and comparative website crawls to assess what’s changed, you can monitor how much of the site is changing and map against performance.
    You might spot some URLs formats which are changing frequently and impacting crawl efficiency.
  • By comparing sitemaps, GA and internal links, a Universal Crawl easily highlights URLs discovered ‘Only in Organic Landing Pages’ and not linked internally.
    You can add ‘Backlink Crawls’ to your DC projects, simply upload a comprehensive backlink profile URL list and let DC crawl and validate the URLs. This crawl also helps identify pages generating traffic but not necessarily linked internally, plus it automatically applies inbound link equity metrics to all DC reports at a URL level – very useful.
  • The correct use and implementation of HrefLang tags is important for effective International SEO, but it can be confusing and even experienced SEO’s get it wrong.
  • DeepCrawl helps manage a seamless HrefLang integration by detecting hreflang tags in sitemaps, headers and on-page before showing a matrix of language alternatives for every page so you can see the gaps and inconsistencies in the setup.

×