This document discusses using different data sources like backlinks, log files, Google Search Console, and Google Analytics together to understand website performance. It provides examples of how to use backlink data from sources like DeepCrawl to identify positive and negative backlinks. It also discusses using log file data to optimize crawl budget by identifying non-indexable or low traffic pages. The document advocates bringing together internal and external ranking factors with user traffic data to develop a holistic "search universe" view of a website.
Call Girls Service Adil Nagar 7001305949 Need escorts Service Pooja Vip
Jon Myers talk on using crawling data with backlinks, log files, GSC and GA
1. Jon Myers
Chief Growth Officer - DeepCrawl
The Search Universe - Links, Log Files,
GSC and everything in between.
@deepcrawl @JonDMyers
2. What we will cover…
@DeepCrawl @JonDMyers
Using Crawling Data with:
• Backlinks
• Log File Data
• Backlink and Log File Data
• Where does GSC fit in and add
• What about the Consumer and where GA fits
6. Power of adding backlinks to internal links
@DeepCrawl @JonDMyers
7. Using backlinks data with crawl data
@DeepCrawl @JonDMyers
Things to consider:
• Backlinks Authority distribution?
• Where are they backlinks landing on?
• Positive/Negative Backlinks?
• Backlinks to low DeepRank/high Level pages?
• Backlinks to orphaned?
• Backlinks to broken pages/non-indexable?
14. Power of adding log file data internally
@DeepCrawl @JonDMyers
15. Incorporating log file data into a crawl reveals where crawl budget is being wasted
Identify non-indexable pages receiving bot requests:
Hits to redirect chains (reduce no. redirects)
Hits to 404/410 status codes (crawled occasionally)
Hits to soft 404 or orphaned pages (redirect to another page?)
Hits to 3xx status codes (investigate why being indexed)
Find out how frequently search engine bots hit your site
For pages receiving low bot requests you could consider:
Adding more internal links to that page
Adding a “Last Modified” date to the sitemap
Making 3øno non-indeä3‡`Íe pages å ≥å ≥sitemap
Splitting SŸ~SJrŒ)the sitemap °∏N!»úones, including one sitemap for new pages
Understand *SŸ~SJrŒ)∂…¨Ûi÷gö´kBots Crawl Your Site With TŸñ•A3àÉ€ÀD≥#ËL;ÉPT˜òRí«User Agents
(Å üQü:i"˙}öLÁ˘’MF5r† ‹Ä|øñx’õÉu∂Î1§FøèvÒflNrı„î+È(˝B_ÈûBFçG%˝¢◊•‹£|Í…ıd`Öaï—
V¨ªj:+Êsd…”∏Áwób•˚°ñdcUó–l™§´Pınˆ∏=õ<◊xñxÍÈ
»û`ÈÏk√äËÑDΩä˜MÙ[ÉuxfiœÛ˜^Ωï‘E_≥hfeŸXGtãtÎt€tªtœËfi“á∑
I love log-files! xx
@DeepCrawl @JonDMyers
23. Adding your traffic to the equation
@DeepCrawl @JonDMyers
1. It is one thing to have internal and external power for
Google’s eyes
2. Need to add the consumer power
3. Where and what do your customers see and convert on
4. Adding GA to the mix and the traffic knowledge it brings
24. A Simple URL story with Visits
@DeepCrawl @JonDMyers
Disallowed myurl.com/my-page 1000s of Visits!
25. An advanced story with visits & backlinks
Disallowed
myurl.com/my-page
2768 Visits!
301 redirection
myurl.com/redirect-page
301 redirection
myurl.com/other-page
No Indexed!
No Backlinks!
1000s of Backlinks!
@DeepCrawl @JonDMyers
26. A simple URL story with visits
@DeepCrawl @JonDMyers
“Google Drives Awareness but
people drive ROI!”
35. Your search universe from DeepCrawl
@DeepCrawl @JonDMyers
Correlation between:
• Backlink Authority
• Internal Link Authority “DeepRank”
• Logfile Googlebot Data
• Performance data from GSC
• Consumer and Traffic Data from GA