If your website has dissapeared from Google, or your website is just not performing in Search Engines, learn how to audit your website for search engine friendliness through an SEO audit.
For awareness – Where do I buy?
For research – Social media for oil and gas companies
For final product selections – Cost to outsource SEO verses keep in house
For awareness – Where do I buy?
For research – Social media for oil and gas companies
For final product selections – Cost to outsource SEO verses keep in house
For awareness – Where do I buy?
For research – Social media for oil and gas companies
For final product selections – Cost to outsource SEO verses keep in house
For awareness – Where do I buy?
For research – Social media for oil and gas companies
For final product selections – Cost to outsource SEO verses keep in house
For awareness – Where do I buy?
For research – Social media for oil and gas companies
For final product selections – Cost to outsource SEO verses keep in house
Like a checkup for your body, we use an audit to examine the inner workings of the website to see if its in relatively good health, or if there are items that need to be addressed.
I don’t mean to throw any of my colleagues under the bus. Developers aren’t marketers. So when they typically build a website they are approaching it in a very different way than a marketer would.
They are focusing on the functionality of the website and the integrity of the code. This can include developing applications, and writing complicated javascript. However, in many cases less attention is paid to things such as search engine optimization.
There is change ahead. The little adage we use in the SEO industry is that the only thhing that stays the same in SEO, is that nothing stays the same. Everything changes, from algorithm updates to the introduction of new products and protocols.
Living on the bleeding edge of these changes really sets a business and website ahead of the pack.
Foundational issues have the same consequences on buildings as they do websites. If the structural issues aren’t corrected they can lead to costly repairs in the future. This is especially true for things like your website’s URL structure or address, its site speed, its navigation system or site architecture.
If these components are caught early it can save you so many headaches in the future and ultimately help your website rank higher in the search results for a larger number of keywords.
The one notable item that is missing from the list is social media. Social certainly plays a role in Search today, however, it’s not strictly part of an SEO audit.
And, I’ll tell you why, there was announcement by Matt Cutts, who is a software engineer at Google and the head of web spam department, that stated that Google DOES NOT use things like the amount of Twitter followers you have or the amount of Facebook Fans you have in its algorithm to calculate search rankings.
It simply returns these websites in the search results if they are in fact crawlable. But you aren’t going to get a ranking boost simply from developing these channels. You may if you use they are distribution networks, which is what we typically recommend in an SEO strategy you may see some ranking benefits out of it. And if that sounds like something you’d like to learn more about maybe Kevin and I can talk about that in another presentation.
Keywords – the terms and phrases users will ideally search to find your website
Onsite Optimization – The optimization of code and architecture
Content – The amount, quality and arrangement of content.
Links/Authority – The quantity, quality and optimization of hyperlinks.
The first item is keyword research and with keyword research typically you are striving to accomplish two things.
The first is find the variations of keywords that have the highest amount of searches and second the keywords that have the lowest amount of competition. And this just makes sense. If you have keywords with high searches and high competition, the rewards are huge, but the investment in optimizing for competitive keywords is obviously going to be much greater and ultimately may not be worth your time.
Secondly if you have a keyword that is low searches and low competition, it may be easy to target but the traffic could so low that it isn’t worth the investment creating content and optimizing the website to target these keywords.
The only caveat to that is if you are in a hyper specific niche, which is something we see quite frequently in the oil and gas industry, you’ll like be targeting keywords that have low search volumes, which isn’t always a bad thing. If no one else is targeting “industrial centrifuges” for example, and you’re really the only game in town - it doesn’t matter that a limited number of people are search for it.
The ideal blend however are keywords with a high amount of traffic and a low amount of competition.
We use a tool called the Google Keyword Planner - which is now offered through Google Adwords. So you must have an account to access it. For starters, what’s really great about this tool is that you have multiple filtering options. So in this example I searched for the term “pickles” and I narrowed my geographic location to calgary, and excluded the word sweet, so I wouldn’t have keywords displaying for “sweet pickles” because I’m a dill man myself.
Once we’re selected our filtering options we then search for the terms, as I mentioned, that have high amounts of monthly search with low competition. And as we can see the search term “pickled beets”, in Calgary, had an average monthly searchs of 210 and a competition level that was low. That’s great and exactly what we want to see.
If we scroll down we see “pickled egg recipe” only had 50 searchs, yet it has a competition level of medium. Which is not as awesome, because we would need to invest more resource to compete for the keyword, and it will ultimately drive less traffic to our website.
Moving on to onsite optimization, and I don’t know about you but I can feel the power. And that’s because onsite optimization is the one item in SEO that you have full control over!! Unlike the other items components of SEO we’re talking about today, you control your own destiny with onsite SEO.
Onsite SEO is obviously a large part of optimizing a website, so I’ve picked three sections I find to be most important.
For starters we have indexation which is just a fancy word for saying Google is going to attempt to crawl the website and store the information it finds until some search for it. This is all well and good, however, in some cases issues with a website can prevent search engines from doing their job. I like imagine google as a robot vacuum cleaner.
The most common issues we see are with robots, duplicate content and the website’s crawl rate.
Robots: are an exclusion protocol that tells search engines to index a website or not.
Duplicate content: Exactly as it sounds, when a website has two versions of the same content it leads to a duplicate content issue. The reason this is bad is that search engines don’t want to have to index multiple variations of the same web page. And in some cases if duplicate content is present throughout a website Google will simply stop crawling it.
Crawl Rates: These are a function of indexation, but if crawl rates are down it can be an indication that there is an issue with the website.
Architecture: refers to the structure of the website and how it is organized and structured. Having a clean architecture that flows page rank
Navigation: The layout of your navigation system
Speed: how quickly your page loads
Sitemap: a map of your website that can be submitted to google. It basically leads google directly the the pages on your website.
Relevance: How relevant is your website to the terms and phrases you’re targeting
When I think of relevance I think of the old satilite dish from the 90’s
Meta Data; Title tag and description tags, which are items Google uses to evaluate the context of a page
URL: The address of a web page – better to have keywords than parameters
Structured Data: Schema.org sends relevance signals to search engines to differentiate between similar pieces of content
Content today is… kind of a big deal! Not only does it work to position you or your organization as an expert in your niche, it also attract additional traffic to your website.
As we all know, content can take many forms such as:…………but the big question is what type of content are we creating, and how can it be optimized for search engines
So, there are a few items to consider. For starters…
Microforms: how does it look in the search results
Shareable: are people going to pass it on, and have we created the circumstances to make it easy for them to do so
Structure: are the titles and content optimized and using SEO best practices
Length: Does it mean minimum content requirements
Location: is it published in a location that it will have a positive impact?
Link worthy: would someone be willing to link to the content
Matt Cutts is the Wayne Gretzky of the SEO industry. And here’s what he had to say about links.
If the concept of links is new to you, this will hopefully clarify a few things. This paragraph is from a patent google applied for in 2004 and it reads…
Ranking in Google is like a high school popularity contest.
If you were to calculate the first line together you’d get your highschool coolness factor.
To take things a step further, we also analyze other components of links.
Schema.org will send the signals back to google
Google webmasters is your two way radio to Google. It sends information to you and you can also send information to it. Configure Google Webmasters and familiarize yourself with its functionality.