In any organization, the basics are often overlooked. It’s much more fun to work on exciting projects such as launching a new website, instead of solving system issues such as reporting accurate data.
During the last six months, I have worked on five SEO audits and each organization has the same problem – The basics of SEO are overlooked (and ignored). Each audit, I approach the same way; I collect the data:
- Run site through Screaming Frog SEO Spider
- Review crawl errors in Google webmaster Tools
- Run site through SEOmoz
Each tools provides valuable data. In webmaster tools, there are usually a high amount of crawl errors found.
Instead of having a team that addresses these problems, resources are instead used for link-building, content creation and web design. While all important factors in SEO, yet if Google is specifically telling you it has problems crawling your site, why is there so little focus? If Google has problems finding your site, then chances are so do your potential customers.
Of the three site audits I have performed during the last two months, only one site has been quick enough to take action and addres the issues.. The audits usually have the same outcome; duplicate content issues, URL parameters and session IDs, crawl errors, etc. The site audit is filled with recommendations and suggested next steps.
The following screenshots are taken from Google Analytics Organic Traffic report between July 1st to July 30th. Can you guess which site implemented the findings from the SEO audit?
Site 1
Organic traffic decrease of 13% compared with June 2012
Site 2
Organic traffic increase of 10% compared with June 2012
Site 3
Organic traffic increase of 350% compared with June 2012
Congrats! You guessed it, it’s site 3.
I’ve made a note in Google Analytics for each improvement made using Annotations. Here is the list of improvements by date:
- Jul 10, 2012 Redirect implemented for non-www to www
- Jul 23, 2012 URL re-write, improved page titles and meta descriptions
- Jul 24, 2012 New page titles added to home, and destination pages
- Jul 24, 2012 /index redirected to /
- Jul 25, 2012 New navigation menu launched
- Jul 26, 2012 Google analytics goals implemented
- Jul 27, 2012 Improved internal anchor text
- Jul 27, 2012 Google analytics tracking script moved to top of the page
- Jul 27, 2012 XML sitemap submitted
- Jul 27, 2012 Crawl errors fixed – Reduced from 155 issues to 6 issues
- Jul 30, 2012 New meta descriptions added for top destination pages
- Jul 30, 2012 https pages now redirected to http
- Jul 30, 2012 Google Analytics tracking script updated (only one per page)
- Jul 30, 2012 New XML sitemap submitted for http URLs
In less than a month, organic traffic has increased by 350% (based on weekly traffic) and organic leads now account for 50% of total conversions, up from 3%. No link-building was involved and no content creation strategies implemented, just fixing the basics reported that are reported in webmaster tools.
Has your organic traffic increased since fixing the crawl errors reported in webmaster tools? Feel free to comment below.
This is a great article! We’re working on SEO for all our sites and duplicate content seems to be one of our biggest problems. How do you solve the duplicate content problem between a UK and .com site (both English, both the same content) and a German and Swiss site (one is in German and the other in Swiss German, but basically the same content). Does the UK site have to re-direct to the .com site or vice versa, for example?
Thanks for the comment, Jennifer.
This is a great question that a lot of business have issues with. At the end of 2011, Google developed a new functionality that businesses could implement on their sites. The functionality is referenced as rel=”alternate” hreflang=”x”. The functionality is based on the following scenarios:
1. Multiregional websites using substantially the same content. Example: English webpages for Australia, Canada and USA, differing only in price
2. Multiregional websites using fully translated content, or substantially different monolingual content targeting different regions. Example: a product webpage in German, English and French
The rel= tag helps notify Google that you have duplicate content but that it is targeted to a country/ region.
More information can be found here: http://googlewebmastercentral.blogspot.com/2011/12/new-markup-for-multilingual-content.html
Since launching the tag, there seems to be confusion in how to implement it correctly. In June 2012, Google launched new functionality to include it within XML sitemaps and the following two articles provide great insight into how to implement the function successfully:
http://searchenginewatch.com/article/2200741/International-SEO-Using-XML-Sitemaps-hreflang-for-Geotargeting-Beware-of-the-Kinks
http://searchengineland.com/how-to-implement-the-hreflang-element-using-xml-sitemaps-123030
Unfortunately, I have yet to implement this myself so I cannot be 100% sure of the correct approach, but it appears the XML sitemap approach is preferred by SEO’s and Webmasters. Here is an example of how the XML sitemap code would look http://support.google.com/webmasters/bin/answer.py?hl=en&answer=2620865&topic=2370587&ctx=topic
Please let me know if you need further support.
Thanks for your reply and setting us off in the right direction.
Pingback: Forretningsverdien av en SEO audit: Hvor mye penger taper du hver måned? – Stammen.no
Pingback: The Business Value of an SEO Audit: How Much Are You Losing Every Month? | Steven Macdonald
Pingback: Tribes.no | The Complete Magento SEO Best Practice Guide