The 1 hour Technical SEO Audit for

A while back I wrote a post about finding your site’s biggest technical (SEO) flaws in 60 minutes, and I had Evan contact me about doing a similar type of audit for this website.  Evan you’re a brave man, and I promise to be as ruthless as possible ;)

The point of this post (audit) is to show you what a professional SEO can do for you in less than an hour to help you improve your site. Since I’ve only given myself 60 minutes to complete this, I’ll likely miss out on some things but I’m going to concentrate on big issues such as:

  1. Algorithm readiness – specifically Panda related issues (duplicate content, freshness, page layout)
  2. Crawlability / Architecture & On page issues (markup, keyword targeting, JS etc..)
  3. Identifying opportunities for Evan (new markets, keywords to target)

Here are the tools I’m going to use to rip through this:

  1. Microsoft IIS SEO Crawler – Specific for types of redirection, markup errors, meta data
  2. Screaming Frog SEO Spider – All meta data, response codes, and more
  3. Seomoz toolbar – Nofollow links
  4. Searchmetrics Essentials – Check for traffic / ranking anomalies in a flash
  5. Spyonweb – Anything owned that might have dupe content?
  6. – Understand what server / technology is powering the site
  7. Pagerank toolbar for Chrome – Mainly for spotting architecture problems
  8. Chome (regular), Firefox & Opera (JS disabled) – The browser line up
  9. HTTPfox – Firefox plug in to listen to HTTP requests
  10. Compare text files online  and cloaking checker – Check for bot / user cloaking
  11. Lots of Google search queries and looking through source code

The 1 hour SEO audit starts now…

Without analytics data, I need Searchmetrics to give me a picture the site’s status (search visibility, problems, keywords ranking).  Right away I can see a steep drop around late February of 2011, which is very closely correlated to the initial Google Panda update. (

So I could be wrong and it could be something completely different (but I’m probably not wrong ;). This drop has my attention and I’ll be paying close attention to thin content, low value pages and duplicate content.

Potential Panda threats


This is an entire subdomain devoted to duplicating content based on article tagging. I’m about as happy about tag related pages as I am about sitting to someone who smells on the bus.

Why is this a problem: This is a breeding ground for low quality pages that are essentially duplicate content. Duplicate content is a big no no for SEO because it confuses ranking ability between pages and in worst case scenarios, can harm all pages with similar contents’ ability to rank well..if at all. Also, Google wanted to weed out low value, duplicate content pages in their series of Panda updates – this subdomain could be one of the victims.

How did I find it: By using this Google operator search query

How much of a problem is it: We’ll call it ~50k pages worth of duplicate content on a site of ~150k pages – so, pretty bad.

How does it happen: This page exists because it’s been tagged in a category called “growth” (maybe automatically by the CMS or an author). If you copy and paste a snippet of text from the page, you should see it ranking 1st if it’s unique, except this page isn’t ranking at all for the snippet of text!

What to do: Cull them all. With poor page titles such as “growth” and reeking of duplicate content, the best thing to do here is to 301 redirect all of these pages back to the root domain (brownie points if you can 301 each tag page to a related deep page on the main domain).

Potential Page Layout Algorithm Update


The forum content on this site is actually pretty darn good. Evan and co. post some great topics, but you might not think it’s that valuable based on first impressions. Google released a page layout algorithm update this year that targets pages with excessive ad content above the fold. Every forum page has a banner ad and Google adsense block before the actual content (example: which makes it a perfect victim for the update or future roll outs. Also, if you’ll notice in the example forum page I provided, you’ll notice &start=15, remember that..I’ll explain below.

How did I find it: By using this Google operator search query and I found the &start=15 parameter by going to page 2 of a forum post and then used this Google operator search query.

How much of a problem is it: From what I can see, it doesn’t seem like the forum has been affected (please note, I don’t have access to Evan’s analytics and I could be dead wrong). However I do know that the &start= parameter (for pagination) is causing around 3k duplicate pages.

How does it happen: (only explaining the start= parameter) This is basic pagination for most websites when you have too many items / products / posts to consume on one page, you paginate. How you structure the URLs for page 2,3,4 doesn’t really matter – it still affects the site exactly the same way.

What to do: 

For the page layout: Move the adsense to the sidebar, or make it less prevalent on the page – I would keep the banner ad at the top.

For the pagination parameter: In this case, we want Google to index all of the forum posts, give the forum post links pagerank – but not the paginated URLs. This is a perfect case of using the robots directive to NOINDEX,FOLLOW which means “Hey Google, I want you to see what’s on page 2,3,4 but I don’t need you wasting your time adding these pages to your index. Also, the first page is the only result I want users to see.” Ok, so I uncomplicated it – hopefully you get the picture.

Alternatively, you could also use rel=canonical to point every paginated version back to the first page. Technically, it works – but it really shouldn’t since the content on the paginated versions isn’t entirely duplicate. The rel=canonical tag transfers pagerank (page/link authority) where the NOINDEX doesn’t. This is a bit devious and that’s why it’s my second choice in this case -so use at your own peril to game the engines!

Crawlability / Architecture & On page Issues

This is the bulk of this post, and I’ll keep it as brief as possible to fly through it.

Problem 1:  A full copy of the website (www and non www version exist)

One of the basic checks of a seasoned SEO is to check if the www & non www version of the site exist.  Basically, they are different sites that host the same content and that’s just how the internet was made :) Although Google does a good job at trying to canonicalize the preferred version you set in webmaster tools, it’s still cutting the domain’s link authority. People don’t always link to you with your preferred version, and since they are separate sites – you won’t get all of the “link juice”.

How did I find it: By using this Google operator search query

How much of a problem is it: It’s a biggie, I think this site is missing out on around 65 linking root domains and around 600 links.

How does it happen: When you buy a shiny new domain, you get the www and non www version.

What to do: Write a redirect rule (in this case the HTACCESS file for Apache) to redirect all non www pages to their www counterparts.

Problem 2: Unnecessary Redirection

Our principle example is: and the redirection link is “#2: Oprah Winfrey, Harpo”

But here’s some more:


How did I find it: Screaming Frog, Internal links report & IIS Crawler. Confirmed it by using HTTPfox.

How much of a problem is it: A straight link to a page passes 100% link juice, but a 301 redirect won’t pass the full value. It’s hard to control on external sites because we can’t always control how someone links to us, however it shouldn’t happen on our own sites.

How does it happen: You create a new page and 301 redirect the old page to the new page to conserve link authority. Except, you forget to correct internal links pointing to the new pages.

What to do: Change the links to the right targets. For example, the link to Oprah’s profile ( on this page ( should go directly to

Problem 3: More duplication, this time it’s WordPress powered blog.

This is the same type of problem I discussed above as a Panda threat, except this time it’s a stock problem with the WordPress CMS. Category, Author and Tag pages are just duplicate content pages that can serve a user well in terms of navigation – but can be a problem when they compete with actual blog posts.

How did I find it: Screaming Frog

How much of a problem is it: Right now there are a ~2k of  pages with duplicate content – however the bigger this blog gets, the worse the problem will become.

How does it happen: It’s just a WordPress stock feature

What to do: Easiest way to fix it is to include the NOINDEX, FOLLOW directive again on all author, category and tag pages. To do this super easily I recommend Yoast’s SEO plugin – it’s so good it could put me out of a job, trust me on this one.

Alternatively, you can also use the rel=canonical tag to point back to a main article. It works to transfer page/link authority, doesn’t reduce Googlebot crawl inefficiencies and also is not the correct use of the tag (because the pages aren’t entirely a duplicate, only sections). Use at your own peril to game the engines!

Problem 4: Duplicate & inefficient page titles

Duplicate examples:

  •  title: What is CRM?
  •  title: What is CRM?
  •  title: Managing Change 
  • title: Managing Change 

Poor page title keyword targeting

  • title: Cars
  • title: Be

Page titles are crucial for SEO and need to be as descriptive as possible, as well as completely unique. There is no compromise here, we just have to do it to help Google match content to queries.

How did I find it: Screaming Frog

How much of a problem is it: These pages aren’t being seen, either because they have duplicate page titles and Google is confused on which page to return, or the titles aren’t relevant matches to the queries – example: “be”.

How does it happen: I * think  * this is due to user generated content.

What to do: Evaluate each article, do some keyword research and title each page according to it’s content. As a general rule, each page title on your site needs to be absolutely unique. You can read more about cannibalization here.


Bearing in mind I did a quick check – I looked at top keywords surrounding Entrepreneur and sadly, didn’t see the site ranking on the first page for top volume terms. So I’d like to see this site take a piece of the ~50k exact match searches per month (US) around these terms:

  • entrepreneurship
  • entrepreneurs
  • famous entrepreneurs
  • entrepreneur ideas
  • successful entrepreneurs
  • what is entrepreneurship
  • what is an entrepreneur
  • characteristics of an entrepreneur
  • young entrepreneur
  • how to become an entrepreneur
  • entrepreneurial skills
  • how to be an entrepreneur
  • business entrepreneur
  • entrepreneur forum
  • online entrepreneur
  • entrepreneur advice

Also, I’d like to point out that there’s a new kid on the block in this field that’s gaining momentum – how about we create some pages targeting “startups” ?


Things I should tell you

Overall, it’s a pretty good site – but there is opportunity. I’ve only included a few points in this post but rest assure I performed well  over 50 different checks in an hour.  In the interest in the length of the blog post and to make sure you’re still awake – I kept it brief.

A special thank you to Evan for volunteering his site for the audit – it’s been fun!

So what did you think, was it useful? Do you have any questions?

Feel free to leave them in the comments below, and feel free to send me a tweet every now and again ;)

Image credits:

Baby Panda


About the Author

I #Believe in entrepreneurs.

50 Responses to “The 1 hour Technical SEO Audit for”

  1. Awesome post David and thank you for the detailed analysis! I’ll respond with a more detailed comment but just wanted to start things off by saying thank you!

  2. AWEB-DESIGN says:

    You’re Smart! I have to look into this panda threat further

  3. AWEB-DESIGN says:

    Interesting Tatics!

  4. The 1 hour Technical SEO Audit for A while back I wrote a post about finding your site’s bi…

  5. Modesto says:

    Hey David,

    Nice write up and great to see your thought process.

    Searchmetrics Essentials is a really handy service, I used to rely on SemRush but it seems it’s the right time for a switchover. Even the free version seems to provide some really useful insights.

    1. Hey, nice to see you here! I’ve only just started using Searchmetrics and I absolutely love it – but I think SemRush is still solid tool as well. Give the Searchmetrics essentials a try, I’m sure you’ll love it too.

  6. John Doherty says:

    My coworker @dsottimano did a 60 minute site audit on a site…and was allowed to guest post it on that site. Word!

  7. Here are my more detailed comments – thanks again David!


    This used to be on the main domain and as a result of Panda to move it to a subdomain because it was a different type of content.


    This was also put to a subdomain after Panda because it was different content. Point noted on the ads – they only show to non-registered visitors but it’s a great point. The Forums traffic wasn’t hit by the page layout algorithm update but that doesn’t mean it might not be a target later on.

    I’m a little confused by the noindex of pages 2,3,4, etc of long posts. For example: (Page 2)

    has different content than (Page 1)

    Why wouldn’t I want both pages indexed since the content is different?

    Again, forum traffic has not been impacted, but I’m still open to following best practices here.

    3) www vs. non www

    Great point – I’ll have my server guy take a look!

    4) Unecessary Redirection

    We created subdomains for 4 of our folders (in this case 514, 614, 592, and one other one) to see if moving to subdomains would help or not so we set up 301s to test it. So far results the results haven’t really shown a big difference. We’ll likely revert back to non-subdomains for these folders if we don’t see an improvement.

    5) WordPress duplication

    We’ll check it out!

    6) Duplicate & ineffecient page titles

    Duplicate – yes, this is due to user generated content – we can put the user name in the title tag though to remove the duplication.

    Inefficient – this is much harder because our users create the content but we can suggest to authors who have article titles of under 5 words to add more a more descriptive title

    7) Opportunities

    Great ideas. In general we haven’t really optimized around specific keywords. A lot of the reason is we have user generated content so we can only guide and suggest – and remove content that doesn’t meet our standards. We do write all the famous entrepreneur profiles ourselves and tend to rank well for those articles. It could be an area of opportunity to create more content ourselves around entrepreneur related keywords though.

    1. No problem. I’m going to provide good honest feedback here and feel free to come back at me :)


      It makes sense that you’ve had to move to because of Panda. You might disagree with me here, but I don’t see why these pages even need to exist – if they exist, they become a real threat because they are essentially riddled with duplicate content. In my opinion, there isn’t one page on here that deserves to rank for its page title (

      If you can spend some time and properly optimize these pages (example: with a proper title (example: Achieve your entrepreneurial goals) and managed to create compelling unique content other than just listing articles – then I could see the value in them. Right now, they even lack a basic CTA – something soft like a newsletter sign up would make them much more valuable to you.


      Was it moved to a subdomain because it was a possible threat or because it had been hit? Would it make sense to show registered users some highly targeted ads as well?

      Using the NOINDEX directive won’t stop Google crawling, nor will it stop it assigning pagerank to the links it finds on a NOINDEX page. This directive only implies that the page shouldn’t appear in search results – in this case, because page 1 should be the starting point.

      Take this example for instance The query was “How To Post A Forum Topic That Gets Responses” and it did in fact return a relevant forum page from your site. However, if you click on it, you’ll notice that it’s brought you to page 2:

      However, the conversation has already been started on this post on page 1 – and for users, you’d want to see a forum post from the beginning to follow the entire conversation.

      If we were to use the NOINDEX,FOLLOW tag on the paginated versions, we could ensure that the top level page was the only one ranking. Not only that, but we wouldn’t confuse Google by trying to rank different pages with the exact same page titles – in this case “Entrepreneur Forums • View topic – How To Post A Forum Topic That Gets Responses”

      3) You should see some nice uplift from that – just make sure the redirection is a 301 :)

      4) You can keep the subdomains if you wish (subdomains automatically have less authority than subfolders though), all you need to do is correct the internal links so they point to their final destination and not have to hop though a redirect.

      5) Cool.

      6) Create a policy that forces guest authors to create unique titles. All the need to do is perform this query MY TITLE – if a matching result is returned, they will need to create a new title. I’m a big believer that if you’re nice enough to allow guest posting, that the guest author should at least meet the minimum editorial and SEO standards as it’s benefiting them as well.

      7) Looking forward to seeing it!

  8. Awesome 60min site audit by @dsottimano RT @EvanCarmichael: The 1 hour Technical SEO Audit for

  9. simon penson says:

    @dsottimano did a 60 minute site audit on a site…and was allowed to guest post it on that site. Word! via @dohertyjf

  10. Jason Acidre says:

    Solid and actionable pointers from distilled's @dsottimano on technical SEO auditing

  11. Chris says:

    Solid and actionable pointers from distilled's @dsottimano on technical SEO auditing

  12. James Carson says:

    Probably the best post on auditing / Panda I've seen – by @dsottimano real life case study

  13. Jilan Wagdy says:

    RT @jasonacidre technical #SEO audit
    #google #google panda #googleanalytics
    #سيو #جوجل

  14. RT @EvanCarmichael: The 1 hour Technical SEO Audit for

  15. Paul Baguley says:

    Solid and actionable pointers from distilled's @dsottimano on technical SEO auditing

  16. Julie Cheung says:

    RT @mrjamescarson: Probably the best post on auditing / Panda I've seen – by @dsottimano real life case study

  17. Corey Eulas says:

    My coworker @dsottimano did a 60 minute site audit on a site…and was allowed to guest post it on that site. Word!

  18. @dsottimano hi Dave, reading by you..a 301 doesn't pass 100% link juice, what percentage would you say it does pass?

  19. joemarD says:

    Very impressive and interesting post!

  20. Isha says:

    Those are great tactics.

  21. The 1 hour Technical SEO Audit for via @EvanCarmichael by @dsottimano {Highly technical, great!}

  22. @Searchmetrics seriously, thank you for making my life infinitely easier. You saved me tons of time here:

  23. @Searchmetrics seriously, thank you for making my life infinitely easier. You saved me tons of time here:

  24. Alex says:

    @Searchmetrics seriously, thank you for making my life infinitely easier. You saved me tons of time here:

  25. @Searchmetrics seriously, thank you for making my life infinitely easier. You saved me tons of time here:

  26. Mikael Rieck says:

    Hi David,

    First of all thank you for this post. I gives a great view into the methods used by professional SEOs and give great points that we can check ourselves.

    Would you care to share what the price would be for someone wanting to hire you for a similar review (including the report)?


  27. Hi, Nice article! Good website. Thank you for this specific material I had been seeking all Msn to be able to locate it!

  28. santa says:


  29. sbt77 says:

    When I first saw this title The 1 hour Technical SEO Audit for – Entrepreneur Blog on google I just whent and bookmark it. Superb, what a weblog it is! This blog gives helpful data to us, keep it up.

Table 'evan_WPTMW.LightBox' doesn't exist