SEO Client Story
I used to work for an SEO agency in Pittsburgh and dealt with a number of interesting clients in a variety of industries, with large and small sites. There were a number of funny incidents that I encountered, which I’d like to recount here, although names will be withheld.
No Google Traffic
After taking on this client I gained access to their webtrends reports and it showed an astounding lack of Google organic traffic. I looked over the meta tags and page content and all seemed to be targeting the right set of keywords to some degree, although onpage could still use some improvement.
I knew they weren’t doing anything advanced like IP delivery so I used Firefox with the useragent switcher extension and confirmed that with my useragent set to googlebot, slurp or msnbot I could browse the site without any problems. After checking the robots.txt I found that googlebot had been disallowed! After asking the client’s developer why they decided to ban googlebot their response was: It was crawling the site too often and there were errors on some of the pages that were leading to open database connections and locking up the server.
Needless to say the developers got a quick lesson in why banning googlebot to mask their programming errors is not good business practice.
Want to hear more stories? Do you have any of your own you’d like to share?