Matt Cutts and Vanessa Fox answer questions on the Site Review Panel at Pubcon in Las Vegas.
Today, during the Interactive Site Review Session, Google’s head of Web Spam, Matt Cutts, along with Vanessa Fox of NinebyBlue and Derrick Wheeler of Microsoft took thorough dives into a number of sites.
A few points in particular stood out and are worthy of coverage:
- Blocking Internet Archive may be a Negative Signal
Matt Cutts noted that spammers very frequently block archive.org from crawling/storing their pages and few reputable sites engage in this. Thus, it’s a potential spam signal to search engines. SEO Theory has a good writeup on when and why there may be legitimate reasons to do this, but webmasters seeking to avoid scrutiny may want to take heed.
- Web Page Load Time can Positively Influence Rankings
Maile Ohye actually mentioned this at SMX East in New York, but Matt Cutts repeated it again today. In a nutshell – while slow page load times won’t negatively impact your rankings, fast load times may have a positive effect. This comes on a day when the Google Chrome blog introduced their new SPDY research project. I’m particularly happy about this news, because it’s also true that load times have a positive second-order effect on SEO. Pingomatic recently published some excellent research on load times from Akamai noting the expectations of users for faster web browsing have doubled in the past 2 years. In addition, fast loading pages are, in my opinion, considerably more likely to earn links, retweets and other forms of sharing than their slow-loading peers. This tool from Pingdom is a great place to start testing your own site.
- It May be Easier to Walk Away from Banned Domains
Sites that Google’s webspam team has severely penalized or banned entirely from the index can be very difficult to re-include, and thus, Matt suggested that “walking away” and “starting over” may be a more prudent strategy. In my opinion, this is largely due to link profile issues – if your site has a “spammy” link profile, it’s tough to ask an engineer to sort out the wheat from the chaffe manually (or algorithmically) and stop counting only the bad links. Thus, re-consideration requests may not be as effective a use of time as registering a new site and trying to re-build a more trusted presence.
- Repetition of Keywords in Internal Anchor Text (particularly in footers) is Troubling
During a specific site’s review, Matt noted that keyword usage in the anchor text of many internal links, particularly in the footer of a website, is seen as potentially manipulative. Yahoo!’s search engineers have noted this in the past and we at SEOmoz have seen specific cases where removal of keyword-stuffed internal links from a footer had immediate impacts on Google rankings (removing what appeared to be large negative ranking penalties sitewide).
- Having Multiple Sites Targeting Subsections of the Same Niche can be Indicative of Spam
Matt Cutts today mentioned that “having multiple sites for different areas of the same industry can be a red flag to Google.” Though Googlers have mentioned this before, today’s site review panel brought renewed attention to both Google’s ability and proclivity for carefully considering not only an individual site, but all the other sites owned by that registrant/entity/person. Given Google’s tremendous amount of data on web usage behavior, many SEOs suspect that they track beyond simply domain registration records.
To read more, check out SERoundtable.