Q: In your book you recommended using a spider emulator to see what a webpage looks like to search engine robots. What tool do you use when you’re doing this?
A: These days, we use three different methods for reviewing a site with the “spiders-eye view.” Here they are in order of simplest to most complicated:
* Rex Swains HTTP viewer at http://www.rexswain.com/httpview.html. Type in any URL and see what a typical client (browser or spider will see). This is a great tool for a quick and easy look at a page.
* Fetch as Googlebot. This only works for sites that you have a Google Webmaster Tools account for (so it can’t be used for competitive review, only for your own site). Sign up for Google Webmaster Tools by following the instructions here: https://www.google.com/webmasters/tools. Once you have your website verified, click Crawl > Fetch as Google and enter the URL of the page that you want to see. This method has the advantage that it is an accurate rendition of what Google sees, according to Google itself.
* Microsoft’s IIS SEO Toolkit. This is useful when you want to crawl an entire site, and this is what we use for our site technical audits. It is a free crawler, requiring a PC platform to run: http://www.iis.net/downloads/microsoft/search-engine-optimization-toolkit