Skip to content
Gravity Search Marketing
  • Why Gravity
  • Services
  • Contact Us

Blog

Dynamic Website SEO Terror Level Downgraded to Yellow articles and interviews

Dynamic Website SEO Terror Level Downgraded to Yellow

  • June 2, 2009
  • by Jennifer Grappone

Dynamic content used to be a red flag for search engine friendly design, but times have changed. Search engines now include dynamically-generated pages in their indexes, but some particulars of dynamic pages can still be obstacles to getting indexed.

Whether it’s keeping in synch with inventory or updating a blog, more than likely if you’re a website owner you have some level of dynamic or CMS-managed content on your site (and if not, you should really be looking into it for your next redesign). Follow the guidelines here to avoid major pitfalls and ensure that your dynamic body of work is search engine friendly from head to toe.

Rule #1: Be sure that search engines can follow regular HTML links to all pages on your site.

Any website needs individually linkable URLs for all unique pages on the site.   This way every page can be bookmarked and deep linked by users, and indexed by search engines.  But dynamic websites have an additional concern: making sure the search engine robots can reach all of these pages.

For example, suppose you have a form on your website: you ask people to select their location from a pull-down, and then when people submit the form your website generates a page with content that is specifically written for that geographical area.  Search engine robots don’t fill out forms or select from pull-down menus, so there will be no way for them to get to that page.

This problem can be easily remedied by providing standard <a href> type HTML links that point to all of your dynamic pages. The easiest way to do this is to add these links to your site map.

Rule #2: Set up an XML site map if you can’t create regular HTML links to all of your pages, or if it appears that search engines are having trouble indexing your pages.

If you have a large (10K pages or more) dynamic site, or you don’t think that providing static HTML links is an option, you can use an XML site map to tell search engines the locations of all your pages.

Most website owners tell Google and Yahoo! about their site maps through the search engines’ respective webmaster tools (Links: Google Yahoo!). But if you’re an early adopter, you should look into the new system whereby a site map can be easily designated in the robots.txt file using sitemap autodiscovery. Ask.com, Google and Yahoo! currently support this feature. Cool!

Rule #3: If you must use dynamic URLs, keep them short and tidy

Another potential problem – and this is one that is subject to some debate – is with dynamic pages that have too many parameters in the URL.  Google itself in its webmaster guidelines states the following: “If you decide to use dynamic pages (i.e., the URL contains a “?” character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few.”

Here are a few guidelines you should follow for your website parameters:

  • Limit the number of parameters in the URL to a maximum of 2
  • Use the parameter “?id=” only when in reference to a session id [this is no longer a problem.]
  • Be sure that the URL functions if all dynamic items are removed
  • Be sure your internal links are consistent – always link with parameters in the same order and format

Rule #4: Avoid dynamic-looking URLs if possible

Besides being second-class citizens of search, dynamic-looking URLs are also less attractive to your human visitors.  Most people prefer to see URLs that clearly communicate the content on the page.  Since reading the URL is one of the ways that people decide whether to click on a listing in search engines, you are much better off having a URL that looks like this:

http://www.yoursite.com/church-bells/discount/

rather than this:

http://www.yourseite.com/prod.php?id=23485&blt=234

We also think that static-looking, “human-readable” URLs are more likely to receive inbound links, because some people will be less inclined to link to pages with very long or complicated URLs.

Furthermore, keywords in a URL are a factor, admittedly not a huge one, in search engine ranking algorithms. Notice how, in the above example, the static URL contains the keywords “discount” and “church bells” while the dynamic URL does not.

There are many tools available that will re-create a dynamic site in static form.  There are also tools that will re-write your URLs, if you have too many parameters, to “look” like regular non-dynamic URLS.  We think these are both good options for dynamic Intrapromote has a helpful post on dynamic URL rewriting.

Rule #5: De-index stubs and search results

Have you heard of “website stubs?”  These are pages that are generated by dynamic sites but really have no independent content on them.  For example, if your website is a shopping cart for toys, there may be a page generated for the category “Age 7-12 Toys” but you may not actually have any products in this category.  Stub pages are very annoying to searchers, and search engines, by extension, would like to prevent them from displaying in their results.  So do us all a favor and either figure out a way to get rid of these pages, or exclude them from indexing using the robots.txt file or robots meta tag.

Search results from within your website is another type of page for which Google has stated a dislike: “Typically, web search results don’t add value to users, and since our core goal is to provide the best search results possible, we generally exclude search results from our web search index.” Here’s our advice: either make sure your search results pages add value for the searcher (perhaps by containing some unique content related to the searched term), or exclude them from indexing using the robots.txt file or robots meta tag.

Bonus Points: Handling duplicate content

While it’s not a problem that’s specific to dynamic sites, this rule is one that dynamic sites are more likely to break than static ones. If multiple pages on your site display materials that are identical or nearly identical, duplicates should be excluded from indexing using the robots.txt file or a robots meta tag.  Think of it this way: you don’t want all your duplicate pages competing with each other on the search engines.  Choose a favorite, and exclude the rest. [Editor’s note: we no longer (2009) recommend de-indexing duplicate content. A better approach is to either redirect your duplicate pages to the primary page using a server-side, 301 redirect, or to set up a <link rel=”canonical”> tag for any page that has been duplicated. A good explanation of best practices for handling duplicate content in 2009 can be found at Matt Cutts’ Blog]

Dynamic content is usually timely and useful, which is why users love it, and the search engines want to list it. And now you know how to help your dynamic website reach its full search engine potential.

Our Favorite SEO Blogs and Tools seo resources

Our Favorite SEO Blogs and Tools

  • May 27, 2009
  • by Jennifer Grappone

In Search Engine Optimization: An Hour A Day, we listed some of our favorite SEO blogs and tools. Here are the links from the book, and some updates.

SEO Blogs we Love

  • Search Engine Land– search news by Danny Sullivan and his team of SEO experts
  • SEOMoz Blog with Rand Fishkin
  • Aaron Wall’s SEO Blog
  • Matt Cutts’ Blog (for Google information)
  • Matt McGee’s Small Business SEM
  • Problogger.net (for those with blogs)
  • Occam’s Razor – web analytics blog by Avinash Kaushik
  • Official Google Webmaster Central Blog

Fun Tools for Site Assessment

  • Link validators: elsop.com
  • Slow page load checker: WebPagetest, Webmaster Tools
  • Accessibility check: cynthiasays.com, w3.org (for more links to accessibility evaluation tools)
  • Backlink Checker: backlinkwatch.com, Yahoo! Site Explorer, OpenSiteExplorer
  • Social Media Search: Socialmention.com, Wiki of Social Search Tools
  • Firefox/Chrome Extensions:  SeoQuake, YSlow, Firebug

More SEO Tools We Recommend

  • Wordtracker Keyword Tool for researching keyword search popularity. Wordtracker also offers a limited free keyword research tool. (This is an affiliate link so if you purchase access through this link, Wordtracker will show us a little love, too!)
  • Keyword Discovery – this fee-based keyword research tool has garnered many positive reviews – free trial available
  • Google AdWords’s Keyword Tool – free, you can get keyword ideas & search popularity numbers here
  • SEOMoz Page Strength Tool (or see their other SEO Tools)
  • A truly fabulous spider emulator at seo-browser.com
  • MSN’s set of SEO tools includes finding similar keywords, clustering keywords, and several others.
  • Rex Swain’s HTTP Viewer is a handy way to check if your 301 redirects are sending the right server message.
  • TouchGraph Google Browser a cool way to visualize your website neighborhood
  • Socializer – links to social bookmarking sites

Posts pagination

1 … 55 56 57 58 59 … 92
Who We Are
Gravity Search marketing is led by SEO industry veteran and author Jennifer Grappone in Los Angeles. The company was founded in 2006 following the success of the book Search Engine Optimization: An Hour a Day (Wiley, 2006, 2008, 2011), which Jennifer co-authored. Gravity’s clients include Fortune 500 companies, global entertainment brands, niche B2Bs, large and small retailers, and nonprofits.
Our small, talented California-based team specializes in SEO, advertising, analytics, and online brand visibility. Senior Technology Manager Andrew Berg, who joined Gravity in 2009, elevates the company’s technical SEO expertise to an elite level.
Deeply dedicated to our clients’ success, we’re known for clear communications, effective SEO guidance, and a commitment to transparency and ethical business practices.

Get in Touch

Get An Effective SEO Strategy

Most of our business comes through word of mouth from happy customers. We work with clients who have what it takes to make the project a success: intelligence, openness to new ideas, a commitment to communicating with us regularly, and a workflow that allows us to work with you effectively.

Contact Us
  • Home
  • Why Gravity is Different
  • Services
  • Books
  • Contact Us

Women-Owned Small Business (WOSB)

Gravity is an SBA-Certified WOSB

NAICS Codes:

  • 541613 Marketing Consulting Services
  • 541810 Advertising Agencies
  • 541820 Public relations agencies
  • 541990 Other Professional Services
  • 611430 Professional and Management Training
Gravity Search Marketing LLC - A Full-Service SEO Company
Los Angeles • San Francisco
Copyright © 2024 All Rights Reserved
Theme by Colorlib Powered by WordPress