Skip to content
Gravity Search Marketing
  • Why Gravity
  • Services
  • Contact Us
Dynamic Website SEO Terror Level Downgraded to Yellow articles and interviews

Dynamic Website SEO Terror Level Downgraded to Yellow

  • June 2, 2009
  • by Jennifer Grappone

Dynamic content used to be a red flag for search engine friendly design, but times have changed. Search engines now include dynamically-generated pages in their indexes, but some particulars of dynamic pages can still be obstacles to getting indexed.

Whether it’s keeping in synch with inventory or updating a blog, more than likely if you’re a website owner you have some level of dynamic or CMS-managed content on your site (and if not, you should really be looking into it for your next redesign). Follow the guidelines here to avoid major pitfalls and ensure that your dynamic body of work is search engine friendly from head to toe.

Rule #1: Be sure that search engines can follow regular HTML links to all pages on your site.

Any website needs individually linkable URLs for all unique pages on the site.   This way every page can be bookmarked and deep linked by users, and indexed by search engines.  But dynamic websites have an additional concern: making sure the search engine robots can reach all of these pages.

For example, suppose you have a form on your website: you ask people to select their location from a pull-down, and then when people submit the form your website generates a page with content that is specifically written for that geographical area.  Search engine robots don’t fill out forms or select from pull-down menus, so there will be no way for them to get to that page.

This problem can be easily remedied by providing standard <a href> type HTML links that point to all of your dynamic pages. The easiest way to do this is to add these links to your site map.

Rule #2: Set up an XML site map if you can’t create regular HTML links to all of your pages, or if it appears that search engines are having trouble indexing your pages.

If you have a large (10K pages or more) dynamic site, or you don’t think that providing static HTML links is an option, you can use an XML site map to tell search engines the locations of all your pages.

Most website owners tell Google and Yahoo! about their site maps through the search engines’ respective webmaster tools (Links: Google Yahoo!). But if you’re an early adopter, you should look into the new system whereby a site map can be easily designated in the robots.txt file using sitemap autodiscovery. Ask.com, Google and Yahoo! currently support this feature. Cool!

Rule #3: If you must use dynamic URLs, keep them short and tidy

Another potential problem – and this is one that is subject to some debate – is with dynamic pages that have too many parameters in the URL.  Google itself in its webmaster guidelines states the following: “If you decide to use dynamic pages (i.e., the URL contains a “?” character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few.”

Here are a few guidelines you should follow for your website parameters:

  • Limit the number of parameters in the URL to a maximum of 2
  • Use the parameter “?id=” only when in reference to a session id [this is no longer a problem.]
  • Be sure that the URL functions if all dynamic items are removed
  • Be sure your internal links are consistent – always link with parameters in the same order and format

Rule #4: Avoid dynamic-looking URLs if possible

Besides being second-class citizens of search, dynamic-looking URLs are also less attractive to your human visitors.  Most people prefer to see URLs that clearly communicate the content on the page.  Since reading the URL is one of the ways that people decide whether to click on a listing in search engines, you are much better off having a URL that looks like this:

http://www.yoursite.com/church-bells/discount/

rather than this:

http://www.yourseite.com/prod.php?id=23485&blt=234

We also think that static-looking, “human-readable” URLs are more likely to receive inbound links, because some people will be less inclined to link to pages with very long or complicated URLs.

Furthermore, keywords in a URL are a factor, admittedly not a huge one, in search engine ranking algorithms. Notice how, in the above example, the static URL contains the keywords “discount” and “church bells” while the dynamic URL does not.

There are many tools available that will re-create a dynamic site in static form.  There are also tools that will re-write your URLs, if you have too many parameters, to “look” like regular non-dynamic URLS.  We think these are both good options for dynamic Intrapromote has a helpful post on dynamic URL rewriting.

Rule #5: De-index stubs and search results

Have you heard of “website stubs?”  These are pages that are generated by dynamic sites but really have no independent content on them.  For example, if your website is a shopping cart for toys, there may be a page generated for the category “Age 7-12 Toys” but you may not actually have any products in this category.  Stub pages are very annoying to searchers, and search engines, by extension, would like to prevent them from displaying in their results.  So do us all a favor and either figure out a way to get rid of these pages, or exclude them from indexing using the robots.txt file or robots meta tag.

Search results from within your website is another type of page for which Google has stated a dislike: “Typically, web search results don’t add value to users, and since our core goal is to provide the best search results possible, we generally exclude search results from our web search index.” Here’s our advice: either make sure your search results pages add value for the searcher (perhaps by containing some unique content related to the searched term), or exclude them from indexing using the robots.txt file or robots meta tag.

Bonus Points: Handling duplicate content

While it’s not a problem that’s specific to dynamic sites, this rule is one that dynamic sites are more likely to break than static ones. If multiple pages on your site display materials that are identical or nearly identical, duplicates should be excluded from indexing using the robots.txt file or a robots meta tag.  Think of it this way: you don’t want all your duplicate pages competing with each other on the search engines.  Choose a favorite, and exclude the rest. [Editor’s note: we no longer (2009) recommend de-indexing duplicate content. A better approach is to either redirect your duplicate pages to the primary page using a server-side, 301 redirect, or to set up a <link rel=”canonical”> tag for any page that has been duplicated. A good explanation of best practices for handling duplicate content in 2009 can be found at Matt Cutts’ Blog]

Dynamic content is usually timely and useful, which is why users love it, and the search engines want to list it. And now you know how to help your dynamic website reach its full search engine potential.

Oops, I Redesigned My Website! An SEO Checklist articles and interviews

Oops, I Redesigned My Website! An SEO Checklist

  • June 27, 2008
  • by Jennifer Grappone

A website redesign is a time for celebration! But if you recently redesigned your website without thinking about the effect on its search engine presence, you may be in for a rude awakening. Follow a few simple guidelines to be sure that your fabulous new site isn’t going incognito.

One question that we are asked over and over again is this: “I just redesigned my website.  How can I make sure that I don’t lose my search engine rankings?”

If you just launched a redesigned website, or you are about to go through a revamp, we’ve made this checklist for you.  Follow it, and your grand debut won’t become a search engine flop.

You Never Know What You’ve Got Until It’s Gone

How did visitors get to your old site? Knowing this will help you know where to focus your efforts when reclaiming lost traffic.  For a meaningful baseline, dig up this information about your old site:

  • Conversions. Do you have any data on sales, leads, or other performance of your old website?  Make a note of it so that you can compare it against the new site.
  • Search Engine Rankings. Did you track your old site’s search engine ranks? Is your new website targeting the same keywords? If so, you’ll want to keep a record of your old site’s ranks as a baseline. You can learn the best way to track search engine ranks in chapter 6 of our book, SEO: An Hour Day, and you can record your rankings on our downloadable SEO rank tracking worksheet.
  • URL “Hot List.” What were the most visited pages on your old website? And, what were the most common entry pages (the pages that your audience come to first)? These pages will be your highest priorities for “cleanup time” tasks (we’ll explain below).
  • Inbound Links. It’s important to know how many other sites are linking to your old website – especially if you’re changing URLs. While the previous tasks in this list require some recordkeeping before the new site is live, this is one you can do just as well after launch.  See our handy search shortcuts page to learn how to find out who’s linking to your site.

On the Day of Launch: Handle With Care

With key background info in hand, you’ll be in a good position to manage your site redesign with care. Here are some site launch best practices for a website redesign:

  • Avoid Changing URLs. Keep as many of your old URLs as possible. Don’t change your domain name if you can avoid it!
  • Page Redirects – Must happen concurrent with launch. In an ideal world, every URL from your old website would redirect to an appropriate page on your new website.  But we know that this can be hard to achieve in reality.  So try this on for size: using a server-side 301 setting (your IT people will know what this is) redirect the following pages, in order of priority:
  1. Your Website Home Page (be sure you get all the variations – index.asp, index.php, or whatever you have)
  2. Any Pages with Special Status (i.e,. Customer Support) That Make Them Important to your Business
  3. Your Top Entry Pages
  4. Your Top Most Popular Pages
  5. Pages on the Path to Conversion (the pages that visitors often visited on their journey from entering the site to converting)
  6. Any Other Pages on the Website.

Don’t delay this step – you need the redirects in place before the search engine robots come back and visit your site. And be sure, when setting up the redirect, that each page from your old site goes to a well-chosen landing page on your new site – not just the home page!

P.S.: Looking to go techie? Here’s a link to an excellent guide to redirects in all sorts of languages (PHP, .htaccess, Ruby on Rails and so on).

  • Server Downtime. It probably goes without saying that server downtime is to be avoided; however, most reports we’ve read say that Google will come back again if a page fails once. Here are some tips if you’re moving your website to a different server.
  • Internal Links. No website is immune to broken links – even your own. After a redesign be sure to run a link validator on your website to be sure that those internal links have all been updated properly. These are available online or using website development software (in Adobe Dreamweaver, for example, select Site > Check Links Sitewide).
  • File Not Found Page. As a result of all the potential broken links listed above, your audience will probably see the “File Not Found” Page more often than usual after a site redesign.  Be sure it’s well written and explains the situation. (“To better serve our customers, we have redesigned our website” is good; “404 Error!  The URL you have typed is incorrect!” is not.)  Better yet, include links to those most-popular pages you figured out earlier.

After Launch

  • Monitor 404 errors. After your new site launches, keep an eye on your website stats, so you can see if there are a large number of “File Not Found” errors showing up for a particular URL. You can also see a list of broken links at Google Webmaster Tools. Any pages with a large number of errors should recieve 301 redirects.
  • Inbound Links. There may be dozens or hundreds of links all over the web linking to non-existent URLs on your website.  Each of these should receive a polite request for an update – but don’t hold your breath.  In our experience, less than half of these requests result in an update. If you’ve got analytics in place, you can review which links are sending the most traffic and pester in order of importance.
  • Directory Listings. Really just a special case of “inbound links,” directory listings deserve a little special attention.  Take the time to submit whatever form you need to make sure that they are linking to the correct URLs.
  • XML Sitemap. Today’s search engines are smart enough to follow a 301 redirect through to a new page and index it properly – with no loss in search engine presence.  How long will this take?  We’ve seen it take 3 months before an old site is entirely flushed.  Can you speed up the process?  Maybe. Some SEOs swear by XML Sitemaps for getting pages indexed quickly. We haven’t seen this in action, but it certainly wouldn’t hurt. Here is a helpful article on how to submit your sitemaps.

We know what you’re thinking: You worked so hard on your shiny new website, and just when you want to kick back and relax a little, we’ve created a substantial new pile of work for you. Is it worth it? Absolutely! You don’t want your redesign to turn away your most desirable visitors – people who are actively trying to come to your site! You could hand all this traffic over to your competitors, or you could identify your best sources of traffic, and take important steps to keep them coming!

Posts pagination

1 2
Who We Are
Gravity Search marketing is led by SEO industry veteran and author Jennifer Grappone in Los Angeles. The company was founded in 2006 following the success of the book Search Engine Optimization: An Hour a Day (Wiley, 2006, 2008, 2011), which Jennifer co-authored. Gravity’s clients include Fortune 500 companies, global entertainment brands, niche B2Bs, large and small retailers, and nonprofits.
Our small, talented California-based team specializes in SEO, advertising, analytics, and online brand visibility. Senior Technology Manager Andrew Berg, who joined Gravity in 2009, elevates the company’s technical SEO expertise to an elite level.
Deeply dedicated to our clients’ success, we’re known for clear communications, effective SEO guidance, and a commitment to transparency and ethical business practices.

Get in Touch

Get An Effective SEO Strategy

Most of our business comes through word of mouth from happy customers. We work with clients who have what it takes to make the project a success: intelligence, openness to new ideas, a commitment to communicating with us regularly, and a workflow that allows us to work with you effectively.

Contact Us
  • Home
  • Why Gravity is Different
  • Services
  • Books
  • Contact Us

Women-Owned Small Business (WOSB)

Gravity is an SBA-Certified WOSB

NAICS Codes:

  • 541613 Marketing Consulting Services
  • 541810 Advertising Agencies
  • 541820 Public relations agencies
  • 541990 Other Professional Services
  • 611430 Professional and Management Training
Gravity Search Marketing LLC - A Full-Service SEO Company
Los Angeles • San Francisco
Copyright © 2024 All Rights Reserved
Theme by Colorlib Powered by WordPress