SEO and your Website Redesign: Avoiding the Pitfalls

In reading the SEO message boards regularly there’s one issue that comes up repeatedly, usually after a site has plummeted in Google, Yahoo and MS LiveSearch immediately after a site redesign. The site owner is sweating, traffic has dropped through the basement, and now he or she is scrambling to make things right. You can avoid most of that. If you carefully consider what Search Engines will penalize, and work around those traps when redesigning your website, you can ensure that you don’t lose out in the Search Engines when spiffying up your site’s look.

The 404 Issue

The most avoidable and devastating mistake made when rolling out a site redesign is to completely change the URL structure of a site without putting any safeguards in place. If a search engine spider or a visitor who has bookmarked a page or followed a link hits a “404 Page Not Found” message, that can lose you indexing in the search engines, and the linkbacks to that page that got you good SERPs in the first place, not to mention the visitor, who will most likely just hit the back button. So either keeping the URL structure the same, or putting 301 redirects in place via the htaccess file to redirect from each old page URL to the corresponding new page URL is the best way to ensure that your site won’t tank. Cpanel hosting has a tool for this purpose, and I’m sure other hosting managers do too. Using the cpanel tool you simply put in the old URL that you want redirected from and the new URL you want redirected to, and save it. Have a list ready to do each webpage change immediately upon switchover.

When launching the new site you should also, when possible, notify sites that have linked to your pages. Include the URL on their site where your link is located, your old url that will no longer be in effect, and your new url. This courtesy will, in most cases, ensure that you continue to have a link from that source. Not doing so might lose you a link, if there isn’t at least a 301 in place, which is another thing that hurts websites in the search engines, and annoys visitors and people who link to you. If you have hundreds of thousands of links this might not be possible, but you may want to consider at least hitting the high points on the PR scale, and notifying the sites with the most Google Juice. Though it’s best to notify people, if your 301 redirects are in place at least you won’t be breaking links for anyone.

Of course, keeping the same url structure is a much better way to handle all this, but sometimes that just isn’t possible, especially if you’re switching from a static html site to a database driven site. In this case, 301′s are a must. Since most dynamic sites are tested locally prior to deployment, you can generate sitemaps on both the old and the new, and use them to create your 301 redirection list.

On Site SEO Page Factors

Another huge problem is changes made to the on-site page factors. The most obvious of these are in the html head area; page title, the page description and any meta tags. If these page elements change, so will your SERPs and Ranking, usually downward when coupled with a lot of other changes. Minimize these changes as much as possible. If you are switching from a static html site to a database driven site, choose a dynamic program that allows you to control on page factors such as title, meta description, meta keywords, etc. for each page, and keep all on page factors as close to the originals of the html head elements as you can for each and every page of your site.

The Codebase, making the Skeleton Stronger

The third thing that affects SERPs and Ranking is less easy to control, since, by its very nature, a site redesign means code changes, but is less of a concern if handled properly. Because the source order of information within the html will change, this may affect the way the search engines look at your pages, at least for awhile. But if you upgrade your code to be cleaner and leaner compliant table-less xhtml/css code, you may end up coming out better in the end, though a dip may happen at first. One of the things search engines do take into account is the code to content ratio. You want to decrease the amount of code and increase the amount of content a spider will find when parsing the html, whether it’s static or dynamically generated. The easiest way to do this is to make sure all css and javascript is in separate files and called in through the head via link hrefs. This allows that part of the code to be cached, and lessens load on your html, good for visitors and good for spiders. And using a good source ordered layout will put your content front and center for both the spiders and your visitors, and be friendlier to those using alternative devices such as screen readers.

Taking these simple precautions when launching a new site design will considerably lessen the negative impact. Though you may see a drop in the first few weeks as the search engines adjust to your new codebase, you will rebound to your former, or possibly even better, Search Engine Placement in no time.

About the Author

BJ Novack is also known as Kickass Web Design, a Web Design Boutique specializing in Custom WordPress Themes, Cubecart Templates, ModX Templates, and other Dynamic Templating for Small Business and Non-Profit Websites.

Library Category: