404’s: A Great Way to Lose Site Traffic!

404 errors, dead pages, bad links, whatever you like to call them are a great way to increase your bounce rate, decrease your online revenues and make you look sloppy. This form of conversion suicide can be corrected by monitoring your site well to make sure that you are not contributing to the probability of your site rendering 404 errors or that others on the web are not properly linking to your site.

Not ensuring that all site pages are linked correctly and redirected appropriately is like waving good bye to site visitors. There are a couple reasons why your site may have 404 errors evolving from server requests. This can include a URL rewrite where folders may have changed yet certain internal pages didn’t make the rewrite file. Other reason reasons for 404 File Not Found errors include internal linking to pages that may have the link structure of the above explanation or that there may just be a typo in the internal link. Another reason for a 404 error may not be your fault at all. This can simply be someone who is link to your domain but to an incorrect page. This can also relate to the first reason above where you did not include all site pages in the URL rewrite and now inbound links arrive at pages that no longer exist.

I know, we have all been here before, we recognize our mistakes, run our heads into a wall a few times and move on. Below are a few suggestions I have found helpful for assessing the potential of 404’s on-site.

  • Utilize site crawling tools. I use OptiSpider and like this because it quickly crawls the site and reports on any 404 errors so I know off-the-bat what needs to be fixed. With any new clients I assess this is one of the first areas I look at. There’s no telling what your neighbor’s cousin Eddy missed when he redesigned your site last year.
  • Google Webmaster Tools can be very helpful for assessing dead pages, broken links etc. The Crawl Errors section will tell you what pages are listed as unreachable and what links out there on the web from other sites are breaking.
  • Generate an XML sitemap. This is another form of crawling your server and will let you know what page errors it finds. Google Webmaster Tools above will also tell you about error pages in sitemaps but this is just another way to assess the situation from a different vantage point.
  • It should go without saying that you should have custom 404 pages for your site. If not, go ahead and run your head into a wall a few more times and come back to this post.  The custom 404 page allows you let visitors know that they have reached a non-existent page and that you would like to provide them with the navigational opportunity to find the page which holds the information they are looking for. This custom 404 page also houses the analytical tracking code in the footer of the page template hopefully that all of your other site pages possess. I as well as the other 99% of webmasters and search marketers use Google Analytics. Since we have made sure that we possess custom 404 pages with our tracking code it is easy to find 404 errors that are currently indexed in search engines or are coming from other referring sites. Simply go to the Content section, filter to 404 pages and do reverse path research to note where the referring source, keyword and such have brought about the 404. I like this method because you can drill down on a recent 404 from a search engine listing and hopefully repair it before you lose the listing.

Managing your 404’s is not that hard to take on once you have found a few methods such as those above. If you are working with an internal web team or third party developmental resource it will also save a lot of time if you host a quick SEO 101 crash course to address proper page redirection and internal linking practices to get everyone on the right page, no pun intended.

Need help making your site perfect?