Organic SEO Blog

231-922-9460 • Contact UsFree SEO Site Audit

Friday, January 20, 2006

Link Architecture Critical To Organic SEO Success !

Just got off the phone with a senior engineer at Google. The conversation turned from the feds hanging out in Google's lobby today unsuccessfully requesting the last two weeks of Google keyword search data, over to linking and redirects on websites.

It was confirmed that recent algorithmic updates have been installed that reduce the value of incoming links and decrease third party link casting scores. Too many sneaky redirects were reducing the relevancy of Google's organic results.

Instead a renewed focus of the googlebot spiders is link navigation. As so many websites have moved to sticky include files, especially many large dynamically generated, database sites that are using links housed in javascript and flash, despite the fact that the googlebot spiders often times cannot follow such links.

Creating a secondary navigation is a step in the right direction, however these new secondary navigation links are just that, secondary, and as such, typically cannot make enough of an impact to offset or compensate for the poorly designed and less than search engine friendly dynamic link navigation being used for the primary links of the website.

Why should a search engine spider favor or trust sites that utilize dynamic link navigations?, especially if the site combines one bad design idea with another and also uses urls that feature dynamic query strings. This deadly dynamic combination of javascript/flash link navigations used on "character-heavy, spider-choking query string urls" typically leaves a site invisible to new in-market users using search engines, found by no keyword searchers, and floating around aimlessley in the invisible 'dark web'.

How can a google spider be expected to "trust" sites with dynamic link navigations or redirect links, to actually serve users relevant content, or even render properly in all browsers, on pdas, wireless web enabled cell phones, or in alternative web browsers specifically designed for people with disabilities, etc.?

Google encourages sites with problematic dynamic link navigation schemes to create a google sitemap url and also an xml feed providing a text url driectory or text based crawler roadmap for the spiders. This is a step in the right direction but even these spider sitemaps cannot compensate for the critical site design flaw of housing primary link navigations in javascript and flash.

The spiders are programmed to index and crawl rendered pages and engage their semantic text matching capabilities on the 'true' rendered pages of sites, not specialty text based spider site maps that only list url names and feature zero relevant page content.

In other words, why sacrifice the keyword ranking capabilities of your best pages by linking to them using links that are embedded in javascript or flash?


When the powers that be at your company begin knocking on your cube walls and tapping on your shoulders at the water cooler, moaning over their inability to find their company site on google searches of the company name and exclusive product/brand names, try explaining why javascript and flash based links are being used in the primary link navigation and how pretty they are.

Do you think upper management will understand why the site is not being found on searches for the company by name ?

I suggest touching up the resume and ensuring your COBRA plan elgibility before starting this discussion. Is it fair to conclude that search engine visibility far outweighs design aesthetics...it should as without search engine exposure no one will ever see the company picasso.

Consider that 90% of all first time website visitors are delivered to sites for the first time by a major search engine.

Link Architecture and W3C Code Vaildation Are Absolutely Critical To Organic SEO Success !

---