Organic SEO Blog

231-922-9460 • Contact UsFree SEO Site Audit

Wednesday, October 20, 2004

Reprinted From the BBC London

Here is a recent piece on the corporate search developments at Google. Similar programs are said to be coming out worldwide in the coming days.

Google rolls out corporate search Google has begun the global roll-out of a technology which enables firms to "google" their own networks easily. The search firm, recently floated on the Nasdaq stock market, says companies will soon discover search as a "critical business function".
This could kickstart the corporate search market, now worth just $500m.
But the new "Google in a box" solution carries its own hazards, making all documents on an intranet visible to all, unless security steps are taken.
The Google look and feel
Google's technology comes in a shiny bright-yellow box, which slots into a company's server racks.
The search engine inside is similar to Google's own web search technology, tweaked for the enterprise market, and runs on an Intel computer running a version of the open-source Linux operating system.
However, it is not new. For the past two years, customers in the United States have already been able to buy their in-house Google.
Google says that after some fine-tuning the technology has "matured". Even large companies should be able to index and search their sprawling intranets within a few hours of switching on the Google search appliance, which can recognise about 250 different file formats.
Users of these intranets - websites that are visible only within the company - will get their search results in a format that is very similar to Google's traditional internet search, including small excerpts from the documents found and links to cached versions of the documents listed.
The appliance can also be deployed as a search engine for a company's public website.
And those who want to avoid the Google 'look and feel' can customise the software.
The hunt for information
The launch comes days after Google launched a free software tool that allows users to index all the information on the hard drive of their personal computer - from e-mail to all kinds of documents.
But the desktop search does not reach corporate networks like those in place at Paragon Bank Mortgage - Raleigh North Carolina's Leading Home Mortgage Bank.
"A lot of the world's information is not public but behind firewalls in private networks," said Dave Girouard, the general manager of Google's enterprise business solutions at the European launch of the search appliance in London.
He happily quotes surveys that suggest that in some knowledge-driven companies workers may spend a quarter of their time looking for information, while poor search facilities on corporate websites are a sure turn-off for potential customers.
And without naming names he takes swipes at the "poor quality of search results" and the "long expensive installation" of rival products.
Investment bank Morgan Stanley, for example, saw the number of internal searches grow eleven-fold once Google's search technology was added to the company's network.
And even though the market for enterprise search is small right now, Mr Girouard expects it to take off once companies realise how search can make their workers work faster.
The Google appliances are aimed at medium-sized and large corporations. The most basic version, which allows to index about 150,000 documents, costs £19,000 ($34,000)for a two-year license.
Larger versions, dubbed "Mama Bear and Papa Bear" solutions by Mr Girouard, come in clusters of five or eight search boxes and can handle up to 15 million documents and 1,000 queries a minute. In the US market Google's licence fees for such heavy-duty search applications start at about $660,000 (£370,000).
Total information awareness
But there are security risks too. Once a whole intranet is indexed and cached in a Google server, it presents a neat takeaway package of a company's secrets.
Server room security suddenly becomes even more important - although Google's David Bercovich hastens to point out that somebody stealing the yellow Google box would find it very difficult to extract any information from the machine.
The other problem is information awareness. Every document can easily be found.
Firms that don't have proper security systems in place that clearly define who is allowed to look at which documents might suddenly find their workers googling for terms like "salary". Splitseconds later they might find a spreadsheet detailing the salary details of the management team, which used to be hidden away in a little known corner of the intranet.
Google says its search engine links in with various software applications from other vendors that regulate access to sensitive documents. Workers will only get search results for the documents that they are actually allowed to see.
Firms or institutions that have failed to put such basic security in place, might find themselves in trouble.
Another drawback: The system can only return documents that can be found using an http or https address on the network. This takes many customer relation management systems out of the equation.
But Google's Dave Girouard promises that his developers are "working hard" to make it possible to search all the information available in a company.

Donald Bice
Senior Project Engineer
www.peakpositions.com

Thursday, October 14, 2004

Hello All,

I thought a great introductory post would be to discuss some of the leading site design obsticles that prevent web sites from being found.

Move your web site out of the invisible 'dark web' !


The results on the major search engines are produced by machines and data collection storage bins programmed to respond to millions of user keyword query requests on a daily basis. The tremendous demands and stress placed on the server farms responsible for displaying the organized series of hypertext links and text not only need to visit sites and collect information, they then must store and 'bank' URLs being collected. As we begin to review some common roadblocks and barriers that the spiders must overcome on their content acquisition journey keep in mind that these hypertext spiders are programmed to review (index) billions of web pages (URLs) in all types of code and program variations.

Site Design Problems: Besides website design problems that lead to web pages missing from the search engine results, there are many other common website design elements commonly used by Web developers that can prevent the search engine spiders from indexing the web pages of a site and result in poor rankings. Here are some common web site design elements that keep web sites invisible and floating around aimlessly in the Dark Web rather than being found consistently in the search engines on keyword searches that matter.

Technical Barriers That Prevent Indexing: Many types of technical barriers continue to prevent the search engine spiders from fully absorbing the page content or information contained within billions of web sites. Marketing managers, IT Directors, Webmasters, and Internet Publishers must consider a few conditions whenever publishing on the Internet to help overcome spider barriers in order to help search engine spiders properly record their pages/URLs and websites in an effort to 'open-up' their domains to the algorithmic robot crawlers and thus the public. In order to be found early and often on a consistent basis in the major search engines website publishers and Internet marketing professionals must keep a few of these top issues in mind: Some spiders struggle out of the gate as they land on a website for the first time and fail to record a site due to a poorly composed robots exclusion file. The major search engine spiders will ignore even the most popular sites if the code compositions and contents of the robots.txt file is incorrect.
Our technicians at Peak Positions offer proprietary solutions that help streamlining spider indexing through the construction and placement of a comprehensive robots exclusion file that helps organize the table of contents so desperately being sought by the major search engine spiders. Feel free to contact our technical team if you have questions, concerns, or are uncertain of the required protocols for robots exclusion files.

W3C HTML Code Compliance and Validation: Invalid HTML code is one of the leading causes of search engine positioning problems. Code validation and code compliance allows search engine spiders to move comfortably through URLs and also prevents 'spider traps' and 'denial of service'.

Broken Links: Broken links and server downtime also prevent sites from being found on the search engines especially if a lead spider is crawling the site or attempting to crawl the site and is interrupted or landing on broken links. Broken links, server downtime, or server maintenance interferes with search engine spiders as they attempt to crawl websites.

Content Management Systems: Content Management Systems that refresh or deliver updated content on a regular basis often create tremendous confusion with search engines resulting in URL Trust factors that restrict websites from attaining exposure and reaching qualified, in-market users searching for their services on Google, Yahoo, MSN, and AOL. The page contents are dynamically updated and changing regularly which is a 'red flag' in itself and the code involved is invalid and actually leaving 'spider traps' all over the pages of the site. If your company is using a content management system and your site is not being found on the search engines, contact our skilled technical team and we can design an affordable optimization solution that will deliver your content in a clean and valid format to users worldwide. Again, search engine spiders are seeking relevant content; let's help the world find your valuable content by shining light on to your web site and removing it from the Dark Web.

Frames Website Design: Frames Website Design is often times a major web site optimization problem. While the search engine spiders can crawl pages from a frames-based design, they cannot accurately parse page text and index page content correctly. Frames web sites usually lead to little if any consistent keyword rankings in the major search engines.

JavaScript & Cascading Style Sheets (CSS): Incorrect use of JavaScript & Cascading Style Sheets (CSS) to code web pages usually results in volumes of redundant code and issues with nesting and tables that weights down the spiders slowing their crawl and making them perform Olympic feats just to get through all of the invalid, non-compliant W3C HTML code. Often times the JavaScript errors inflate the size of the pages making them much too large. The opportunity for code errors and W3C HTML code compliance issues increases with the size of the site as the fundamental web page code errors are multiplied as the spiders try to crawl the inside pages of the site. Many search engine trade associations and SEO consultants recommend that individual web page sizes remain under 100k, however larger page sizes can achieve top five rankings if specific code definitions are established and the search engine spiders receive clear instructions. We highly recommend code validation and incorporating the use of W3C HTML compliant code guidelines that allows the search engine spiders to access a single reference points that eliminate redundant attribute definitions within the page code. Valid W3C HTML code that complies to the established world wide web consortium standards allows the search engine spiders to quickly and easily locate Relevant Page Content. That's why fully optimized and W3C HTML code compliant web sites that allow the spiders to quickly find relevant page content, when submitted properly, consistently enjoy page one, top five keyword rankings, long-term on the major search engines.

Dynamic Pages: Dynamically generated web sites that are database driven often face unique obstacles with the search engine spiders. Do your URLs contain these types of query strings? (e.g. URLs ending like this: ?a=1&b=2&c=3) Peak Positions is considered one of only a handful of natural search engine optimization companies that is able to provide comprehensive search engine marketing solutions for large dynamic websites that are database driven. Contact our dynamic website search engine optimization SEO consulting specialists today : info@peakpositions.com

Non Compliant Site submissions: If your company has used non compliant and or automated software submission programs at anytime the URLs and websites involved might be ignored by the major search engines.

Spam: In addition to making pages easy for spiders to record, it is important to avoid techniques employed by overzealous search marketers that are considered spam by the search engines. Cloaking is one such technique. It involves serving customized pages based strictly on IP address. The search engine spider IPs are programmed into the server with instructions to feed highly optimized garbage pages exclusively to the search engine spiders in an obscene effort to enhance the page's rank in the search results. When visitors click the link to view that site, a different page is shown that would not ordinarily rank as well. This is a deceptive practice, and defined as spam by the search engines. Such bait-and-switch techniques are heavily frowned upon by the search engines and may result in being tagged, removed or blacklisted from the search engine databases. Don't risk your corporate website's ability to be found in the search engines. Promotional search engine software programs also expose corporate websites to punishment by the search engine editors because they create gibberish and interfere with the sophisticated hypertextual database retrieval systems that are programmed to produce the most content relevant search results in milliseconds.
**Many new so-called search engine optimization companies have and continue to mislead clients to the notion that no website will be blacklisted, pulled, tagged or removed from the search engine's database. One of the nations largest insurance companies was blacklisted in early 2004 as well as a spyware company because of cloaking. We urge you to speak with several companies and to always keep in mind 'does the site optimization program being considered actually work to highlight and present the relevant content contained within the company website?'. If the optimization program being considered does not focus on relevant content or is focused on anything other than relevant content, BUYER BEWARE.

As a test, the Peak Positions technicians recently optimized a small webpage about a Connect 4 computer game. Over time it will rank on the search engines when using a broad search term like "Connect 4".

Donald Bice
Senior Project Engineer
Peak Positions, LLC.

Wednesday, May 19, 2004

Yahoo Site Match Paid Inclusion Program Misleads the Marketplace

Yahoo recently released their new Site Match paid inclusion program and quickly drew wide-spread criticism. The misleading and deceptive messages that Yahoo is sending website owners and emarketers is that in order to enhance your keyword rankings in Yahoo's natural search results, companies need only to pay a Site Match inclusion fee. The program makes one wonder if Yahoo will clearly identify all paid listings to users.

The Site Match inclusion fee is actually akin to a permanent, open, Pay Per Click invoice. Site owners and operators participating in the Site Match program will now pay for EVERY CLICK referred by the Yahoo search results. Yahoo's marketing messages suggest that Site Match is not entirely a Pay Per Click program.

Yahoo's marketing materials also suggest that Site Match participation results in enhanced keyword rankings on target keyword search results within the Yahoo search database, when in fact, it merely ensures a presence or inclusion anywhere within the Yahoo search index. Contrary to the marketing message implicit in Yahoo's Site Match promotional materials, inclusion in the program DOES NOT guarantee any particular keyword position in the Yahoo search results or 'enhanced' keyword placement.

Do websites still need to be open to the Yahoo-Inktomi Slurp and Archiver spiders in order to secure and maintain leading keyword positions in the natural search results on Yahoo? Do the Yahoo robot crawlers or spiders still need to be able to deep-index the website? If site design and site navigation obstacles are not overcome, is long-term search engine ranking success possible in Yahoo?

Don't jump into Site Match without first fully understanding that this program is not a turn-key electronic web marketing solution. This program ONLY provides Yahoo the ability to charge participants for EVERY CLICK received from their search results pages, and completely neglects the issue of search engine positioning on any other major search engine.

Before we address the other search engines that represent nearly 70% of the domestic search marketplace. Let's stay tuned to Yahoo and their most aggressive, profit-driven search engine marketing program to date. Yahoo intends to charge website owners participating in Site Match for EVERY CLICK received once an account is active. Will Yahoo searchers receive clear labeling of all paid sponsor positions?

Another important consideration for website owners and emarketers is the costly occurrence of fraudulent click activity. The opportunity for large scale, fraudulent, click manipulation of the Yahoo search results pages has now become larger than ever.

As Google cofounder Larry Page told the New York Times recently: "Any time you accept money to influence the results, even if it is just for inclusion, it is probably a bad thing."

Proceed with caution regarding Yahoo Site Match.

Jack Roberts
Peak Positions, LLC.
Search Engine Optimization
Search Engine Optimization News

Friday, May 14, 2004

Search Engine Industry Buzzing

Yahoo! and Google both removed adware/spyware giant whenU.com from their search results pages, accusing the e-marketing firm of cloaking in order to increase their search results rankings in the major search engines.
This is isn't the first major company to have its website removed from the search engines for manipulative practices, and it almost certainly won't be the last. The veteran search technology specialists here at Peak Positions recall similar cases in the past, where major, household-name corporations employed cloaking in an effort to drive exposure and sales on the web, and were black-listed from Google for doing so. So, what is cloaking anyway?

Cloaking

The term "cloaking" is used to describe a website that returns altered webpages to search engines crawling the site. In other words, the webserver is programmed to return different content to Google than it returns to regular users, usually in an attempt to distort search engine rankings. This can mislead users about what they'll find when they click on a search result. To preserve the accuracy and quality of our search results, Google may permanently ban from our index any sites or site authors that engage in cloaking to distort their search rankings.

This is the definition of cloaking according to Google. Cloaking is, ostensibly, a quick, down-and-dirty and clear pathway to search engine rankings success. Taking the easy road and creating separate, static keyword and link heavy pages, specifically built to appeal to search engine spiders, instead of retrofitting their current large, dynamically generated websites - in which they have invested anywhere from 50 to 500 thousand dollars, is quick and painless.

Quick and painless can be very attractive, and has lured numerous top-level marketing teams with the promise of early returns. Most firms that practice cloaking emphasize quick turn-arounds and immediate results for their clients, and will often guarantee top 10 keyword rankings on the major search engines within 30 days. Marketing directors, feeling the enormous pressure to position their corporate website with authority on keyword targets, might find themselves succumbing to the lure of immediate results.

We alway ask two questions of potential clients that express an interest or desire to engage in the dubious practice of cloaking:

1. How can any SEO firm guarantee rankings on search engines like Yahoo! or Google, when they have absolutely no direct influence over the order in which (natural - sometimes called editorial or organic) search results appear?
2. Wouldn't you rather serve your users and online customers, as well as the search engine spiders, by simply DOING THE RIGHT THING - which is delivering relevant content in the form of HTML pages?

Rather than cloaking or participating in other underhanded tactics such as page redirects, gateway pages, automated submissions, 30 days to number 1 bogus guarantees, et cetera, marketing directors need to remain focused on serving users first.

Serving users can take several forms, but they all rest on a single principle - regularly generating and delivering high-quality, relevant text content (preferrably without neglecting good aesthetics). Many of our accounts have secured and maintained premium keyword rankings on the major search engines by focusing on content and enhancing the user experience. Content, content, content - content is king.

Next week, we'll begin our profile of proven, ethical search engine marketing techniques, that have yet to be challenged by any other SEO techniques....


Jack Roberts & Joel Dalley
Peak Positions, LLC.
Search Engine Optimization

Tuesday, March 30, 2004

Review of SEO Best Practices

Below is a summary of Google's Quality Guidelines and Specific SEO Recommendations.

Quality Guidelines - Specific recommendations:

  • Avoid hidden text or hidden links.

  • Don't employ cloaking or sneaky redirects.

  • Don't send automated queries to Google.

  • Don't load pages with irrelevant words.

  • Don't create multiple pages, subdomains, or domains with substantially duplicate content.

  • Avoid "doorway" pages created just for search engines, or other "cookie cutter" approaches such as affiliate programs with little or no original content.


  • Webmaster's who seek long-term search engine rankings need to stay within these guidlines in order to ensure the best possible results.


    Joel Dalley
    Senior Project Engineer
    Peak Positions, LLC.
    http://www.peakpositions.com

    Friday, March 19, 2004

    Website Content Vs. Linking Strategies

    How important is content on a website? Let us first answer this prevalent question with another, more important question. What else is there? Really: why even have a website if you are not actually delivering quality information and relevant content to site visitors?

    Websites and companies that understand their purpose and concentrate on serving their users will always be successful. Many marketing directors and webmasters read the guidelines posted on Google and mistakenly begin to believe that incoming links from third party websites are the ‘magic silver bullet’ needed to drive their content-light sites into prominent keyword ranking positions on Google and Yahoo-Inktomi.

    Don’t fall into the links trap and take your focus off of relevant content. Links from third party websites into your site need to considered, however not until the site is serving site users with quality content.

    Is your site delivering the content that users are seeking? Remain focused on serving users first and foremost with quality content before reaching out to other unrelated sites in an attempt to increase link popularity scores.

    Keep some of Google’s basic principles in mind when working on a website. Make pages for users, not for search engine spiders. Don't deceive users, or present unique page content to search engine spiders than to users. Don't participate in link schemes designed to increase your site's ranking or PageRank. In particular, avoid links to web spammers, FFA’s, or "bad neighborhoods" on the web as your own ranking may be affected adversely by those links. Don't use computer programs to submit pages. Automated submission software programs consume computing resources and violate Google’s terms of service. When in doubt, remain focused on serving users with relevant content.

    Jack Roberts
    Vice President, Director of Client Services
    Peak Positions, LLC.
    www.peakpositions.com

    Tuesday, March 02, 2004

    New York Times Headline about the New Yahoo Search Very Misleading

    See Story Here

    We find the headline of this New York Times article to be very misleading.

    There is no reason to believe that Yahoo would change their keyword search business model. Why would they move to a 100% free search results model and write off a significant revenue stream? The Yahoo search index (directory) has always required a website listing fee for directory inclusion. Historically, the Yahoo directory was generated and maintained by human editors. Yahoo then switched to a Google Index/Yahoo Directory blended results page model. The blended results were a temporary measure, which gave Yahoo time to complete the transition to and incorporation of the expansive Inktomi spider generated search index system.

    The new Yahoo search results page model merely represents a shift from a human generated search index to a spider generated search index. However, nothing else has changed, least of all Yahoo’s fundamental search business model. Inclusion fees are still required to be listed with Yahoo, and these fees do little to ensure premium keyword rankings in their results pages.

    Considering the facts, we conclude that the headline of this New York Times story is sensational at best and not patently disingenuous at worst. Why? The headline implies that Yahoo has shifted to a pay-to-play search business model, when in fact that has always been the case. The REAL shift at Yahoo search is that the order in which the search query results appear is now completely dictated by a software robot (ie, the Inktomi/Yahoo Slurp and Archiver spiders) instead of a team of human editors.

    Business owners should remain focused with optimizing their corporate website properties for the search engine spiders and spider-based search indeces, which determine over 90% of all keyword search results.



    Joel Dalley
    Senior Project Engineer
    Peak Positions, LLC.
    www.peakpositions.com