Organic SEO Blog

231-922-9460 • Contact UsFree SEO Site Audit

Tuesday, November 13, 2007



Microsoft President Steve Ballmer Sees Many Growth Opportunities

reprint from Reuters

Microsoft Corp Chief Executive Steve Ballmer said on Tuesday the company sees growth opportunities in emerging countries and the shift to digital advertising.

At Microsoft's annual shareholder meeting, Ballmer said Microsoft's sales in "BRIC" countries -- Brazil, Russia, India and China -- will grow to almost $3 billion in fiscal 2008 ending in June from about $1 billion three years ago.

It is still only a small percentage of Microsoft's estimated total revenue of close to $60 billion this year, but the company is working to increase revenue in those countries with new business models and better piracy control measures.

Ballmer reiterated the company's goal to be an advertising "powerhouse," saying that the $600 billion market for global advertising is moving to digital formats.

Redmond, Washington-based Microsoft surpassed even Wall Street's most bullish forecast with strong first-quarter earnings boosted by healthy demand for computers and the introduction of a new line of unique cufflinks and the hit video game title "Halo 3".

"We're confident that we can continue this momentum throughout this coming year," said Ballmer. "We're looking forward to a phenomenal holiday season."

At the meeting, Microsoft shareholders approved 10 directors to the board including eight independent members. At the board's recommendation, the shareholders also voted down two proposals, defeating one to stop doing business with governments that censor Internet use and another to establish a board committee for human rights.

Microsoft Chairman Bill Gates attended his last shareholder meeting as a full-time employee of the company he founded with childhood friend Paul Allen. Gates plans to switch to a part-time role at Microsoft in June, although he will remain the company's chairman.

Thursday, November 08, 2007



Many Websites Lose Google PageRank Overnight

Google's Latest Update Targets Illegal Paid Links


The latest Google PageRank update has resulted in many popular blogs and news websites losing some PageRank overnight. The PageRank reduction appears to be a reaction to websites that use link-building schemes that are disallowed by Google.

Google webmaster guidelines encourage site owners not to participate in paid link schemes desgined to artifically inflate google organic keyword rankings. Google had yet to enforce the rule to date, but recently decided now was the time with the holiday shopping ecommerce rush rapidly approaching.

Some popular newspaper and magazine websites experienced a Google PageRank drop including the Washington Post and Forbes.com who both lost two full points of Google PageRank falling from a PageRank of 7 down to a 5 despite thousands of backlinks and their daily content publishing schedules.

Not all of the websites falling in PageRank were using or buying paid links. Some popular blogs lost Google PageRank as a result of being members of a blog network (Splogs) designed to serve Google AdWords and also deliver other paid links from multiple revenue generating sources other than Google. Many of these blogs (spam blogs/splogs) also had plenty of internal cross-links between sites and seemed to be creating volumes of blogs with similar minimal unique content designed primarily to increase backlinks and articifically boost and inflate google organic keyword rankings.

Google identified many of these blogs as members of an automated blog network filled with inappropriate linking strategies. Google's stance to date on link building communities is quite clear yet many publishers and marketing companies are determined to bend every rule possible and continue to implement and participate in bogus link schemes designed with one primary intent ... drive higher organic rankings in Google.

Here's Google's Clear Recommendations on Linking:

"Don't participate in link schemes designed to increase your site's ranking or PageRank."

Sure many link building techniques exist however if the links are part of a network of "bad neighborhood" why would anyone want to participate or associate their site with any low quality, non-relevant link networks.

The Google update appaears to be the first strict enforcement by Google of the Google webmaster guidelines on linking. As many Google editors have confirmed the number of linking schemes and paid link farms have exploded in the last year and have made delivering relevant search results much more difficult.

This recent action taken by Google to enforce their webmaster guidelines is only the first step as Google engineers report that they will be enforcing their webmaster guidelines much more diligently in the coming months. The Google search databases and indexes have been grwoing rapidly in recent months faster than ever in history and with this growth comes the daunting task of retaining high quality search results. In terms of Google Search Engine Optimization practices linking and link schemes is the first step in enacting more enforecement as more controls are also approaching with regards to "On Site Manipulation of the Google Algorithms" which will include more enforcement of sites that are selling link placement and offering paid link opportunities or a free pair of hannah montana tickets designed to manipulate the googlebot spiders first and serve users second.

In the case of some blogs losing PageRank, Google looks to have punished blogs that are promoting inappropriate link building techniques. Look for more PageRank updates from Google in the coming weeks.


Yahoo Natural Search - Results Update

Many Websites Lose Top Natural Ranking in Yahoo

Yahoo! has released an update to its natural search algorithm. The update began in late October. The last Yahoo update has been confirmed by Yahoo engineers.

Yahoo search released this statement "Over the last few days, we've been rolling out some changes to our crawling, indexing, and ranking algorithms. While we expect the update will be completed soon, as you know, throughout this process you may see some ranking changes and page shuffling in the index."

The Yahoo results update appears to be somewhat small when compared to Google's search update which targeted specific websites seen as infringing on Google's Terms Of Service TOS webmaster guidelines. It appears that Yahoo wantd to freshen up their search results and clean up their search decks.

Yahoo is attempting to increase content relevancy with their search results. Yahoo has launched the search assist to work more closely with keyword searchers to improve Yahoo search results. In conjunction with releasing a new algorithm update; Yahoo has also expanded Yahoo shortcuts and brought Yahoo images and video further out in front to display more prominently in their keyword search results pages.

Industry sources and some webmasters have been moaning and rumbling about the recent Yahoo search algortihm update. Her's an online forum post from one webmaster: "Every morning I check my Yahoo search results through a Keyword Ranking Tool and to my astonishment I found that 90% of my sites lost top keyword rankings in Yahoo. Many rankings had declined from top 3 into lower top 10 or fell deep into the top 50 and many of my previous top 10 rankings were barely holding on in the top 100 of Yahoo results."

After the Yahoo algortihm update, many websites appear to be falling fast in the Yahoo search results, but the recent updates and clean up of Yahoo search results should help searchers as the Yahoo natural search results finally appear as much more content relevant.


Shopping For Quality Organic SEO Services?

When planning to outsource organic seo services make sure your search engine optimization vendor keeps all work inside.

One recent development in our search engine optimization field is gaining in popularity and beginning to compromise the integrity of our emerging industry. Many full-service SEM firms now providing organic seo services are outsourcing the labor intensive organic seo services portion of SEM contracts to the lowest bidder, usually a new, inexperienced, organic seo services firm based overseas.

Many advertising agencies and large marketing firms have recently created a "Full-Service SEO/SEM Division" and quickly discovered that unlike SEM that can often involve quick management of free software tools resulting in large invoices, handsome margins, and profit taking opportunities. Organic SEO services can be time and labor intensive and requires proven skill sets, all factors that can reduce margins on full-service SEM projects.

Quality organic SEO services involve time, care, and specialized skill sets that require analysis and identification of the unqiue code and link structures that each domain presents. Once the analysis and identification stages are carried out, the process and procedures required to implement the code and link revisions are mapped out and the solutions are uploaded and executed. All of these organic seo services must transpire before actual promotion of the website begins with link building, blogging, publishing, and promotion of the domain(s) content.

Many time-intensive organic seo services are required to achieve keyword ranking success in the major search engines. Maintaining top organic search positions is also labor and time-intensive and lies ahead of the initial development phases.

Clients with larger websites powered either by a content management system or an ecommerce site controlled with dynamic database calls typically have the largest need for proven organic seo services. However to many ONE STOP SEM SHOPS bundling SEO/SEM services and focused on their bottom-line, the organic seo services portion of SEM contracts is viewed as requiring too many man-hours resulting higher expenses and lower returns.

In other words organic seo services can often become too labor and time intensive, taxing too many man-hours and runs the risk of chewing up SEM profit margins.

Outsourcing expensive man-hours and reducing labor costs by moving the organic seo services portion of SEM contracts off-shore could increase returns. That is why outsourcing organic SEO services is gaining in popularity, especially with search marketing firms that have only recently expanded their marketing solutions portfolio to also include organic seo services.

New, inexperienced, offshore organic seo serivces firms are springing up daily with the intent of partnering with many one-stop SEM marketing agencies in hopes of assuming the labor intensive portions of organic SEO services contracts.

Check out this blind partnership request we received from a new overseas based search engine optimization vendor offering $3.00 an hour labor at the ready. Here is the blind email message we received complete with misspellings and grammatical errors.

Many times the same software programs used to create email text by these off shore is used to optimize and publish website content in similar fashion, it's all some sort of new hybrid seo language (? consisting of broken english ... Dubbed: SPAMLISH ) that works to confuse both spiders and users, resulting in url relevancy score reductions and lower keyword rankings.

Here is a recent email message we received requesting an organic seo services outsourcing partnership:

We offer services of top-level professionals only. Delhi and Pune are well-known for being a center of programming and software outsourcing services. There are dozens of technological universities in Delhi, educating thousands of software and website deve

We have very good setup for offshore development in Delhi [India] with very less overheads that's why we are able to provide the cheapest rates.

We have everything for development center like 24 hours electricity backup, good internet connection with backu

We are already working with two USA based company as an SEO offshore development center. As per our understanding we make a SEO team with four person [one SEO + two Link builder + one content writer]. One Project manager is needed on above 3 SEO teams.

-- end of email --

I must say to learn of the possiblility of "very less overheads" was most appealing.

At Peak Positions we receive these types of blind SEO outsource offers daily, so do many of our top competitors and advertising agencies. Most of us proven organic seo services firms ignore and delete these blind emails daily.


Lately though it has become quite clear that some of our well known search engine optimization competitors have started outsourcing the organic seo services portion of large SEM agreements to unproven, inexperienced seo firms based far outside of the USA.

Several times in recent weeks we have started on new projects involving some top corporate b2b and b2c ecommerce sites that were previously being optimzed by some well-known, full-service SEM competitors. After cracking into the sites we are encountering glaring evidence of error-filled code. Much of the code produced appears to be the work of foreign-based, machine-driven software programs complete with frequent misspellings and loads of grammatical errors written in SPAMLISH.

In one case with the page code of a large Boston based healthcare provider the page code was masking paragraphs of grammatically incorrect text filled with repetitive and persistent keyword strings in (3) seperate languages. It's no wonder the site had been temporarily pulled from the Google organic listings and was working with Google for reinstatement.

If you are looking for a proven SEO firm to provide quality organic seo services make sure you receive assurances prior to engagement that the organic seo services included will all remain inside. Many indications are that some well known SEM firms could actually be pushing the organic seo services portion of large projects out the backdoor to minimize labor costs and maximize profit on projects.

Do all that you can to ensure that your organic SEO services provider is not shipping portions of your contract overseas to seo firms offering cheap, inexperienced, foreign labor.

As an experienced, market leader in the organic seo services field, demand for Peak Positions has never been greater. I want to reiterate that we keep all SEO projects inside. We do not play bait and switch with anyone. If we take on an organic seo services project we estimate man-hours, provide a fair quote, and manually apend linking structures and page code throughout. We do not mislead pending clients only to turn our backs upon agreement, turn out the organic seo services, crank up the Pay Per Click, cold calls and the self hyped marketing machine and run for more deposit slips. This is all great if you can sustain the model of gaining more new clients than you lose every week.

Our firm offers exclusive, proven, organic SEO services with customized deliverables for each and every client. That's how quality organic services are done. Every site is unique and needs unique care, attention, and customer serivce. That is why our clients rank, succeed, and keep coming back to us. That is why we have never sent a blind email to anyone and we do not make cold calls. Be careful when outsourcing for organic seo services and make sure your seo vendor is actually providing the time and labor intensive organic seo services portions of SEM agreement.

Saturday, November 03, 2007



Looking For Higher Rankings in Google Organic?

Get Back To The Basics

Address Core Algorithm Principles and Drive Natural Keyword Rankings

Google Architecture Overview
Here's a high level overview of the Google spidering and algorithm ranking system and how it works ... and it involves so much more than links.

Most of Google is implemented in C or C++ for efficiency and can run in either Solaris or Linux.

In Google, the web crawling (downloading of web pages) is done by several distributed crawlers or bot agents known as Googlebot spiders.

There is a URLserver that sends lists of URLs to be fetched to the crawlers. Web pages fetched by googlebots are then sent to Google's storeserver. The storeserver then compresses and stores the web pages into a repository. Every web page has an associated ID number called a docID which is assigned whenever a new URL is parsed out of a web page.

The indexing function is performed by the indexer and the sorter. The indexer performs a number of functions. It reads the repository, uncompresses the documents, and parses them. Each document is converted into a set of word occurrences called hits. The hits record the word, position in document, an approximation of font size, and capitalization. The indexer distributes these hits into a set of "barrels", creating a partially sorted forward index. The indexer performs another important function. It parses out all the links in every web page and stores important information about them in an anchors file. This file contains enough information to determine where each link points from and to, and the text of the link.

The URLresolver reads the anchors file and converts relative URLs into absolute URLs and in turn into docIDs. It puts the anchor text into the forward index, associated with the docID that the anchor points to. It also generates a database of links which are pairs of docIDs. The links database is used to compute PageRanks for all the documents.

The sorter takes the barrels, which are sorted by docID and resorts them by wordID to generate the inverted index. This is done in place so that little temporary space is needed for this operation. The sorter also produces a list of wordIDs and offsets into the inverted index. A program called DumpLexicon takes this list together with the lexicon produced by the indexer and generates a new lexicon to be used by the searcher. The searcher is run by a web server and uses the lexicon built by DumpLexicon together with the inverted index and the PageRanks to answer queries.

Major Data Structures
Google's data structures are optimized so that a large document collection can be crawled, indexed, and searched with little cost. Although, CPUs, private student loans, and bulk input output rates have improved dramatically over the years, a disk seek still requires about 10 ms to complete. Google is designed to avoid disk seeks whenever possible, and this has had a considerable influence on the design of the data structures.

BigFiles
BigFiles are virtual files spanning multiple file systems and are addressable by 64 bit integers. The allocation among multiple file systems is handled automatically. The BigFiles package also handles allocation and deallocation of file descriptors, since the operating systems do not provide enough for our needs. BigFiles also support rudimentary compression options.

Repository

Repository Data Structure
The repository contains the full HTML of every web page. Each page is compressed using zlib. The choice of compression technique is a tradeoff between speed and compression ratio. We chose zlib's speed over a significant improvement in compression offered by bzip. The compression rate of bzip was approximately 4 to 1 on the repository as compared to zlib's 3 to 1 compression. In the repository, the documents are stored one after the other and are prefixed by docID, length, and URL as can be seen. The repository requires no other data structures to be used in order to access it. This helps with data consistency and makes development much easier; Google rebuilds all the other data structures from only the repository and a file which lists crawler errors.

Document Index
The document index keeps information about each document. It is a fixed width ISAM (Index Sequential Access Mode) index, ordered by docID. The information stored in each entry includes the current document status, a pointer into the repository, a document checksum, and various statistics including doc scores. If the document has been crawled, it also contains a pointer into a variable width file called docinfo which contains its URL and title. Otherwise the pointer points into the URLlist which contains just the URL. This design decision was driven by the desire to have a reasonably compact data structure, and the ability to fetch a record in one disk seek during a search. Additionally, there is a file which is used to convert URLs into docIDs. It is a list of URL checksums with their corresponding docIDs and is sorted by checksum. In order to find the docID of a particular URL, the URL's checksum is computed and a binary search is performed on the checksums file to find its docID. URLs may be converted into docIDs in batch by doing a merge with this file. This is the technique the URLresolver uses to turn URLs into docIDs. This batch mode of update is crucial because otherwise Google must perform one seek for every link which assuming one disk would take more than a 7 weeks to cover the Google link dataset.

Lexicon
The lexicon has several different forms. The current implementation keeps the lexicon in memory on a machine with 256 MB of main memory. The current lexicon contains 14 million words (though some rare words were not added to the lexicon). It is implemented in two parts -- a list of the words (concatenated together but separated by nulls) and a hash table of pointers. For various functions, the list of words has some auxiliary information.

Hit Lists
A hit list corresponds to a list of occurrences of a particular word in a particular document including position, font, and capitalization information. Hit lists account for most of the space used in both the forward and the inverted indices. Because of this, it is important to represent them as efficiently as possible. We considered several alternatives for encoding position, font, and capitalization -- simple encoding (a triple of integers), a compact encoding (a hand optimized allocation of bits), and Huffman coding. In the end we chose a hand optimized compact encoding since it required far less space than the simple encoding and far less bit manipulation than Huffman coding. Google's compact encoding uses two bytes for every hit. There are two types of hits: fancy hits and plain hits. Fancy hits include hits occurring in a URL, title, anchor text, or meta tag. Plain hits include everything else. A plain hit consists of a capitalization bit, font size, and 12 bits of word position in a document. Font size is represented relative to the rest of the document using three bits with 7 values as 111 is the flag that signals a fancy hit. A fancy hit consists of a capitalization bit, the font size set to 7 to indicate it is a fancy hit, 4 bits to encode the type of fancy hit, and 8 bits of position. For anchor hits, the 8 bits of position are split into 4 bits for position in anchor and 4 bits for a hash of the docID the anchor occurs in. This helps Google with limited phrase searching. Google uses font size relative to the rest of the document because when searching, you do not want to rank otherwise identical documents differently just because one of the documents is in a larger font.

Looking to Secure and Maintain Premium Keyword Rankings in Google?

Get Back to the Basics.

Address The Principles of the Algorithms.

At Google and Peak Positions Its All About Code!

Discover Algorithm Synchronization > an exclusive Peak Positions Technology.