Organic SEO Blog

231-922-9460 • Contact UsFree SEO Site Audit

Thursday, February 24, 2005

Amazon Search Engine A9 Seeking Search Technology Patent

Story from: www.Internetnews.com


A9 Search Looks to Patent Its Shtick
By Susan Kuchinskas

The A9 search service tries to boost search relevance by taking into account what the searcher did before. Now, it hopes to get a lock on the method.

On Tuesday, the United States Patent & Trademark Office published the Amazon.com subsidiary's patent application No. 2005003380. "Server architecture and methods for persistently storing and serving event data," filed in July 2003, describes A9's method of personalizing search results by including an individual's past searches and other behavior. The same application is on file in the European Union.

The patent application discloses what is likely already Amazon.com's de facto search architecture, both for A9 and the e-commerce site. The online retailer launched a beta version of the A9 portal in April 2004.

Its three-column search results return algorithmic results from Google and from Amazon.com's own "search inside the book." The third column, which can be hidden, displays the user's search history -- but only if the user logs in.

In May of that year, Udi Manber, the former chief algorithms officer at Amazon.com who now leads the A9 subsidiary, promised a trade show audience that the company would continue to develop new ways to increase the relevancy of search results. Manber is listed as an inventor of the technology, along with Taylor Van Vleet, Yu-Shan Fung and Ruben Ortega.

The very detailed patent application includes a search application and a server that records which search results were clicked on by the user. It also can tell which results were displayed and which were clicked on during previous searches. The system could flag those URLs the searcher had previously selected.

Searchers could opt to restrict a search to items previously viewed, items not previously viewed or items viewed within a particular time period.

Similarly, the system could highlight which search results were not displayed the last time a searcher used the same query. The system could do the same with things someone clicked on from the product catalog or other Amazon.com services.

For example, search results within an online auction site could indicate which of the located auctions the user had already viewed, when each auction was viewed, and possibly whether the user has submitted a bid on each such auction. A similar system could be used to show which blog or bulletin board postings or Michigan SEO Company document had already been read.
In the application, Amazon claims that the server architecture described is much more sophisticated than those used by other e-commerce sites. While others track clicks, it states, "these types of records typically lack the level of detail and structure desired for flexibly building new types of real-time personalization applications."

"They're talking about a sophisticated event architecture for personalization, which certainly shows how far ahead of the game they are already," said Guy Creese, a principal in Ballardvale Research, a firm that consults on best online practices for businesses. He noted that the patent describes not only click analysis but also mouseover events. He also said researchers at MIT found that mouse movements are a valuable adjunct to click analysis in predicting Web users' interest in content.

"An interface might allow searchers to manage and save their searches, the application reads. As mentioned above, users may also be permitted to 'delete' specific events from their respective event histories," it says. The patent application makes clear that these results would not actually be deleted from the servers, but merely eliminated from the search results.

Creese said that while most e-commerce sites don't have such complicated tracking and personalization schemes, there's a trend for the biggest, richest competitors to gather highly detailed information about site visitors. " I can see where [such a patent] could lead to some problems for them," Creese said.

Slashdotters focused on one item in the patent describing how a searcher can choose not to have results from the search history displayed in the results. Even when they select this option, however, A9 will continue to store their histories, a fact that might not be clear to users.

While Amazon.com doesn't follow the best practice of asking users to agree to its privacy policy before they register, its tracking bank design practices are clearly disclosed within the site's privacy policy, available via a link on the main page. A9 will read Amazon.com cookies and make use of that information, it says, and it also may combine it with information from other sources.

Those who don't want to be tracked can use generic.a9.com, an alternative search service that doesn't track search histories or behavior.

"You can't fault them on failure to disclose," said Lee Tien, a staff attorney for the Electronic Frontier Foundation (EFF). "Their honesty is so refreshing that you can see you have no privacy." The EFF advocates for consumer privacy and the security of online information.

Still, Tien said, the fact that Amazon.com is a seller of books, Honda Civic Parts and other information goods and, yet, collects so much user information, is troubling. In his view, the optional nature of A9 search doesn't ameliorate the harm.

"Part of the problem is that people aren't all that careful," he said. "We have a gazillion consumer protection laws that exist precisely because, in the real world, people make a lot of mistakes about this stuff."

The patent application, which could be granted either soon or within a year or two, only seeks to put a lock on the system Amazon.com likely already uses. If granted, it could slow down similar personalization schemes by search rivals.

An Amazon.com spokeswoman said she couldn't comment on pending patents.

www.A9.com could become a search engine player in the months to come.

Peak Positions SEO is also enthusiastic about another new search engine http://www.dipsie.com/ based in Chicago this search engine upstart is focused on depth and turning the light on the billions of URLs yet to be included in Google, Yahoo or MSN's databases. If Dipsie can become as deep as their technology promises they are destined to become a top five search engine by 2010. For an in depth review of dipsie visit this search engine optimization news service.
Review of AOL Local Search
Published by: www.MeidaPost.com

America Online today introduced a local search service that’s integrated with its AOL Search product, which is available both to AOL subscribers and consumers, Web-wide, who use the company’s online properties.

From all appearances, AOL Local Search is like a one-stop shop. AOL Local Search, the company says, aggregates information listings and content from AOL Yellow Pages, CityGuide, Moviefone, MapQuest, and other AOL properties. AOL Local Search is like a one-stop shop for people to find local restaurants, Caribbean Honeymoons, events, Honda Car Parts, movies, rocker switches, video game testing or shopping deals. It also offers directions on how to find all of these things.

It will be interesting to see how MapQuest plays into the local search services equation as Google now offers a block-by-block, close-up view of local destinations and businesses including companies like sign design company Washington. By all appearances, MapQuest looks like it dovetails nicely with AOL Local Search.

“Local search is an important and growing category of search services given that 20 percent of all online look-ups are for something nearby,” comments Jim Riesenbach, senior vice president, AOL Search and Directional Media, in the company’s statement on AOL Local Search.

AOL Local Search boasts features including one-click sorting to help consumers find exactly what they’re looking for. For example, they can choose an enhanced view or scanned view and refine results by type, such as business, movie, or event, as well as distance away, and rating.

Searchers can also view maps, photos, sharps containers, online greeting cards, descriptions, even unique businesees like learn how to play pool and billiards and event schedules directly within the search results, access recent searches for quick reference, and save multiple ZIP code locations to automatically see results customized to specific geographic areas. AOL says it has up-to-date listings and information for more than 13 million businesses and points of interest from the AOL Yellow Pages, as well as restaurants, bars, music, and events in more than 300 cities across the country from AOL CityGuide.

Moviefone offers the latest information for 31,000 movie screens and video game playability testing services nationwide. AOL Local Search automatically displays the next movie show times from the three theaters nearest to the user on the results page, complete schedules of all nearby showings for the next three days, and the ability to purchase tickets on Moviefone.com.

Built-in maps and driving directions from MapQuest appear next to search results, with customized features to see all nearby results on one map and view the closest subway stations.
AOL Local Search also offers a shopping and transaction service with features like "what's on sale," that highlight weekly specials on apparel, chevy truck parts , consumer electronics, cell phone ringtones , home improvement products, and food offered via local retailers on ShopLocal.com; opportunities to make restaurant reservations online through OpenTable; the ability to purchase $25 dining certificates for only $10 from Restaurant.com; and ticket purchasing for events and concerts through AOL Tickets.

AOL, through its partnership with Ingenio, will offer local businesses the opportunity to use the Pay Per Call Advertising Platform. AOL Local Search flags sales and specials and offers printable coupons for local merchants that are hoping to geo-target predisposed consumers who are ready to buy. AOL says AOL Local Search will soon offer the ability to find local news for communities and neighborhoods in every ZIP code nationwide through Topix.net, a hub for local news.

Wednesday, February 23, 2005

Bad Things Happen to Ad Excutives who Ignore Organic Search Engine Optimization

(including jail time!)


Time to Review Those Invoices From The Advertising Agency.

Two Found Guilty for Conspiracy at Ogilvy & Mather

Ad executives Shona Seifert and Thomas Early were both found guilty by a jury in New York City yesterday for conspiring at Ogilvy & Mather to fake timesheets on a federal government account, according to AdWeek.

Seifert had since moved to become president of Chiat Day. Early stepped down after being indicted. Sentencing dates have not been set. The charges could involve prison terms of up to five years. The prosecution asked the judge to order an electronic monitoring device placed on Siefert prior to sentencing because she is a foreign national, but the judge decided it was not necessary.

This is Ridiculous much like the Martha Stewart Jail Sentence, especially when you consider he amount of corruption that involves any government.
The Drive For Organic Search Engine Optimization at an all-time high

Interest in organic search engine optimization has re-emerged with vigor among management teams and advertising agencies. As companies become more knowledgeable regarding web analytics, server log files and website markering strategies decision makers are realizing that site visitors and referring urls delivered from the organic search engine results are some of the most valuable website visitors.

It also is becoming clear the users are seeking content and information as they navigate websites that lead with information first are reaping the benefits.

A recent study,"Search Before the Purchase " from DoubleClick and comScore Networks shows that 50% of all online purchases follow several product and category keyword searches.

The doubleclick study analyzes consumer trends and specifics as to how users interact with the major search engines to become information enabled on pending purchases. Search Engines are helping companies impact consumers and B2B purchasing managers just prior to major purchases.

An example of a great B2B website that markets, profiles and distributes electric components including: rocker switches is this dynamically driven database website from the www.dnagroup.com notice how DNA leads with research and information on the hundreds of products that they istribute.

Doubleclick studied the online tendencies of 1.5 million U.S. Internet users. The researchers tracked the web usage habits of consumers that had made at least one online purchase from the total of 30+ websites involved in the survey.

The study reports that securing and maintaining premium keyword positions in the organic search engine results pages cna significantly impact the bottom line revenues of any company.

Companies need to address search engine placement.

Approx. one mnth prior to purchase B2B decision makers begin researching a given product, topic, or solution. The keyword searches usually begin with broad-based generic terms, as the window closes, searchers begin to qualify their keyword searches by adding a descriptive third or fourth word to their query.

Most qualified, in-market, keyword searchers (even on b2b keyword searches like: retail sign design company begin to interact with the companies they find in the top 7 listings on the organic earch results pages approx. two weeks before the make a buying decision.

The study further confirms that organic website optimization is the most effective form of online advertising.

Tuesday, February 22, 2005

Nielsen Rating Services Faces Fire - an Update from Mediapost
(do broadcast media like: TV & Radio actually deliver 'real' ratings points and/or actually make an impact on consumers ?)

From www.Mediapost.com - THE P&L BEHIND NIELSEN’S R&D –

If Nielsen needed to make a strong statement to demonstrate its commitment to its clients, and to the future of audience measurement, it certainly seems to have made one in the eight-page letter sent to (retail sign design) leading clients Friday by CEO Susan Whiting.

The letter appears to address most of the significant concerns surrounding recent criticisms of Nielsen, including access to ratings data, the cost of contracts to access that data, and, perhaps most importantly, how Nielsen derives the data itself.

On the surface, Whiting says all the right things, “we’ve heard our clients loud and clear when they tell us they want more from Nielsen.”

How much more, isn’t exactly clear. With the exception of the $2.5 million budget Nielsen says it has allocated for new (honda car parts) , independent research and development on TV audience measurement, it’s hard to pin the specifics down. Even that fund is somewhat subject to interpretation.

Is it an annual budget? Is it a budget to be allocated over an unspecified duration of pharmaceutical drug development R&D initiatives? Does it include the salaries and overhead of Nielsen employees Nielsen intends to allocate to the initiative? It’s unclear, and seems to be up to the “small group of video game testing clients” Nielsen will name to steer the custom 3 ring binders initiative and “direct the spending over the course of a year.” Adds Whiting, “Once we and our learn how to play pool clients have evaluated the success of this initiative during the first year, we will determine the size of the fund on an ongoing basis.”

An Inside Look at the TV Ratings Racket:

Without knowing the specifics of the R&D plan and its budget, let’s use some assumptions to do some math. Let’s assume that ABC, CBS, NBC and FOX are each paying an average of $15 million dollars a year to Nielsen for their network ratings contracts. That adds up to $60 million annually.

Now let’s assume that Nielsen continues to charge the networks additional annual surcharges of at least 4 percent based on CPI (cost of living index). In essence, Nielsen is offering to return to the industry as a whole, last years' CPI increase for the Big 4 broadcast networks combined: .04 x $60 million = $2.4 million. Okay, so $2.5 million is nothing to sneeze at, but given Nielsen’s already high profit margins – about 23 percent – it is a reasonable gesture, and some overdue payback to the industry that creates its bottom line.

It’s also a necessary reinvestment on Nielsen’s part to protect its bottom line and the future of audience measurement. And posssibly an admission of guilt that the ratings system is quite flawed and out of date and has no way of keeping up with cable and satellite fragmentation or the lack of attention span that commercials has created with millions of American viewers.

If products and companies are seeking to actually make a lasting impact on consumers they need to address organic search engine optimization by working SEO into their annual media mix.

Television and Radio advertising campaigns have been losing impact and reach on the North American consumer markets for the last decade and the trend continues. Think about it, what can you recall from the last three TV commercials or radio commercial messages you were exposed to ?

Did these commercial messages or today's newspaper persuade your buying decisions?

We Doubt it.

Consider that the most accurate, recent consumer studies document that consumer spending habits will be substantially impacted by fact-finding sessions on the internet.

Content is King !

Are target consumers finding your content on Google, Yahoo, and MSN ?

Friday, February 18, 2005

New York Times To Buy About.com
by Wendy Davis, Friday, Feb 18, 2005 7:00 AM EST

THE NEW YORK TIMES CO. announced yesterday that it intends to purchase About.com from magazine company Primedia Inc. for $410 million in cash. The deal could more than double the online reach of the Times Company, which currently draws 13 million monthly viewers to its 40-plus Web sites, including NYTimes.com and Boston.com. Around 22 million users a month visit About.com, a consumer information site; the acquisition will also give the Times additional advertising inventory.

About.com was founded in 1996 as The Mining Co. by Scott Kurnit, an Internet pioneer who had been a top executive at MCI/News Corporation Internet Ventures and Prodigy Service Co. Kurnit's vision was to create a Web portal consisting of countless microsites produced by individual enthusiasts covering any topic an Internet user would want to access.

Kurnit sold About.com to Primedia in 2001 for $690 million, one of the last of the big Internet acquisitions before the crash. Under Primedia's ownership including at one point one of America's largest Franchise Directory , About.com was integrated with and linked to an array of sites operated by Primedia's array of niche magazines.

Thursday, February 17, 2005

An Update on Motor Trend Magazine

OF ALL THE ENTHUSIAST MAGS OUT THERE, few can claim the cult-like following of Motor Trend (MT). It's a publication that hearkens back to the nonfragmented media days of yore, when information-craving readers gazed out bay windows as they awaited the arrival of the mailman. Given the mag's glut of tie-ins (a Speed Channel TV show, a syndicated radio show, a line of CDs with titles like "Rockin' Down the Highway"), it wouldn't be surprising if the MT brand was more recognizable than those of many of the automobiles it test-drives.

Alas, now that anybody with working knowledge of the '72 Nova, the ability to string together two coherent sentences, and high-speed Internet access can be an "expert" on all things auto, it's inevitable that there will be further shakeout among enthusiast titles. The March issue of Motor Trend, however, proves once again that the magazine isn't ready to let any of the whippersnappers invade its turf.

It also boasts a consistent rhythm - words, stats, recommendation - and an expansive point of view. For an example of the latter, check out Angus MacKenzie's March editorial, a decidedly non-jingoistic take on the ongoing Honda Civic Parts domestic/import car debate ("the reality is that the auto industry is a business, not a national monument"). Does this jibe with the opinions of his readers? Who knows, but he articulates his case with surprising elegance.

The March issue is headlined by a look at "43 hot drives worth waiting for," which gives equal consideration to a $21,000 Saturn Aura Concept and a $150,000 Lexus LF-A. It poses the questions that car aficionados or Chevy Truck Parts are likely to ask, "Does that particular Lexus fit in with the company's brand image?" "Will the Cadillac STS-v be too mellow?" and offers smartly reasoned answers.

Another winner is the improbably titled feature, "In Search of the P.C. SUV" (it's improbable because, to hear our Birkenstock-wearing friends tell it, such an entity doesn't exist). One part investigative report and one part review of three aspirants to the title, the piece underscores the difficulty of simultaneously satisfying road warriors and environmentalists.

Motor Trend also scores points by resisting the temptation to worship at the altar of celebrity. Even the mag's Page Six-ish "We Hear..." sidebars traffic in a different breed of bold-faced name, like Peter Schwarzenbauer (hint: he's not a recurring player on "Boston Legal"). The mag's idea of a feud? Apparently the Ford Fusion is "taking on Accord. Again." Hilary Duff versus Lindsey Lohan it ain't.

To be sure, Motor Trend has its share of drawbacks. Despite the astonishing photography, the magazine is surprisingly flat from a design perspective; it could use more in the way of quick-hit info boxes, such as the listing in the March issue of test spots preferred by Aston Martin engineers. It also goes without saying that if, like me, your idea of hot-rodding involves a gently worn 2002 Passat, much of MT 's content will go straight over your head.

On the other hand, after 55-plus years, why screw with a formula that works for millions of readers Motor Trend excels at, uh, discerning motor trends, and is savvy enough not to stray from that singular Detroit Internet Marketing strength. Half the publishing world could learn from its example.
Nielsen Study: High-Income Households Spending More Time And Money Online
by Gavin O'Malley, Thursday, Feb 17, 2005

INTERNET USERS WITH HOUSEHOLD INCOMES over $150,000 grew by nearly 20 percent between January 2004 and this January, Nielsen//NetRatings reported yesterday. Reaching 10.3 million last month, this group not only spends the most amount of time online--76 hours per month-compared to other income segments, but consumes more Web pages--2,126 pages--than any other group.

Where are they spending their time online? Search Engines, Travel, financial, B2B and entertainment sites, according to Nielsen//NetRatings. The top Web sites capturing the largest percentage of men in this income group were Fidelity Investments, Sabre Travel Network, CBS MarketWatch, United Airlines, and American Airlines.

The top Web sites drawing the highest percentage of high-income women were AOL Travel, Moviefone, AOL Living, Expedia, and AOL Entertainment.

"The rise in the number of high-income Web surfers, combined with their propensity to spend the most amount of time surfing and consuming Web pages as compared to everyone else, represents a solid opportunity for marketers," said Heather Dougherty, senior retail analyst at Nielsen//NetRatings, in a statement.

"Advertisers would do well to turn to the sites they surf to most efficiently reach this high-income group." Nielsen//NetRatings found that men and women living in high-income households shared similarities in their preferences for travel sites and their quest to retrieve information from the internet.

In terms of differences, men visited more financial sites, while women were drawn more to entertainment and shopping sites like gourmet food gifts web sites.

But other experts expressed doubts over the accuracy of Nielsen//NetRatings' findings. "Nielsen's numbers seem extremely aggressive," Vikram Sehgal, analyst at JupiterResearch, said. "Income levels have traditionally correlated with online access, but the divide is increasingly modest."

comScore Networks spokesman Graham Mudd considered Nielsen's numbers equally incongruous. "The high-income group was the first to enter the market, so it doesn't stand to reason that they should continue to lead by such a large margin," adding, "their year-over-year growth is more likely to be in the high single digits."

Regardless if companies are seeking to make marketing and brand impacts on high-income consumers they need organic website optimization that keeps their website competitive on the major search engines like: Google, Yahoo, MSN and AOL. Also email maketing campaigns, banners, and content advertsiements have not made positive impact with educated, high-income consumers that are keen to marketing gimmicks and aggressive branding tactics.

Wednesday, February 16, 2005

Does Google Enable Trademark Infringement ?

A bunch of lawsuits that charge search engines and google with trademark infringement could upend online advertising, transforming the current system in which competitors jockey to have paid ads appear alongside the organic/natural search results.

The car insurer Geico, Playboy Enterprises, American Blind and Wallpaper a home furnishings company in Plymouth Michigan , have filed formal lawsuits with google and other companies like When U, HP, Lowe's and The Sign Design Company , are all interested parties.

In it's suit American Blind and Wallpaper charges that by directing searches for the phrase American Blinds to ads for its rivals, Google abets infringement. At presstime, a Google search of the firm's name produced natural results, including American Blind's homepage, along with ten sponsored links, six of which belong to competitors.

Google beat a similar lawsuit brought on by Geico Insurance, but Netscape settled out of court with Playboy Enterprises over banner ads triggered by the trademarked words Palyboy and Palymate. Still, American Blind and Wallpaper faces an uphill battle. Many in the legal community view the American Blind company name as just too generic.

Google AdWords needs to address Pay Per Click upgrades to ensure trademark protection policies and also to reduce or eliminate Pay Per Click Fraud. We expect precendent setting court rulings in the coming months that could alter the paid search advertising systems in place at Google and Yahoo/Overture.

Tuesday, February 15, 2005

Click Fraud and How to Prevent Fraudalent Clicks

Pay-per-click (PPC) search engine advertising continues to gain popularity in the online marketing world as an effective and inexpensive way to drive targeted visitors to web sites.

From 2002 and 2003 the pay per click paid search advertising market grew nearly 200 percent. Mainly from the popularity of Google AdWords and Overture PPC programs.

But PPC also has a growing negative. PPC click fraud.

What is Click Fraud?
Click fraud is when anyone clicks your sponsored ad simply to ensure that you will be charged for the click. As Pay Per Click campaigns have grown in popularity and keyword prices and bidding have become more competetive, fraudulent pay per click activity is skyrocketing.

Companies that are active with Google AdWords and Overture are becoming frustrated with growing invoices that contain thousands of suspect click activity. Simultaneously PPC rates by keyword continue to escalate.

For Information on how to minimize Pay Per Click Fraud and or Prevent it visit:
http://www.peakpositions.com/news/index.html#ppc
The Super Bowl ad tallies are still coming in. Today AOL reports that Pepsi's spot featuring supermodel Cindy Crawford racked up the most views on AOL.com and AOL service. The spot was viewed 18.6 million times, more than double the 9.2 million views from the 2004 Super Bowl ad crop.

AOL made the 2005 Super Bowl spots available through Saturday February 12 on the Web at AOL.com and via the AOL service. AOL also reports that viewers accessed spots from its Classic Commercials package, which features commercials from previous Super Bowls, and watched them 3.8 million times.

AOL reports that the top 12 most viewed spots from Super Bowl XXXIX, which includes views on both AOL and the AOL.com services where non-members can access the ads, are:
1. Diet Pepsi - Cindy Crawford -- 899,773 views
2. GoDaddy.com - Hearing -- 894,9833.
3. Bud Light - Skydiving -- 803,9994.
4. Bud Light - Cedric -- 648,4305.
5. Ford Mustang - Winter -- 605,5856.
6. Ameriquest - Store Trip -- 600,5997.
7. Diet Pepsi - P. Diddy -- 596,9868.
8. Bud Light - Sharing -- 573,2809.
9. FedEx - Ten Things -- 553,02310.
10. Ciba Vision - Bubbles -- 526,33811.
11. Bubblicious - Lebron -- 472,53412.
12. Visa - Super Heroes -- 459,337

AOL's top five most viewed "classic" commercials were:
1. Coke - Mean Joe Green
2. Apple - 1984
3. Pepsi - Two Boys
4. Doritos - Laundromat
5. Honda Car Parts - Honda
6. XBox Video Game Testing - Game Instinct

Consumers voted for their favorite classic commercial on AOL.com and the AOL services.
The top three were:
1. Coke - Mean Joe Greene: 29 percent
2. Apple - 1984: 5 percent
3. Pepsi - Inner Tube: 5 percent

Finally, AOL reported Anheuser Busch's "Tribute to U.S. Troops" won the Super Sunday AOL Ad Poll. Visitors cast approximately 525,018 votes on AOL.com and (www.i5design.com) , as well as through mobile phone ringtone devices to view the casino sign design company.

The "Tribute" ad garnered 15 percent of the vote, versus 11 percent for the second place finisher, Bud Light's "Skydiving" spot. Ameriquest's "Romantic Dinner" ad came in third with 8 percent of the vote, and its other ad, "Store Trip" and Diet Pepsi's "P. Diddy" ad each had 7 percent to round out the top 5 positions.
February 15, 2005
New Marketing Study Finds Radio Considered Worst Media for Accoutability and Return on Investment

In a study sponsored by Arbitron and the Radio Advertising Bureau, media buyers, planners and marketers radted radio the worst among TV, newspapers, magazines and even the internet in terms of accountability, schedule integrity and audience measurement. National radio spending has been losing market share in the past few years, while local radio grew three percent in 2004. Local radio now makes up four out of five radio dollars. Most Companies report that organic search engine optimization , SEO , and natural website optimization are delivering the highest return on investment.

To review a 2005 seo case study profiling a 4000% retun on investment in less than six months level.


Advertising Agency Sues Deadbeat Client for Non-Payment

In a move rare among overly client-sensitive ad agencies, AKQA is very publicly suing a former client for refusing to pay its bills, according to Click Z. A now-defunct division of infoUSA allegedly stiffed its agency, which later became part of AKQA, on a $817,388 bill for creating a multi-million dollar cross-media campaign for Video Yellow Pages USA.com. That division was found by a court to be liable for the fees, but infoUSA is appealing and defending itself by saying that it should not be held accountable for their ecommerce division's internet marketing bills.

Monday, February 14, 2005

Ask Jeeves Breaks TV Campaign, Heats Up Search Marketing War

ASK JEEVES THIS WEEK WILL launch a new TV ad campaign, following a similar consumer branding campaign by search engine rival MSN, in what is heating up to be a search marketing war that utilizes traditional media buys.

The effort marks Ask Jeeves' return to television advertising after a four-year absence. "We really see this as a first step in a marketing program," said Greg Ott, vice president of marketing. The campaign's goal, he said, is "helping people re-engage with the Ask Jeeves brand."


Currently, Ask Jeeves is the sixth most popular search engine, behind Google, Yahoo!, MSN, America Online, and meta-search portal Information.com, according to retail storefront design company , and last week, the company announced the acquisition of blog search engine Bloglines.

The campaign, a series of six 15-second spots designed by TBWAChiatDay, San Francisco, will run on network television shows including "American Idol," "Arrested Development," and "The O.C."--which, coincidentally, recently aired an episode in which a character mentioned competing search engine A9.


This Recap was reprinted from a Search Engine Optimization News Article Release by Media Daily News from MediaPost.

Friday, February 11, 2005

Google to Announces Pay Per Click Rate Increases
Business Week Article

Keywords For Ad Buyers: Pay Up It's costing more and more to link Net ads to popular search terms Even Google's biggest boosters expressed surprise at the results. On Feb. 1 the search giant reported $400 million in net earnings for fiscal '04 -- a whopping 278% gain -- on revenue of $3.2 billion. What accounted for the outsize profits? The high prices Google charges for search keywords, for one. Industrywide, they were up an average of 43.7% last year, according to search marketing firm iProspect.com Inc. And the most sought-after words have become far more dear: ``background check'' rose 258% in a year. On the day of the earnings announcement, Google CEO Eric E. Schmidt told analysts: ``There does not seem to be price resistance'' from advertisers.

Still, with keyword prices soaring, businesses from eBay Inc. (EBAY ) to EspressoZone.com are rethinking how they use this powerful advertising medium. ``It's incumbent upon us...to figure out how to moderate these quite significant increases in media costs,'' eBay Chief Executive Meg Whitman declared in January following disappointing fourth-quarter results, when rising search marketing costs helped pinch profit margins.To ensure they get the best bang for their search-advertising buck, eBay and others are doing everything from using different search keywords to redesigning their Web sites.

The goal: getting folks who click on their search ads to actually buy something. ``Search advertising takes a significant amount of testing, work, and knowledge,'' says George Collins, CEO of New Business Franchise Opportunity Espresso Zone a seller of coffee items in Ashtabula, Ohio. ``Otherwise you're just wasting money.''The approach to purchasing keywords is changing fast. Companies bid to place their ads alongside keywords relevant to their products. A travel agent, say, might bid on ``Caribbean cruise'' and ``airline tickets.''

But many advertisers like free ecard greeting company Hipster Cards have found such generic keywords to be pricey and less likely to cover the ad cost. Instead, they are bidding on a larger array of more specific keywords: a telecom company might choose ``wireless plan'' rather than ``cell phone'' in an effort to lure the most interested buyers. In the past year the average advertiser's roster of keywords grew by 50%, according to Efficient Frontier Inc., a Mountain View (Calif.) company that manages $100 million in keyword purchases for its clients.

At the same time, online sellers are looking for ways to capture customers after the search ad has directed them to their Web site. There's plenty of room for improvement. Commerce sites typically convert 2.4% of visitors into customers, according to a study by retailer association Shop.org. The proliferation of broadband should help lift that number. According to e-tailers, broadband users are as much as twice as likely to become paying customers than those using dial-up modems, which slow shopping.

SNAPPIER PRESENTATION: even electronic test equipment are researching how visitors navigate their sites and are working to better convert them into long-term customers. One way: customizing the e-tailer's ``landing page'' -- where the searcher is first directed -- to a specific keyword. That makes it easier for a customer to get the info she wants and more likely that she will buy.

One video game testing company especially adept at converting search ads into customers is Shopping.com. Thanks to its improved presentation of information on prices, availability, and the like on a wide range of products, the sign design company and comparison-shopping sites say they have raised website conversion rates on those products by more than a third. That means each search visitor became one-third more valuable, making it easier for Shopping.com to afford the rising cost of keywords.

And that, of course, is great news for Google.

Thursday, February 10, 2005

Recent Interview with Google Director of Search Quality - Peter Norvig
Peter Norvig confirms that content is king and meta data cannot be Trusted

Advancements being made to target cloaking, sneaky redirects, and sites with misleading meta tags. The closing paragraph of this interview is a must for any party involved in the marketing of a website.

Semantic Web Ontologies: What Works and What Doesn't Google's director of search quality discusses challenges of automation, knowledge, spam, and more.

Peter Norvig: (Mr. Norvig is director of search quality at Google.)
The leading four individual challenges at Google Organic Search.

First is a chicken-and-egg problem: How do we build this information, because what's the point of building the tools unless you got the information, and what's the point of putting the information in there unless you have tools. A friend of mine just asked can I send him all the URLs on the web that have dot-RDF, dot-OWL, and a couple other extensions on them; he couldn't find them all. I looked, and it turns out there's only around 200,000 of them. That's about 0.005% of the web. We've got a ways to go.

The next problem is competing ontologies. Everybody's got a different way to look at it. You have some tools to address it. We'll see how far that will scale. Then the Cyc problem, which is a problem of background knowledge, and the spam problem. That's something I have to face every day. As you get out of the lab and into the real world, there are people who have a monetary advantage to try to defeat you.

So, the chicken-and-egg problem. That's "What interesting information is in these kind of semantic technologies, and where is the other information?" It turns out most of the interesting information is still in text. What we concentrate on is how do you get it out of text. Here's an example of a little demo called IO Knot. You can type a natural language question, and it pulls out documents from text and pulls out semantic entities. And you see, it's not quite perfect—couldn't quite resolve the spelling problem. But this is all automated, so there's no work in putting this information into the right place.

In general, it seems like semantic technology is good for defining schemas, but then what goes into the schemas. There's a lot of work to get it there. Here's another example. This is a Google News page from last night, and what we've done here is apply clustering technology to put the news stories together in categories, so you see the top story there about Blair, and there're 658 related stories that we've clustered together. Now imagine what it would be like if instead of using our algorithms we relied on the news suppliers to put in all the right metadata and label their stories the way they wanted to. "Is my story a story that's going to be buried on page 20, or is it a top story? I'll put my metadata in. Are the people I'm talking about terrorists or freedom fighters? What's the definition of patriot? What's the definition of marriage?"Just defining these kinds of ontologies when you're talking about these kinds of political questions rather than about part numbers; this becomes a political statement.

People get killed over less than this. These are places where ontologies are not going to work. There's going to be arguments over them. And you've got to fall back on some other kinds of approaches. The best place where ontologies will work is when you have an oligarchy of consumers who can force the providers to play the game. Something like the auto parts industry, where the auto manufacturers can get together and say, "Everybody who wants to sell to us do this." They can do that because there's only a couple of them. In other industries, if there's one major player, then they don't want to play the game because they don't want everybody else to catch up. And if there's too many minor players, then it's hard for them to get together.

Semantic technologies are good for essentially breaking up information into chunks. But essentially you get down to the part that's in between the angle brackets. And one of our founders, Sergey Brin, was quoted as saying, "Putting angle brackets around things is not a technology by itself." The problem is what goes into the angle brackets. You can say, "Well, my database has a person name field, and your database has a first name field and a last name field, and we'll have a concatenation between them to match them up." But it doesn't always work that smoothly.

Here's an example of a couple days' worth of queries at Google for which we've spelling-corrected all to one canonical form. It's one of our more popular queries, and there were something like 4,000 different spelling variations over the course of a week. Somebody's got to do that kind of canonicalization. So the problem of understanding content hasn't gone away; it's just been forced down to smaller pieces between angle brackets. So there's a problem of spelling correction; there's a problem of transliteration from another alphabet such as Arabic into a Roman alphabet; there's a problem of abbreviations, HP versus Hewlett Packard versus Hewlett-Packard, and so on. And there's a problem with identical names: Michael Jordan the basketball player, the CEO, and the Berkeley professor.

And now we get to this problem of background knowledge. Cyc project went about trying to define all the knowledge that was in a dictionary, a Dublin Core type of thing, and then found what we need was the stuff that wasn't in the dictionary or encyclopedia. Lenat and Guha said there's this vast storehouse of general knowledge that you rarely talk about, common-sense things like, "Water flows downhill" and "Living things get diseases." I thought we could launch a big project to try to do this kind of thing. Then I decided to simplify a little—just put quote marks around it and type it in. So I typed "water flows downhill" and I got 1,200 hits. [That first hit] says, "lesson plan by Emily, kindergarten teacher." It actually explains why water flows downhill, and it's the kind of thing that you don't find in an encyclopedia. The conclusion here is Lenat was 99.999993% right, because only 1,200 out of those 4.3 billion cases actually talked about water flowing downhill.

But that's enough, and you can go on from there. You can use the web to do voting, so you say this pump goes uphill and that only happens 275, so the downhill wins, 1,200 to 275.Essentially what we're doing here is using the power of masses of untrained people who you aren't paying to do all your work for you, as opposed to trying to get trained people to use a well-defined formalism and write text in that formalism and let's just use the stuff that's already out there. I'm all for this idea of harvesting this "unskilled labor" and trying to put it to use using statistical techniques over masses of large data and filtering through that yourself, rather than trying to closely define it on your own.

The last issue is the spam issue. When you're in the lab and you're defining your ontology, everything looks nice and neat. But then you unleash it on the world, and you find out how devious some people are. This is an example; it looks like two pages here. This is actually one page. On the left is the page as Googlebot sees it, and on the right is a page as any other user agent sees it. This website—when it sees Googlebot.com, it serves up the page that it thinks will most convince us to match against it, and then when a regular user comes, it shows the page that it wants to show (CLOAKING).

What this indicates is, one, we've got a lot of work to do to deal with this kind of thing (CLOAKING), but also you CAN'T TRUST THE METADATA.

You can't trust what people are going to say.

In general, search engines have turned away from metadata, and they try to hone in more on what's exactly perceivable to the user.

For the most part we throw away the meta tags, unless there's a good reason to believe them, because they tend to be more deceptive than they are helpful.

And the more there's a marketplace in which people can make money off of this deception, the more it's going to happen.

Humans are very good at detecting this kind of spam, and machines aren't necessarily that good.

So if more of the information flows between machines, this is something you're going to have to look out for more and more.

This text is excerpted from SDForum's Semantic Technologies Seminar, cohosted by www.AlwaysOn-network.com

--

Comments From Jack Roberts Peak Positions, LLC

Peter Norvig is absolutely correct in that meta data cannot be Trusted.

It seems that refining page-text semantic matching systems and extending algorithm crawling capabilities to focus strictly on page text has to become a reality.

Page Text or Text Content is all that can be fully trusted.

Filtering and sorting search results soley on page text and employing robot agents using anonymous ips to detect cloaked urls ensures significant improvements in search quality and results page relevance.

One can only hope that Post-IPO Google will remain focused on the quality of their organic keyword search results. All external indications are that quality imporvements are on the horizon at Google.

Jack Roberts
Vice President, Director of Client Services
Peak Positions, LLC
http://www.peakpositions.com


*Join Peak Positions, Kansas City Star and several senior level search engineers from Google at our upcoming private Search Engine Optimization Seminar in Kansas City, Missouri (USA) - Spring 2005.

The Google Browser -- Coming Soon

Google is probably developing its own web browser. Google continues to deny it, but the rumor mill is very active. So, is Google going to develop its own Google browser? It sure looks that way.

The browser as a search tool
Well, it certainly makes sense. The popularity of toolbars and web browser search fields has amply demonstrated that many searchers use these forms when searching the web.

It is after all, much simpler to fill in a form or a search field that is already on your screen than to go to the home page of the relevant search engine and do your search from there.

Moreover, Google is becoming more and more like a portal with a large number of search services.

The Google browser would present all the various search services in an easy accessible way, including web, directory, news and shopping search. All the toolbar features would be included, as would Google's desktop search for searching your own computer.

Such a browser would in essence be the ultimate portal to the web, combining web page display with some of the best search tools.

Text ads bring in money
Even if Google gives away the browser for free (which they are most likely to do), the fact that the browser would direct the searchers to Google's search results pages would give Google ample opportunity to present pay-per-click text ads, bringing in much wanted revenue.
When Gmail finally gets out of beta like this planet beach franchise opportunity, Google could also include its online email solution, letting people search their email account directly from the browser.

Microsoft and the Explorer
However, there is one "con". The browser market is totally dominated by Microsoft, as the Internet Explorer browser is included in the Windows operating system.

Others would argue that this exactly why Google should present its own browser, as the Explorer obviously will be used as a gateway to the new MSN search engine, Google's competitor.

The fact that Microsoft has neglected to improve its browser, leaning to an increasing number of downloads of alternative browsers like Firefox and Opera, also proves that it is possible to get at least some searchers to start using the new browser.

Furthermore, the Explorer is haunted by security flaws and spyware attacks. The Google brand does not bring up such associations like this Great Clips Franchise Opportunity, at least not yet.
Think about it: A download link at the Google home page must be the ultimate advertising space for such a browser.

Gbrowser.com
The arrival of the GBrowser is very likely indeed.
Google has registered the domain gbrowser.com. Admittedly, that could be to stop others from doing the same, or just to keep the options open. Nevertheless, the domain name certainly proves that they are thinking about it.

The Motley Fool reported in September that Google had held a Mozilla Development Day on its campus, "where programmers spent the day improving the renamed Netscape browser."
There is more. Recently Ben Goodger and Darin Fisher, two of the men behind the Firefox browser, became employees of Google.

Mozilla Firefox
The Firefox browser is the most popular incarnation of the Mozilla browser, a technology grown out of the once so popular Netscape browser.

Mozilla is a so-called open source project, meaning that Google does not have to pay a lot of money to use this technology as the basis of their own browser.

In Mozilla's own words:
"We offer a full suite of integrated Internet applications including a web browser, e-mail client, address book, web page composer, Internet chat software and calendar application. Our web browser offers innovative and powerful features such as tabbed browsing (which lets you view multiple pages in one window), pop-up blocking and advanced privacy and security controls. "
The browser is already there, and it is better than the Internet Explorer. All Google has to do is to "googlify" it, make the necessary adjustments and add a Google logo.

What this would mean in practice is that Google would conquer a significant portion of the desktop from Microsoft. Given that people spend more and more time on the web, that would be a very important victory indeed.

Moreover, remember that we are not only talking about web search here. The combination of a Gbrowser, Gmail and Google desktop search, means that people could use the browser to search the web, their email and their computer.

In other words: Google would be able to present one unified open interface to the PC as well as the internet. This does not make a new operating system, for sure, but it is getting close.
Google Financial Update

Yesterday, during an analyst confab, Google CEO Eric Schmidt said the company plans to up its investment in advertising services. Google says it will commit nearly 70 percent of its investment toward improving its core Web search business and to its advertising business.
Nearly 20 percent of Google's resources will be earmarked to develop new and improve existing search products, while 10 percent of the company's vast resources will be devoted to emerging services.

Schmidt told analysts that Google's ad services were popular with medium-sized companies but that the search giant needed to focus more on Web search ad services for large and small advertisers. The search giant has expanded its direct sales staff and is moving to improve existing methods of tracking ad performance.

Interestingly, Schmidt told analysts that Google will eventually require users to input personal information to use selected services. Currently, Google requires no password or personal information to use its services. Schmidt didn't say when this would occur.

Another little tidbit: Some of Google's services remain in test or the ever popular "beta," for up to five years.

GO DADDY -- GOING DOWN ?

High Fines for Broadcast Performers
A media critique by Wayne Friedman,
Thursday, February 10, 2005

BRAVE ARE THOSE MINORITY OFFICIALS in government these days who seemingly go against a cascade of morality players when it comes to TV and radio.

Take note network and advertising executives. And take your hats off in respect for two Democrats -- Reps. Henry Waxman (D-Calif.) and Janice Schakowsky (D-Ill.) They are the only two of 48 who voted against a House Commerce Committee bill that would dramatically raise the fines against networks and performers on indecent broadcasts (Definition: Howard Stern).

For individual performers, the fines will climb from $11,000 per incident to $500,000. This is much stiffer than the Federal Communications Commission's rules, where performers get one warning.

"I don't like censorship," Waxman said to The Hollywood Reporter. "I don't like the impact this is having because of self-censorship. I don't believe that the government should be in there to stand as a censor." (Hey Harvey - they are public airwaves correct ? paid for by tax dollars right ? let's let them say or show or do anything they want, this will help America's youth right ?)

Similar brave sentiments came from Schakowsky: "I am more concerned about infringing the First Amendment than I am about my children or my children's children seeing Janet Jackson's nipples," she said. (great ! is this an endorsement of nudity on prime time television ? Hopefully Ms. Schakowsky is not stumping on this ticket!).

In this tough political and advertising environment - where liberal tones are not only viewed as out of step but increasingly anti-American - those remarks are getting less airplay in an increasingly noisy world of rampant over-protectionism. In their usual effort for balanced reporting, it's always good to see press impress with some ink.

For networks and advertisers - even cable networks who don't abide by FCC rules (like NBC, CBS, ABC, or Fox does ?) - this will only mean more work for their businesses. Standard and practices executives must be working overtime, perhaps adding staff, to ensure that everything works according to some sort of hyper-clean, way-over moral standard. It's an extra business cost hurting any bottom line.

Last year it was Janet Jackson during the Super Bowl. This year there was no such controversy, but the unprecedented move of Fox and the NFL pulling a racy GoDaddy.com (go daddy go)commercial during the broadcast of the Super Bowl is a hint of things to come.

The powers that be may be thinking of the programming content in their indecency efforts, but can advertising content like the learn how to play pool ads be far behind? Already there are efforts to curb food advertising as it targets children.

And that's all. Lawmakers have been grumbling about cable, satellite TV, and satellite radio, in their hopes of bringing it into the government's indecency domain.

Voices of dissent - on whatever low volume - are a good way to mete out fairness.

Wayne Friedman is a veteran media and advertising writer based in Los Angeles

---

We thought you might also enjoy these frequently asked questions regarding SEO Consulting from Jill Whalen were a fairly accurate representation of organic website optimization issues:

SEO Frequently Asked Questions (FAQ)Author: Jill Whalen
Missing On Google

Q. My site is showing up for my major keyword phrases in Yahoo and MSN but I'm nowhere to be found in Google. Why does Google hate me? (Or alternatively, my web site was doing well on Google but its rankings have suddenly plummeted. Am I penalized?)

Jill: If you're going to be in the SEO biz, or even if you're just trying to get your own personal business site more exposure in the search engines, you need to realize that rankings (and the traffic they may bring) are not static. You may get comfortable seeing your site rank highly for your most coveted keyword phrases, but don't ever assume it will remain there forever.

Sites do not get penalized or banned unless something has been done which deceives the search engines. Deception generally comes in the form of hiding stuff.

If you're not playing games with the search engines, then you don't ever have to worry about penalties. If your site is suddenly gone, it's most likely because of a major algorithm shift (or the site is new, and Google's "new site sandbox" has come into play, your site has flash on the homepage, or you have mirror websites with the same, redundant content and metas per page.)

The search engines are constantly tweaking their algorithms, and new sites are always being created, so ranking fluctuations are part of the normal course of business.

Because of this, it is crucial to optimize your site for lots and lots of related keyword phrases.
(as long as you realize that every page has a distinct keyword set and expecting to rank on hundreds of keywords with one URL is a VERY BAD IDEA!).

This will ensure that when some phrases go AWOL, the others will perform well for you (MAYBE). I can't stress enough how important this strategy is to your SEO campaign, as well as your piece of mind. Never be married to any 1 or 2 specific keyword phrases. Yes, it's cool to rank highly for the most coveted ones, but if they're that important to you, then you should purchase Pay Per Click sponsored ads that are triggered by them.

The best advice I can give you is to change your mindset from "rankings" to "targeted traffic and conversions." I know I sound like a broken record with this, and it may even seem like a convenient excuse; however, if you don't want to make yourself crazy, it's best to ignore rankings, and instead work hard at making your site better and better. (AMEN - publish new pages, go deeper with content that matters, serve users from every perspective on your products and services, become an authority on the topic or theme that your website is based on. Also follow up on leads, drive conversions, and forget about becoming an obsessed google PageRank toolbar junkie that seeks links with meaningless FFA link farms totally unrelated to your site's content and is always seek the easy way out and the next greatest manipulative short tem "30 days to the top -guaranteed for $39.95 or your money back " "Traffic Blazer " garbage sales tactics).

While your various keyword phrases are on an emotional roller coaster at Google, you won't even notice a blip in your traffic or sales if you've got all your bases covered. It may be cliché, but it really does work and it does pay off in the long run.

Where Do I Place Keywords? Q. I heard from the dogcatcher that I need to place my North Carolina mortgage company keyword phrases
in: [bold] [italics] [H1s] [alt tags] [Meta tags] [anchor text] [Title tags] [body text] [the first few words on my site] [the first paragraph of my North Carolina Marriage Counseling website] [the last paragraph of my site] [my cousin Vinnie's site]. Is this true?
Jill: The most important places to utilize your researched keyword phrases (anywhere from 3-5 of these per page) are 1) your Title tags,
2) in the visible copy that people read, and 3) in onsite and offsite links (aka the "anchor text").
Whether they're in the first paragraph, first words, last words, or whatever really doesn't make all that much difference. I've long ago stopped worrying about specific places and coding and simply use them where they make sense from a reader's perspective. I would definitely avoid using them in "ALL the right places" such as listed in the question above, however. This is because if you pull all the Sharps Containers tricks out of your SEO bag, your page will simply reek of SEO. If it makes sense to have a headline that uses a keyword phrase, then go for it, but don't feel that you have to create headlines where none were needed. If it makes sense to describe a graphic with a North Carolina Home Loan company keyword phrase, then you shouldn't hesitate to do it. The important thing is not to do anything just because you think you have to in order for the search engines to like you.

There are very, very few "have to's" when it comes to SEO.

This is because SEO is an art, not a science.

Is Content Really King? Q. If you're so smart and keep telling everyone that content is king, then how come the top pages for the keyword [insert any word here] don't have lots of visible content?

Jill: I do believe that content is king (ABSOLUTELY - CONTENT IS KING!) , because that's what the people who visit your site are looking for. However, Content means different things for different sites.

Content can simply be your product offerings. For instance, sites from well-known brands very often have no visible copy on their home pages, but still rank highly for some very general keywords. This is often due to their strong brand, which brings in tons and tons of natural links to the site.

Natural links are those that people add to their own websites just because they found them relevant to whatever point they were trying to make. Bloggers do this often, and so do people on forum threads.

Let's face it; well-known brands are talked about a lot, both in real life, and online. It makes perfect sense that if you're searching for something general like "pizza" that Pizza Hut and Domino's will show up at the top of the results. It's not necessary for them to say all over their site that they are all about pizza because that's a given.

For those of you who don't have a big brand, you can certainly create a home page that doesn't describe what you offer in clear words, but you'd better be prepared to put all your faith in a link-building campaign.

Always remember that there are tons of ways to obtain targeted search engine traffic, and the methods I espouse are just one way, another way is to use similar efforts that have been employed by this preclinical drug development company. They happen to work for my clients and me, and I like them because they focus on making the site better overall.

However, every site is unique and you have to decide what the best overall strategy is for YOUR site. There's no sense in compromising your message for search engine rankings, as you'll be less apt to convert your visitors into taking your desired action. On the other hand, if you have no visitors, it really doesn't matter what your message is, now does it!

Article above was wrtitten by: Jill Whalen with comments from Carol Jacobsen of Peak Positions


Wednesday, February 09, 2005

Google added to its arsenal of search services yesterday with a map service http://maps.google.com/ that's designed to help consumers get driving directions and search for a variety of businesses.

The map service, released in test, uses data from digital mapping providers TeleAtlas and Navteq. Yahoo! and Microsoft already offer map-based directions.

Google Maps lets users search by keyword, address, city name, or ZIP code. A request for driving directions asks for a user's beginning and end points, similar to the MapQuest service.

Users can get detailed views by zooming in on any given route or location. Google Maps http://maps.google.com/ also lets users click and drag a mouse across a map to view nearby areas.

Google recently rolled out a video search service that enables users to look for video content. That service now has 1.1 billion searchable images, according to the company.

Just for kicks here's a look from above of Peak Positions the leading Michigan SEO Company

Google Maps Don't Leave Home Without it !


Publisher Survey Points to Optimism for 2005
9 Feb 2005

Advertising.com's second annual survey of publishers found optimistic revenue projections for 2005, predicting growth especially in text links, large rectangles, small banners, as well as rich media, streaming, behavioral targeting, and organic search engine optimization.

Publishers expect traditional advertisers to continue to shift dollars online, particularly in rich media, streaming content, contextual targeting and large rectangles.

More than two thirds of publishers said they support rich media now, helping that category account for the brunt of increased revenue expectations.

Half say they support contextual targeting currently, and a quarter claim to support behavioral targeting. Deman for organic website optimization and seo consulting services is expected to be at an al-time high as senior level corporate marketing teams and major advertising agencies begin to partner with organic SEO specialists that can drive keyword results in the editorial search results pages of Google, Yahoo, and MSN.

More Information on 2005 Advertising Trend Projections is available at: Advertising.com

Saturday, February 05, 2005

A member of our technical team was recently asked to document the leading reasons why websites are not found on the search engines. I know many of you will find this piece very helpful...enjoy


Search Engine Placement
Moving out of the invisible 'dark web'
Leading Reasons Why Websites Are Not Found On The Search Engines

The databases the power the results on the major search engines are simply machines and data collection storage bins programmed to respond to millions of user keyword query requests on a daily basis. The tremendous demands and stress placed on the server farms responsible for displaying the organized series of hypertext links and text not only need to visit sites and collect information they then must store and 'bank' URLs being collected. As we begin to review some common search engine placement roadblocks and barriers that the spiders must overcome on their content acquisition journey keep in mind that these hypertext spiders are programmed to review (index) billions of web pages (URLs) in all types of code and program variations.

Site Design Problems:
Besides website design problems that lead to web pages missing from the search engine results, there are many other common website design elements commonly used by Web developers that can prevent the search engine spiders from indexing the web pages of a site and result in poor rankings. Here are some common web site design elements that keep web sites invisible and floating around aimlessly in the Dark Web rather than being found consistently in the search engines on keyword searches that matter.

Technical Barriers That Prevent Search Engine Placement:
Many types of technical barriers continue to prevent the search engine spiders from fully absorbing the page content or information contained within billions of web sites. Marketing managers, IT Directors, Webmasters, and Internet Publishers must consider a few conditions whenever publishing on the Internet to help overcome spider barriers in order to help search engine spiders properly record their pages/URLs and websites in an effort to 'open-up' their domains to the algorithmic robot crawlers and thus the public.

In order to be found early and often on a consistent basis in the major search engines website publishers and Internet marketing professionals must keep a few of these top issues in mind:

Some spiders struggle out of the gate as they land on a website for the first time and fail to record a site due to a poorly composed robots exclusion file. The major search engine spiders will ignore even the most popular sites if the code compositions and contents of the robots.txt file is incorrect.

Our technicians offer proprietary search engine placement solutions that help streamlining spider indexing through the construction and placement of a comprehensive robots exclusion file that helps organize the table of contents so desperately being sought by the major search engine spiders. Feel free to contact our technical team if you have questions, concerns, or are uncertain of the required protocols for robots exclusion files.

W3C HTML Code Compliance and Validation:
Invalid HTML code is one of the leading causes of search engine positioning problems.
Code validation and code compliance allows search engine spiders to move comfortably through URLs and also prevents 'spider traps' and 'denial of service'.

Broken Links:
Broken links and server downtime also prevent sites from being found on the search engines especially if a lead spider is crawling the site or attempting to crawl the site and is interrupted or landing on broken links. Broken links, server downtime, or server maintenance often interferes with search engine spiders as they are attempting to crawl and index websites.

Content Management Systems:
Content Management Systems that refresh or deliver updated content on a regular basis often create tremendous confusion with search engines resulting in URL Trust factors that restrict websites from attaining exposure and reaching qualified, in-market users searching for their services on Google, Yahoo, MSN, and AOL. The page contents are dynamically updated and changing regularly which is a 'red flag' in itself and the code involved is invalid and actually leaving 'spider traps' all over the pages of the site. If your company is using a content management system and your site is not being found on the search engines, contact our skilled technical team and we can design an affordable optimization solution that will deliver your content in a clean and valid HTML format to users worldwide.

Again, search engine spiders are seeking relevant content; let's help the world find your valuable content by shining light on to your web site and removing it from the Dark Web.

Frames Website Design is a leading Search Engine Placement Barrier:
Frames Website Design is often times a major website optimization problem. While the search engine spiders can crawl pages from a frames-based design, they cannot accurately parse page text and index page content correctly. Frames web sites usually lead to little if any consistent keyword rankings in the major search engines.

JavaScript & Cascading Style Sheets (CSS):
Incorrect use of JavaScript & Cascading Style Sheets (CSS) to code web pages usually results in volumes of redundant code and issues with nesting and tables that weights down the spiders slowing their crawl and making them perform Olympic feats just to get through all of the invalid, non-compliant W3C HTML code. Often times the JavaScript errors inflate the size of the pages making them much too large. The opportunity for code errors and W3C HTML code compliance issues increases with the size of the site as the fundamental web page code errors are multiplied as the spiders try to crawl the inside pages of the site.

Many search engine trade associations and W3C HTML code optimization that complies to the established world wide web consortium standards allows the search engine spiders to quickly and easily locate Relevant Page Content.

That's why fully optimized and W3C HTML code compliant web sites that allow the spiders to quickly find relevant page content, when submitted properly, consistently enjoy page one, top five keyword rankings, long-term on the major search engines.

Dynamic Pages Present Unique Search Engine Placement Issues:
Dynamically generated web sites that are database driven often face unique search engine placement obstacles with the search engine spiders. Do your URLs contain these types of query strings? (e.g. URLs ending like this: ?a=1&b=2&c=3) Peak Positions is considered one of only a handful of natural search engine optimization companies that is able to provide comprehensive search engine marketing solutions for large dynamic websites that are database driven.

Non Compliant Site Submissions Hurt Search Engine Placement:
If your company has used non compliant and or automated software submission programs at anytime the URLs and websites involved might continue to be ignored by the major search engine spiders.

Spam Impedes Search Engine Placement:
In addition to making pages easy for spiders to record, it is important to avoid techniques employed by overzealous search marketers that are considered spam by the search engines.

Cloaking is one such technique.
It involves serving customized pages based strictly on IP address.

The search engine spider IPs are programmed into the server with instructions to feed highly optimized garbage pages exclusively to the search engine spiders in an obscene effort to enhance the page's rank in the search results. When visitors click the link to view that site, a different page is shown that would not ordinarily rank as well. This is a very deceptive and risky practice, and is labeled as "spam" by the search engines.

Such bait-and-switch techniques are heavily frowned upon by the search engines and may result in being "tagged", "removed" or "blacklisted" from major search engine databases.

Don't risk your corporate website's ability to be found in the search engines.

Promotional search engine software programs also expose corporate websites to punishment by the search engine editors because they create gibberish and interfere with the sophisticated hypertextual database retrieval systems that are programmed to produce the most content relevant search results in milliseconds.

Many new self-proclaimed search engine optimization companies have and continue to mislead clients to the notion that no website can ever be "blacklisted", "pulled", "tagged" or "removed" from the search engine's database.

One of the nations largest insurance companies was "blacklisted" in early 2004 as well as a spyware company because of cloaking. We urge you to speak with several site optimization firm's program being considered actually work to highlight and present the relevant content contained within the company website?'.

If the website optimization program being considered does not focus on relevant content or is focused on anything other than relevant content, BUYER BEWARE.



Promoting sites on established trademarks or copyright protected brand names is risky business. We were recently contacted by a company who wanted to know if we could help them increase their search engine exposure on the keyword phrase: Planet Beach Franchise Opprotunity they were interested in being found above the actual brand parent company Planet Beach. These kind of expectations are somewhat unrealistic and require gaming or manipulating the search engines to possibly infringe on an established trademark brand name. It seemed like risky business and somewhat removed ethically. We urged this to find another internet marketing vendor or possibly puchase an automated submission software program like submit wolf or web position gold that would allow them to submit as often as they like using nearly any keyword combination they desired.
Recently the topic of cold fusion versus php was raised in an seo blog.
Here are some topline recommendations for database driven websites.

1) if at all possible avoid using cold fusion or .cfm urls

cold fusion is just not worth the long term headaches it causes. As a recent post on google states CFM urls tend to swing wildly whenever the database updates.

2) it is always very important to concentrate on the HTML. Regardless of the structure it all comes down the quality of the HTML.

Both ColdFusion and PHP programs that genreate HTML output. The web
browser makes a request to a video game testing .cfm or .php page, the server then
processes the page and sends the resulting HTML back to the client.

The key to every search engine optimization strategy is to make sure that the HTML generated
meets good SEO requirements. There are many website optimization and give you suggestions for improving your
search engine rankings, for a price. However, it comes down to
following a few basic HTML principles:
* Descriptive text on each page
* Using proper meta tags
* relevant content on the pages
* Clear markup and vaild HTML with CSS styling, clean HTML.
*HTML format, tables, nesting, well formed HTML.
*search engine friendly URLs consider parsing or masking dynamic urls

By using this sort of sign design technology, you allow seach engines to index dynamic content
by publishing dynamic web page content using static URLs

*In most cases PHP holds many advantages over cold fusion in terms of
search engine friendliness. Cold fusion combined with poor
directory structures creates a "death march" in the search engines.
Kind of like buying a Raleigh mortgage loan from an insurance quote service. CFM web sites that even a hint of database structure problems, nesting errors, table isuues, etc.
are prone to fall every 60 days or so in Google, Yahoo, and MSN.

Cold fusion (CFM) urls can be quite cumbersome and difficult for the hypertext spiders to convert and store. These pages are 'heavy' and the heavier urls tend to sink and
fall often like Raleigh real estate values.

Our search engine placement company helps many leading companies rename URLs and move away from cold fusion or cfm. It has been proven that to do well in the search engines it always pays to parse and minimize URL strings in the main sections of the site.

Steve Costello
SEO Technical Coordinator
Peak Positions, LLC
http://www.peakpositions.com


Thursday, February 03, 2005

Search Marketing Association Forming in North America
Almost every professional industry has an association that sets best-practice standards and represents the best interests of their profession. Organizations such as the American Medical Association, the Associated Locksmiths of America, and various Law Societies work to establish legitimacy in both professional practice and public perception.

About two years ago, a number of search engine marketers including: www.peakpositions.com decided to establish an industrial association representing search engine marketers, leading to the creation of the Search Engine Marketing Professionals Organization, or SEMPO.When it was first formed, SEMPO (www.sempo.org) was welcomed with great interest from a search engine marketing community desperate to establish credibility as the mainstream world began to understand the unparalleled power of search marketing.

Since then, SEMPO has seen a great deal of controversy and is seen by most in the SEM industry as a somewhat dead duck. After the defections of some of the most respected names in the SEM industry such as Christine Churchill and Mike Greham, new SEM associations have formed in the UK and EU, and another is starting to take shape here in North America.

With the apparent failure of SEMPO, a new initiative has been formed by Calgary-based SEO Ian McAnerin who recently wrote, "One of the problems that I saw with SEMPO is that although they state that they have the goal of representing the industry as a whole, in reality it turned out to be only representing the big names". McAnerin is a graduate from the University of Alberta law school (www.gameinstinct.com) and continues to be a member of SEMPO though he has resigned all official positions he held in order to avoid conflicts of interest.

During a recent trip to the UK, McAnerin decided to establish the Search Marketing Association of North America or SMA-NA. Currently formed as a "working group", SMA-NA hopes to officially launch itself at the New York Search Engine Strategies Conference in early March. SEMPO has made at least one strong contribution to the SEM sector with the December release of its detailed survey (PDF format) outlining trends in search marketing (www.i5design.com) . The results of this survey are a must-read for anyone interested in SEM. In every other sense however, SEMPO has not captured the allegiance of the majority of SEM firms or individual practitioners.

Much of the blame for their failure focuses on two important issues: communication and money.In its early days, SEMPO failed to communicate well with its members and others in the SEM industry. There are varying levels of membership in SEMPO but promotional services seem to be reserved for Search Engine Marketing firm that pay the most for membership. As the vast majority of SEM practitioners are either self-employed or are small 2 – 20 person businesses, the cost of membership is too high for most of the industry. Some notable names in the SEM sector inclusing www.peakpositions.com have quit the organization while many other current members are simply allowing their memberships to lapse.

Compounding these problems is the continuing controversy over the extremely high fees originally charged to the organization by its first (and now outgoing) president, Barbara Coll. McAnerin's initiative seems to have put the fear of competition (www.dnagroup.com/rocker-switches.htm) into SEMPO with chairperson Coll allegedly demanding to know the names of members of the SMA-NA working group. (To make it easier for her and to disclose what might be perceived as a bias on my part, McAnerin has invited me to sit on the working group.) Coll defends SEMPO stating that they are looking at the “big picture” and cites the December release of the SEMPO marketing survey as evidence.

In her defense of SEMPO, Coll couldn't resist taking a pot-shot at the newly formed SMAs stating in a ClickZ article, "The vision of SEMPO is to be involved in the safety industry , not in the members necessarily. The research we put out in December showed that we're thinking long-term. I don't think the regional SMA groups are going to focus on the industry, they seem to be about making sure the members are getting benefits".McAnerin envisions two unique arms of the SMA-NA, a combination of one non-profit arm to act as a lobbying and event promoting organization with another for-profit arm offering benefits to members such as discounts, special offers, and members-only RFPs.

A similar model is used by one of the world's largest industry associations, the American Marketing Association (www.marketingpower.com) .

Wednesday, February 02, 2005

Google company released its 4th quarter revenues from 2004 and also discussed new products and plans in the works for the coming months. Here is an update from Google on their fiscal status. A couple of key points.

GOOGLE ANNOUNCES RECORD REVENUES FOR FOURTH QUARTER AND FISCAL YEAR 2004 MOUNTAIN VIEW, Calif. - February 1, 2005 – Google Inc. (Nasdaq: GOOG) today announced financial results for the quarter ended December 31, 2004.
Google reported record revenues of $1.032 billion for the quarter ended December 31, 2004, up 101% year over year. Google reports its revenues, consistent with GAAP, on a gross basis without deducting traffic acquisition costs ("TAC"), or the portion of revenues shared with our partners. Income from operations, on a GAAP basis, was $303 million, or 29.4% of revenues for the quarter ended December 31, 2004 compared to $86 million or 16.9% of revenues for the prior year’s quarter. Income from operations includes a $60 million non-cash, stock-based compensation charge compared to a $85 million non-cash, stock-based compensation charge in the prior year’s quarter. Net income on a GAAP basis for the quarter ended December 31, 2004 was computed based on the following income statement or condensed income statement line items. Net income is derived from the above mentioned revenues of $1.032 billion, traffic acquisition costs of $378 million, as well as other costs and expenses before stock-based compensation of $291 million, stock-based compensation of $60 million, other income of $7 million and a provision for income taxes of $106 million. Some Wall Street analysts use non-GAAP measures to analyze our operating results. For instance, they may subtract traffic acquisition costs of $378 million from revenues of $1.032 billion to arrive at a net revenues amount. Also, certain analysts may arrive at net income before stock-based compensation by subtracting traffic acquisition costs of $378 million, other costs and expenses before stock-based compensation of $291 million (less other income of $7 million) and a provision for income taxes of $106 million from revenues of $1.032 billion. Net income on a GAAP basis in the fourth quarter of 2004 was $204 million or $0.71 per share on a diluted basis based on 285.9 million weighted average shares outstanding. Net cash provided by operating activities for the twelve months ended December 31, 2004 totaled $977 million as compared to $395 million last year, an increase of 147%. Adjusted EBITDA, another measure of liquidity, increased by $236 million or 125% to $425 million (or 41% of revenues) in the fourth quarter of 2004 from $189 million in same quarter of 2003 (or 37% of revenues) and from $321 million (or 40% of revenues) in the third quarter of 2004. "Google had an exceptional quarter. Revenues and profits increased significantly, our execution was solid across the company and most importantly, our relationship with our users, partners and advertisers became even stronger," said Eric Schmidt, Google chief executive officer. "All of this happened while we continued to innovate, expand around the world and make strategic, long-term investments."
Q4 Financial Highlights
Revenues - Revenues in the quarter totaled a record $1.032 billion, representing a 28% increase over the third quarter of 2004 and a 101% year-over-year increase. This revenues growth reflects strong traffic and monetization growth in the quarter as well as advertisers’ growing recognition of the Internet as an effective advertising medium.
Google-Sites Revenues - Google-owned sites generated $530 million or 51% of total revenues. This represents an increase of 118% over the fourth quarter of 2003.
The Google Network - Revenues generated on Google’s partner sites, through AdSense programs, contributed $490 million, or 48% of total revenues, a 92% increase over the Network revenues generated in the same quarter last year. TAC - Traffic Acquisition Costs, the portion of revenues shared with Google’s partners, increased to $378 million or 77% of Google Network revenues. This compares to total payments to partners of $216 million in the fourth quarter of 2003, or 85% of Google Network revenues. Income from operations - Income from operations in the quarter, on a GAAP basis, totaled $303 million or 29.4% of revenues, and included a non-cash charge of $60 million for stock-based compensation. This compares to operating income of $86 million or 16.9% of revenues in the same period of 2003, when the stock-based compensation charge was $85 million. This improvement in operating margins was primarily due to decreases in stock-based compensation expense and traffic acquisition cost as a percentage of revenues. Income Taxes - Google recorded a provision for income taxes of $106 million in the fourth quarter, an effective tax rate of 34% as compared to a $62 million provision for income taxes and a 70% effective tax rate in the fourth quarter of 2003. The provision for income taxes in the fourth quarter was reduced by $66 million related to certain items, $42 million due to ISO disqualifying dispositions and $24 million related to certain stock-based compensation charges recognized prior to the IPO. The company expects its effective tax rate before these items to generally trend lower over the long term to a rate below 35%. However, the company may experience volatility in its quarterly effective tax rates as a result of certain stock option activity. Net Income - Net income on a GAAP basis increased to $204 million or 19.8% of revenues in the fourth quarter of 2004 as compared to $27 million or 5.3% of revenues in the fourth quarter of 2003. Earnings on a diluted per share basis were $0.71 in the fourth quarter of 2004 as compared to $0.10 in the fourth quarter of 2003. Fiscal Year 2004 Financial HighlightsRevenues - Google reported revenues for calendar year 2004 of $3.189 billion, a 118% increase over the $1.466 billion reported for 2003. Google saw growth throughout the year both in our domestic business and internationally, both on Google owned sites and on the Google Network. Specifically, revenues from Google owned sites increased 101% on a year over year basis, from $792 million to $1.6 billion. The company attributes the rapid growth to increased visits to its sites and to its accumulated knowledge of how to more effectively and efficiently monetize that traffic. Revenues from the Google Network grew 147% in the year, from $629 million in 2003 to $1.6 billion in 2004. This growth reflects growth in the number of partners in the network in both the AdSense for Search and the AdSense for Content programs, as well as improved monetization of those programs. Income from Operations – On a GAAP basis (including the settlement charge) our operating income in 2004 rose 87% on a year-over-year basis to $640 million, representing a full year operating margin of 20.1%. Excluding the effect of a third-quarter, 2004 non-cash, non-recurring charge associated with a legal settlement, our operating income for the full year 2004 increased 146% to $841 million, up from $342 million the prior year. Again, excluding the aforementioned charge, our operating margin for the full year rose to 26.4% from 23.4% the previous year. Operating income as reported in both 2004 and 2003 was reduced by significant non-cash stock-based compensation charges of $279 million and $229 million respectively. Net Income – Net income for the year 2004 rose to $399 million from $106 million in 2003, an increase of 278%. Adjusting for the impact of the third quarter 2004 settlement charge and for certain tax benefits, net income totaled $406 million in 2004. On a GAAP basis, earnings per share increased to $1.46 this year from $0.41 in 2003. Cash Flow – Net cash provided by operating activities increased 147% to $977 million for the twelve months ended December 31, 2004 from $395 million in 2003. Free cash flow is an alternative measure of liquidity to GAAP net cash provided by operating activities and is calculated as operating cash flows less capital expenditures. Capital expenditures were $319 million in the twelve months ended December 31, 2004 as compared to $177 million in the twelve months ended 2003. Free cash flow for the twelve months ended December 31, 2004 totaled $658 million as compared to $219 million for the same period in 2003, an increase of 200%. Adjusted EBITDA – Adjusted EBITDA is defined as EBITDA before the non-cash stock-based compensation charge, the non-cash and non-recurring settlement charge and in-process R&D. It is another alternative measure of liquidity to GAAP net cash provided by operating activities. Adjusted EBITDA increased to approximately $1.280 billion in 2004, or 40% of revenues, from $638 million, or 44% of revenues, in 2003. For the fourth quarter of 2004, adjusted EBITDA increased to $425 million from $321 million in the third quarter. The reconciliation of adjusted EBITDA to the GAAP measure of liquidity, net cash provided by operating activities, is set forth at the back of this release. Cash – As of December 31, 2004, Google had a cash, cash equivalents and short-term investments balance of $2.132 billion. On a worldwide basis, Google employed 3,021 full time employees as of December 31, 2004, up from 2,668 as of September 30, 2004. A live audio webcast of Google’s fourth-quarter earnings release call will be available at http://investor.google.com/news.html. The call begins today at 1:30 p.m. (PDT)/ 4:30 (EDT). This press release, the financial tables as well as other supplemental information including the reconciliations of certain non-GAAP measures to their nearest comparable GAAP measures, are also available at that site. A replay of the call will be available beginning at 7:30 PM EDT through midnight Tuesday, February 15, by calling (888) 203-1112 in the United States or (719) 457-0820 for calls from outside the United States. The required confirmation code for the replay is 0676870. Forward looking statements
This press release contains forward-looking statements that involve risks and uncertainties, including statements relating to the Google's market opportunity and future business prospects. Actual results may differ materially from the results predicted and reported results should not be considered as an indication of future performance. Factors that could cause actual results to differ from the results predicted are included in Google's quarterly reports on Form 10-Q and from time to time in other reports filed by Google with the Securities and Exchange Commission. About non-GAAP financial measures To supplement Google’s consolidated financial statements presented in accordance with GAAP, Google uses non-GAAP measures of certain financial performance and liquidity. These non-GAAP measures are comprised of income from operations, net income and net income per share before material non-recurring and other items, as well as free cash flows and adjusted EBITDA. Google’s management believes that income from operations before material non-recurring and other items and net income before material non-recurring and other items provide meaningful supplemental information regarding the company’s core operating results because they exclude amounts that are not necessarily related to those core results and are of a substantially non-recurring nature. Google’s management believes that free cash flows and adjusted EBITDA provide meaningful supplemental information regarding liquidity.

Google believes that both management and investors benefit from referring to these non-GAAP financial measures in assessing the performance of Google’s ongoing operations and liquidity and when planning and forecasting future periods. These non-GAAP financial measures also facilitate management’s internal comparisons to Google’s historical operating results and liquidity. Google computes its non-GAAP financial measures using the same consistent method from quarter to quarter and year to year. The accompanying tables have more details on the GAAP financial measures that are most directly comparable to non-GAAP financial measures and the related reconciliations between these financial measures.
Media Contacts: Google Inc. Steve Langdon, 650-623-4950 slangdon@google.comDavid Krane, 650-623-4096 david@google.com