SEO Vs. Personalization

When the link structure of the Internet is no longer a primary signal to a web search engine what happens to Search Engine Optimization (SEO). What other signals might replace or augment current techniques?

  • Social Graph
  • Implicit web
  • Tagging
  • Semantic Web
  • Status Updates 
  • Comments

What will drive the next generation of web search, let’s review a few of the leading candidates?

Social Graph

This is already an important signal that few people have mastered; Microsoft purchased a portion of Facebook, Google has OpenSocial. News Corp is pushing really hard with Myspace.

Implicit Web

This is already an important set of signals, but they are not very public. Every click you make every search you enter everything you purchase oneline – someone’s gaterhing that data. 

Tagging

Delicious made this concept popular and it has already been folded into the list of existing signals leveraged by search engines like Google.

Semantic Web

This ismoving slowly into the limelight, one tripple at a time.  

Status Updates 

This is a feature of most social systems; it also stands alone as a signal that can be mined. Twitter and FriendFeed is a perfect example of this activity.

Authority

The concept of a trusted source has been around for years, but is it becoming more important to web search engines as they branch into universal search.

Summary

I have hinted at a few of the emerging signals, what else should we be looking @?


What’s the difference between a web search engine and a publisher

Two guys sit down next to each other at a new tech meetup.

One guy says, Hi I am Robert while pointing at his name tag, what’s your name and what do you do?

the other guy says, I’m David and my company aggregates content and we sell ads based on it.

Robert says, Cool I do the same, but we also have lots of pictures.

Daivd says, Yeah we have tons of pictures as well.

Which media company do you work for?

Google


The Why and How of Add-on-Con

Dec 11 2008 was the first Add-on-Con and it surpassed my expectations. As a first time conference producer I realized a few things about myself and learned a few things about creating a conference. For me personally, this event brought up how much I like creating new things. I get a rush seeing an opportunity, thinking of an idea, and then taking it to market.  Creating Add-on-Con was fun, a little nerve racking but still provided the appropriate rush. The event had enough people purchase tickets in advance to suggest the turnout was going to meet my goals, but you never know. In the end the event exceeded the food and premium items ordered and even sold reduced price admission at the door. Along the way I learned a few things about producing a conference that I thought were worth sharing. I will start with Marketing, because so many technologists think of it as a black box.

Add-on-Con Marketing Strategy

I used several techniques to get the word out quickly.

  • We published 2 Press Releases through the wire services, which cost around $600 per release. I did a quick compare of the server logs and the sites that published our press releases generated about 2% of the overall traffic
    • We used 2 different wire services to distribute the press releases and PR News Wire had significantly broader coverage then the other service. 
  • Articles written prior to the event generated about 8% of the traffic, my personal favorite was written by Christian Zibreg “Browser rivals to sit down and discuss the future of browsing”.
  • Twitter generated about 4% (surprise)
  • Brightkite generated about .5% (surprise)
  • SV, NY and Boulder Denver Meetup message boards generated about 1% (I thought these would generate more traffic, especially the SV Meetup)
  • Paid Ads generated about 2% of the traffic and cost us $20. I was able to purchase most terms for under $0 .05
  • I purchased $50 worth of event promotion from LinkedIn.com and it generated 0% of the traffic
  • Media sponsorships
    • Techcrunch donated 125×125 pixel ad that ran on their home page for 2 days, it generated about 4% of the traffic. I believe, if this started earlier, it would have generated significantly more traffic.
    • Mashable mentioned the event in there Monday events section and it generated 0% of the traffic
    • GarysGuide generated 0% of the traffic
    • Center Networks generated 0% of the traffic
  • Blog’s generated the rest
    • Several of the bloggers received discount codes for their readers.
      • VC Bloggers generated 15% of the traffic and contributed 34% of the paying attendee’s. (wow)
      • individual tech blogs generated about 4% of the traffic and contributed 5% of the paying attendee’s (wow)
    • The Mozilla and Microsoft blogs generated about 25% of the traffic.

I was surprised by the amount of twitter traffic, the lack of traffic from the event listings and Meetup message boards. 

I wanted to thank every person, news service and news agency that talked about Add-on-Con 08, it was greatly appreciated. Plus the people from my company OneRiot, AdaptiveBlue and Sxipper, without their help this event would not have happened.

Thank You


Does Size Matter

Coverage vs. Quality in a Web Search Engine

Does quality matter if you do not have enough coverage? This entrepreneur says no.

What is enough coverage? If we think of a vertical search engine, how much coverage is needed to display results that are better than a general web search engine like Google or Yahoo.

Techmeme.com for example uses somewhere around 100 trusted sources to determine the top tech news stories, but is it enough?  The popularity of Techmeme has stayed consistent for the past year according to Compete.com, fluctuating between 200 and 300K per month.

What about general web search engines? Does the consumer require millions of results in order to trust a web search engine or could a smaller more focused service capture general web search market share.

This entrepreneur believes the answer is yes.

So how big is enough?

A back of the envelope calculation suggests the index could be 10% of what web search engines index and update today, if the goal was to answer 80% of all searches. The trick is figuring which 10% to index and update.

According to search analysts:

  • 40% of web searches are navigational
  • 40% are browsing
  • 20% are long tail

If you forget the long tail, because that does require a very large index you are left with 80%. If we look at general popularity of websites over the past year, the majority of all internet traffic is driven by the top 1,000 websites.

So if a web search engine indexed the top 1000 sites would it cover the needs of the general internet user? The answer is No, the web crawler has to be smarter, because web sites gain and loose popularity quickly outside of the top 100.

We have been brain washed to believe size matters

In turn all of our search engines are designed the same way and deliver pretty much the same results. This entrepreneur believes if a company could create an index of what people want and update it frequently enough, statistically it is possible to satisfy 80% of web searchers with an index that is 10% of the size of today’s top web search companies.

Why should we care about the size? The answer is simple; to support an index the size of Google or Yahoo the data has to be optimized for specific types of queries. For example; Google was designed to support navigational and long tail searches, which covers 60% of what people do on the internet. This is not a bad strategy, but it does lead to an interesting question/opportunity. Would people switch to a new search engine if they could get the correct answer 80% of the time?


Follow

Get every new post delivered to your Inbox.

Join 2,209 other followers