SEO Vs. Personalization

When the link structure of the Internet is no longer a primary signal to a web search engine what happens to Search Engine Optimization (SEO). What other signals might replace or augment current techniques?

  • Social Graph
  • Implicit web
  • Tagging
  • Semantic Web
  • Status Updates 
  • Comments

What will drive the next generation of web search, let’s review a few of the leading candidates?

Social Graph

This is already an important signal that few people have mastered; Microsoft purchased a portion of Facebook, Google has OpenSocial. News Corp is pushing really hard with Myspace.

Implicit Web

This is already an important set of signals, but they are not very public. Every click you make every search you enter everything you purchase oneline – someone’s gaterhing that data. 

Tagging

Delicious made this concept popular and it has already been folded into the list of existing signals leveraged by search engines like Google.

Semantic Web

This ismoving slowly into the limelight, one tripple at a time.  

Status Updates 

This is a feature of most social systems; it also stands alone as a signal that can be mined. Twitter and FriendFeed is a perfect example of this activity.

Authority

The concept of a trusted source has been around for years, but is it becoming more important to web search engines as they branch into universal search.

Summary

I have hinted at a few of the emerging signals, what else should we be looking @?


Google Good Cop or Bad Cop

Disclaimer: This is my personal blog. The views expressed on these pages are mine alone and not those of my employer.

Browsers use a technology called a Rendering Engine to determine how to layout a web page. Rendering Engines use a technology called a Parser to determine the structure of the content. The Rendering Engine is at the heart of a web browser and it is one of the reasons when viewing web pages in different browsers that they look different.

Well guess what, a Web Search Engine uses a similar technology. Web Search Engines use something called a Crawler or a Spider to walk the Internet and retrieve web pages. Prior to inserting the data into the index, the software deconstructs the web page using a similar technology to a browsers Rendering Engine. This can be  beneficial to a web page owner. For example a web page that has lots of inbound links can help convince a web search engine to send thousands if not millions of people to visit its web site. This is one of the reasons Wikipedia is always returned in the top 10 for so many searches.

Google – Bad Cop

Numeric examples of PageRanks in a small system.Image via WikipediaThe same technology can also be used to hurt a website. If two websites have similar content and one has a high PageRank and the other has a low PageRank the one with the higher PageRank will get 100% of the search referral traffic from Google.

Google Juice as it is called leverages PageRank as well as many other techniques to evaluate a web site. Some of the techniques are common sense, others are known by a small handful of SEO experts and others are only known by those working at Google.  Now this wouldn’t be such a big deal if Google did not own 70% of search referral traffic and if the other top search engines did not use a similar technique to Google’s PageRank in order to rank results.

To illustrate the Bad Cop techniques, lets review a recent issue that was aired publicly between Twitter and Google. As the story goes Google asked Twitter to change the Bio links on its user profile pages to nofollow. Doing this meant each user who worked hard to build up followers on Twitter could not leverage those links to promote their own blog, or companies website. Twitter ultimately did what Google wanted and changed the link to nofollow.

What I find interesting about this issue is how Google’s almost monopoly on Search is acting like an oligopoly. Google’s Bad Cop techniques are actually benefiting all of the search engines that leverage PageRank style indexes.

The consumer is the one losing out in today’s search market, yes search is better than it was 5 years ago, but that does not mean it cannot be significantly better. The Consumer needs a voice and they need to understand why certain pages are ranked higher than others.

How do we change this oligopoly? Do we have options?

The consumer does have a choice. Up and coming search engines like the one I founded Me.dium enables the end user to influence the search results. If you do not like what you see after typing a term or phrase into the Me.dium web search engine, download the toolbar and surf to the web pages that you think are better.

Similar to Wikipedia, where the people get to create, update and police the information. Me.dium users validate the web pages prior to being returned in search results. Power to the People.


Why SEO is destined to change within the next 2 years

Search Engine Optimization (SEO) has numerous techniques for getting URL’s into the top spots at the different search engines.

  • Clean URL’s
  • Sitemap’s
  • Paid blogging
  • Link purchasing

SEO experts are starting to master the process. For example, in the past week several news stories have covered people poking fun at Google’s core algorithms publicly via their web trends service.

Social media optimization (SMO) is the first new kid on the block to emerge after SEO and it’s a set of methods for generating publicity through social media. These techniques are proving effective at driving traffic from sites like Digg.

The underlying link structure of the web is most likely not going to change anytime soon, but the engines that interpret them are evolving quickly and new dimensions to the internet are starting to emerge. The Social Graph is the best example in the past few years, but others are coming:

These alternative dimensions or indexes are creating buzz. Me.dium for example launched a search engine last week that leverages attention data to determine which pages should be indexed and in what order they should be displayed. The attention data replaces the need for a web crawler in Me.dium’s case, which means the global link structure of the internet is less important.

Obviously for Me.dium to succeed it has to provide a method for SEO to exist. Software is all about information ecosystems. Ecosystems made Microsoft, Oracle, SAP, Facebook and Google. If Me.dium is to succeed as a standalone brand it must find away to demonstrate its value to its customers and then create an ecosystem. Ecosystems also create viral loops or double viral loops. When this is accomplished the network is able to grow at an alarming rate.

What might Attention Data Search Optimization look like?

Attention Data Search Optimization (ADSO) is in its early days. I do not think anyone can predict what it will look like, but if I were to guess, I would suggest something similar to Google Adwords. The core algorithms of companies like Me.dium could be designed with dials that can be tweaked at run-time. These dials would enable an ADSO to increase the relevance of one URL over another in real-time.

A few possible ways this could be accomplished:

  • ADSO idea 1- use good old fashion money – An auction market is created and the highest bidders influence the results.
  • ADSO idea 2 use influence credits: Active participants of the system gets credits. The credits can be used to influence the weight of one URL over another in real-time.

What do you think ADSO might look like?


Follow

Get every new post delivered to your Inbox.

Join 2,209 other followers