About Online Matters

Posts Tagged ‘Rand Fishkin’

PostHeaderIcon Why Search Engine Optimization Matters

Yesterday, a reasonably well-known blogger, Derek Powazek, (whose article,  against my strongest desire to give it any further validation in the search engine rankings where this article now ranks #10, gets a link here because at the end of the day the Web is about transparency and the I truly believe that any argument must win out in the realm of ideas) let out a rant against the entire SEO industry.  The article, and the responses both on his website and on SearchEngineLand upset me hugely for a number of reasons:

  1. The tone was so angry and demeaning.  As I get older (and I hope wiser), I want to speak in a way that bridges differences and heals breaches, not stokes the fire of discord.
     
  2. I believe the tone was angry in order to evoke strong responses in order to build links in order to rank high in the search engines.  Linkbuilding is a tried-and-true, legitimate SEO practice and so invalidates the entire argument Derek makes that understanding and implementing a well thought-out SEO program is so much flim-flam. Even more important to me, do we need to communicate in angry rants in order to get attention in this information and message-overwhelmed universe?  Is that what we’ve come to?  I sure hope not.
     
  3. The article’s advice about user experience coming first was right (and has my 100% agreement).  But it’s assumptions about SEO and therefore its conclusions were incorrect.
     
  4. The article’s erroneous conclusions will hurt a number of people who could benefit from good SEO advice.  THAT is probably the thing that saddens me most – it will send people off in a direction that will hurt them and their businesses substantially.  Good SEO is not a game.  It has business implications and by giving bad advice, Derek is potentially costing a lot of good people money that they need to feed their families in these tough times.
     
  5. The number of responses in agreement with his blog was overwhelming relative to the number that did not agree.  That also bothered me – that the perception of our industry is such that so many people feel our work does not serve a legitimate purpose.
     
  6. The comments on Danny Sullivan’s response to Derek were few, but they were also pro-SEO (of course).  Which means that the two communities represented in these articles aren’t talking to each other in any meaningful way.  You agree with Derek, comment to him.  You agree with Danny, comment there.  Like attracts like, but it doesn’t ultimately yield to two communities bridging their difference.

I, too, started to make comments on both sites.  But my comments rambled (another one of those prerogatives I maintain in this 140 character world) , and so it became apparent that I would need to create a blog entry to respond to the article – which I truly do not want to do because, frankly, I really don’t want to "raise the volume" of this disagreement between SEO believers and SEO heretics.  But I have some things to say that no one else is saying, and it goes to the heart of the debate on why SEO IS important and is absolutely not the same thing as a good user experience of web development.

So to Danny, to Derek, and to all the folks who have entered this debate, I  hope you find my comments below useful and, if not, my humble apologies for wasting your valuable time.

Good site design is about the user experience. I started my career in online and software UE design when that term was an oxymoron.  My first consulting company, started in 1992, was inspired by David Kelley, my advisor at Stanford, CEO of IDEO (one of the top design firms in the world),  and now founder and head of the Stanford School of Design.  I was complaining to David about the horrible state of user interfaces in software and that we needed an industry initiative to wake people.  His response was "If it’s that bad, go start a company to fix it."  Which I did.  That company built several products that won awards for their innovative user experience. 

That history, I hope, gives credibility to next next statement: I have always believed, and will always believe, that good site experience trumps anything else you do.  Design the site for your customer first.  Create a "natural" conversation with them as they flow through the site and you will keep loyal customers.

Having said that, universal search engines do not "think" like human beings.  They are neither as fast or as capable of understanding loosely organized data.  They work according to algorithms that attempt to mimic how we think, but they are a long way from actually achieving it.  These algorithms, as well as the underlying structures used to make them effective, also must run in an environment of limited processing power (even with all of Google’s server farms) relative to the volume of information, so they have also made trade-offs between accuracy and speed.  Examples of these structures are biword indices and positional indices.  I could go into the whole theory of Information architecture, but leave it to say that a universal search engine needs help in interpreting content in order to determine relevance. 

Meta data is one area that has evolved to help the engines do this.  So, first and foremost, by expecting this information, the search engines expect and need us to include data especially for them that has nothing to do with the end user experience and everything with being found relevant and precise.  This is the simplest form of SEO.  There are two points here:

  1. Who is going to decide what content goes into these tags? Those responsible for the user experience?  I think not.  The web developers? Absolutely positively not.  It is marketing and those who position the business who make these decisions.
     
  2. But how does marketing know how a search engine thinks?  Most do not.  And there are real questions of expertise here, albeit for this simple example, small ones that marketers can (and are) learning.  What words should I use for the search engines to consider a page relevant that then go into the meta data?  For each meta data field, what is the best structure for the information?  How many marketers, for example, know that a title tag should only be 65 characters long, or that a description tag needs to be limited to 150 characters, that the words in anchor text are a critical signaling factor to the search engines, or that alt-text on an image can help a search engine understand the relevance of a page to a specific keyword/search?  How many know the data from the SEOMoz Survey of SEO Ranking Factors showing that the best place to put that keyword in a title tag for search engine relevance is in first position, and that the relevance drops off in an exponential manner the further back in the title the keyword sits?  On this last point, there isn’t one client who hasn’t asked me for advice.  They don’t and can’t track the industry and changes in the algorithms closely enough to follow this.  They need SEO experts to help them – a member of the trained and experienced professionals in the SEO industry, and this is just the simplest of SEO issues.

How about navigation?  If you do not build good navigational elements into deeper areas of the site (especially large sites) that are specifically for search engines and/or you build it in a way that a search engine can’t follow (e.g. by the use of Javascript in the headers or flash in a single navigation mechanism throughout the site), then the content won’t get indexed and the searcher won’t find it.  Why are good search-specific navigational elements so important?  It comes back to limited processing power and time.  Each search engine has only so much time and power to crawl the billions of pages on the web, numbers that grow every day and where existing pages can change not just every day but every minute.  These engines set rules about how much time they will spend crawling a site and if your site is too hard to crawl or too slow, many pages will not make it into the indices and the searcher, once again, will never find what could be hugely relevant content.

Do UE designers or web developers understand these rules at a high level?  Many now know not to use Javascript in the headers, to be careful how they use flash and, if they do use it in the navigation, to have alternate navigational elements that help the bots crawl the site quickly.  Is this about user experience?  Only indirectly.  It is absolutely positively about search engine optimization, however, and it is absolutely valid in terms of assuring that relevant content gets put in front of a searcher.

Do UE designers or web developers understand the gotchas with these rules?  Unlikely.  Most work in one organization with one site (or a limited number of sites).  They haven’t seen the actual results of good and bad navigation across 20 or 50 or 100 sites and learned from hard experience what is a best practice.  They need an SEO expert, someone from the SEO  industry, to help guide them.  

Now let’s talk about algorithms.  Algorithms, as previously mentioned, are an attempt (and a crude one based on our current understanding of search) at mimicking how searchers (or with personalization a single searcher) think so that searches return relevant results to that searcher.  If you write just for people, and structure your pages just for readers, you are doing your customers a disservice because what a human can understand as relevant and what a search engine can grasp of meaning and relevance are not the same.  You might write great content for people on the site, but if a search engine can’t understand its relevance, a searcher who cares about that content will never find it. 

Does that mean you sacrifice the user experience to poor writing?  Absolutely, positively, without qualification not.  But within the structure of good writing and a good user experience, you can design a page that helps/signals the search engines, with their limited time and ability to understand content, what keywords are relevant to that page. 

Artificial constraint, you say? How is that different than the constraints I have when trying to get my message across with a good user experience in a data sheet?  How is that different when I have 15 minutes to get a story across in a presentation to my executive staff in a way that is user friendly and clear in its messaging?  Every format, every channel for marketing has constraints.  The marketer’s (not the UE designer’s and not the web developer’s) job is to communicate effectively within those constraints. 

Does a UE designer or the web developer understand how content is weighted to create a ranking score for a specific keyword within a specific search engine?  Do they know how position on the page relates to how the engines consider relevance? Do they understand how page length effects the weighting?  Take this example.  If I have two pages, one of which contains two exact copies of the content on the first page, which is more relevant?  From a search engine’s perspective they are equally relevant, but if a search engine just counted all the words on the second page, it would rank higher.  A fix is needed.

One way that many search engines compensate for page length differences is through something called pivoted document length normalization (write me if you want a further explanation).  How do I know this?  Because I am a search engine professional who spends time every day learning his trade, reading on information architecture and studying the patents filed by the major search engines to understand how the technology of search can or may be evolving.  Because – since I can’t know exactly what algorithms are currently being used –  I run tests on real sites to see the impact of various content elements on ranking.  Because I do competitive analysis on other industry sites to see what legitimate, white hat techniques they have used and content they have created (e.g. videos on a youtube channel that then point to their main site) to signal the relevance of their content to the search engines. 

And to Derek’s point, what happens when the algorithms change?  Who is there watching the landscape for any change, like an Indian scout in a hunting party looking for the herd of buffalo?  Who can help interpret the change and provide guidance on how to adapt content to maintain the best signals of relevance for a keyword to the search engines?  Derek makes this sound like an impossible task and a lot of hocus-pocus.  It isn’t and it’s not.  Professional SEO consultants do this for their clients all the time, by providing good maintenance services.  They help their clients content remain relevant, and hopefully ranking high in the SERPs, in the face of constant change.

So to ask again, do UE designers or product managers understand these issues around content?  At some high level they may (a lot don’t).  Do web developers? Maybe, but most don’t because they don’t deal in content – it is just filler that the code has to deal with (it could be lorem ipsum for their purposes).  Do any of these folks in their day-to-day struggles to do their jobs under tight time constraints have the time to spend, as I do, learning and understanding these subtleties or running tests? Absolutely, positively not.  They need an SEO professional to counsel them so that they make the right design, content and development choices.

I’ll stop here.  I pray I’ve made my point calmly and with a reasoned argument.  Please let me know.  I’m not Danny Sullivan, Vanessa Fox, Rand Fishkin, or Stephan Spencer, to name a few of our industry’s leading lights.  I’m just a humble SEO professional who adores his job and wants to help his clients rank well with their relevant business information.  My clients seem to like me and respect what I do, and that gives me an incredible amount of satisfaction and joy. 

I’m sorry Derek, but I respect your viewpoint and I know that you truly believe what you are saying.  But as an honest, hard-working SEO professional, I couldn’t disagree with you more.

FacebookTwitterFriendFeedStumbleUponDeliciousDiggLinkedInMultiplyBlogger PostPingDiigoGoogle ReaderMySpacePlaxo PulseSphinnTechnorati FavoritesTumblrWordPressShare

PostHeaderIcon Matt Cutts, Nofollow, and the Consistently Inconsistent

I have avoided (like the plague) weighing in on the tempest Matt Cutts unleashed at SMX Advanced in June regarding Google’s change to the use of the <nofollow> tag for PageRank sculpting.  I have avoided it for two reasons:

  1. In my mind, more has been made of it than its true impact on people’s rankings.
  2.  

  3. As far as I’m concerned, in general (and note those two words) the use of the <nofollow> tag is a last resort and a crutch for less than optimal internal cross-linking around thematic clusters.  When internal cross-linking is done right, I don’t believe the use of the <no follow> tag is that impactful.

Bruce Clay had a great show on Webmaster Radio on the subject of the <nofollow> controversy, and basically he was of the same opinion as me. There are also many more heavyweights who have weighed in than I care to name.  So adding my comments to the mix isn’t all that helpful to my readers or the SEO community generally.

But I was searching today for some help on undoing 301 redirects when I found this section on the SEOMoz blog (click here for the whole article) from 2007 that provides some historical context for these conversations – so I thought I’d share it here.  My compliments to Rand Fiskin of SEOMoz for reproduction of this content:

“2.Does Google recommend the use of nofollow internally as a positive method for controlling the flow of internal link love?

A) Yes – webmasters can feel free to use nofollow internally to help tell Googlebot which pages they want to receive link juice from other pages

(Matt’s precise words were: The nofollow attribute is just a mechanism that gives webmasters the ability to modify PageRank flow at link-level granularity. Plenty of other mechanisms would also work (e.g. a link through a page that is robot.txt’ed out), but nofollow on individual links is simpler for some folks to use. There’s no stigma to using nofollow, even on your own internal links; for Google, nofollow’ed links are dropped out of our link graph; we don’t even use such links for discovery. By the way, the nofollow meta tag does that same thing, but at a page level.)

B) Sometimes – we don’t generally encourage this behavior, but if you’re linking to user-generated content pages on your site who’s content you may not trust, nofollow is a way to tell us that.

C) No – nofollow is intended to say “I don’t editorially vouch for the source of this link.” If you’re placing un-trustworthy content on your site, that can hurt you whether you use nofollow to link to those pages or not.”

Just some interesting background as you consider the current debate.

Reblog this post [with Zemanta]
FacebookTwitterFriendFeedStumbleUponDeliciousDiggLinkedInMultiplyBlogger PostPingDiigoGoogle ReaderMySpacePlaxo PulseSphinnTechnorati FavoritesTumblrWordPressShare
Posts By Date
April 2014
M T W T F S S
« Jul    
 123456
78910111213
14151617181920
21222324252627
282930