About Online Matters

Archive for October, 2009

PostHeaderIcon Technical SEO: Introduction to Site Load Times and Natural Search Rankings

It is one of those nights.  Those pesky technicolor dreams woke me up at 2:30 and wouldn’t let me go back to sleep.  But under the heading “turning lemons into lemonade,” at least I have some extra time to write my blog even as I am piled high with the end of month deadlines. 

Today’s topic is part of my Technical SEO series (I just named it that – now I have to go back and change all my titles and meta tags…sigh) – site load times and whether or not they effect how you rank in the SERPs.  It is another one of those topics that came out of SMX East.  In this case it was Maile Ohye, Senior Support Engineer at Google, who spoke to this issue.  Maile is a wonderfully knowledgable evangelist for Google.  I have seen her speak at many shows. Her presentations are always clear and contain good, actionable techniques for improving your rankings in Google’s SERPs. I am not alone in thinking her knowledgable.   Stephan Spencer, one of the guys I most look up to in SEO,  thought enough of Maile to interview her in August of 2007, and she was also recently interviewed by SEOMoz, another leading light in the industry (and if you haven’t used their pro tools, then you are one arrow short of a full quiver for your SEO work).   

So when Maile says “stuff,” I listen.  In her talk at SMX East, she made note that poor site load times (we are talking something between good and absolutely horrible) could harm your rankings in Google search results. Let me define the problem, then try to explain what Maile was referring to, and finally my take on all this.

Basic Concepts of Site Loading Times for Getting Indexed

One the one hand, that site loading times effect search rankings isn’t news.  Let’s take some time to lay a bit of foundation, because the how of site speeds effecting search rankings didn’t really hit me until Maile’s talk.  It’s one of those things that is obvious once you think about it, but it doesn’t really come top of mind when you are focused on specific tasks in an SEO project.  It’s a “given” in the background of your work.  Unless the site is so horribly slow that it is obviously impacting the user experience, you really don’t think about load times when you are focusing on keywords and meta tags.  The site works, move on. 

But that’s not really true from the perspective of the search bots.   Google and the other engines have to crawl billions of pages on the web on a regular basis, bring that information back, and then index it.  Some pages can be crawled infrequently, but as more of the web moves to more real-time information due to social media, the bots have to crawl more sites in real time in order to provide good results.  But there are only so many bots and so much time to crawl these billions of pages.  So if you are Google, you write your bots with algorithms that allocate this scarce resource most efficiently and, hopefully, fairly. 

How would you or I do this?  Well, if I were writing a bot, the first thing I would give it is a time limit based on the size of the site.  That’s only fair.  If you have the ability to create more content, bravo.  I want to encourage that, because it is beneficial to the community of searchers.  So all other factors being equal (e.g. site loading time), I want to allocate time to ensure all your pages get into the index.  There is also the issue of search precision and relevance: I want all that content indexed so I can present the best results to searchers.   

Of course, I can’t just set a time limit based on the number of pages.  What if one site has long pages and another one short, pithy pages (clearly not mine!)?  What if one site has lots of images or other embedded content while another does not?  My algorithm has to be pretty sophisticated to determine these factors on the fly and adapt its baseline timeout settings to new information about a site as it crawls it.

The next algorithm I would include would have to do with the frequency at which you update your data.  The more often you update, the more often I need to have my bot come back and crawl the changed pages on your site. 

Another set of algorithms would have to do with spam.  From the perspective of my limited resource and search precision, I don’t want to include pages in my index that are clearly designed only for the search engines, that are link spammers, or that may only contain PPC ads and have no relevant information for the searcher. 

You get the picture.  I only have a limited window of time to capture continually changing data from the web in order for the data in my index to be reasonably fresh.  Therefore I’ve got to move mountains (of data) in a very short period of time but only so many processing cycles to apply.  And the number of variables I have to control for in my algorithms are numerous and, in many cases, not black and white.

This is where site load times come in.  If a site is large but slow, should it be allocated as much time as it needs to be indexed?  Do I have enough processing cycles to put up with the fact it takes three times as long as a similar site to be crawled?  Is it fair given a scarce resource to allocate time to slow site if it means I can’t index five other better performing sites in my current window of opportunity?  Does it optimize search precision and the relevance of results I can show to searchers?  And last but not least, as one of the guardians of the Web, is poor site performance something I want to encourage from the perspective of user experience and making the Web useful for as many people as possible?  Let’s face it, if the web is really slow, people won’t use it, and the eyeballs that will be available to view an ad from which I stand to make money will be less. 

Hello?  Are you there?  Can you say “zero tolerance?”  And from the perspective of the universal search engines, there is also my favorite radio station – “WIFM.”  What’s In it For Me?  Answer: nothing good.  That is why Google has made page load times a factor in Adwords Quality Score, as an example.

So, in the extreme case (let’s say a page takes 30 seconds to load), the bots won’t crawl most, if any, of the site.  The engines can’t afford the time and don’t want to encourage a poor user experience.  So you are ignored – which means you never get into the indexes.

When Is a Page’s or Site’s Loading Time Considered Slow?

What is an “extreme case?”  I have looked that up and the answer is not a fixed number.  Instead, for Google, the concept of “slow loading” is relative. 

The threshold for a ‘slow-loading’ landing page is the regional average plus three seconds.

The regional average is based on the location of the server hosting your website. If your website is hosted on a server in India, for example, your landing page’s load time will be compared to the average load time in that region of India. This is true even if your website is intended for an audience in the United States.

Two things to note about how we determined the threshold: 

  • We currently calculate load time as the time it takes to download the HTML content of your landing page. HTML load time is typically 10% to 30% of a page’s total load time. A three-second difference from the regional average, therefore, likely indicates a much larger disparity.
  • We measure load time from a very fast internet connection, so most users will experience a slower load time than we do.

Moreover, Google has a sliding scale with which is grades a site.  The following quote applies to Adwords and landing pages, but my guess is similar algorithms and grading are used in determining how often and long a site is crawled:

A keyword’s load time grade is based on the average load time of the landing pages in the ad group and of any landing pages in the rest of the account with the same domain. If multiple ad groups have landing pages with the same domain, therefore, the keywords in all these ad groups will have identical load time grades.

Two things to note:

  • When determining load time grade, the AdWords system follows destination URLs at both the ad and keyword level and evaluates the final landing page.
  • If your ad group contains landing pages with different domains, the keywords’ load time grades will be based on the domain with the slowest load time. All the keywords in an ad group will always have the same load time grade.

We’ll stop here for today.  Next time, we’ll talk about happens in the nether regions between fast and clearly slow.

FacebookTwitterFriendFeedStumbleUponDeliciousDiggLinkedInMultiplyBlogger PostPingDiigoGoogle ReaderMySpacePlaxo PulseSphinnTechnorati FavoritesTumblrWordPressShare

PostHeaderIcon Why Search Engine Optimization Matters

Yesterday, a reasonably well-known blogger, Derek Powazek, (whose article,  against my strongest desire to give it any further validation in the search engine rankings where this article now ranks #10, gets a link here because at the end of the day the Web is about transparency and the I truly believe that any argument must win out in the realm of ideas) let out a rant against the entire SEO industry.  The article, and the responses both on his website and on SearchEngineLand upset me hugely for a number of reasons:

  1. The tone was so angry and demeaning.  As I get older (and I hope wiser), I want to speak in a way that bridges differences and heals breaches, not stokes the fire of discord.
     
  2. I believe the tone was angry in order to evoke strong responses in order to build links in order to rank high in the search engines.  Linkbuilding is a tried-and-true, legitimate SEO practice and so invalidates the entire argument Derek makes that understanding and implementing a well thought-out SEO program is so much flim-flam. Even more important to me, do we need to communicate in angry rants in order to get attention in this information and message-overwhelmed universe?  Is that what we’ve come to?  I sure hope not.
     
  3. The article’s advice about user experience coming first was right (and has my 100% agreement).  But it’s assumptions about SEO and therefore its conclusions were incorrect.
     
  4. The article’s erroneous conclusions will hurt a number of people who could benefit from good SEO advice.  THAT is probably the thing that saddens me most – it will send people off in a direction that will hurt them and their businesses substantially.  Good SEO is not a game.  It has business implications and by giving bad advice, Derek is potentially costing a lot of good people money that they need to feed their families in these tough times.
     
  5. The number of responses in agreement with his blog was overwhelming relative to the number that did not agree.  That also bothered me – that the perception of our industry is such that so many people feel our work does not serve a legitimate purpose.
     
  6. The comments on Danny Sullivan’s response to Derek were few, but they were also pro-SEO (of course).  Which means that the two communities represented in these articles aren’t talking to each other in any meaningful way.  You agree with Derek, comment to him.  You agree with Danny, comment there.  Like attracts like, but it doesn’t ultimately yield to two communities bridging their difference.

I, too, started to make comments on both sites.  But my comments rambled (another one of those prerogatives I maintain in this 140 character world) , and so it became apparent that I would need to create a blog entry to respond to the article – which I truly do not want to do because, frankly, I really don’t want to "raise the volume" of this disagreement between SEO believers and SEO heretics.  But I have some things to say that no one else is saying, and it goes to the heart of the debate on why SEO IS important and is absolutely not the same thing as a good user experience of web development.

So to Danny, to Derek, and to all the folks who have entered this debate, I  hope you find my comments below useful and, if not, my humble apologies for wasting your valuable time.

Good site design is about the user experience. I started my career in online and software UE design when that term was an oxymoron.  My first consulting company, started in 1992, was inspired by David Kelley, my advisor at Stanford, CEO of IDEO (one of the top design firms in the world),  and now founder and head of the Stanford School of Design.  I was complaining to David about the horrible state of user interfaces in software and that we needed an industry initiative to wake people.  His response was "If it’s that bad, go start a company to fix it."  Which I did.  That company built several products that won awards for their innovative user experience. 

That history, I hope, gives credibility to next next statement: I have always believed, and will always believe, that good site experience trumps anything else you do.  Design the site for your customer first.  Create a "natural" conversation with them as they flow through the site and you will keep loyal customers.

Having said that, universal search engines do not "think" like human beings.  They are neither as fast or as capable of understanding loosely organized data.  They work according to algorithms that attempt to mimic how we think, but they are a long way from actually achieving it.  These algorithms, as well as the underlying structures used to make them effective, also must run in an environment of limited processing power (even with all of Google’s server farms) relative to the volume of information, so they have also made trade-offs between accuracy and speed.  Examples of these structures are biword indices and positional indices.  I could go into the whole theory of Information architecture, but leave it to say that a universal search engine needs help in interpreting content in order to determine relevance. 

Meta data is one area that has evolved to help the engines do this.  So, first and foremost, by expecting this information, the search engines expect and need us to include data especially for them that has nothing to do with the end user experience and everything with being found relevant and precise.  This is the simplest form of SEO.  There are two points here:

  1. Who is going to decide what content goes into these tags? Those responsible for the user experience?  I think not.  The web developers? Absolutely positively not.  It is marketing and those who position the business who make these decisions.
     
  2. But how does marketing know how a search engine thinks?  Most do not.  And there are real questions of expertise here, albeit for this simple example, small ones that marketers can (and are) learning.  What words should I use for the search engines to consider a page relevant that then go into the meta data?  For each meta data field, what is the best structure for the information?  How many marketers, for example, know that a title tag should only be 65 characters long, or that a description tag needs to be limited to 150 characters, that the words in anchor text are a critical signaling factor to the search engines, or that alt-text on an image can help a search engine understand the relevance of a page to a specific keyword/search?  How many know the data from the SEOMoz Survey of SEO Ranking Factors showing that the best place to put that keyword in a title tag for search engine relevance is in first position, and that the relevance drops off in an exponential manner the further back in the title the keyword sits?  On this last point, there isn’t one client who hasn’t asked me for advice.  They don’t and can’t track the industry and changes in the algorithms closely enough to follow this.  They need SEO experts to help them – a member of the trained and experienced professionals in the SEO industry, and this is just the simplest of SEO issues.

How about navigation?  If you do not build good navigational elements into deeper areas of the site (especially large sites) that are specifically for search engines and/or you build it in a way that a search engine can’t follow (e.g. by the use of Javascript in the headers or flash in a single navigation mechanism throughout the site), then the content won’t get indexed and the searcher won’t find it.  Why are good search-specific navigational elements so important?  It comes back to limited processing power and time.  Each search engine has only so much time and power to crawl the billions of pages on the web, numbers that grow every day and where existing pages can change not just every day but every minute.  These engines set rules about how much time they will spend crawling a site and if your site is too hard to crawl or too slow, many pages will not make it into the indices and the searcher, once again, will never find what could be hugely relevant content.

Do UE designers or web developers understand these rules at a high level?  Many now know not to use Javascript in the headers, to be careful how they use flash and, if they do use it in the navigation, to have alternate navigational elements that help the bots crawl the site quickly.  Is this about user experience?  Only indirectly.  It is absolutely positively about search engine optimization, however, and it is absolutely valid in terms of assuring that relevant content gets put in front of a searcher.

Do UE designers or web developers understand the gotchas with these rules?  Unlikely.  Most work in one organization with one site (or a limited number of sites).  They haven’t seen the actual results of good and bad navigation across 20 or 50 or 100 sites and learned from hard experience what is a best practice.  They need an SEO expert, someone from the SEO  industry, to help guide them.  

Now let’s talk about algorithms.  Algorithms, as previously mentioned, are an attempt (and a crude one based on our current understanding of search) at mimicking how searchers (or with personalization a single searcher) think so that searches return relevant results to that searcher.  If you write just for people, and structure your pages just for readers, you are doing your customers a disservice because what a human can understand as relevant and what a search engine can grasp of meaning and relevance are not the same.  You might write great content for people on the site, but if a search engine can’t understand its relevance, a searcher who cares about that content will never find it. 

Does that mean you sacrifice the user experience to poor writing?  Absolutely, positively, without qualification not.  But within the structure of good writing and a good user experience, you can design a page that helps/signals the search engines, with their limited time and ability to understand content, what keywords are relevant to that page. 

Artificial constraint, you say? How is that different than the constraints I have when trying to get my message across with a good user experience in a data sheet?  How is that different when I have 15 minutes to get a story across in a presentation to my executive staff in a way that is user friendly and clear in its messaging?  Every format, every channel for marketing has constraints.  The marketer’s (not the UE designer’s and not the web developer’s) job is to communicate effectively within those constraints. 

Does a UE designer or the web developer understand how content is weighted to create a ranking score for a specific keyword within a specific search engine?  Do they know how position on the page relates to how the engines consider relevance? Do they understand how page length effects the weighting?  Take this example.  If I have two pages, one of which contains two exact copies of the content on the first page, which is more relevant?  From a search engine’s perspective they are equally relevant, but if a search engine just counted all the words on the second page, it would rank higher.  A fix is needed.

One way that many search engines compensate for page length differences is through something called pivoted document length normalization (write me if you want a further explanation).  How do I know this?  Because I am a search engine professional who spends time every day learning his trade, reading on information architecture and studying the patents filed by the major search engines to understand how the technology of search can or may be evolving.  Because – since I can’t know exactly what algorithms are currently being used –  I run tests on real sites to see the impact of various content elements on ranking.  Because I do competitive analysis on other industry sites to see what legitimate, white hat techniques they have used and content they have created (e.g. videos on a youtube channel that then point to their main site) to signal the relevance of their content to the search engines. 

And to Derek’s point, what happens when the algorithms change?  Who is there watching the landscape for any change, like an Indian scout in a hunting party looking for the herd of buffalo?  Who can help interpret the change and provide guidance on how to adapt content to maintain the best signals of relevance for a keyword to the search engines?  Derek makes this sound like an impossible task and a lot of hocus-pocus.  It isn’t and it’s not.  Professional SEO consultants do this for their clients all the time, by providing good maintenance services.  They help their clients content remain relevant, and hopefully ranking high in the SERPs, in the face of constant change.

So to ask again, do UE designers or product managers understand these issues around content?  At some high level they may (a lot don’t).  Do web developers? Maybe, but most don’t because they don’t deal in content – it is just filler that the code has to deal with (it could be lorem ipsum for their purposes).  Do any of these folks in their day-to-day struggles to do their jobs under tight time constraints have the time to spend, as I do, learning and understanding these subtleties or running tests? Absolutely, positively not.  They need an SEO professional to counsel them so that they make the right design, content and development choices.

I’ll stop here.  I pray I’ve made my point calmly and with a reasoned argument.  Please let me know.  I’m not Danny Sullivan, Vanessa Fox, Rand Fishkin, or Stephan Spencer, to name a few of our industry’s leading lights.  I’m just a humble SEO professional who adores his job and wants to help his clients rank well with their relevant business information.  My clients seem to like me and respect what I do, and that gives me an incredible amount of satisfaction and joy. 

I’m sorry Derek, but I respect your viewpoint and I know that you truly believe what you are saying.  But as an honest, hard-working SEO professional, I couldn’t disagree with you more.

FacebookTwitterFriendFeedStumbleUponDeliciousDiggLinkedInMultiplyBlogger PostPingDiigoGoogle ReaderMySpacePlaxo PulseSphinnTechnorati FavoritesTumblrWordPressShare

PostHeaderIcon Do Americans Want Advertising Even if Tailored?

There is one frustration about blogging.  Just when I get rigorous and religious about posting on a planned publication schedule, along comes some piece of news on something I wrote that throws off the production line and causes me to have to interrupt the previously scheduled program and cover it.  Don’t you just love the real-time web?  Yes, in fact, you do.  So I have to be flexible and learn to balance thought with immediacy like a one-legged man standing on a log rolloing down a set of rapids.

So what happened?  Yesterday I published an article about the fact we should be using the term tailored versus targeted advertising and how our research showed that consumers would accept any number of emails as long a they were tailored to their specific concerns.  Today,  the Berkeley Center for Law & Technology at UC Berkeley School of Law ( Berkeley Law ), and the Annenberg School for Communication at the University of Pennsylvania published a study that reports that contrary to my research, most adult Americans (66%) do not want online advertisements tailored (yes, they used the right term) by marketers to their specific interests.  Moreover, when Americans are informed of three common ways that marketers gather data about people in order to tailor ads, between 73% and 86% say they would not want such advertising.

The study is called "Americans Reject Tailored Advertising and Three Activities that Enable It.".  What I find most surprising is that the study reports that even 18-24 year old feel this way (55%), which is lower than the average but still higher than I would have thought.  86% of young adults, the study goes on to say, don’t want tailored advertising if it is the result of following the online behavior, and 90% reject it if it results from following their activities offline.

I  went through the study in some detail and, so you know, I am very comfortable with survey methodologies and dealing with the vailidity of statistical results reported.  As part of this, I am always looking for holes in the methodology that could indicate bias.  The Berkeley/U Penn study had a better methodology that two prior surveys on the subject- one from Consumer’s Union conducted in 2008 and one that was conducted in 2008 by TRUSTe and repeated in 2009 by the Privacy Consulting Group.  So I think that, as a general matter, the survey is a better indicator of attitudes than anything done previously.  Having said that, there are two issues I have with the study:

  1. Nowhere in the study was it asked whether or not the consumers would put up with tailored advertising if it is what paid for the information they got for free on the web.  This is a critical issue, because at the end of the day, consumers know that they are making this tradeoff – otherwise they would have fled sites with advertising in droves.
     
  2. Asking people about what they would do is always dangerous.  There is a famous study by Oral-B (I think) where they put consumers into focus groups and asked them what color of toothbrush they liked most.  Universally, the two top colors reported were red and blue.  The participants were thanked for their participation and told that they could take several Oral-B toothbrushes free from baskets as they walked out the door (and yes, the baskets had a random mix of all colors).  What was chosen most frequently?  Amber.  So you see, what people tell you and how they act can be very different.

I would like to see a follow-on study that actually tests behavior, not responses to questions, especially in light of objection #1.  Having said that, I think the study is definitely on to something and we as an industry need to pay attention before concern turns into anger and the public vocally demands tighter privacy laws from their legislators.

The survey also uncovered other attitudes we need to be concerned about, and I have not seen them reported – so I quote them here for your convenience: 

  • Even when they are told that the act of following them on websites will take place anonymously, Americans’ aversion to it remains: 68% “definitely” would not allow it, and 19% would “probably” not allow it.
     
  • A majority of Americans also does not want discounts or news fashioned specifically for them, though the percentages are smaller than the proportion rejecting ads.
     
  • 69% of American adults feel there should be a law that gives people the right to know everything that a website knows about them.
     
  • 92% agree there should be a law that requires “websites and advertising companies to delete all stored information about an individual, if requested to do so.”
     
  • 63% believe advertisers should be required by law to immediately delete information about their internet activity.
     
  • Americans mistakenly believe that current government laws restrict companies from selling wide-ranging data about them. When asked true-false questions about companies’ rights to share and sell information about their activities online and off, respondents on average answer only 1.5 of 5 online laws and 1.7 of the 4 offline laws correctly because they falsely assume government regulations prohibit the sale of data.
     
  • Signaling frustration over privacy issues, Americans are inclined toward strict punishment of information offenders. 70% suggest that a company should be fined more than the maximum amount suggested ($2,500) “if a company purchases or uses someone’s information illegally.”
     
  • When asked to choose what, if anything should be a company’s single punishment beyond fines if it “uses a person’s information illegally,” 38% of Americans answer that the company should “fund efforts to help people protect privacy.” But over half of Americans adults are far tougher: 18% choose that the company should “be put out of business” and 35% select that “executives who are responsible should face jail time.

 Well, at least I guess we can put to bed the use of tailored versus targeted (although the survey did interchange them regularly).  See how good I am as an evangelist? 

FacebookTwitterFriendFeedStumbleUponDeliciousDiggLinkedInMultiplyBlogger PostPingDiigoGoogle ReaderMySpacePlaxo PulseSphinnTechnorati FavoritesTumblrWordPressShare

PostHeaderIcon Own Your Online Brand – knowem.com

More from SMX East…. 

This is a review of a service critical to owning your brand online called knowem.  For $65, knowem.com opens accounts under your chosen brand name(s) on 120 social media sites so that you lock down your brand identity on a majority of key venues on the web.  In today’s social media-intense web, owning your username at as many social media sites as possible is essential for brand and reputation management, as you will see.

A True Story of Online Branding and Social Media

I had an interesting lesson in personal branding awhile back.  In 1996, at the beginning of the Internet I was at Sun Microsystems as the head of the Java ecommerce group (that, btw, produced the first electronic wallet and a Java virtual machine called Java Card that today underlies all cell phones worldwide).  I needed a login for our intranet and since my undergraduate specialty was Anglo-Saxon and medieval literature, I chose a knight-like title (which I will not publish here for security reasons).  For my mnemonic convenience, that login migrated to numerous sites over the years as the Internet grew. 

For whatever reason, one day I decided to look at how many entries I had for my personal identity online.  When I typed in ‘Arthur Coleman’ I had/have a few entries in the top 10, but there was/is an Arthur Coleman photography (who owns arthurcoleman.com), an Arthur Coleman Danto who is a writer (so lots of content + wikipedia entry), a lawyer named Arthur Coleman, as well as a wanted criminal on ‘Crime Stoppers" (many people confuse us….).  But when I typed in ‘arthurofsun’ – there were  3,110 entries all belonging to me, mainly due to my numerous entries on social media sites over the years.

This was a wakeup call (you may notice I have lots of those. It’s a prerogative I guard zealously.).  What I learned was that my online "avatar" was my "true" brand online – not my name.  Since my brand was accidental, had nothing to do with my business, and did not match my site/business name, I was going to have to consciously "rebrand" myself and my web presence(s) for the sake of brand consolidation.  My personal brand had to be chosen carefully because I was only going to change it once.  As with any brand, changing it is painful, it causes confusion in the marketplace, as well as a loss of awareness in the short-term as people have to unlearn the old name and relearn the new.  In the case of online, there is also a substantial temporary loss of Pagerank and thus rank in the SERPs for your favorite keywords as sites are ported from an old URL to a new one.   This is especially true for me because my old site (www.rethought.net) had been in existence since 2001, and that gave it a substantial amount of authority which I was going to lose moving to a younger (new) site.

I also realized – and I was probably one of the earliest to get this – that just as it is essential to own the URL for your brand name, it is equally essential to own any account online with your brand, especially on the social media sites.  Why? 

  • Because those entries will all show up under a search for your username online, so you want to consolidate its power and own the entire first page for reputation management purposes. 
     
  • Imagine some crank putting up an angry rant against your company.  While you may not prevent him from getting on the first page of the SERPs without further work by your team, owning all the slots around your brand name (like I did for my username) makes it more difficult for any third party to rank on page one. 
     
  • Or imagine someone with an account ID that is your brand name making a post about something unsavory.  If the social media site has enough authority, the entry may well rank high on the first page.  Even indirectly it can create a negative brand perception if someone assumes that the blogger is you (or someone in your organization) because the user name is an exact match for your brand name.
     
  • I don’t quite understand the reason yet, but it appears to me that authority accrues to the username itself if it is used frequently and in multiple social media contexts to make posts or add comments.  So the more usernames/sites you control and use, the more authority you can create for your brand term  to leverage for other keywords.

Where is My %&@^!! Signup Service – Introducing knowem

So now we need to sign up for every potential social media service online – right?  Great. What are the steps:

  1. I have to figure out what sites there are, and, at the same time, which I think are really important. 
     
  2. I then go to each site, signup with a username and – if it is by any chance taken – find an alternate brand-related username I will use in those cases on all sites. 
     
  3. I then receive a confirmation email and must respond to activate the account.
     
  4. I insert my picture (or brand graphic).
     
  5. I fill in a variety of personal information.
     
  6. I then need to create links between the various services to simplify login or to allow for single entry into multiple sites.

Sound time consuming?  It is.  When I did it, it took me about two weeks to get all the sites I had identified (which was only a moderate subset of the total, it turns out) signed up and linked together correctly.  I had to do it between other work, the entry was repetitive, I didn’t always remember what I did on each site and often had to go back to verify or fix something I’d done wrong.  It was a boring, low value task that I, in vain,  tried to get my son (13 years old, and also know affectionately as "slave boy") to do.  But it was even too menial for him to touch at $10/hour (What is it with kids nowadays anyway? You’d think they’d want a childhood or something. ).

So when I heard about knowem at SMX East yesterday I jumped online and even before the session was over, had signed up for the premium service to go lock down all the other sites relevant to my online brand I had previously missed. knowem checks the availability of your brand name, user name or vanity URL on 120 popular social media websites.  Their tag line is "thwart social media identity theft", which says a lot about their view of how important a username is to branding online.

At knowem, you enter a URL and the service returns a list of which sites have your brand name available. 

As I looked at the list, it indicated availability for some sites where I knew I had already created accounts.  It also indicated my username (onlinematters) wasn’t available on some sites where, in fact, it was.   So something about their account retrieval algorithm still isn’t perfect.  But I imagined they would figure this out once they attempted to create the accounts — which, in fact, they did.

I also noted some limits:

  1. Even this extensive list didn’t cover all the sites I had found and where I had created accounts.  Notable misses are: LinkedIn, Flickr, Facebook, and Wikipedia (huge misses, obviously), ping.fm,, Plaxo pulse, bebo, Hi5, Mashable, Friendster, getsatisfaction, Goodreads, Hubpages, HubSpot, imeem, Jabber, Joost, Ning, Orkut, Pandora, present.ly, Reunion,com, ShareThis, ShoutEm, Smugmug, ShoutBack, Spurl, Streetmavens, Tagworld, Tickle, Typepad, UrbanSpoon, Utterli, Vimeo, Yammer, Yelp, Zimbio.
     
  2. Email accounts (e.g. brand@live,com) and Skype were not included. Skype especially is important.
     
  3. No coverage of Twitter-related accounts/add-ons.  An example is Tweetmeme, which has strong impact on rankings (see my previous post "Social Media Channels – You CAN Own the First Page").

Still, I was happy that they had identified a goodly number of sites I hadn’t identified or tested for impact for my online brand, so I decided to sign up for the service.  This costs $65. A screen pops up and asks you for all the basic information the service needs to establish an account for you.

 

knowem signup page

 

Once you complete your purchase, knowem promises to begin signing you up for services using the information you provided within 48 hours.  In my case is was more like 12. 

Still, even with a service like knowem, there is no free lunch.  Once knowem starts its process, you will receive numerous (can be as high as 120) emails to confirm your signup – which knowem cannot do for you.  At the same time, knowem does not upload your visual avatar and cannot fill in all the fields every account requires where this has to be done after confirmation.  So you still have substantial work.  But with knowem:

  1. Your coverage/control of your online brand in the current universe of social media sites is extensive.
     
  2. You don’t have to go site by site to sign up, which is incredibly repetitive.
     
  3. Often the confirmation email takes you directly to the right page to fill in your information – just this element is a substantial time saver over having to remember the site name and type in the URL.

Epilogue

As an aside, at the time I had no idea just how awful this process would be.  Let me tell you that doing a rebrand online is 10x worse than doing it in traditional venues.  I am still having problems with Twitter for example, where my new account (onlinematters) isn’t showing up in people search, even though it has more followers than any account save my previously established account – so people aren’t linking to it.  (PLEASE DO, if you have the time.)

This is one reason I am so happy with knowem, despite its limitations.  There are enough problems that emerge in even the smoothest brand transition that I will sign up for any service that can, at a reasonable cost, provide organization to a portion of my process and save me time.

FacebookTwitterFriendFeedStumbleUponDeliciousDiggLinkedInMultiplyBlogger PostPingDiigoGoogle ReaderMySpacePlaxo PulseSphinnTechnorati FavoritesTumblrWordPressShare

PostHeaderIcon Targeted Advertising: Tailoring Paid Search

 I just got back from SMX East, and will have lots to write in the next few days.  One thing I’ve noticed, Danny Sullivan and his team can write up entire articles on the same day that a piece of news comes out.  Of course, that is their job, but here is Danny presenting, moderating, and hosting SMX East and finding time to write articles in between.  It’s a skill I have yet to develop.  I need to concentrate to write and can’t do it in the middle of sessions or hectic meetings (although i can do it just fine at Starbucks).Sara Holoubek

So while I’m a bit behind the folks at SearchEngineLand, I think you will still find the information useful.My first entry is a quick discussion of the term "targeted advertising."  In the "BigWig Crystal Ball Panel",   Sara Holoubek , a well known interactive technology consultant and speaker, objected to the term "targeted advertising."  "It’s a terrible term from a branding perspective,"  she said.  "Who wants to be a target?" 

She recommended, instead, the term "tailored advertising."  "I don’t care if you advertise to me,"  she continued in her chat, "as long as it is relevant to my concerns."  Her acceptance of advertising tailored to her interests corresponds to the results of market research we did on email advertising a few years ago.  In that research, consumers indicated they would accept an unlimited number of marketing emails as long as the content and offers were immediately relevant to their current needs. (From a practical perspective, "immediately relevant" is the difficult term because what is immediately relevant can change within minutes.)

Her comment was a real "aha!" moment for me – and I told her so afterwards.  This needed change in terminology is so obvious that as a brand-savvy marketer, it should have screamed "bad positioning" to me from the getgo.  But then, just as briliiant engineering designs are only obvious after a gifted engineer visualizes the solution, so brilliant marketing ideas are only obvious after a smart person has spoken them at some important conference.

So I want to both thank Sara for waking me from my stupor on this subject, and also support her, starting today, by evangelizing the change in technical term to  tailored advertising from its current usage as targeted advertising.

Now that that’s settled, on to changing "public relations" to "perception manipulation"….no no I  mean "image enhancement" (only kidding).

FacebookTwitterFriendFeedStumbleUponDeliciousDiggLinkedInMultiplyBlogger PostPingDiigoGoogle ReaderMySpacePlaxo PulseSphinnTechnorati FavoritesTumblrWordPressShare
Posts By Date
October 2009
M T W T F S S
« Sep   Nov »
 1234
567891011
12131415161718
19202122232425
262728293031