About Online Matters

Archive for the ‘Search Engine Optimization’ Category

PostHeaderIcon Dilbert’s Take on SEO

While we’re on the subject of SEO humor, I thought I’d throw in this wonderful piece from Dilbert. My guess is that a lot of folks looking for SEO help would be willing to sacrifice an ox (like in Biblical stories) in order to get a first position in the SERPs on a keyword core to their business. But, of course, that is just a guess. No one really searches on “sacrifice an ox for SEO”.  And a search on the term brings back a very weird link from the University of Oregon in first position and The Catholic Encyclopedia in second position.  I doubt either of those organizations or publishers are into ritual sacrifice for any reason.  Then again, I don’t get out as much as I used to.

Dilbert.com

FacebookTwitterFriendFeedStumbleUponDeliciousDiggLinkedInMultiplyBlogger PostPingDiigoGoogle ReaderMySpacePlaxo PulseSphinnTechnorati FavoritesTumblrWordPressShare

PostHeaderIcon Funniest SEO Keywords I’d Love to Optimize

As I’ve mentioned before, I’m a lover of words.  So I take a short break from the serious work of online marketing to enjoy the beauty of our language and contemplate how single letters can not only change meaning but add humor.  After all, in SEO words are our business.

It seems The Washington Post runs a regular feature called The Style Invitational in which it invites readers to change a single letter in words to create a non-existent new word and come up with a definition for the new version.  It also has a variant of the game where the reader is asked to give a humorous definition for an existing word.  They both produce hilarious results, but I’m going to focus on the former game.  Here’s an example:

ignoranus: An individual who is both stupid and an asshole
(Pardon the offensive language. I’m quoting it.)

These will have you bending over in laughter, but since we are in the SEO game, I thought it would be fun to see the exact match volumes from Google Adwords Keyword Tool.  It turns out “surprise! surprise!” that people actually search on these new terms.  So for all of the SEO experts in the room, you now have data to justify creating pages, content and tags for these words. The words are shown in the table below. By the way, I am in stitches that ‘bozone’ and ‘karmageddon’ are the two most searched for terms. Like, wow. I mean, like really? Dude, it’s as if some karmic word God has touched all humans with the ability to recognize the truly funny. Enjoy!

Word Definition Exact Match Volume
Ignoranus (n.) An individual who is both stupid and an asshole 480
Cashtration (n.) The act of buying a house, which renders the subject financially impotent for an indefinite period of time 91
Intaxication (n.) Euphoria at getting a tax refund, which lasts until you realize it was your money to start with 110
Reintarnation (n.) Coming back to life as a hillbilly 210
Bozone (n.) The substance surrounding stupid people that stops bright ideas from penetrating. The bozone layer, unfortunately, shows little sign of breaking down in the near future 1,900
Foreploy (n.) Any misrepresentation of yourself for the purpose of getting laid 46
Giraffiti (n.) Vandalism spray painted very, very high 390
Sarchasm (n.) The gulf between the author of sarcastic wit and the person who doesn’t get it 1,000
Inoculatte (n.) To take coffee intravenously when you are running late 36
Osteopornosis (n.) A degenerate disease 46
Karmageddon (n.) It’s like, when everybody is sending off all these really bad vibes, right? And then, like, the Earth explodes and it’s like, a serious bummer 1,600
Decafalon (n.) The grueling event of getting through the day consuming only things that are good for you 210
Glibido (n.) All talk and no action 58
Dopeler Effect (n.) The tendency of stupid ideas to seem smarter when they come at you rapidly 63
Arachnoleptic Fit (n.) The frantic dance performed just after you’ve accidentally walked through a spider web 91
Beelzebug (n.) Satan in the form of a mosquito that gets into your bathroom at 3 in the morning and cannot be cast out 170
Caterpallor (n.) The color you turn after finding half a worm in the fruit you’re eating 16
FacebookTwitterFriendFeedStumbleUponDeliciousDiggLinkedInMultiplyBlogger PostPingDiigoGoogle ReaderMySpacePlaxo PulseSphinnTechnorati FavoritesTumblrWordPressShare

PostHeaderIcon Highlights from SMX West 2010 – Part 1

I have been away from the blog for waaay too long. Not a good thing. As someone asked yesterday at the “Ask the SEOs” session at SMX West; “Is it more important to blog often, to blog long articles less frequently, or to optimize your posts for SEO in order to rank well in the SERPs for the keywords you care about?”   The answer from the panel of luminaries – Greg Boser, Bruce Clay, Rand Fishkin (substituting for Rae Hoffman) Todd Friesen, Vanessa Fox, Aaron Wall, and Jill Whalen – was mixed depending on who responded.  But to me, if I had to choose only one, it would be blog often as that gets you crawled more frequently, broadens your keyword base, and generates more followers because people tend to subscribe to folks who generate regular, fresh content.  So I’ve broken my own rule.

Well, let’s see if I can change that with a quick post today.  To help those who couldn’t be at SMX West these last three days, I will list some tidbits here that I found useful.   These will go all over the place, so bear with me. This is Part 1 – I will cover more of the highlights in subsequent posts.

Also, before I get started, let me say a big “thanks” to the folks at SearchEngineLand for putting on another great and informative event. It was fun and you never fail to educate me on things I hadn’t discovered for myself. I’ll see you, of course, at SMX Advanced in June.

Insights for Mobile Search

  1. Isolate mobile into its own unique campaign.
    • KPIs are lower than in traditional paid search – so isolate it to protect your other programs.
    • Isolating helps you optimize to raise KPIs of mobile specifically.
    • Adwords lets you segment your campaigns by carrier and site you are on. Cool!
  2. In mobile search that are only 5 results above the fold and you must be in position 1 or 2 or not at all – there is a HUGE falloff after those 2 positions.  Literally near zero clicks.
  3. Mobile query lengths are shorter – no big surprise there.  Search Term length example:  in a study of 414 broad match queries on mobile, the avg word length was 2.8 and longest phrase was 8 words (used 3 times).  So it is very important to run broad match on mobile campaigns.
  4. Length of query versus Click Through Rate is pretty similar between mobile and desktop search.
  5. Most popular categories for mobile right now are sports, celebrity, news, wall papers, videos, and ringtones.
  6. Other Best Practices for Mobile
    • Reinforce mobile friendliness in ads – “mobile optimized, “4 UR phone”, “Mobile ready” – indicates to viewer that site will work on their phone with best performance.
    • Have a URL that indicates mobile friendliness – it’s another signal to the consumer that you’ve made the experience usable on mobile devices.
    • Build landing pages specifically for mobile. Test landing pages for mobile separately.
    • Simplify the experience for mobile – you have to really get the content down to an essence – we saw an example of Home Depot versus Best Buy. Home Depot’s cluttered page really made the experience unusable.

New Tools to Explore

Of course I’m going to have a section on this, toolhound that I am.  Just a list for you to explore:

  • usertesting.com. Quick testing of your user experience for landing pages.  Cool idea: run this on your competitors sites as well – $29/user.  I’ve used this, btw – it has drawbacks, but for testing landing pages it is very appropriate.
  • crossbrowsertesting.com. See how your page looks in various browsers easily.
  • attentionwizard.com. A free tool that shows where people are looking on your page.
  • clicktale.com.  Records sessions and gives feedback/analytics on experience.  Free for one domain and 2-page playback.
  • crazyegg.com. Simple and affordable heat mapping tools that allow you to visually understand user behavior.
  • Lynx or seobrowser.com. A dedicated browser that allows you to see your pages the way the search engine crawlers see it
  • charles. An HTTP proxy / HTTP monitor / Reverse Proxy that enables a developer to view all of the HTTP and SSL / HTTPS traffic between their machine and the Internet.
  • Wave toolbar. Provides a mechanism for running WAVE reports directly within Firefox for debugging site issues.
  • gsite crawler. Another free sitemap generator for Google, Yahoo and Bing.

That’s it for today. Back to the day job.

FacebookTwitterFriendFeedStumbleUponDeliciousDiggLinkedInMultiplyBlogger PostPingDiigoGoogle ReaderMySpacePlaxo PulseSphinnTechnorati FavoritesTumblrWordPressShare

PostHeaderIcon Technical SEO: Site Loading Times and SEO Rankings Part 2

In my last post, I discussed the underlying issues regarding site loading times and SEO rankings.  What I tried to do was help the reader understand why site loading times are important from the perspective of someone designing a search engine that has to crawl billions of pages.  The post also outlines a few of the structures that they would have to put in place to accurately and effectively crawl all the pages they need in a limited time with limited processing power.  I also tried to show that a search engine like Google has a political and economic agenda in ensuring fast sites, not just a technical agenda.  Google wants as many people/eyeballs on the web as possible, so it is to their advantage to ensure that web sites provide a good user experience.  As a result, they feel quite justified in penalizing sites that do not have good speed/performance characteristics.

As you would expect, the conclusion is that if your site is hugely slow you will not get indexed and will not rank in the SERPs.  What is “hugely slow”?  Google has indicated that slow is a relative notion and is determined based on the loading times typical of sites in your geographical region.  Having said that, relative or not, from an SEO perspective I wouldn’t want to have a site where pages are taking more than 10 seconds on average to load.  We have found from the sites we have tested and built that average load times higher than approximately 10 seconds to completely load a page will have a significant impact on being indexed.  From a UE perspective, there is some interesting data that the limit on visitors patience is about 6-8 secondsGoogle has studied this data, so it would probably prefer to set its threshhold in that region.  But I doubt it can.   Many small sites are not that sophisticated, do not know these kinds of rules, and do not know how to check or evaluate their site loading times.  Besides this, there are often problems with hosts that cause servers to run slowly at times.  Google has to take that into account, as well.  So I believe that the timeout has to be substantially higher than 6-8 seconds, but 10 seconds as a crawl limit is a guess, 

I have yet to see a definitive statement by anyone as to what the absolute limit is for site speed before indexing ceases altogether (if you have a reference, please post it in the comments).  I’m sure that if a bot comes to a first page and it exceeds the bot’s timeout threshold in the algorithm, your site won’t get spidered at all.  But once the bot gets by the first page, it has to do an on-going computation of average page loading times for the site to determine if the average exceeds the built-in threshold, so at least a few pages would have to be crawled in that case. 

Now here’s where it gets interesting.  What happens between fast (let’s say < 1-2 second loading times, although this is actually pretty slow but a number Matt Cutts in the video below indicates is ok) and the timeout limit?  And how important is site speed as a ranking signal?  Let’s answer one question at a time.

When a site is slow but not slow enough to hit any built-in timeout limits (not tied to the number of pages), a couple of things can happen.   We do know that Google allocates bot time by the number of pages on the site and the number of pages it has to index/re-index.  So for a small site that performs poorly, it is likely that most of the pages will get indexed.  Likely, but not a guarantee.  It all depends on the cumulative time lag versus the average that a site creates. If a site is large, then you can almost guarantee that some pages will not be indexed, as the cumulative time lag will ultimately hit the threshold set by the bots for a site of that number of pages. By definition, some of your content will not get ranked and you will not get the benefit of that content in your rankings.

As an aside, by the way, there has been a lot of confusion around the <meta name=”revisit-after”> tag.  The revisit-after meta tag takes this form <meta name=”revisit-after” content=”5 days”>. 
This tag supposedly tells the bots how often to come back to the site to reindex this specific page (in this case 5 days).  The idea is that you can improve the crawlability of your site by telling the bots not to index certain pages all the time, but only some of the time.  I became aware of this tag at SMX East, when one of the “authorities” on SEO mentioned it as usable for this purpose.  The trouble is that, from everything I have read, the tag is completely unsupported by any of the major engines, and was only supported by one tiny search engine (SearchBC)  many years ago. 

But let’s say you are one of the lucky sites where the site runs slowly but all the pages do get indexed.  Do Google or any of the other major search engines use the site’s performance as a ranking signal?  In other words, all my pages are in the index.  So you would expect that they would be ranked based on the quality of their content and their authority derived from inbound links, site visits, time-on-site, and other typical ranking signals.  Performance is not a likely candidate for a ranking signal and isn’t important. 

If you thought that, then you were wrong. Historically, Google has said, and Matt Cutts reiterates this in the video below, that site load times do not influence search rankings.  But while that may be true now, it may not be in the near future.  And this is where Maile’s comments took me by surprise.  In a small group session at SMX East 2009, Maile was asked about site performance and rankings.  She indicated that for the “middle ground” sites that are indexing but loading slowly, site performance may already be used to influence rankings.  Who is right, I can’t say.  These are both highly respected professionals who choose their words carefully. 

 

 

 

Whatever is true, Google is sending us signals that this change is coming.  Senior experts like Matt and Maile don’t say these things lightly.  They are well considered and probably approved positions that they are asked to take.  This is Google’s way of preventing us from getting mad when the change occurs.  Google has the fallback of saying “we warned you this could happen.”  Which from today’s viewpoiint means it will happen.

Conclusion: Start working on your site performance now, as it will be important for SEO rankings later. 

Oh and, by the way, your user experience will just happen to be better, which is clearly the real reason to fix site performance. 

And it isn’t only Google that may make this change.  Engineers from Yahoo! recently filed a patent with the title “Web Document User Experience Characterization Methods and Systems” which bears on this topic.  Let me quote paragraph 21:

With so many websites and web pages being available and with varying hardware and software configurations, it may be beneficial to identify which web documents may lead to a desired user experience and which may not lead to a desired user experience. By way of example but not limitation, in certain situations it may be beneficial to determine (e.g., classify, rank, characterize) which web documents may not meet performance or other user experience expectations if selected by the user. Such performance may, for example, be affected by server, network, client, file, and/or like processes and/or the software, firmware, and/or hardware resources associated therewith. Once web documents are identified in this manner the resulting user experience information may, for example, be considered when generating the search results.

In does not appear Yahoo! has implemented any aspect of this patent yet, and who knows what the Bing agreement will mean for site performance and search.  But clearly this is a “problem” that the search engine muftis have set their eyes on and I would expect that if Google does implement it, others will follow.

FacebookTwitterFriendFeedStumbleUponDeliciousDiggLinkedInMultiplyBlogger PostPingDiigoGoogle ReaderMySpacePlaxo PulseSphinnTechnorati FavoritesTumblrWordPressShare

PostHeaderIcon Social Media Strategy and Search Engine Optimization – You CAN Own the First Page

I have been talking in my last few entries about the power of social media channel architectures for search engine optimization and brand building .  It’s always hard to understand these approaches or their true efficacy for SEO in the abstract.  Is this just a nice idea or does it really work?  In a land where data talks and you-know-what walks, do you have data to support the ROI on creating a social media channel architecture?  Can your really move the needle on the SERPs and, if so, how much?

There are many more well-known than me in SEO - Adam Audette of Zappos for one, and Jordon Kasteler of Search and Social for another – who I didn’t realize were mining the same path and have reported results of their efforts.   But I can now also provide substantiation of my techniques, which are especially strong for “easy to rank” and “moderately difficult to rank” keywords and which I am about to test on painfully competitive keywords.

Shown below are two results of my social media channel strategy.  Both of these are from posts on this blog, which is based on an SEO optimized version of WordPress.  The first was a post that was done on September 26 about .htaccess files.  The search was done on October 5 while I was at SMX East.  I was writing a primer on htaccess grammars.  I wanted to rank on that word so new SEOs (and maybe experienced ones) would find me.  Traffic isn’t the issue with this post – reputation management and brand building are.  It is part of my strategy of trying to become a top SEO in the industry (for those who don’t know me, I am a fierce competitor and don’t know how to set small goals).  So it was ok that it was a low-traffic word. 

In this case, I was able to rank in 9 of the top 10 positions after one week - basically my entry owns the first page.  This happens because I don’t depend on my blog to rank.  Bill Joy of Sun, who I had the pleasure to work with while I was on the Java team, used to say “95% if the smart people in your industry don’t work for you, so need to find a way to leverage their intelligence for your success.”  That is Joy’s Law, and it is one of the conceptual underpinnings used to market Java in its early days.  Well, similarly in SEO, my social media strategy leverages the intelligence and resources of bigger organizations to maximize my content’s impact for search engine optimization purposes. 

Figure 1
Rankings Using Social Media Channel Architecture for the Keyword “htaccess grammar”

While my blog ranks first and second in the SERPs, as well as nine and ten, the other entries (3,4, 6, 8) are from what I call the social media amplifier – as my entries flow through Friendfeed, youare, Twitter, and identi.ca.

Very low volume, very low competition keyword.  OK.  Here are the results for a search on “social media channels”, which is higher volume (73 searches, but still low).  Note that the blog holds positions 1, 2, and 5, but the higher positions are due to the SEO power and flow through from Tweetmeme. The post was published on September 8 and, sad to say, I don’t have the search date, although I believe it was around September 15. (You will note I failed to capture the top of the search page – but I hope you will trust my reporting for this one time). 

Figure 2
Rankings Using Social Media Channel Architecture for the Keyword “social media channel”

Search results for social media channels

Also if you search today for “social media channel architectures” (the keyword I optimized for) you will find the entry in Positions 1,2,3, 4, and 5 due to leverage of Tweetmeme and Friendfeed. 

Realize this is before the creation of any special content specifically for social media sites – such as an article on hubpages or knowl, or a presentation on slideshare.  What would happen to that content through the amplifier still needs to be tested.

My next experiment will be to try this approach on really tough keywords.  We’ll see then just how much effort it takes to rank there. 

FacebookTwitterFriendFeedStumbleUponDeliciousDiggLinkedInMultiplyBlogger PostPingDiigoGoogle ReaderMySpacePlaxo PulseSphinnTechnorati FavoritesTumblrWordPressShare
Posts By Date
September 2014
M T W T F S S
« Jul    
1234567
891011121314
15161718192021
22232425262728
2930