Archive for March, 2010
Well, it’s Monday. A good Monday with some interesting insights.
I will continue with tool review going forward, but I’m finding that I need to document our work on our website performance as we go along or else we lose the data from the intermediate steps, and there have already been several that have been implemented. So let me bring you up to speed.
After my last post about the site and reviewing the data from the Google Site Performance tab in Google Webmaster tools, I was able to visualize (see the image) what was going on. As the image shows, performance jumped around substantially from mid-September, when I started the blog, until early-mid December. These jumped did not coincide in any major way with the debugging and latency improvements that I had been working on. Except for December – around the time of my last post. That seemed to have cut my latency in half – which was what pingdom had shown. So perhaps I was moving in the right direction.
Things continued to improve steadily through January – even though I had not changed any further settings. This again suggested that the fact I was hosted on a shared server and that perhaps my ISP had improved the performance of that server might be the reason for unpredictable performance changes, good or bad. But then in mid-January, I started to see a jump in latency times again.
At the same time, I wanted to continue debugging AboutOnlineMatters site latency and implement some of the changes from ySlow, such as gzip, entity tags, and expires headers. To do that, I needed direct access to my Apache Server. Given these two facts, I decided that it was time to remove the server as a factor and host the blog myself.
On February 6, we moved the site onto our own hosted setup. This is basically a dedicated server (we do have a few other small sites running, but they are using insignificant server resources) and I have direct access to all the configuration settings. From that time forward, as the chart shows, site latency has decreased continually until it is now at close to it’s historical lows.
I’ll leave it there for now – following my rule of short posts. We’ll pick up the next steps I took tomorrow.
It is back to the blogstone. And once again, I have broken my own rule about writing long posts infrequently. This one is a continuation of my previous posts on improving web site performance. What especially motivated me to go back to this topic was a request I received from Justified on my site performance posts:
My fellow classmates use your blogs as our reference materials. We look out for more interesting articles from your end about the same topic . Even the future updates about this topic would be of great help.
What a nice compliment. I wouldn’t be a very good marketer if I didn’t respect the wishes of my ‘customers.” So, I continue the series on web site performance issues and my saga to improve the performance of this blog. Having said that, a number of things have happened since that last post.
First, as noted in a previous post , I had the opportunity to go to SMX West earlier this month. While there, I attended a session titled “Diagnosing Technical SEO Issues”, with Adam Audette, Patrick Bennett, Gabe Gayhart, and Brian Ussery as the panelists. One thing I learned is that the term “site performance” has a general usage different than what I am covering here. Site performance is usually defined as including:
- How easy a site is to crawl.
- Infrastructure issues, including URL structures, template coding, directory structures, and file naming conventions.
- Latency issues such as html redirects, http headers, image compression, and all the other items I have been covering in this series.
The point is, what this series of posts are about is only one element of site performance which is web site latency and response times as seen by Google and other search engines. In the future, I will use this technical term in these posts. I have to decide – for purposes of rankings – whether to change the names of my posts, the URLS, and all the core meta data to reflect this change or whether I will stay with web site performance as the keyword I want to optimize for. That decision will probably be made based on the keyword search volumes as shown in the Google Adwords Keyword Tool. (Actually I have now changed the keyword I am optimizing for to web site latency as I am testing some theories I have on page optimization in the SERps that has nothing to do with site performance. So it just goes to show…)
Second, as also noted in the third post in this series on web site latency, Google has announced and deployed a new web site performance tool within Google Webmaster Tools, as well as a Firefox/Firebug plugin. So in order to continue to explore the topic of AboutOnlineMatters site latency, I need to cover that tool. But then we get into the whole issue of the core set of site performance tools to use for evaluating site latency issues. We already discussed and showed our results from pingdom’s latency analysis tool, but there are many more, some of them providing similar analysis and, as I was bemused to discover, often providing differing results for the same items.
So what I’ve decided to do is to provide some discussion of web site latency and performance tools and toolbars before we get back to analyzing AboutOnlineMatters, and then I can show how I used the tools to debug my site latency issues.
Here are the tools I plan to cover, and just so you know, I may cover some or all of them in flash/video, which would be a first for this blog. Although I’m not a big video fan (I can take in more info more quickly by reading), I know many people prefer than format so I want to try and accomodate them along with my current readers.
|Charles||A desktop application that provies a HTTP proxy / HTTP monitor / Reverse Proxy that enables a developer to view all of the HTTP and SSL / HTTPS traffic between their machine and the Internet. This includes requests, responses and the HTTP headers (which contain the cookies and caching information). A great tool for understanding what calls/requests are being made and how they impact web site latency.|
|curl [url]||curl is a downloadable command line tool for transferring data with URL syntax.|
|dynamic drive||Image Optimizer is a web-based service that lets you easily optimize your gifs, animated gifs, jpgs, and pngs, so they load as fast as possible on your site. It provides images in a range of filesize (for the same size image) by decreasing the DPI of the image. It also easily converts from one image type to another. Upload size limit is 300 kB.|
|Firebug||Firebug is a Firefox plugin that provides a number of tools for developers and technical SEO work, including web site latency and performance analysis. I will cover many of the plugins later, if a get the chance. In the meantime, take a look at this article at webresources depot to find a good list of useful Firebug plugins.|
|Google Page Speed||Page Speed is an open-source Firefox/Firebug add-on that performs several tests on a site’s web server configuration and front-end code. It provides a comprehensive report and score on issues that can effect web site latency, as well as recommendations for improving site latency. This is how Google sees your web site latency and is the first tool you should run to understand if you have web site performance problems from Google’s perspective, which over time will have a larger impact on your rankings.|
|HttpWatch||HttpWatch is a desktop (downloadable) HTTP viewer and debugger that integrates with IE and Firefox to provide seamless HTTP and HTTPS monitoring without leaving the browser window. It is similar in functionality to Charles.|
|Live HTTP headers||A Firefox toolbar plugin that allows you to view http headers of a page while browsing. Analysis of headers is important to understand if certain key functions/libraries that effect web site latency and performance, like gzip, are active on the web server serving up pages.|
|Lynx||A downloadable text browser that allows you to view your site as the search crawlers do. Also a way of ensuring that people with text-only browsers can use the site – however this is a pretty minimal use nowadays.|
|NetExport||NetExport is a Firebug 1.5 extension that allows exporting all collected and computed data from the Firebug Net panel. The structure of the created file uses HTTP Archive 1.1 (HAR) format (based on JSON)|
|ShowSlow||ShowSlow is an open source tool that helps monitor various web site latency and performance metrics over time. It captures the results of YSlow and Google Page Speed rankings and graphs them, to help you understand how various changes to your site affect its performance. This is a great tool to see how the two tools results compare, but also to understand which items they are analyzing. Showslow can be run from within your Firefox/Firebug toolbar or be installed on your server. Be forewarned, to run it on your toolbar you will need to make some settings changes to the about:config page and your results will show publicly on www.showslow.com.|
|Site-perf.com||Site-Perf.com is another performance analysis tool that visually displays web page load times. It is similar to Pingdom’s Full Page Test Tool, although it provides a little bit more detail and better explanations of what the load times mean. It also has a network performance test tool that is handy in understanding what portion of your web site latency and performance issues are coming from your host rather than from the site – and let me tell you that can be a lifesaver as you watch your performance go from great to lousy to great again. The page test tool provides an accurate, realistic, and helpful estimation of your site’s loading speed. The script fully emulates natural browser behavior downloading your page with all the images, CSS, JS and other files, just like a regular user.|
|Smush.it||Smush.it runs as a web service or as a Firebug plugin that comes with ySlow V2. It uses optimization techniques specific to image format to remove unnecessary bytes from image files. It is a “lossless” tool, which means it optimizes the images without changing their look or visual quality. After Smush.it runs on a web page it reports how many bytes would be saved by optimizing the page’s images and provides a downloadable zip file with the minimized image files. smush|
|Wave Toolbar||The WAVE Toolbar provides button options and a menu that will modify the current web page to reveal the underlying page structure information so you can visualize where web site latency issues may be occurring. It also has a built in text-browser comparable to Lynx.|
|Web Page Test||webpagetest.org is a hosted service that provides a detailed test and review of web site latency and performance issues. It is probably the most complete single tool I have found for getting an overview of what is happening with your website. I like this better than yslow or showslow, but I would still use Google Page Speed Test as that is how googlebot sees web site performance.|
|wget||wget is a free utility for the non-interactive download of files from the web. It runs in the background (so you can be doing other things) and supports http, https, and ftp protocols, as well as retrieval through http proxies. You can use it, for example, to create a local version of a remote website, fully recreating that site’s directory structure.|
|Xenu Link Sleuth||Xenu Link Sleuth spiders web sites looking for broken links. Link verification is done on ‘normal’ links, images, frames, backgrounds and local image maps. It displays a continously updated list of URLs which you can sort by different criteria.|
|ySlow||ySlow, developed by Yahoo!, is a FireFox/FireBug plugin. It is a general purpose web site latency and performance optimizer. It analyzes a variety of factors impacting web site latency, provides reports, and makes suggestions for fixes. This has been the most commonly used tool for analyzing web site performance until now.|
While we’re on the subject of SEO humor, I thought I’d throw in this wonderful piece from Dilbert. My guess is that a lot of folks looking for SEO help would be willing to sacrifice an ox (like in Biblical stories) in order to get a first position in the SERPs on a keyword core to their business. But, of course, that is just a guess. No one really searches on “sacrifice an ox for SEO”. And a search on the term brings back a very weird link from the University of Oregon in first position and The Catholic Encyclopedia in second position. I doubt either of those organizations or publishers are into ritual sacrifice for any reason. Then again, I don’t get out as much as I used to.
As I’ve mentioned before, I’m a lover of words. So I take a short break from the serious work of online marketing to enjoy the beauty of our language and contemplate how single letters can not only change meaning but add humor. After all, in SEO words are our business.
It seems The Washington Post runs a regular feature called The Style Invitational in which it invites readers to change a single letter in words to create a non-existent new word and come up with a definition for the new version. It also has a variant of the game where the reader is asked to give a humorous definition for an existing word. They both produce hilarious results, but I’m going to focus on the former game. Here’s an example:
ignoranus: An individual who is both stupid and an asshole
(Pardon the offensive language. I’m quoting it.)
These will have you bending over in laughter, but since we are in the SEO game, I thought it would be fun to see the exact match volumes from Google Adwords Keyword Tool. It turns out “surprise! surprise!” that people actually search on these new terms. So for all of the SEO experts in the room, you now have data to justify creating pages, content and tags for these words. The words are shown in the table below. By the way, I am in stitches that ‘bozone’ and ‘karmageddon’ are the two most searched for terms. Like, wow. I mean, like really? Dude, it’s as if some karmic word God has touched all humans with the ability to recognize the truly funny. Enjoy!
|Word||Definition||Exact Match Volume|
|Ignoranus (n.)||An individual who is both stupid and an asshole||480|
|Cashtration (n.)||The act of buying a house, which renders the subject financially impotent for an indefinite period of time||91|
|Intaxication (n.)||Euphoria at getting a tax refund, which lasts until you realize it was your money to start with||110|
|Reintarnation (n.)||Coming back to life as a hillbilly||210|
|Bozone (n.)||The substance surrounding stupid people that stops bright ideas from penetrating. The bozone layer, unfortunately, shows little sign of breaking down in the near future||1,900|
|Foreploy (n.)||Any misrepresentation of yourself for the purpose of getting laid||46|
|Giraffiti (n.)||Vandalism spray painted very, very high||390|
|Sarchasm (n.)||The gulf between the author of sarcastic wit and the person who doesn’t get it||1,000|
|Inoculatte (n.)||To take coffee intravenously when you are running late||36|
|Osteopornosis (n.)||A degenerate disease||46|
|Karmageddon (n.)||It’s like, when everybody is sending off all these really bad vibes, right? And then, like, the Earth explodes and it’s like, a serious bummer||1,600|
|Decafalon (n.)||The grueling event of getting through the day consuming only things that are good for you||210|
|Glibido (n.)||All talk and no action||58|
|Dopeler Effect (n.)||The tendency of stupid ideas to seem smarter when they come at you rapidly||63|
|Arachnoleptic Fit (n.)||The frantic dance performed just after you’ve accidentally walked through a spider web||91|
|Beelzebug (n.)||Satan in the form of a mosquito that gets into your bathroom at 3 in the morning and cannot be cast out||170|
|Caterpallor (n.)||The color you turn after finding half a worm in the fruit you’re eating||16|
Like many I have an iPhone. Admittedly I still have a 2G because I’ve become more thrifty in my old age (funding a national-level gymnastics career and private school tuition will have that effect) even at the risk of having my Silicon Valley friends call me a technology troglodyte. But I am an avid user of all the main apps (I can bump with the best of them), use it for location-based searches (e.g. AroundMe, Google maps), send images to Facebook, Tweet in real-time at events, Ping when I can, check my blog traffic with Google apps for the iPhone, and know how to plan/execute advertising campaigns specifically on mobile phones. No one who knows me would say I am in any way behind on my use of mobile technology.
But until now, I’ve never felt like mobile has really changed the basic way I have experienced the world. I go to trade shows and listen to all the ideas for the latest mobile services or how mobile concepts should change my daily life, but I have never felt that I had crossed a Rubicon with the real-time nature of mobile in the same way I did when I got my first laptop or sent my first Tweet.
That changed yesterday. I had the pleasure to take my family to a performance of The Smuin Ballet at the Sunset Center in Carmel. I am an admitted ballet snob, and Smuin is a wonderful company with talented, disciplined dancers and creative choreography. So whenever they are in Carmel we go to see them. The second act was a performance of Smuin’s Medea, which was first performed in 1997. It is a dramatic ballet that retells the story of Medea, the wife of Jason, the intrepid explorer of Jason and the Argonauts.
The myth of Jason and Medea is a dark and haunting tale of revenge and self-destruction. But in the dark as the curtain rose on Act 2, despite being a student of mythology, I couldn’t for the life of me remember even the outlines of the tale to tell my wife and eight year old daughter, neither of whom knew the first thing about this particular myth.
So what did I do? I pulled out my handy iPhone and while shielding it so as not to disturb others, I did a web search on Medea and pulled up the Wikipedia entry.
Despite the dark and small print, I was able to glean from Wikipedia the details of the story. Jason met the sorceress Medea, daughter of King Aeetes of Colchis. Medea falls in love with Jason, and he convinces her to help him to acquire the Golden Fleece in return for a pledge of marriage. After acquiring the Fleece, Jason and Medea flee to Corinth and have children. In Corinth, King Creon offers his daughter Glauce to Jason in marriage, and Jason feels he cannot pass up the opportunity to marry a royal princess. Despite explanations and promises of support from Jason, Medea feels betrayed. She gives Glauce a wedding gown covered in poison that kills the bride. Then to ensure that nothing of Jason’s will outlive him, she kills their two sons.
I quickly related the story to my wife and daughter (who was especially engaged by the dark and intense drama onstage). I then sat back and enjoyed the performance, knowing that they now could now understand what they were seeing and, as a result, enjoy it with more insight.
Not a big deal, you might say. I say otherwise. Prior to this, I had used my mobile capabilities to get directions or find a resource nearby. I was using the mobile device as a tool for location-based information. It was a parallel use to a GPS, which while practical, was not a fundamental change in my experience over a map. It was easier than a map and had better information, but the experience was just a replacement of one form of information (paper) for another (digital).
In this case, for the first time I used my iPhone to plug into the wisdom of the commons, into the global village, to enhance and extend the quality or content of an experience. In other words, the phone and the information it provided in real-time became an integral part of the experience, although it was unintentional (from the perspective of the choreographer or the dancers) and because it was unintentional, it was distracting. But this little thing, this one act, was a fundamental change in how I interacted with the world. This was not a substitution of paper-based data with a digital version. Instead it delivered the true promise of the mobile web. It allowed me to access a completely new set of information that was then added to a real-time, real world event to enhance the experience. The equation is real-time event + Internet information = a real-time multimedia experience.
Admittedly, the experience was not perfect because it wasn’t intentionally developed by the show’s producers. But what if Smuin developed a mobile video app that during the Intermission allowed audience members with iPhones to see the full story of Medea in an entertaining way that also tied the story to how the ballet attempted to recreate that story in dance. That would be a more intentional, and less invasive, way to provide the same integrated (but completely new) experience.
So yesterday for the first time, I experienced the true power of the mobile web. I can now say “Eich Bin Ein Mobile Netizen.”