Tag Archives: web performance

Display w3c navigation timings for any web page

I’ve been trying to find a tool that will display all the w3c navigation timings for any web page that I might be browsing. I was surprised when I didn’t find it in Chrome dev tools (really?) nor in speedtracer nor in the format I wanted available Chrome extensions (*hint: product opportunity for someone).

I actually considered writing a Chrome extension myself to show the web performance data from the navigation timings for any web page for a whole afternoon while I was creating my first “Hello World” extension, but I would be slow trying to learn the necessary javascript and messaging protocol Google requires for the extension.

And then I came across this “adorable” bookmarklet from @kaaes that does just that.
breaking_down_onLoad

All you have to do is drag the bookmarklet to your bookmarks and click it for any web page you are browsing and voila!

w3c_nav_timings

So far, this is the handiest tool I have found to get a quick glimpse at the nav timings for your website. These web performance timings are becoming the defacto standard and that will be even more true when browsers support the forthcoming resource timings (thus giving us the waterfall report as well).

Hope you find this a useful bit of kit for your web performance tool belt.
And a shout out to @kaaes for sharing this with everyone!

Ken

How to select the most important web performance metric as a KPI – #feelsfast

We all know intrinsically that website performance is important. It has a tremendous impact on all of the business KPIs that measure the success of our online endeavor. I think website performance gets so much attention for two reasons: (1) it’s the most obvious symptom for bad results and (2) and it is easy to measure.

In my larger philosophical views on Customer Experience (CX) I’ve suggested…

PX > CX > UX = #usable + #feelsfast + #emotive

Feelsfast here represents “a lack of perceived latency.”

15 years ago, when I first started thinking about web performance, we only had network oriented metrics to understand web page performance. Today, there are a larger set of collectible metrics to measure many aspects of the spectrum on User Experience (UX). And today’s web applications, because they are pretty fat clients, must take client-side performance into account as well.

We are constantly reminded of the importance of performance by vendors, the media and customers through their actions.

What is the most important metric to measure web performance as a KPI? It’s the one that best represents User Experience or a lack of perceived latency.

We have network metrics. These are old school metrics that focus on how long it takes your server and the network to delver web page resources to the browser’s network layer.

  • DNS lookup time – time to resolve DNS name
  • TCP connect time – time to TCP connect
  • SSL handshake time – time to perform SSL handshake
  • Time to first byte – time to receive the first packet of data
  • Time to receive the data – time to receive the data
  • Fullpage time – time to load the web page and all it’s resources

Most of today’s modern network browsers supplement this with a richer set of data based on the W3C standards for navigation timings:

  • navigationStart – time that the action was triggered
  • unloadEventStart – time of start of unload event
  • unloadEventEnd – time of completion of unload event
  • redirectStart – time http redirection begins
  • redirectEnd – time http redirection completes
  • fetchStart – time that request begins
  • domainLookupStart – time of start of DNS resolution
  • domainLookupEnd – time DNS resolution completes
  • connectStart – time when tcp connect request begins
  • connectEnd – time when tcp connect completes
  • secureConnectionStart – time just before secure handshake
  • requestStart – time that the browser requests the resource
  • responseStart – time that the browser receives first packet of data
  • responseEnd – time the browser receives the last byte of data
  • domLoading – time that the document object is created
  • domInteractive – time when the browser finishes parsing the document
  • domContentLoadedEventStart – time just before DomContentLoaded event
  • domContentLoadedEventEnd – time just after DOMContentLoaded event
  • domComplete – time when the load event of the document is completed
  • loadEventStart – time when the page load event is fired
  • loadEventEnd – time when the page load event completes

This is a nice visual of the W3C timings.

timing-overview

And we have visual timing metrics available from various tools:

  • IE11 brings us msFirstPaint as part of the browser timings
  • webpagetest.org gives us start render, filmstrip view, and the innovative speed index
  • AlertSite.com can provide visual capture and metrics for FirstPaint and Above the Fold using Firefox

How do you choose which web performance metric has the most value as a KPI when all of these have some value? The key is to identify, for any particular monitored application or web page, which metric best represents a users perception of latency, in other words, if it feels fast. This is likely one of the more modern metrics – loadEventEnd, FirstPaint, Speedindex, Above the fold.

Once selected, this #feelsfast metric should become a critical business KPI and tracked and managed as such.

Are you giving the web performance component of UX enough attention?

Ken

64% Chrome Browser Share! Are They Chrome Hipsters?

Ok, I’m a big fan of Chrome, and I’ve pretty much replaced Firefox on all my machines with it, but I’m still having a hard time believing the numbers in New Relic’s infographic. New Relic says this data is aggregated from over 3 million application instances.

The breakdown shows:

  • Chrome 64.3%
  • Firefox 16.3%
  • IE 14.8%
  • Safari 4.2%
  • Other 0.4%

May I correct what I said before? I’m absolutely sure New Relic can correctly aggregate data like this from their massive collection. It’s just that I’m having a hard time believing these numbers represent the mainstream. I still know too many people who are using Windows on the desktop and are happy with the latest IE browsing experiences. And although I’m still a considerable Mac fan, the newest household addition is a Surface Pro, and IE provides a better user experience right now even with the latest Chrome beta version.

If this number is true for IE, it would spell almost certain doom for Microsoft. Maybe they are not King Kong of the client operating system world anymore but this information is just for Desktop and Laptop computers. I know a few people who really like their Windows phones and after some use I think there is a fair amount to like about Windows 8.1 (of course I did have to install a 3rd party start button ;) In fact, I find myself swiping the screen on the Macbook Pro sometimes lately. But if it is doom for Microsoft should I return my Surface Pro before the 30-days are up?

One reasonable explanation might be that New Relic targets a certain developer / startup type of user and maybe their applications are a little more targeted at – dare I say – Chrome Hipsters :)

Just ranting a bit.

Ken

PX > CX > UX = #usable + #feels_fast + #was_emotive <-- the battle for business supremacy

Over the last few years I have become inspired or perhaps possessed with a certain awe about how touch interfaces, Mobile, Cloud and Social have converged to change the focus of most successful organizations from delivering usable products to delivering meaningful and pleasurable experiences worth sharing. Digital experience influences more and more of our business landscape from how customers find us, to how they learn about and perceive our reputation, to their on-boarding experience. It’s the experience to date (hey I just made up a new term ETD), the sum of the whole experience delivered to the PEOPLE who are our users, that drives this.

Delivering experiences that people feel good about, find memorable and want to share is the next battle for business supremacy.

I’ve suggested previously, because our users are people, and understanding the customer journey and how to deliver amazing experiences starts with people, this should not be the practice of customer experience (CX) but rather people experience (PX). Further, if we are focussing for today (and we are) on digital experiences then we are really talking about user experience (UX).

I would suggest to you that this equation holds true:

PX > CX > UX = #usable + #feels_fast + #was_emotive

Of course, this is rooted in the fact that software systems are now systems of engagement and not just systems of record. We count on our software systems to help improve our reputation with our customers and our software systems to help our employees improve our reputation with customers. Every software system built ultimately has an impact on People Experience. And I want to emphasize how important it is that even our internal systems provide pleasurable experiences to employees. Because happy employees make for happy customers.

Aberdeen Research interpretation of Andrew's CX hierarchy

Aberdeen Research interpretation of Andrew’s CX hierarchy

This concept is borrowed from a slideshare (slide 15) by Steven Anderson in 2006 and the clarified graphic is courtesy of Aberdeen Research.

They are both a refinement of some previous research from Carnegie Mellon on human computer interfaces in the early 90s.

This is what we all should be striving for in the software systems that drive our interactions with customers and prospects. And also the software systems that support real-world interactions that support our employees or inventory or return process. What does this refined CX pyramid look like to you? Does it remind you of Maslow’s Hierarchy of needs. Take a look.

maslow

Just like with Maslow’s Hierarchy the basic needs and basic tasks at the bottom are much easier to achieve than the needs at the top like self-actualization. And yet, that is what is required of everyone involved in designing customer experiences now.

How can we build applications that create pleasurable experiences? By understanding the PEOPLE who will be using them. That’s probably a lot easier than self-actualizing.

Understand the people who are your users and do more than help them get it done – strive for delight. Think about those smaller parts of the interaction that don’t require building the starship enterprise. The Kano model is a good strategy here. Where could you introduce parts of the interaction that are different and appealing?

Let’s look at one ingenious example in the Travel aggregation space – Hipmunk.com. Search for any flight…go ahead. Notice that cute little button in the sort bar that says sort by “agony.” How can that not make you smile if you’ve every travelled through airports?

What appealing little capabilities are you adding to your UX to help delight people?

Ken

Uptimerobot – An accurate, easy to use, and free website monitoring service

I am on a quest to share the best free tools and services to help maximize availability, performance and user experience for your critical customer facing applications. Uptimerobot is a nice looking, easy-to-use, basic website monitoring service. Even better, like all of the tools I’ve been sharing over the last few weeks, it’s FREE!

And these guys don’t skimp.
You get 50 – that’s right I said 50 – individual monitors that can run as often as every 5-minutes.

They offer a reasonably handsome dashboard to view summary statistics for all of your website monitors. The web monitors support HTTP, HTTPS, ping, port checking and keyword monitoring. I think they need to merge the keyword monitoring into the HTTP monitoring but that’s such a minor quibble.

uptime_robot_dashboard

Here’s a quick example of the availability and performance data for the web monitor pointed at APMexaminer.com. Right now, I think you can only view this performance data for the last 24-hours.

uptime_robot_monitor_performance_uptime

I’ve only had one outage on my Website so far.  For some reason, I thought it was a good idea to turn on fastCGI on my Web server. WordPress or mySQL wasn’t happy and the site crashed the next night and was down for almost 3 hours. Uptimerobot sent accurate and reliable notifications, albeit sparse of diagnostic info, indicating when my Website was down. After @dreamhost support helped me resurrect it along with some advice for turning of the fastCGI option I promptly received notification the Website was back up and the duration of the downtime.

Uptimerobot doesn’t have a lot of advanced features like configuring web performance or transaction monitoring using real web browsers or collecting real-user monitoring statistics for every using site visitor, or even the ability to select which geographic locations perform monitoring.

In my considerable experience, a lot of people are just looking for good basic website monitoring services that provide reliable notification when their Website is unavailable, and allow you to see a little bit of basic HTTP performance data, and that is something that Uptimerobot does very decently.

Thanks for bringing something excellent to the community.

At this point in our journey together, please don’t tell me YOU don’t have basic Website monitoring in place.

Ken