Beware of the Robots!

February 7th, 2014 by

How Internet Bots are Clouding Insight from Web Metrics: A Case Study in Screen Size for Responsive Design.

Google_web_crawling_bots

With our strong emphasis on the use of data in the digital experience design process, one of the first things we do when kicking off a new design process is to ask the data a key question: Who, using what devices, are we designing for?

We want to know the devices, browsers and Operating Systems that the current visitors are using so that we can optimize the experience to that mix, and deliver a responsive design that works best at the most common screen resolutions we are seeing (and will be seeing moving forward). Knowing the networks (mobile/fixed) and bandwidth are also helpful to determine the typical/sweet spot experiences a user will have with the new sites and apps we are designing.

We started working with a new client in the B2B technology space a few months back and started digging into their current Google Analytics data from the first day. First we wanted to take a look at the current typical screen sizes to determine if a responsive or adaptive (or hybrid) design was the best approach for them, and to determine which breakpoints were optimal.

Screen resolution settings in Google Analytics

(Screen resolution settings in Google Analytics)

Pulling the screen resolution out of Google Analytics for the last quarter, adding a secondary dimension of Device Category (so we could filter to just desktop, removing tablet and mobile) and then importing into Excel so we could extract and average the Horizontal and Vertical resolutions separately, we ending up with the following ‘average’ screen size:

1,119 x 798 pixels

With an aspect ratio close to 4:3 (4:2.8), this seemed low, compared to others we see, but was within the realms of the possible. However, the average screen size is not that helpful for determining design sizes, as we need to know the full distribution of screen size. Taking the raw horizontal screen resolutions and turning them into a cumulative chart produced this more visual representation of the data.

Screen Resolution 2

The chart basically shows, for each horizontal screen size, what % of users have a screen larger than that size – so 1 pixel has 100% of users with larger screens, etc. It also filters out mobile and tablet for now.

Instantly we can see a problem. 75% of all users seem to be at a screen resolution of 1024×768. And, as we have filtered out tablets, this must mean a lot of people have very small screens. In fact, a cursory glance at the Best Buy website shows that it’s not even possible to buy a computer or monitor with such a low resolution today.

So, we have a bot problem.

Some recent research from Incapsula (report here) showed that more than half of internet traffic is bots – some good (like Google’s search crawlers and monitoring tools) and some nefarious (like scrapers and malware). And both our friendly and less friendly bots tend to report as ‘standard’ agents, i.e., 1024×768 running on Windows and IE.

So, with our known infestation of bots we went back to Google Analytics and worked closely with the client to identify and filter out as many of these bots as possible. The easy ones (like Google) were already filtered. Some more were easy to spot, based on identifiers and IP networks. But for others we had to dig deeper and look at behavior on the site, such as visiting every page, something a real user would rarely, if ever, do!

The filtering had a dramatic effect, as the chart below with total traffic to the site over the last five months shows.

Visits per day 3

Once we had stabilized the data, we could revisit the screen resolution and other key data that would help inform the design. This chart below shows the “before & after” data for the horizontal screen resolution.

Horizontal Resolution 4

This totally transforms the view of the physical set up of our users’ desktop devices.  The data also looks more ‘natural’ with jumps at standard resolutions. The average resolution also moved from 1,119×798 to 1,501×922, a really significant change.

So, now that we were confident that we had killed off most of the bots (victory to the humans!) we could get back to the first question – what screens should we optimize the design to?

Merging the tablet and mobile data back in, and adding the vertical resolution to the horizontal chart we end up with this true picture of the users of the current experience.

Visits 5

Quickly we can see the main device resolutions and the share of users seeing each. The big takeaway for this project was that 65% of the users are seeing the site on a screen larger than 1,200 pixels, and almost 20% were using monitors at the full HD resolution of 1,920.

This changed our design strategy completely.  From maxing the screen resolution at 1,024 we are now focusing to optimize the desktop experience at 1,200, on tablets for resolutions closer to 1,000, and all within an overall responsive framework.

The lessons?

Always validate your web analytics to ensure your data is correct.  Bots are insidious on the web so filter them out. And as the actual desktop users are using big screens optimize for them, not just for tablets and mobile.

Post-PC Digital Teams

October 24th, 2013 by

Mark Ryan, Chief Analytics Officer of Extractable, and I were recently interviewed by Sam Stern, a Senior CX Analyst from Forrester for his new report “Digital Customer Experience Teams in the Post-PC Era” and I wanted to share a few thoughts on the topics raised.

Sam’s report looks at how companies are trying to shape their digital teams against an environment in which customers are moving between devices (PCs, phones, tablets) and channels in a very fluid manner, and demanding a consistent and functional experience across them all.

We had a great conversation, sharing anecdotes and strategies from our clients, comparing our thoughts against what Sam was hearing from others during his research.

There were many observations and opportunities that we shared with Sam, based on what we have been seeing with our clients and engagements over the last few years, but two major insights stood out that I wanted to discuss in more depth today:

#1 – The external / internal cycle

As we know, larger organizations find it hard to change quickly, and this is especially true when confronted with new technology that not just challenges the current way that business is done, but also requires new skills or ways of thinking to take advantage of the technology. So, what we often see is the creation of a new group or team, outside of the standard organization, with a remit to take advantage of the new technology. We saw it with the ‘web team’ in the past, and to a lesser extent with SEO and social media in more recent years.

Over time, as the technology becomes mainstream, the external team is absorbed back into the organization, changing it in the process.
Today, we are seeing some clients and organizations with ‘mobile teams’ as a reaction to the rapid growth of the new, post-pc multi-device world we are now in. However, it makes no sense to have a separate mobile team, as mobile devices are just one touch point that customers have with an organization. Of course, certain key mobile skills are needed to fully utilize mobile platforms, but that should be delivered within the context of the total customer experience.

One good question to ask of anyone proposing a separate team focused on mobile is: “If a social media tool is delivered to the customer via a mobile device, is that the responsibility of the mobile or social teams?”

The takeaway: Focus on the total customer journey and not the specific delivery device.

#2 – The business is the experience

In his report Sam talks about the importance of getting business stakeholders involved in the customer experience design process and how customer journey mapping can be a good tool to aide in that process.

We strongly agree.

With one of our clients in the financial services industry we are working on a major overhaul of the client’s b2b broker portal. The project’s business stakeholders deeply understand their business but are new to the customer experience design world. Consequently it is proving hard for them to clearly express experience requirements and give feedback on the more advanced interactions being contemplated.

Just last week during a call to review some wireframe concepts, one stakeholder asked a question, “I understand how the navigation is supposed to work, but how will our users interact with the menu on the left?” The menu on the left was just our index of the screens we were showing in the session, not part of the design, indicating a lack of familiarity with the process we live by every day. However it is our responsibility, as experience designers, to ensure those stakeholders are part of the success of the new experience not the other way round.

Journey mapping is one tool to help with this, allowing stakeholders to see the total customer journey and where digital can assist customers.

Customer Journey Workshop

(In progress Journey Mapping exercise)

Another technique we are using much more frequently is to put higher-fidelity concepts in front of stakeholders so we can gauge their reactions earlier in the process. Once they see how principles that sometimes seem esoteric come alive in a ‘real’ experience, it’s much easier for them to express valuable feedback.

Combining the customer journey maps with interactive concepts at key touch points can become a powerful tool to evangelize an organization behind digital change.

How has your digital team changed? Do you see big changes in the future?

And, thanks to Sam Stern for helping to drive an ecosystem-wide approach to building digital customer experience teams.

Data-Driven Design: The results are in!

May 2nd, 2012 by

Here at Extractable, we are strong advocates of the data-driven design approach to creating powerful, enjoyable and successful web and digital experiences for users. You’ve heard us talking about our experiences with data-driven design on this blog, at conferences and in person, but we wanted to go further and learn about how firms across America, both b2b and b2c, used data in their design processes, what tools they used and what outcomes they generated.

So we commissioned a study from Forrester Consulting to do just that, and the results are now in!

The full study: “Data-Driven Design”, a commissioned study conducted by Forrester consulting on behalf of Extractable, April 2012 is now available for download on our site.

We have also created a great infographic of the key findings, available here.

In this blog post, we will focus on a few key findings and what we at Extractable have learned from the study. Over the next few days, we will dissect the study further with a series of deeper posts on key findings.

We saw two key findings in the study.

Firstly, some 60% of firms surveyed had seen improvements in their website due to use of data. And, if the company also reported they had a repeatable design process, the numbers reporting improvement grew to 71%. This is a powerful result. To us as advocates of incorporating data in the design process, the result is a key validation that the process can produce measurable business outcomes. On a daily basis the projects we undertake for our clients using data-driven design are seeing positive results which we are now seeing across a larger and diverse sample size.

However, the second key finding is that many firms are struggling with data: measuring the wrong kinds of data; missing out on key inputs; and even ignoring key data points.

Some specific examples from the study include:

  • Companies don’t know how or can’t apply the right tools and processes to optimize their sites. Only 28% of companies are happy with the tools and techniques they use to measure their websites today. As high as 52% believe there are other tools that could provide them with better insight.
  • Many firms are measuring the wrong kinds of data. Sites are often measured on metrics that don’t show business value. For example, 46% of respondents indicated they used “time on site” as a key measure. This doesn’t always indicate a positive experience—it could also mean that users are lost trying to navigate the site.
  • Some key data is being ignored. 37% report ignoring data that is uncovered and 34% that they gather data but do not use it. Sometimes this seems to be a factor of the ‘Highest Paid Person’s Opinion’ overriding the insights drawn from key data.

Based on the study and our own experiences, it seems that many firms understand the value of data, are looking for the best ways to use it, but fall short in terms of creating a strong data-driven design process, supported by the right skills, teams and tools to be effective.

After reviewing the study, the strategy team here at Extractable synthesized our thoughts into some core recommendations. To make the most of data-driven design, companies should look to:

  • Define the metrics for the site based primarily on the central business goals/outcome of the site (or other digital asset), such as sales, leads, customer service efficiency, etc.
  • Measure the right set of data that will allow you to see the effect of the changing elements of the experience on that goal/outcome.
  • Include a wider set of data/tools, including behavioral tools as well as data from non-web systems such as financial/sales and customer service tools, to ensure you understand why users are doing what they are doing.
  • Apply these insights to the design process. Test the updated designs, analyze the results and repeat.

As a reminder, the full study can be downloaded here:

Additionally, Extractable is hosting a Webinar on the study’s findings featuring guest speaker, Forrester Research Inc. Analyst, Adele Sage. Sign up here.

My love / hate relationship with users

September 13th, 2011 by

Here at Extractable we employ a data-driven user experience design approach. That is a rather formal way of saying that we gather as many types of user behavior inputs into the design process as possible, from deep web analytics through user concept testing and full user testing. And once live, aggressively optimize our sites based on actual usage patterns.

In the last few weeks we have been working on a series of contextual user interviews, user concept testing and formal usability testing for a range of clients from including a b2b semiconductor manufacturer, a consumer facing financial services startup and a credit union.

This is where both my frustration and love of users kicks in.  Just on Friday we were sitting with one user (contextual, at his desk) for a couple of hours. At one point he was searching on our client’s current site for a specific document – a document that was highlighted (in bold with a big icon) on top right of the page. He diligently scrolled through pages of document listings below the fold, clicked on every tab and scanning the page as if he were a leopard stalking its prey. All to no avail. He never saw the document icon/link that probably took up 15% of the page real-estate.

Throughout this process I had to physically hold my arm from pointing at the screen and my tongue from shouting out some expletive along the line of “What is that big red thing in the corner”!

Of course I exaggerate my reaction, but it was a frustrating experience as the user was not doing what I, or the rest of our design team, were expecting them to do. And this is why I love data based on real users and their real behaviors. They challenge us to think beyond our expectations of what they should be doing and especially to challenge the perceived wisdom that can exist in many companies,  often of the form, “no, our customers don’t want to do that”. This challenge, knowing that each of our ideas will need to successfully run the gauntlet of user research and testing, pushes us to think wider, deeper and hopefully, more innovatively.

Plus sometimes user research delivers incredible insights, like the user last week that had pretty much downloaded the entire product catalog of our client and all its competitors, and had then built out his own taxonomy to store all the files!  That one session gave us more insight into how their users see the client’s products than many hours of stakeholder sessions with the client’s product managers did.

But, of course, not every session delivers such value.  One testing participant when asked what else she might want to see on the test site answered “More information”. When asked what type of information, she responded “just more information”.   Which takes me back to my starting point, I love users and all the things we learn from them to help us deliver rich user experiences, but sometimes it can be quite a challenging experience for the user experience design team.