December 20, 2019
6 min read
Have you heard about the successful start-up all over TechCrunch because they don't collect any metrics? Where every department is entrusted to make decisions based on past experience and gut feel, and where ROI is listed in their mission statement as "Rely On Instinct?"
Of course not. They don't exist. It never will exist. Running a business without metrics is laughably risky and there isn't an investor or entrepreneur on the planet who thinks otherwise.
So why aren't you collecting metrics for your digital signage project? Whether the project is yours or a client's, chances are you've somehow decided that experience and gut feel, though nuts for running a business, are perfect tools for steering a signage deployment.
Why do we do this to ourselves?
There are three key reasons most digital signage projects choose to do without analytics.
- Absence of analytics capability in the underlying software platform.
- Absence of mind share because there is no history of using analytics on a project.
- Absence of experience or confidence in executing an analytics initiative well.
As I like to say to my son, reasons aren't excuses. However, knowing the reason provides you with a guidepost for making things better. And frankly, the first two reasons are easily corrected.
(Is "budget" a fourth reason? If you come around to believing every project needs analytics, then budget can't be a reason because no project can be without. Anyway, if you don't measure, you risk going over budget. Don't be penny wise, pound foolish.)
Does the software platform lack good analytics capability? You've got the wrong platform. Time to find a new option that takes analytics seriously.
Is it not something your team has done in the past? Unless you think there is nothing left in this world to learn, maybe things should change.
No experience or confidence? Ok, now that's a good one. No shame there and it's not your fault. We've established that analytics is atypical for individual projects, so your lack of knowledge is collateral damage. Of course, the solution is education – but where to start?
Three charts approach
Let's keep it simple and identify three charts you should use in every digital signage analytics project. The first two will help you measure the effectiveness of your deployment. The third will shine a light on your most important measures.
I'll talk about data collection in a different article, so for this article I'm going to assume you have a means for collecting data. However, let me tackle an elephant in the room.
There are no direct measures for fully non-interactive signage. You'll need secondary measures and instinct.
This is not as controversial as it may seem. There are many ways to interact with digital signage. Some are active, like touch, RFID badge swipes, and voice activation. Other are passive, like camera vision and motion detection. It is these latter, passive approaches that could be used to generate data for analysis.
Thus, all signage installations can be a primary data source with some degree of accuracy. It just takes vision and investment on the part of the project owner to make it happen.
Average dwell time
Average dwell time is an unvarnished measure with clear implications. With this number, you can unambiguously identify the level of engagement your signage content is achieving, and do this on a screen-by-screen basis.
Dwell time is the amount of time a person (or people) is actively interacting with your content. The interaction could be explicit, like tapping buttons or entering information in forms. But interaction could be implicit as well, like watching a video. Either way, dwell time tells you how long a given user has engaged with your content.
Average dwell time gives you an overall view of how engaging a given screen has been. The timeline is up to you – hourly, daily, weekly. And the focus could be at any comparative level – per screen, per floor, per geographic location, and more.
During your signage pilot phase, establish a dwell time baseline. What is the average across all locations, all screens, over a given time? Then, beyond the pilot, track average dwell time and compare it to the baseline.
Average dwell time is the perfect way to identify the engagement level of your content. Changing content, or location, or format, or many other aspects of your deployment will likely alter the average, an ideal moment for running A/B tests to identify methods for improving engagement. It's an indispensable measure.
If you know anything about website analytics, then you're familiar with the notion of a session. A session is the set of actions performed by a unique visitor. That action could be reading a marketing promotion, it could be actively interacting with content. Each visitor is represented by a unique session. The more sessions, the more people. The more people, the more engagement your signage deployment is delivering.
There's a catch here and that's how to identify a session. With websites, session identification is easy. Every device has a unique IP address, and Web servers (and Google) can see these IP addresses. Each IP address is a new user and thus a new session. Google takes this even one step further and will treat multiple sessions from the same IP address as one session if they happen in close succession.
Humans don't have IP addresses, so how can you differentiate one person from another and thus one session from another? The key is to create one or more scenarios that will indicate, with a high level of probability, that a new person has approached your signage and thus a new session has begun. Options include:
- Automatically returning to the home screen after a set amount of time during which no interaction has occurred. It's likely that the next interaction is a new person.
- Using computer vision and eye tracking to identify engagement.
- Adding a Home or Start button. A tap of this button likely indicates a new person has begun their interaction.
These approaches aren't foolproof, but there's a good 80/20 (if not more) chance of accuracy.
By measuring session count – which, like average dwell time, can be based upon any particular timeline or environmental contingency like location – you'll know (almost) exactly how many people have interacted with your content. There is no ambiguity about what this number means; you have to decide what your target should be. More A/B testing could reveal approaches that would increase usage.
The hardest part of any analytics endeavor is identification of the key measures that are meaningful and actionable. Your goal should be to avoid MUMs: measurable but useless metrics. Average dwell time and session count are actionable – they deliver unambiguous insight about your deployment – but there are certainly other useful measures unique to your needs and your deployments.
A formal exercise involving stakeholders should be conducted to identify project goals and the Key Performance Indicators, or KPIs, for those goals. Maybe there's only one – for example, collecting email addresses. Or perhaps there are multiple goals because there are multiple stakeholders. Regardless, if you have a goal, you have a structure for identifying KPIs.
Tracking KPIs is best achieved through use of a single value chart, the classic chart type for a dashboard. Identify the current, actual value of the KPI, associate it with the target value and – if possible with your analytics platform – indicate the KPI value trend. At a glance you can identify progress toward your goals.
TIP: When discussing goals and KPIs, consider a discussion about the actions you will take in response. Ideally, you would have clear steps to take in response to KPI actuals and trends. For example, if your project goal is the collection of email addresses, a KPI could be the session count. If the session count is quite low, perhaps it's time to start A/B testing different messages or signage. If the session count is high, perhaps you should consider the addition of a second kiosk.
Word of caution
Be very careful about reaching inappropriate conclusions when studying collected data.
You know the old saw "correlation does not mean causation." If in-store revenue goes up when winter settles in, does that mean cold weather causes people to buy more? A more sensible explanation is that Christmas and New Years is the cause. Be careful about inferences.
And be careful what you wish for. If you're hoping for particular outcomes, you may view data with rose-colored glasses and reach inappropriate conclusions. If a particular set of items is never viewed on an endless aisle kiosk, is it because those items are not popular? Or is it possible that the kiosk design makes it hard to discover or understand that category of items, so no one decides to view them?
These are healthy concerns and your attention will ensure eyes-open use of analytics. The result won't just be a more successful project, it will also be the experience and confidence to tackle more complex and ambitious projects in the future. Good analytics means good decisions. Who doesn't want that?
This article was written by Geoffrey Bessin from Digital Signage Today. News Features and was legally licensed through the NewsCred publisher network. Please direct all licensing questions to email@example.com.