EPISODE 35 – TOPIC SUMMARY AND GUEST:
Please join me in welcoming back our most frequent guest, Tamer Ali, Co-Founder and Director of Authentic Learning Labs. Tamer is a learning systems entrepreneur who is always ahead of the extended enterprise curve. Most notably, about seven years ago he founded Crowd Wisdom, one of the first pure extended enterprise learning management systems. Now he’s back with Authentic Analytics, an advanced reporting platform for training professionals. So today we’re talking about how innovative measurement methods are shaping extended enterprise learning.
- Advanced reporting is more than just a data dump. It relies on visualization and interpretation techniques that make data meaningful.
- Analytics tools aren’t new. However, for a variety of reasons, learning leaders have been slow to adopt advanced reporting as a practice.
- If you sell instructional content, your data can tell you how your business can do better. And finding those answers in your data is likely to be easier than you think.
Welcome back, Tamer! Let’s start with your journey. What led you from developing the Crowd Wisdom learning platform to your mission now with advanced reporting at Authentic Learning Labs?
When we developed Crowd Wisdom, many organizations were trying to make academic or corporate or open source platforms like Moodle work for extended enterprise learning. Although those tools are good, they weren’t designed to serve disparate extended enterprise audiences.
So we tried to solve that problem. And over the years, as customers adopted our platform, another issue became obvious. Many organizations were ignoring measurement.
Across-the-board, we saw (and continue to see) that organizations are collecting lots of data from their platforms. But they’re just not using that data.
Hmm. Why do you think that’s the case?
There’s a lot of talk. Our industry tends to jump on jargon bandwagons and talk about concepts like xAPI and data warehouses and business intelligence. It’s all with good intent. Even so, there’s a very clear lack of business insights being pulled from data – especially in the learning sphere.
So, how are you addressing this?
We founded Authentic Labs several years ago to deal with some of these measurement gaps in the marketplace.
For example, reporting was an immediate need. Plenty of reports existed, but useful business insights weren’t part of reporting capabilities.
It’s not because there’s a lack of data. There’s plenty of data being generated. And data warehousing and storage are now a trivial cost. But the value is in what you do with that data. This is where advanced reporting tools and techniques can make a big difference.
Interesting. I have a theory about that.
Many learning systems are still employee-focused. They try to address external audiences through “audience management” functionality. But their analytics are still very employee-focused…
Built-in reporting capabilities really aren’t business-oriented. And with so many employee-focused learning platforms saying they serve extended enterprise needs, its hard to even know what extended enterprise analytics is. Does that seem valid?
Very much so, John. It’s like the early days of the elearning movement, with the term “page-turner.” That became a popular way to label bad content – when people simply converted printed books directly into digital format for web delivery. But this resulted in static, one-dimensional content.
It seems like the concept of learning reporting has taken a similar route, by being relegated to a sort of dispensary checkbox model. Did John Leh take training? Yes. Okay, mark the checkbox.
Too many learning reports are built on this kind of checkbox data. But this reporting approach hides a lot of very important factors involving deeper learner behaviors and relevant business impact. These things aren’t visible in checkbox reports. We want to help capture these more strategic insights and bring them to light.
In learning reports, much of what we call business intelligence is low on intelligence. It really focuses on activity. It’s about answering questions like “Hey, did this person complete this course?” But that’s not very useful, so we’ve moved beyond questions like that.
Interesting. “Page-turner” once meant something very positive, but then it became very negative…
That’s true! At one point “page-turner” was a good thing. And in the case of learning reports, the promise of business intelligence is also a good thing. But reporting has become synonymous with data dumps. And that kind of reporting doesn’t help a training organization scale or solve specific problems.
What problems are organizations trying to solve?
Over the years, we’ve seen more organizations buying analytics software. Not just ours, but Tableau or Domo or Microsoft PowerBI. That’s good for adoption. We’ve also seen more organizations hiring dedicated business analysts, which is also a good sign. And organizations with more resources are now investing in data warehouse initiatives.
So learning decision-makers mention these three things more often now. In fact, it’s almost like name dropping. “Oh yeah, we have a data warehouse team.”
But even with these resources, few organizations can answer their most pressing questions. And that’s where we tell people to begin.
It’s important to start with your business questions. What business are you in? What strategic challenges are you facing? What issues are central to your success?
We usually work with publishers or educators in a particular field, or with entities that are trying to train their channel or their partners. So, we focus on how they can align education data to support strategic business insights. It’s just that simple.
Once key business issues are defined, what next?
Then we help determine how to address those needs with appropriate reports and visualizations. But it’s not about reporting, per se. This may seem a little dismissive, but we’re not in the reports business. We’re actually in the story-making business.
We want to convey stories that support a specific business hypothesis or objective. So it’s not just about producing cool visual reports and dashboards – even though we think aesthetics are important. But the end goal is to provide meaningful insights that help organizations solve specific business challenges.
So, without necessarily naming customers, can you share some examples?
One of the simplest examples is an inventory analysis for a course product line. In the past, content usage and completion was monitored in weekly or quarterly intervals. But when we started looking at aggregate data over longer timeframes, we noticed that a particular course series wasn’t engaging anyone. However, it was costing a lot to produce and translate into multiple languages. So the very quick solution was to discontinue that line of courses.
Sounds simple enough…
Another one was a new business that needed data to support executive management decisions. We created a custom dashboard for the C-level team and the board, so critical data and supporting reports were always available whenever they met.
Within only a minute, these executives could view the latest key metrics. Then they could scan through either an email or a secure online report to see all the underlying data and related indicators.
One thing we emphasize is that indicators and alerts are just as important as the data itself. In other words, we don’t assume that a pie chart or bar chart, alone, can tell the whole story. Wherever it’s helpful, we add indicators, alerts and insights of our own on top of the data.
Good point. So when data reveals that a trend is going south, you can flag it as a forward indicator?
Exactly. We can highlight anomalies that are outside of a pattern. It might seem out of the norm if we’re looking at a short-term reporting period. But over a longer timeframe, it could be part of a pattern.
So when we see something that seems unusual, we’ll pan out and look at the data in a larger scope. It might be a one-off outlier. Or it could be part of a consistent but infrequent sequence that isn’t visible in your regular reporting view.
Or say you’re looking at assessment scores and course completion reports and you realize that some people are abusing a reward structure that a partner training system was offering to drive course completions. We found this for one of our customers.
English-speaking participants were misusing the ability to take courses in different languages, not just English. They were finding the same course in other languages and completing them so they could receive rewards they didn’t really deserve. We were able to identify and prevent future abuse, and our customer minimized wasted funds.
Excellent. So to summarize, it’s really easy for people to become overwhelmed when trying to tease-out specific intelligence from aggregate data over a long timeframe. Is that right?
Exactly. It’s one thing to look at aggregate data because you’re in a hurry to obtain accreditation or prove performance to internal stakeholders for a specific business initiative.
And organizations with a lot of resources may have three legs of a strong foundation for advanced measurement:
- Sophisticated analytics software
- A data warehouse or data lakes
- A person or team that understands data analytics and business intelligence
But often, there’s no consistent practice of evaluating data through a learning lens, because these initiatives are usually organization-wide. So the organization invests in central resources and then lends them to the learning team when needed. But because these engagements can be sporadic, insights can get lost in the cracks.
As a consultant, I help organizations define their LMS requirements. And increasingly, clients say they want to tie-in with a data lake. Or they want to push all their LMS data to a central analytics platform like Domo or Tableau…
Here’s my standard follow-up question for them: “Okay. But what are you going to do with that data? What will this help you do that you’re not doing right now with the LMS?” Actually, no one seems to have a good answer for that.
Yes, exactly. We have the same challenge with people who want an LRS. They’ll say, “We have an LRS and we’d love to use it more.” So next we’ll ask if they’re using xAPI.
Nope. They aren’t actually doing anything with xAPI, but they want to know if we can work with it, anyway.
Well, the answer is yes. xAPI is a great concept and we support it fully. But what problems do they want to solve with it? If they’re just tinkering, fine. That’s okay. But it’s not a material business challenge.
I hear ya…
On the other hand, what if someone is setting up a new Tesla manufacturing facility in South America, and they need to train 5000 operations people rapidly on the intricacies of moving from a gas engine to an electric vehicle? There’s a business initiative that could make great use of xAPI.
Knowing the difference involves business maturity and adoption of analytics as a process. And I think it’s a challenge for learning practitioners, in general.
We need to stop thinking about soft scores and start thinking about quantitative impact. We have to think about putting science behind our profession, which will then elevate our strategic value and our contribution to corporate strategy.
Is the low-hanging fruit in the advanced analysis of pure LMS data, since it’s already there and it doesn’t need to be combined with other business data? Or is the low-hanging fruit when you combine LMS data with data from a CRM or other systems to create insights?
Great question! The cheap wins are definitely the lower-hanging fruit. And it’s very easy, right? Any organization can produce checkbox-type charts that show completions or grades for courses that someone says they hated.
But we think our customers want to do more. And that’s how we provide clear value and differentiation. We help them look at complementary data from other sources that can actually elevate the whole conversation around learning.
Say we take CRM data from Salesforce or Sugar CRM, and then bring-in data from an association AMS. Then we complement that with demographic data that’s superficially available from Google Analytics. We can also include data from Workday or another human capital management system. Analyzing this combination of data can create a meaningful story that your senior executives and everybody else will appreciate.
That’s strategic. It’s not trying to apply a one-dimensional report card to the concept of business impact. These are two entirely different things, right?
Don’t get me wrong. A report card is good in the right situation. If you’re a parent like me, you appreciate a report card and you use it to compare your kids to everyone else’s.
But is a report card on-pace with comparisons that drive strategic business value? Does this address corporate strategy? Can a report card tell your association if you’re investing the right resources in continuing education programs? Can it tell your training company if you’re focused on the right product lines for the education market and your support costs are decreasing as a result?
The answer is no. A simple report card isn’t enough. But a more robust analysis is. And we’ve seen it directly affect the volume of a customer’s Zendesk trouble tickets.
I bet organizations usually have a long list of challenges that you could help them address. How do you help them prioritize…?
Need Proven LMS Selection Guidance?
Looking for a learning platform that truly fits your organization’s needs? We’re here to help! Submit the form below to schedule a free preliminary consultation at your convenience.