by vincentbarr on 8/25/16, 12:50 AM with 27 comments
by calinet6 on 8/25/16, 1:25 AM
However, let's apply the pareto principle. 80% of the needs are served by 20% of the effort. I think for most (let's say 80%) of what most (about 80%) of companies really need, a computer can get at least most of the way there, with much less time and effort than a human.
For example, sure, you need a human to really understand your web analytics. However, to understand it at a basic level, pull out clusters of interesting patterns, and discover anomalies and outliers that you wouldn't have gone looking for otherwise, the computer can get the average intelligent person (not an analyst) most of the way there, and certainly can provide a great deal of value for generally much less cost than a human analyst on salary.
Plus, this frees analysts up for work that truly needs their expertise. As many data folks will tell you, it'd be great if everything they did was using their full skill set to the fullest; but often they spend half their day pulling reports that a computer truly could have done for the requester without any intervention, if designed correctly (not easy, mind).
So, it's not about computers replacing humans entirely. It's about reducing waste, finding ways to cover common cases and repeated work easily, and freeing up human minds for what they're really good at and needed for.
by bmh100 on 8/25/16, 3:51 AM
by tuna-piano on 8/25/16, 2:14 AM
1. Previously, static reports delivered weekly are replaced with daily-updated interactive tools in Tableau (or similar), which allow business users to drill down and filter things.
2. Previous reports that would be custom SQL reports are replaced with reports built by a business user (or a power Tableau user) - with much quicker turnaround and better results.
And of course, Excel, the most widely used self-service BI tool is used successfully every day by almost every business in the world. It's use has it's problems, but overall it's extremely valuable... and it's definitely self-service and definitely BI.
by dworin on 8/25/16, 2:51 AM
Domain expertise is hugely important at making sense of data. Self-service allows domain experts to quickly look at data themselves. They may have to learn skills in data-sensemaking, but the expert in data will have to learn about the specific domain (often much harder).
I'm noticing that more and more people in a variety of fields have at least a passable understanding of how to make sense of data. For quick questions, self-service access to data makes the process much faster with little risk.
I've been in organizations that tried to put data behind gatekeepers who would protect users from making mistakes. In those cases, we made a lot more mistakes because not enough analysis was done, or people didn't have access to data.
I've been in other organizations where we let everyone look at the data. Sure, some people made mistakes, but we used that as an opportunity to teach.
If I had to bet on which type of firm would win, I'd bet on the latter. I'm deeply skeptical of the promises made by BI vendors, but self-service analytics isn't one of them.
by nwenzel on 8/25/16, 2:20 AM
And yes, software makers exaggerate in their advertising.
As compared to the author's other posts, this one seems hurried and maybe a little grumpy. But, still makes an important point.
by fiatjaf on 8/25/16, 2:39 AM
by sgt101 on 8/25/16, 8:39 AM
by kfk on 8/25/16, 10:38 AM
by gp05zmzpjl on 8/25/16, 5:35 AM
Yes, there are pitfalls if you give people in ~B access to the BI tool. They could make bad decisions based on the data. On the other hand, they are already making bad decisions without the data. In a world where |B| is limited, it can be hard to make the right tradeoffs. You might choose to let people in ~B use the tool but have a policy against sharing the results without a once-over from someone in B.
by akerfonta on 8/25/16, 4:40 AM
To me, what is wrongly called self-service analytics is often really self-service reporting. You are consuming the end result, not the analysis itself.
by dgudkov on 8/25/16, 3:04 AM
by skybrian on 8/25/16, 2:53 AM
by ommunist on 8/25/16, 5:55 AM
by buckbova on 8/25/16, 2:39 AM
by newjersey on 8/25/16, 1:08 AM
by baconner on 8/25/16, 4:34 AM
Self Service BI tools are not intended to take the place of analytic skill any more than they take the place of domain expertise and that has never been the meaning of the term. The promise of SSBI is to reduce the incredible friction domain experts traditionally had to deal with to get their key business questions answered. Yes, your users need to develop other analytic skills to go along with their domain expertise! Turns out most of us have stronger and weaker points and have to learn and evolve our skills to get our jobs done well. Taking on SSBI means exactly that for your users who most likely have at least one of the key skills (domain expertise) already and maybe more.
Using an exploratory type ssbi tool is a conversation with your data via an interactive tool. One question leads to another leads to another and if the alternative is having to stop and ask another department to put each follow up question on their backlog the conversation is basically broken and often business users just stop asking and revert to pure gut feel decision making. I think most of the progress made over the past 20 years in BI has been about making this kind of process more agile in the same sense as iterative development. SSBI is part of that. The inversion of analytic process in big data systems is part of that as well. ML can also play a role in that with the right circumstances.
What we can do, as BI vendors, is build tools and documentation that guide users who start with only part of the skills they need into learning the rest while using the tools we provide. We can present defaults that guide the user towards visualizations and views that are easier to interpret. We can embed analytic skill building into our applications in tutorials and hints. We can build metadata up as users inform us about the data as they use it rather than requiring them to do it up front in a big-bang DW modeling session. We can inspect the data with simple heuristics to try and hint the user how to use it with the tool or apply better defaults. We can build better cleansing munging and data consolidation tools. We can build tools to let data analyst teams also turn things around more quickly. And yes, perhaps we can try using machine learning to suggest possible avenues for the user to explore which _of course_ must be interpreted by a user with the right skills because ML is going to be wrong a lot of the time and users need to understand false positives. It's all part of the process.
Bailing out on SSBI because of marketers being marketers just isn't pragmatic. Better that we just keep evolving our products from both the data science team end and the business user ssbi end.