Some Assembly Required
The Third Route to Embedded BI
Dec 10, 2019
All roads to embedded business intelligence (BI) software begin with the same question: should we build our own analytics solution or buy something we can just plug into our SaaS product and bring to market right away? It’s the classic build-or-buy conundrum, and the vast majority of companies end up buying because it’s almost always more expedient and cost efficient than building. But starting at this build-or-buy crossroads, I think, popularizes the notion of enterprise-grade embedded BI as a turnkey, plug-and-play solution when that’s not the optimal way to regard it.
Embedding BI into your enterprise software application is like remodeling your kitchen. Presented with the build-or-buy question, most of us default to “buy” because we’re not carpenters. So we start looking through Lowes and Ikea catalogs for suitable cabinets. Can you imagine anyone tearing out a glossy cabinet ad with the intention of installing that exact configuration in their kitchen?
We should be putting at least as much effort into integrating our embedded BI solutions as we put into remodeling our kitchens.
Embedded software, like cabinets, should be modular so that they can be arranged to fit the space they’re in. Cabinets come in different colors so they can coordinate with the walls and the flooring. Handles and knobs are sold separately so you can settle on the ones that work best for you. There are dozens of customizations and extensions to consider: Do you want doors to open on the right or the left? Frosted glass or clear? Gap or no gap between the cabinets and ceiling? What about countertops?
We should be putting at least as much effort into integrating our embedded BI solutions as we put into remodeling our kitchens. What software-as-a-service (SaaS) vendors and other embedded BI hopefuls think of as the build-or-buy decision is really more like a build-or-assemble decision. And it may sound like I’m arguing semantics, but the difference between a cookie-cutter implementation and thoughtful implementation is stark, as stark as the difference between these two kitchen layouts.
One works. The other makes you want to cry.
Brian O’Neill, founder of Designing for Analytics, has done product design and consulting for companies including DELL/EMC, Tripadvisor, Fidelity, JP Morgan Chase, and numerous SaaS providers for over 20 years. A champion of human-centered design, O’Neill confirms that implementers of embedded BI tend to overlook the impact of integration on user experience.
“Administrators of the host application have to properly customize the embedded BI solution in order for it to begin generating customer value,” he asserts. “If they do not integrate it in such a way that the analytics are providing valuable decision support to the users, then it doesn't matter what tool they use. Customers may simply stop using the BI application, given the amount of tool time that is required to extract any value.”
There’s a direct correlation between how much planning you invest in your integration project and how “native” the embedded solution feels to users. You may be able to get a minimally viable product (MVP) to market in a week, but is that accelerated timeline worth risking the possibility of a poor first impression?
Those just beginning to explore the BI space may be unfamiliar with what an embedded integration might entail, so here’s a list of domains likely to factor into a deployment strategy:
- Data preparation: Connecting to data sources and ensuring that the information is ready for reporting. This might include ETL, data warehousing, and/or building out data management practices.
- API integration: Forging a seamless connection between the host application and the embedded BI application, which could include breaking the latter into parts and exposing those parts in different areas of the host.
- Configuration: Adjusting administrative settings to best meet your users’ needs. This could include user tenanting, security, data aliasing, and performance controls.
- Extension: Adding custom code to the embedded solution so that it’s even more uniquely suited to the host environment.
- Visual integration: Adjusting colors, copy, and images so that they feel native to the host application.
- Localization: Preparing the BI application to accommodate international users with local currency, language, and time zone conversions.
- Report library: Equipping users with a variety of ready-made or “canned” reports they can may either run or edit to their specifications.
The amount of time you allocate to each of the above components depends largely on your data preparedness, business requirements, and resources. Research suggests, however, that companies should avoid spending too long on their first deployment. In the Business Application Research Center (BARC) Buyers Guide for BI Software, an ongoing survey of BI implementations around the world reveals that “project implementation times appear to have a direct impact on the level of all business benefits.” Implementation times between zero and three months appear to yield the best business outcomes; implementations requiring six months or more suffer the most. In light of this, BARC recommends aiming for a “three-month implementation window for the first application” and, to facilitate this goal, taking an “incremental approach by breaking the project into a series of smaller projects.”
In short, do plan your embedded BI integration the way you would plan your kitchen: with care and in stages so that you’re not stuck eating microwave meals for months on end. Put in the work of framing your requirements during the product evaluation stage so that you use your resources efficiently when it comes time to prepare for your initial launch. Implementing a refined embedded BI solution, even if you approach it like the project it is, is still incredibly fast and efficient compared to the alternative of building a solution from scratch. No miter saws, no DIY videos, just an analytics engine waiting for your team to tell it where to go.
Originally published with Dataversity.