The world is full of paradoxes. Here’s one for data analytics professionals: analytics becomes pervasive when it disappears.
For decades, business intelligence (BI) and analytics tools have failed to penetrate more than 25% of an organization. And within that 25%, most workers use the tools only once or twice a week. Embedded analytics changes the equation. By inserting charts, dashboards, and entire authoring and administrative environments inside other applications, embedded analytics dramatically increases BI adoption. The catch is that most business users don’t know they’re “using BI”—it’s just part of the application they already use. The best BI tools are invisible.
By placing analytics at the point of need—inside operational or custom applications—embedded analytics closes the last mile of BI. Workers can see the impact of past actions and know how to respond to current issues without switching applications or context. Analytics becomes an indispensable part of the way they manage core processes and solve problems. As a result, embedded analytics has a much higher rate of adoption than traditional BI or analytics.
Embedded analytics has much higher rate of adoption than traditional BI or analytics.
Embedded analytics makes existing applications more valuable for every organization. Independent software vendors (ISVs) say that embedded analytics increases the value of their applications and enables them to charge more for them. Enterprise organizations embed analytics into operational applications, such as Salesforce.com, and intranet portals, such as SharePoint. In both cases, embedded
analytics puts data and insights at users’ fingertips when they need it most—both to gain insights and take action.
ISV requirements. In the embedded world, ISVs have more stringent requirements than traditional organizations. (See “Twelve Evaluation Criteria” below.) ISVs must ensure an embedded product looks and feels like their host application, and thus require greater levels of customization and extensibility.
Also, cloud ISVs require embedded products that work in multi-tenant environments, with seamless user administration and custom deployments. Many ISVs offer tiered pricing, which requires embedded products with flexible user provisioning. Finally, because ISVs can’t always estimate how many customers will purchase the analytics, they need flexible and affordable pricing models.
Enterprise requirements. Traditional enterprises have fewer requirements than ISVs, but that is changing. More companies are pursuing digital strategies that require customer-facing Web applications. And although most don’t charge for analytics, as ISVs do, many view data analytics as a key part of the online customer experience. For example, mutual funds now provide customers with interactive dashboards where they can slice and dice their portfolios and take actions such as buying and selling funds. Thus, their requirements for customization, extensibility, multi-tenancy, and security have grown significantly in recent years.
Once organizations decide to embed analytics, they need to make a few key decisions. The first is whether to build their own analytics or buy a commercial off-the-shelf tool.
Build. Organizations with internal developers are always tempted to build their own analytics. But
unless the analytics are simple and users won’t request changes, it’s always smart to outsource analytics to a commercial vendor. Commercial analytics products deliver best-of-breed functionality that would take in-house developers years to develop, distracting them from building the host application.
Buy. Many analytics vendors have made their tools more configurable, customizable, and integrateable with host applications. Most see embedded analytics as a big market and aim to make their tools blend seamlessly with third-party applications. They also make it easy to customize the tool without coding, including changing the graphical user interface (GUI) or the ways users navigate through the tool or interact with its components. When extensive customization or integration is required, customers can use application programming interfaces (APIs) to fine-tune the tool’s look and feel, create new data connectors, charts, event actions, and export types.
The second decision is to figure out the architecture for embedding analytics. From our research, we’ve
discovered three primary approaches. (See figure 1.)
1. Detached analytics. This is a lightweight form of embedding where the two applications—host and analytics—run separately but are tightly linked via URLs. This approach works well when multiple applications use the same analytics environment, portal, or service. A common example is Google Analytics, a commercial service that multiple groups inside an organization might use to track Web traffic on various internal websites. There is no commonality between the host application and analytics tool, except for a shared URL and shared data. The two applications might also share a common authentication mechanism to facilitate single signon (SSO). Ths approach is rather uncommon these days.
2. Inline analytics. With inline analytics, output from an analytics tool is embedded into a host application—it looks, feels, and acts like the host but runs as a separate element, tab, or module within it. For example, a newspaper might embed a chart within the text of an article on a webpage. Or an ERP application might present users with a dashboard upon logging in that displays summary activity from each module in the application. Or there might be a separate tab where customers can view analytics about their activity within the application. In most
cases, the embedded components sit within an iFrame, which is an HTML container that runs inside a webpage. iFrames were once the predominant method for embedding analytics content but are disappearing due to security and other concerns. (See next section.)
3. Fused analytics. Fused analytics delivers the tightest level of integration with a host application. Here, the analytics (e.g., a chart, table, or graphical component) sit side by side with the host application components and communicate bi-directionally with them. This created a “fused” or integrated environment where the end users aren’t aware that a third-party tool is part of the experience.
For example, an inventory manager can view inventory levels in various warehouses and place replenishment orders without leaving the screen. Or a retail store manager can view daily demand forecasts and then click a button to create the shift schedule for the following week. Fused analytics is facilitated by JavaScript libraries that control front-end displays and REST API calls that activate server functions. Analytics tools with extensive API libraries and programming frameworks make all their functionality available within a host application, including collaboration capabilities, alerts, reporting and augmented analytics features.
Most analytics vendors can support inline analytics without much difficulty. They simply provide “embed code”—a snippet of HTML and JavaScript—that administrators can insert into the HTML code of a webpage. The embed code calls the analytics application to display specified content within an iFrame on the webpage. People who use YouTube and other social media services are familiar with embed code.
iFrames are a quick and easy way to embed third-party content, and most analytics vendors support them.
But iFrames have disadvantages. Because they are frames or windows inside a webpage that are controlled by an external application or service, they pose considerable security risks. Also, they operate independently of the host webpage or application—the host can’t manipulate what’s inside the iFrame, and vice versa. For example, hitting the back button doesn’t change what’s inside the iFrame.
Furthermore, iFrames behave differently depending on the browser, making them difficult to manage. Consequently, a growing number of organizations refuse to allow iFrames, and the software industry is moving away from them Fused analytics requires tight integration between an analytics and host application.
Fused analytics also requires a much greater degree of customization, extensibility, flexibility, and integration than many analytics vendors support out of the box. To simplify fused analytics, many BI vendors have wrapped their APIs in programming frameworks and command line interfaces (CLIs) that make it easy for programmers to activate all functions in the analytics tool. These Javascript frameworks and CLIs have been a boon to analytics embedding. Nonetheless, companies that want to fuse analytics into an application need to look under the covers of an analytics tool to discover its true embeddability. (See “Select a Product” below.)
Another major decision is selecting a commercial analytics product to embed. Selecting a product that doesn’t include a key feature you need, such as alerts or exports to PDF or printing, can undermine adoption and imperil your project. Or maybe the product doesn’t work seamlessly in a multi-tenant environment, making administration cumbersome and time-consuming and contributing to errors that undermine customer satisfaction. Or your deployment drags out by weeks or months because most customizations require custom coding.
This report is designed to help you avoid these and other pitfalls of embedded analytics. Whether you are an independent software vendor (ISV) that needs to know how to embed analytics in a commercial, multi-tenant cloud application or the vice president of application development at major corporation who wants to enhance a homegrown application, this report will provide guidance to help you ensure a successful project.
The report outlines a four-part methodology:
The report’s appendix drills into the evaluation criteria in depth, providing questions that you should ask
prospective vendors to ensure their products will meet your needs.
Many teams start their embedded analytics project by selecting an analytics product to deploy. Although choosing the vendor to power your analytics is a critical step, it shouldn’t be the first milestone you tackle. Instead, start by considering these questions: What are you trying to build, for whom, and why? These essential questions will help you better understand your product goals and will aid in selecting the
best tool to achieve your goals.
Start by asking: What are you trying to build, for whom, and why?
We recommend a six-step model to ensure that your analytics are not only technically successful, but achieve your business goals
Setting the goals for your analytics project is an essential first step to ensure that all key stakeholders—from the executive team to the end users—are fully satisfied upon project completion.
There are three basic steps to planning for a successful analytics project: define table stakes, define delighters, and define what’s out of scope.
The composition of the project team can be a key element in the success or failure of an analytics project. For example, more than a few projects have been derailed at the last minute when the legal team—not included in the process—surfaced major issues with the plan. When structuring your analytics product team, consider including the following roles in your “core team” for the project:
> Product owner/head of product
> CTO
> Head of development/engineering
> Head of sales
> Head of marketing
> Head of operations and support
Next, identify roles that, although not involved in daily decisions, will need to be consulted along the
way, including finance, billing, legal, and sales enablement/training.
A best practice is to get the key project stakeholders in a single room to discuss core elements in a facilitated session. Although it might be necessary to have some participants attend remotely, in-person attendance makes for a faster and more effective session.
Too often project teams—whether analytics-focused or otherwise—fail to create a plan to guide the key steps required to bring embedded analytics to market. Without a plan, teams are liable to spend too much time gathering requirements and too little time analyzing persona needs. Without planning, the time required to perform key tasks, such as resolving issues from beta testing, might be overlooked. The
steps to creating a basic, but useful, plan are simple:
Set project goals. Setting project goals before any technical work starts is a good way to ensure that everyone involved agrees on the success criteria. Set aside time to create project goals as the first step in your analytics plan.
Set a timeline. A timeline may seem obvious, but it’s important that, in addition to the overall start and end dates, you plan for intermediate milestones such as:
It’s easy to forget that although you, as a member of the product team, might be fully aware of everything that’s taking place within your analytics project, others might not know about your progress. In the absence of information, you might find that inaccurate data is communicated to customers or other interested groups. You can prevent this by establishing a communication plan, both for internal personnel and for potential customers. Although the plans will be different for those inside your walls versus external parties, all communication plans should include:
> Regular updates on progress
> Revisions of dates for key milestones
> Upcoming events such as sneak peeks or training sessions
Once you’ve started your product development effort, particularly once you’ve started beta testing or rollout, it can be hard to identify when critical problems surface. That’s why setting metrics and tripwires is a good idea during the planning phase.
It can be hard to identify when critical problems surface.
That’s why setting metrics and tripwires is a good idea.
Metrics are used to measure product performance and adoption. An embedded product should have
a dashboard that enables ISVs and OEMs to monitor metrics and activity across all tenants using the
product, alerting administrators when performance goes awry. Consider tracking:
> Product uptime
> Responsiveness of charts and dashboards (i.e., load time)
> Data refresh performance and failures
> Number of reloads
> Total users
> Users by persona type
> Number of sessions
> Time spent using the analytics
> Churn (users who don’t return to the analytics application)
> Functionality used, e.g., number of reports published, alerts created, or content shared
Tripwires alert you to critical situations before they cause business-disrupting problems. They are metrics values that, if exceeded, trigger a response from the development team. As an example, you might have a tripwire that states if the product uptime is less than 99.9%, the general rollout of the analytics product will cease until the issue is resolved. Each metric should have an established tripwire, and each tripwire should contain the following elements:
> A metric value that, if exceeded, triggers a response.
> A predetermined response such as “stop rollout” or “roll back to the previous version.”
> A responsible party who reviews the metric/tripwire and determines if action is required.
Although metrics and tripwires don’t ensure project success, they can greatly reduce the time —and stress for the team—to address problems.
It’s a common mistake to think either that you fully understand the users’ needs or that all users are the same. Many teams launch embedded analytics products without considering the detailed needs of target users or even the different user types they might encounter. Avoid this situation by creating detailed user personas and doing mission/workflow/gap analysis.
Many teams launch embedded analytics products without considering the detailed needs of their users or even the different user types they might encounter.
Here’s how it works:
Step One: Choose personas. The best way to create an engaging data product is to deliver analytics that solve users’ problems. This is difficult to do for a generic “user,” but it can be accomplished for a specific user “persona” or user type. Start by picking two or three key user types (personas) for whom you will tailor the analytics. These might be strategic users looking for patterns and trends (like executives) or tactical users focused on executing work steps (like salespeople or order fulfillment teams). Although you may ultimately add functionality for many personas to your analytics, start with personas that can impact adoption—key decision makers—first. Get these user types engaged with your analytics application and they can help drive adoption among other users.
Step 2: Identify mission. For each chosen persona, the next task is to understand the user’s mission. What is the person trying to accomplish in their role? Are they trying to improve overall sales? Are they striving to increase revenue per customer? Understanding what the persona must accomplish will help you understand where analytics are needed and appropriate cadence.
Step 3: Map workflows and gaps. Now that you understand each persona’s mission, the third step is to outline the workflow they follow and any gaps that exist. These steps—and gaps—inform the project team where they can add analytics to assist the persona in accomplishing their mission. Keep it simple. If your persona is “head of sales” and the mission is “increase sales,” a simple workflow might be: (a) review sales for the month (b) identify actions taken within those segments (c) recommend more effective tactics to managers.
Within this workflow, you might find opportunities where analytics can improve the effectiveness of the head of sales. Perhaps reviewing sales performance or identifying underperforming segments requires running reports rather than simply reviewing a dashboard. Maybe seeing what actions have been taken requires investigating deep within the customer relationship management (CRM) system and correlating actions back to segments.
By finding information gaps within workflows and understanding personas’ missions,
project teams can ensure they put high-value analytics in front of users.
By finding information gaps within workflows and understanding personas’ missions, project teams can ensure that they put high-value analytics in front of users. It becomes less of a guessing game—replicating existing Microsoft Excel-based reports and hoping the new format attracts users—and more of a targeted exercise. Only analytics that truly add value for the persona are placed on the dashboard, in a thoughtful layout that aids in executing the mission. Engagement increases as target users solve problems using analytics.
Once you’ve defined user requirements, you need to turn them into specifications for selecting a product. The following evaluation criteria will help you create a short list of three or four vendors from the dozens in the market. The criteria will then guide your analysis of each finalist and shape your proof of concept.
We’ve talked with dozens of vendors, each with strengths and weaknesses. Analyst firms such as Gartner and Forrester conduct annual evaluations of Analytics and BI tools, some of which are publicly available on vendor websites. G2 provides crowdsourced research on BI tools, while the German research firm BARC publishes a hybrid report that combines analyst opinions and crowdsourced evaluations.
However, these reports generally don’t evaluate features germane to embedded analytics. That’s because the differentiators are subtle and often hard to evaluate, since it requires diving into the code.
The differentiators among embedded analytics are subtle and often hard to evaluate since it requires diving into the code.
For companies that want to tightly integrate analytics with a host application, there are three key things to look for:
> How quickly can you deploy a highly customized solution?
> How scalable is the solution?
> Does the vendor have a developer’s mindset?
Deployment speed. It’s easy to deploy an embedded solution that requires minimal customization. Simply replace the vendor logo with yours, change font styles and colors, copy the embed code into your webpage, and you’re done. But if you want a solution that has a truly custom look and feel (i.e., white labeling), with custom actions (e.g., WebHooks and updates), unique data sources and export formats,
and that works seamlessly in a multi-tenant environment, then you need an analytics tool designed from the ground up for embedding.
The best tools not only provide rich customization and extensibility,
but they also do so with minimal coding.
The best tools not only provide rich customization and extensibility, but they also do so with minimal coding. They’ve written their own application so every element can be custom-tailored using a pointand-click properties editor. They also provide templates, themes, and wizards to simplify development and customization. And when customers want to go beyond what can be configured out of the box, the
tools can be easily extended via easy-to-use programming frameworks that leverage rich sets of product APIs that activate every function available in the analytics tool.
Moreover, the best embeddable products give host applications unlimited ability to tailor analytic functionality to individual customers. This requires analytics tools to use a multi-tenant approach that creates an isolated and unique analytics instance for each customer. This enables a host company to offer tiered versions of analytic functionality to customers, and even allow customers to customize their analytics instance. This mass customization should work whether the host application uses multitenancy and/or containerization or creates separate server or database environments for each customer.
Scalability. It’s important to understand the scalability of an analytics solution, especially in a commercial software setting where usage could skyrocket. The tool needs strong systems administration capabilities, such as the ability to run on clusters and support load balancing. It also needs a scalable database—whether its own or a third party’s—that delivers consistent query performance as the number of concurrent users climbs and data volumes grow. Many vendors now offer in-memory databases or caches to keep frequently queried data in memory to accelerate performance. The software also must be designed efficiently with a modern architecture that supports microservices and a granular API. Ideally, it works in a cloud environment where processing can scale seamlessly on demand.
Developer mindset. When developers need to get involved, it’s imperative that an analytics tool is geared to their needs. How well is the API documented? Can developers use their own integrated development environment, or must they learn a new development tool? Can the tool run on the host application server or does it require a proprietary application server and database? How modern is the tool’s software architecture? Does it offer JavaScript frameworks, which help simplify potentially complex or repetitive tasks by abstracting API calls and removing the need for deep knowledge of the analytics tool’s APIs by your developers?
Companies are adopting modern software architectures and don’t want to
pollute them with monolithic, proprietary software from third parties
Increasingly, companies are adopting modern software architectures and don’t want to pollute them with monolithic, proprietary software from third parties. The embedded analytics solutions of the future will insert seamlessly into host code running on host application and Web servers, not proprietary servers and software.
It’s important to know what questions to ask vendors to identify their key differentiators and weak spots. Below is a list of 12 criteria to evaluate when selecting an embedded analytics product. (See the appendix for a more detailed description of each item.)
These criteria apply to both ISVs and enterprises, although some are more pertinent to one or the other. For instance, customization, extensibility, multi-tenancy, and pricing/packaging are very important for ISVs; less so for enterprises.
With a plan and tool selected, the next step is to begin development. But perhaps not the development that might initially come to mind. We recommend that, alongside the technical implementation of your analytics, you develop the business aspects of your project. This phase requires you to fully consider how the analytics will be introduced to the market—how they will be packaged, priced, and supported post-launch.
Start by designing the packaging for your analytics. Packaging is particularly important for software vendors who sell a commercial product. But enterprises that embed analytics into internal applications can also benefit from understanding these key principles.
Teams often consider analytics to be an “all-or-nothing” undertaking. You either have a complete set of analytics with a large set of features, or you don’t have any analytics at all.
But this approach fails to consider different user needs. Expert users may desire more sophisticated analytics functionality, while novice users may need less. The “all or nothing” approach also leaves you with little opportunity to create an upsell path as you add new features. It’s better to segment your analytics, giving more powerful analytics to expert users while allowing other users to purchase additional capabilities as they need them.
You never want to give users every conceivable analytical capability from the outset. Instead, use a tiered model. If you’ve ever signed up for an online service and been asked to choose from the “basic,” “plus,” or “premium” version of the product, you’ve seen a tiered model. Keep it simple, don’t add too many levels from which buyers are expected to choose. For example, you might use the following tiers:
> Basic. This is the “entry level” tier and should be the default for all users. You put these introductory, but still useful analytics in the hands of every single user so that they understand the value of analytical insights. Most organizations bundle these analytics into the core application without charging extra, but they usually raise the price of the core application by a nominal amount to cover costs.
> Plus. These are more advanced analytics, such as benchmarking against internal teams (e.g. the western region vs. the eastern region), additional layers of data (e.g. weather data, economic indicators, or financial data), or the ability to drill deeper into charts. This tier should be priced separately, as an additional fee on top of the core application.
> Premium. The top tier will be purchased for use by power users, analysts, or other more advanced users. Here, you might add in features such as external benchmarking (e.g. compare performance to the industry as a whole), and the ability for users to create entirely new metrics, charts, and dashboards. This will be the most expensive tier.
Architecting your offering in this format has several key benefits for data product owners:
> It doesn’t overwhelm the novice user. Although offering too little functionality isn’t ideal, offering too much can be worse. Upon seeing a myriad of complex and perhaps overwhelming features, users may decide the application is too complicated to use. These users leave and rarely return.
> It provides an upgrade path. Over time, you can expect customers to become more sophisticated in their analysis needs. Bar charts that satisfied users on launch day might not be sufficient a year down the road. The tiered model allows customers to purchase additional capabilities as their needs expand—you have a natural path for user growth.
> It makes it easier to engage users. How can you entice customers to buy and use your data product unless they can see the value that it delivers? Including a “basic” analytics tier with minimal, but still valuable, functionality is the answer. The basic tier can be offered free to all customers as a taste of what they can experience should they upgrade to your advanced analytics tiers.
Unfortunately, not all customers will be satisfied by your analytics product, even if it’s structured into a tiered model. Some will require custom metrics, dashboard structures, and more data. Here are some “add-on” offerings that you can charge extra for:
> Additional data feeds. Although your core analytics product might include common data sources, some customers will require different or more data feeds for their use cases. These might include alternative CRM systems, alternative financial systems, weather, economic data, or even connections to proprietary data sources.
> Customized models. A “custom data model” allows buyers to alter the data model on which the product is based. If a buyer calculates “customer churn” using a formula that is unique to them, support this model change as an add-on.
> Visualizations. Customers often request novel ways of presenting information, such as new charts, unique mobile displays, or custom integrations.
> More data. The product team can augment an analytics application by providing more data: Seven years instead of five, detailed transactional records instead of aggregated data.
Data applications can be complex, and they are often deeply integrated into many data sources. For this reason, you might consider offering services to augment your core analytics product:
> Installation/setup. Offer assistance to set up the analytics product, including mapping and connecting to the customer’s data sources, training in-house support personnel, and assisting with loading users.
> Customization. Offer to create custom charts, metrics, or data models.
> Managed analytics. Occasionally, a data product customer requests assistance in interpreting the analytics results. This can take the form of periodic reviews of the analytics (e.g., quarterly check-ups to assess performance) or an “expert-on-demand” service where the customer calls when they have analysis questions.
The situations above are very different from normal product technical support. Managed analysis services can be a highly lucrative revenue source, but they can also consume more resources than anticipated and skew your business model from a product orientation to a services model.
Pricing your analytics sounds like a simple proposition—far less complex than the technical product implementation—but that’s rarely the case. In fact, we’ve seen more than a few instances where the pricing portion of the project takes longer than the actual analytics implementation. But determining the fees to charge for your data product doesn’t have to be daunting. Here are our guidelines to help you
avoid pricing pitfalls.
Charge for your analytics. Analytics can help users make decisions faster, more accurately, and improve overall process cycle times. Such business improvements have value, and you should charge for providing it. However, if your analytics doesn’t add value, some product teams decide to offer analytics free of charge. When there is an apparent mismatch between the value and the price of an analytics product, the answer is to revisit the persona/mission/workflow/gap step of the development process.
Start early. Setting pricing late in the process is a mistake because once the product is fully defined and the costs are set, the flexibility you have for creating pricing options is severely limited. Start early and craft price points appropriate for each product tier (basic, plus, premium) rather than trying to rework key project elements just before launch day.
Keep it simple. Complicated pricing models turn off buyers. They cause confusion and slow the buying cycle. Limit the add-ons available and include key functions in the price of the core application.
Make the basic tier inexpensive. Keep the basic tier as inexpensive as possible. You want people to try analytics and get hooked so they purchase a higher tier. Roll the extra price into the cost of the core application and ensure that every user has access to the basic tier.
Match your business model. If your core application is priced based on a fee per user per month, add a small percentage to that fee to account for the additional value of analytics. Do not add a new line item called “Analytics Functionality” that jumps out at the buyer. Make analytics a part of what your product does.
Many teams spend significant energy designing analytics, creating dashboards, and thinking through pricing, but most forget to consider support processes. Embedded analytics as part of a customer-facing data product are inherently different from enterprise or “inside your walls” analytics. Data products require marketing plans, sales training, and other processes that will allow a properly designed analytics
application to flourish post-launch. Here’s how to get started planning your application’s supporting processes.
List the Impacted Processes
The first step to getting your supporting processes ready is enumerate exactly what will be impacted. There are two ways to go about this step:
1. Brainstorm the processes. The product team spends about 45 minutes brainstorming a list of potentially impacted processes. This is no different from any other brainstorming session—just be sure that you are listing processes (e.g. “process to intake/resolve/close support tickets”) and not organizational names that are less actionable (e.g., “the operations group”).
2. Work from other product lists. If you are part of an organization that has fielded products in the past, you might already have checklists for “organizational readiness” lying around. If so, the list of support processes developed by another product team might be a great place to start. You’ll find that you need to customize the list a bit, but the overlap should be significant, saving you time.
Here is a list of processes commonly impacted by a new data product:
> Provisioning or installation process
> New user onboarding process
> Trouble ticket or issue management
> User experience processes
> Product road mapping process, including request intake
> Utilization tracking or monitoring
> Sales training process
> Billing process
> Decommissioning or deactivation process
Define Changes to Processes
The next step is to determine the degree to which each of the listed processes might need to change to
support your new data product.
> Create a simple flow chart for each potentially impacted process.
> Apply scenarios. Pretend an issue occurs with your analytics. Can your existing process address it? Add or modify process steps to accommodate the analytics product.
> Publish and train. Present the new processes to any teams that might be impacted and store the process documentation wherever other process documents are kept.
Create Metrics
With new processes in place, you need to monitor process performance to ensure that everything is working as planned. For each new or modified process, create metrics to track cycle times, throughput, failure rate, cost, and customer satisfaction. Benchmark these processes against existing processes to ensure they are performing in parity.
The process isn’t over when you’ve deployed your analytics and brought users on board. In fact, the most successful analytics teams view embedded analytics as a cycle, not a sprint to a finish line. Post-launch, you need to consider what is working and what isn’t; and you need to add or fine-tune functionality to better meet persona needs. You need to continuously improve to ensure that analytics adoption and usage doesn’t decline over time.
People always find unique ways of using analytics functionality. Learn what your users are doing; their experiments can guide your project development. Here are three ways to gather feedback on your analytical application:
> Use analytics on analytics. Some platforms allow you to monitor which dashboards, charts, and features are being used most frequently. Track usage at the individual and aggregate level. What functionality is being used? How often is the functionality reloaded? Number of sessions? New, recurring, one time use counts? Three-month growth by user, tenant, type of user, etc?
> Monitor edge cases. The users who complain the most, submit requests, and call your help desk are a gold mine of information. They will often talk—at length—about what functionality can be implemented to make the analytics better for everyone. Don’t ignore them.
> Shoulder surfing. Shoulder surfing is an easy way to gather insights. Get permission from users to observe them interacting with the analytics product on their own tasks at their own computers in their own environments. Shoulder surfing can uncover incredible insights that users might fail to mention in a formal focus group.
Although you started with a limited number of personas and workflows during the initial implementation, the key to sustaining your analytics is to expand both the personas served and the use cases addressed. If you started with an executive persona, consider adding tactical personas that need information about specific tasks or projects. Also, add workflows for existing personas. For example, add a budgeting dashboard for the CFO to complement the cash flow might analytics previously deployed.
Unfortunately, in the absence of new information, users will assume that no progress is being made. Even if you can’t add all the personas, workflows, and functionality required immediately, make sure to create a communication plan so users understand what’s coming next and for whom.
Embedding another product in your application is not easy. There’s a lot that can go wrong, and the technology is the easy part. The hard part is corralling the people and establishing the processes required to deliver value to customers.
Here are key success factors to keep at the forefront of your mind during an embedded analytics project:
1. Know your strategy. If you don’t know where you’re going, you’ll end up somewhere you don’t want to be. Define the goal for the project and keep that front and center during design, product selection, and implementation.
2. Know your users. Pushing dashboards to customers for the sake of delivering information will not add value. Dashboards, whether embedded or not, need to serve an immediate need of a specific target group. Identify and address information pain points and you’ll succeed.
3. Identify product requirements. It’s hard to tell the difference between embedded analytics tools. Use the criteria defined in this report to find the best product for your needs. It may not be one you already know!
4. Define a go-to-market strategy. Here’s where the wheels can fall off the bus. Before you get too far, assemble a team that will define and execute a go-to-market strategy. Especially if you are an ISV, get your sales, marketing, pricing, support, training, and legal teams together at the outset. Keep them informed every step of the way. Make sure they provide input on your plan.
Following these key principles will help ensure the success of your embedded analytics project
For more information or enquiries about Qlik products and services, feel free to contact us below.
Connect with SIFT Analytics
As organisations strive to meet the demands of the digital era, SIFT remains steadfast in its commitment to delivering transformative solutions. To explore digital transformation possibilities or learn more about SIFT’s pioneering work, contact the team for a complimentary consultation. Visit the website at www.sift-ag.com for additional information.
About SIFT Analytics
Get a glimpse into the future of business with SIFT Analytics, where smarter data analytics driven by smarter software solution is key. With our end-to-end solution framework backed by active intelligence, we strive towards providing clear, immediate and actionable insights for your organisation.
Headquartered in Singapore since 1999, with over 500 corporate clients, in the region, SIFT Analytics is your trusted partner in delivering reliable enterprise solutions, paired with best-of-breed technology throughout your business analytics journey. Together with our experienced teams, we will journey. Together with you to integrate and govern your data, to predict future outcomes and optimise decisions, and to achieve the next generation of efficiency and innovation.
The Analytics Times
“The Analytics Times is your source for the latest trends, insights, and breaking news in the world of data analytics. Stay informed with in-depth analysis, expert opinions, and the most up-to-date information shaping the future of analytics.
Published by SIFT Analytics
SIFT Marketing Team
marketing@sift-ag.com
+65 6295 0112
SIFT Analytics Group
Explore our latest insights