Blog
Vivek Rajagopal
Aug 17, 2022
Fascinatingly, business intelligence (BI) implementations are only half as likely to succeed as missions to the moon! Although 41 of the 111 missions to the moon have been unsuccessful in the last six decades, studies estimate the failure rates for BI implementations to be up to 80%. However, despite the "astronomical" failure rates, a look at the giant market size and growth rate of the BI market—estimated to be upwards of $24 billion and growing at close to 9% CAGR—indicates that the business community badly wants the BI system to work.
Key Criteria
That leads us to a basic question on what constitutes a success or a failure in a BI system. There are two fundamental indicators of a successful BI implementation. First, is BI institutionalized in the organization? This signifies that using BI for decision making is not a choice but is part of the process. Second, is there an objective mechanism defined to identify use cases that need to be taken up by BI? This implies that the BI roadmap is strategic and not opportunistic or unstructured.
1. Institutionalization
For any process to be institutionalized, it needs to be recognized by the organization as useful and usable, or it must be mandated by law. Although BI definitely isn’t mandated by law, it has to be usable, and it must be perceived as useful. The usefulness of the BI system largely depends on the impact it is expected to create and the credibility of the information it presents. The usability of the BI system is broadly defined by the ease of learning, the ease of use and the availability of the system. Although fancy dashboards carrying every type of chart known to mankind look good on brochures, the practicality of the solution is often a critical factor in determining its usability.
2. Use-Case Identification
The key factors that affect the use-case identification process are the availability of data around the use case, the quality of the data, the value the use case is expected to add and the technical complexity of developing the use case. A high-level process map of the organization along with the data and analytics coverage around each process can present a clear picture of the overall penetration of analytics. High-impact processes with sufficient data coverage but low analytics coverage can be a starting point to identify good use cases. This process can be repeated for subprocesses within the high-level process.
You Need Champions, Too
The frameworks around institutionalization and use-case identification are critical to the success of a BI implementation but so are the people chosen to create and manage these frameworks. There need to be champions identified to own each of the processes outlined. There should be an impact tracker who maintains a log containing all the use cases and their impact. You will need a person in charge of training and adoption. There needs to be a champion for the user experience the BI system provides. Similarly, there should be owners from the BI team tagged to the high-level processes and their subprocesses.
And Don't Forget Your Stakeholders
All the factors we’ve specified so far are internal to the BI team, but the success of a BI implementation will involve the commitment of several stakeholders external to the team as well. The functional teams play a key role in creating the process maps, helping quantify the impact, championing adoption and providing feedback to the BI team. The IT team plays an important role in providing data access and ensuring that the infrastructure around data movement is adequate.
The BI team needs to ensure that communication channels with all the external stakeholders are strong, active and structured. Most importantly, the management team needs to understand the importance of BI and its limitations and be determined and patient enough to see the BI implementation through.
There will be many nuanced calls and compromises that your organization will face in its BI journey. But it's a journey well worth it. In 2016, IBM analysts estimated that bad data costs the U.S. economy alone $3.1 trillion per year! Although one can always argue about the rigor of such estimates, there are several other studies that have reinforced the value a successfully implemented BI system offers. However, if the BI implementation just isn't working out, there are always simpler things the organization can try. How about a mission to the moon?
With an analytics project or solution, an impact assessment is crucial, often as important as the solution or project itself. Essentially, an impact analysis does exactly what it says: It allows analytics teams to understand what impact the project or solution actually had in comparison to its desired result.
The impact assessment should happen at two phases in the lifecycle of an analytics project. First, the potential impact needs to be calculated when the project is conceptualized. Then, once the solution has been deployed, actual impact needs to be continuously calculated.
In order to understand how this is done, let's look at two broad categories of analytics solutions.
1. Problem-Solving: Problem-solving analytics is built around a clear objective, such as improving revenue or throughput. Problem-solving analytics solutions need to have a clear impact associated with them.
2. Exploratory: Exploratory analytics is more of an investment aimed at uncovering insights in poorly analyzed areas or "dark data." Exploratory analytics is broader but expected to lead to problem-solving analytics use cases down the line.
Measuring the impact, however, is often not straightforward. In this article, I will examine seven potential avenues for approaching impact assessment.
1. Savings In Man Hours
This is the most basic impact that an analytics project can deliver. Recognizing this benefit often helps organizations move into a shared services model, centralizing and automating more and more analyses.
2. Impact On The Frequency Of Decisions
Certain reports are often presented for decision making every month or every quarter. Automating the insights from the report to a daily or weekly report allows more frequent interventions. A course correction that happens every month could, therefore, happen every week, i.e., the inefficiency build-up over 30 days can now be restricted to only seven days. The difference between these inefficiency build-ups can be calculated as an impact of the analytics project.
3. Impact Of Earlier Interventions
In certain cases, it may not be possible or practical to improve the frequency of analysis or to increase the frequency of decision making. However, it may still be possible to compress the time to insights. If an analysis project can compress the time to arrive at insights—versus arriving at these insights manually—then the analytics team can evaluate the time to intervention. For example, if a report takes three days to prepare manually and is subsequently analyzed by the management after one day, the impact analysis should show that it saved three to four days of inefficiency.
4. Impact Of Informed Decisions
There are often factors or perspectives that an analytics solution can bring by analyzing data at scale, analyzing a wider spectrum or employing a technique that may not be easy to replicate manually. The potential impact of such solutions, however, is difficult to calculate. People would have still made those decisions before implementing an analytics solution, but they may have filled in any gaps through their experience or knowledge in the area. To calculate the impact of an informed decision, it is important to define the objective of the analytics project at a very granular level. Consider a project aimed at revenue improvement. An organization's revenue is often too broad an objective to be entirely attributable to a single analytics project, as changes in revenue are affected by several internal and external factors. To successfully analyze the impact of an analytics solution, understand how the project attempts to enhance revenue. Is it intended to improve conversions of a particular cohort of customers from a particular channel? If so, the objective or the target metric needs to be exactly that.
5. Impact On Data Quality
Analytics solutions, especially in their first deployment, tend to bring data-quality issues to the fore. Teams may have been resolving such issues manually or making decisions based on incorrect data. An analytics solution can help the team track the extent of gaps in the data quality or non-compliance. It can also strengthen processes around data capture. To measure the impact on data quality, we can look at two approaches: 1. Checking how the data with a higher quality would've changed decisions of the past and 2. Checking how data with a poorer quality would change decisions of the present.
6. Impact On Direct KPIs
This is probably the core measurement that needs to be done. If the objective or the direct KPIs of an analytics project have been clearly defined, it is easy to go back to the function that is responsible for driving the KPIs and ask them for a target or a goal they want to get to through the analytics solution. When the solution is straightforward and the objective is clear, the impact on the direct KPI is the most important factor for an organization.
7. Impact On Indirect KPIs
While the objective of an analytics project may be very clearly defined, the KPI or metric that it improves may not appear important or relevant enough to excite the management team. It is important to connect the metric to a clearly quantified financial impact. This may require extrapolations or simulations to get to the impact on an indirect KPI. For example, by reducing the wait times of customers in a process, the throughput will be increased, which frees up bandwidth to handle a more crucial demand, creating capacity for additional revenue. It may also improve customer satisfaction, which can increase revenue by a certain factor in the medium to long term.
Conclusion
Impact assessment requires logical thinking and a good dose of creativity. They also require common sense and due diligence. A colleague from the industry once mentioned that their analytics team had gotten extremely creative, calculating the impact on revenue and ending up with an impact that was higher than the actual revenue itself. That said, when the potential impact of a solution is clearly quantified, it becomes a powerful sales pitch and a critical driver of adoption. Nothing is more persuasive than concrete visibility of impact, and a clear and concise impact assessment can provide exactly that.
In today’s fast-paced environment, healthcare organizations need to be agile and adapt quickly to dynamic market needs. Analytics has become a crucial tool for healthcare organizations, enabling them to gain insights into patient experience, clinical outcomes, and financial performance. However, when it comes to implementing analytics solutions, many healthcare organizations face a daunting challenge.
The traditional approach is to build large, monolithic systems that require extensive resources and long implementation cycles. This means that it could take months or even years before the organization sees any tangible results. Imagine your analytics solution as a toolbox. Traditionally, businesses have built a large, all-in-one toolbox that contains every tool they could possibly need. This toolbox is expensive, heavy, and difficult to move around. It also takes a long time to find the right tool for the job, and the more tools that are added, the more cluttered and disorganized the toolbox becomes. Additionally, there is also a need for a middleman who must understand which tool needs to be given to which workman. But there is a better way to organize and be more efficient when it comes to analytics implementation.
The Modular SaaS Analytics Approach
Imagine instead that you have a modular toolbox. This toolbox contains smaller, specialized toolsets that are tailored to specific healthcare areas or users. Each toolset can be easily added or removed, and because they are designed for specific tasks, they are more efficient and effective than the tools in a traditional toolbox. This makes it easier for workers to find the right tool for the job themselves and reduces clutter and confusion. This is the kind of impact that Modular SaaS analytics approach can offer. This approach offers several benefits:
• Target High Impact Process Areas: With a modular approach, healthcare organizations can target specific high-impact process areas such as Patient Turn around Times, for instance, and show immediate success on Patient Experience. This can help build momentum and confidence in the analytics program.
• Flexibility and Scalability: Modular analytics solutions are more flexible and scalable than monolithic systems. Imagine that as soon as you digitize your Lab and Radiology systems, you start getting analytics around those areas.
• Adaptability and Learning: The modular approach encourages a learning culture. We can understand the adoption of modules by different users and turn off analytics that are not used by a certain user role. Thus, we could easily identify and retain those analytics solutions that are actually useful and are having an impact instead of having a large chunk that’s not being used at all. This would also help declutter the solutions available for the user.
• Accelerate Time to Value: One of the biggest advantages of modular analytics is that it can accelerate time to value. Tomorrow, if a healthcare organization decides to focus on population health management, they can quickly activate the corresponding module of the analytics solution. With this, they can quickly identify high-risk patient populations and proactively manage their care to prevent disease progression and avoid costly hospitalizations.
Best Bang for Your Buck
While each Healthcare organizations may have unique processes and operations, many of the challenges they face are often similar to those faced by other healthcare businesses. By adopting readily available solutions to these challenges, healthcare organizations can benefit from the best-in-class solutions that have already been tested and proven in the market. Instead of investing time and resources in designing customized analytics solutions, which could take months or even years to develop and implement, healthcare organizations can focus on their core work which would be financial and clinical outcomes.
Modular analytics solutions also have better Return on Investment than monolithic systems. By using pre-built components, businesses can significantly reduce the cost involved in developing analytics solutions while attaining significant business impact at greater agility. This agility can help the organization demonstrate the value of analytics to stakeholders and drive further strategic investment in analytics capabilities. Thus, by embracing this approach, healthcare organizations can achieve their healthcare goals faster, more efficiently and stay competitive in the healthcare market.
Most organizations are probably familiar with the concept of analytics maturity. They know that having a high level of analytics maturity will help drive better business outcomes. There are different frameworks such as the one developed by Gartner, which helps organizations to self-assess their current analytics maturity. But the key challenge lies in improving and maintaining high levels of maturity. It’s like trying to hit a moving target — just when you think you’ve got it figured out, the goalposts shift. This is also one of the primary reasons why many Analytics Implementations don’t succeed as Vivek Rajagopal writes here . The journey towards analytics maturity can be challenging, and it requires a strategic and systematic approach. In this article, we will explore the different levers that organizations can use to gain analytics maturity.
1. Process Maturity
Process maturity refers to the degree to which an organization has documented and standardized its decision-making processes. When it comes to analytics, process maturity is critical because it enables the organization to establish a clear line of sight between the data and the decision-making process. The absence of documented processes and decision flows can lead to fragmented decision-making, which can hinder the effectiveness of analytics.
Organizations can improve their process maturity by documenting their decision-making processes and ensuring that they are standardized across the organization. This involves mapping out the various decision points in the organization and creating a clear flow of information between them. Once the processes are documented, the organization can use them to identify areas for improvement and optimize their decision-making.
2. Data Integration Maturity
Data integration maturity refers to the degree to which an organization has integrated its data sources and made them available for analytics. Data stored in silos can make it difficult to access and use for analytics. Additionally, a lot of peripheral data may be manually maintained.
To improve their data integration maturity, organizations need to digitize and standardize their data and integrate it into a single source of truth. This involves identifying the various data sources in the organization and mapping out how they can be integrated into a single database. By doing so, organizations can ensure that their data is accurate and complete, making it easier to extract insights from it.
3. Solution Maturity
Solution maturity refers to the degree to which an organization is using analytics to drive insights and automate decision-making. At the lowest level of maturity, users are creating their own analysis with data, while at the highest level, insights are driving automated actions.
To improve their solution maturity, organizations need to move beyond basic reporting and visualization tools and towards more advanced analytics solutions. These solutions should be designed to provide actionable insights to users at the right time, enabling them to make informed decisions quickly and efficiently. Additionally, organizations should be exploring the use of machine learning and artificial intelligence to automate decision-making and improve efficiency.
4. Adoption Maturity
Adoption maturity refers to the degree to which analytics has been institutionalized within the organization. At the lowest level of maturity, analytics is used on an adhoc basis by individual teams, while at the highest level, analytics is trusted and used to make decisions every day.
To improve their adoption maturity, organizations need to invest in training and education programs to help users understand the value of analytics and how to use it effectively. Additionally, organizations should be creating a culture of data-driven decision-making, where analytics is seen as a critical component of the decision-making process.
5. Impact Monitoring Maturity
Impact monitoring maturity refers to the degree to which an organization is monitoring the impact of its analytics initiatives. At the lowest level of maturity, impact monitoring is unavailable, while at the highest level, impact monitoring is automated.
To improve their impact monitoring maturity, organizations need to establish clear metrics for measuring the impact of their analytics initiatives. These metrics should be tied to specific business outcomes and should be regularly monitored to ensure that the organization is on track to achieve its goals. Additionally, organizations should be exploring the use of automated monitoring tools to streamline this process.
In conclusion, gaining analytics maturity is a journey that requires a strategic approach, starting small but where it’s impactful. By focusing on specific high impact process areas, organizations can identify the most impactful use cases and work towards developing analytics solutions to address them. Also, organizations should realize that analytics is not just about implementing new tools or technologies, but actually bringing about a change in the way people work and think about data. Ultimately, the goal of analytics maturity should be to empower people with confident decision making to drive even greater outcomes.