Designing for a modern and resilient analytics stack is not only about building for an intelligent data architecture via cloud native technologies or ensuring that an advanced analytics platform has a comprehensive set of data connectors with interactive data visualization tools. In a perfect environment, data is continuously translated into information and insights which guide informed decisions and actions.
First and foremost, a strong foundation in data calls for applying a business oriented mindset to identifying the capabilities a business is looking for. It is about setting out a sustainable business strategy to help deliver superior performance and drive sustainable and responsible growth. Organizations today understand that to remain competitive, they need to continue their path to becoming data intelligent.
It is necessary to define motivational triggers and review high impact use cases while ensuring awareness and alignment with business and IT stakeholders to achieve the intended results. Once the motivations, objectives and expected benefits are assessed, the next stage calls for a full assessment on the current analytics platform or digital data estate and its components both at a high level of understanding and a detailed level of evaluation.
An effective data strategy is one that is aligned to the business drivers and capabilities. It is relevant and contextualized to the organization’s needs while also being evolutionary in the sense that it continuously updates according to periodic reviews of progress.
A distributed data strategy approach such as data mesh is a new paradigm of thinking about managing and operationalizing data where data is broken down into specific products where each is designed to meet a specific need and is managed by teams closest to the use case. In adopting this domain-oriented decentralization approach, pressure is taken off the central team while the product delivers its intended outcomes over time. This practice favours an organization to become data driven because in the data life-cycle, it improves data quality, accelerates time to insights and governance.
Individual teams can contextualize their own insights.
Federated data governance is able to bring data quality back to the centre-stage
In my current set-up, the data and analytics team is responsible for building out data, data science and analytics as a core capability to help business become data intelligent. The key themes to the strategy are as follows :
Delivering intelligent systems at scale by leveraging on big data and data science.
Driving assisted and predicitive decision-making by unlocking insights from deluge of customer and external data.
Progressing with better data with continued excellence through accessibility and continuous improvement.
Single source of truth with one version of facts. In my experience, a signle source of truth was the mantra for every data warehouse project. With data lakes, the idea was to hoard all the data and we will figure it out when required. This market emphasis of accumulating data to make sense of it later only worsened data quality as there was no clarity around where to put the quality gates or tests. There was however no discussion on how to leverage trusted data that supported decision making and day to day operations.
With a single version off truth, the impact on the client’s business is as follows: -Decision makers have access to clean, consummable and connected infornation for improved insights to support strategic decisions and routine operations.
Trying to make sense of overwhelming data is nothing short of confusion. Not all data is equally useful. Some are a distraction at best and a misleading source of information at worst.
Executive sponsorship and leadership communication, commitment and involvement in the decision making process is equally important and required to ensure success.