In the modern business world, data has become an imperative asset that needs to be harvested.
Every system interaction generates data. Automated batchjobs start and stop at specifically scheduled timings, ensuring that critical business processes are running nonstop, each adding to a continuously growing amount of transactional data.
Using this data, the right way, has become the battle of titans. Continuously good decision making, and fast maneuverability has become a necessity to keep your competitors at bay. But has this abundance of data necessarily made our decisions better? -And do we trust the analytical results we are served?
Not many decades ago, all important data were kept neatly organized in binders on the bookshelf in each organization. Accounting requirements were the same, business processes and manufacturing requirements were the same; and the products coming out were basically the same. Businesses were still growing and were also profitable…
Extrapolating today’s databases to a bookshelf format, one would drown even the biggest warehouses in bulks of paper binders. -Leaving basically no space for anything else. Due to the sheer space requirements, it would be close to impossible to organize the information meaningfully.
*** This must make one reflect ***
- How much of the data we generate carry actual value?
- Does the vast amount of data and system complexity disturb the picture, and make it more difficult to extract the meaningful and valuable data?
- How many people in our organization understands the underlaying assumptions of the various reports and may some of them carry false assumptions that affect our decisions negatively?
I cannot help to reason, why so many executives still rely on their experience and gut feeling when all necessary information is supposedly available in the analytics presented to them? How often are the analytical conclusion revised due to wrong assumptions or perhaps an incomplete picture, because this and that. Subject matter experts coming back a few days later, telling that details were not accounted for, and what they told before, was not exactly true anyways. So now we favor this decision instead…
As a BI consultant, quite often I see legacy systems with a lot of customizations, likewise the data warehouse is packed with custom interpretations, perhaps hidden in the backend SQL queries. ERP system knowledge is scattered over a handful of solution specialists, each with important nuances to why business processes are run a certain way.
Maybe persons who designed and coded various reports are no longer with the organization and nobody truly understands why the interpretation was done this way. The affected reports, however, have been part of the management board presentations for years, and have been running untouched ever since.
Often data is only accessible through front end tools. The Business Intelligence teams often preach that businesspersons do not understand the system terms well enough, so all fields presented to them have been renamed and modeled so that it can be “better understood”, and more importantly everyone sees the same…
If you ask people about the BI deliveries, you often hear a lot of pains: Distrust to the numbers, a missing understanding of the of underlaying modelling and assumptions, and perhaps also complains about stability and update frequency of the data.
Attempts to reconcile and validate reports have perhaps never been carried out, or key aspects were not fully understood. Several special cases may not be accounted for. etc. etc.
In VLK IT Consulting, we believe that people working in the organization need to get a much deeper understanding of the underlaying assumptions and raw data structures at hand. By obtaining this, the business gets more ownership of data, start to experiment with data; and in turn, becomes much more successful in utilizing the data effectively.
It is key to understand that this discipline and data curiosity does not fit every businessperson equally well; but BI needs to be anchored solidly deep within the organization.
At least one person needs to take ownership of all data requirements and reporting for each department. Finding and educating these persons and making them capable of just that, is where the money shot lays.
One problem to this is, that employee retention also has changed the recent years, no matter if you like it or not, data interpretation is very person driven and requires a ton of process knowledge. When key persons are no longer with the organization, the sense of ownership and meaning of existing reports, will drop drastically. Trust to the existing analytics need to be reestablished. And rebuilding the reports is often seen as a major job.
Another important area, that is often neglected, is data governance and maintenance of data. Most people agree that: “Shit in” equals “Shit out”, however, frequently companies have less success with establishing data ownership and solid data cleaning processes to ensure their data quality.
Therefore, having a data model that enables carefully selected businesspersons an easy access to raw data is key to establish ownership, understanding, and reliable data at the heart of the organization where data belong and creates the most value. Help you business make data a friend instead of an enemy.
If this sounds familiar to you, please feel free to reach out and let us start the journey and get you back in control and make your data the asset it ought to be.