A global power plant operator uses a Snowflake DB data warehouse on the Microsoft Azure cloud for reporting of KPIs related to power plant, environment management and health & safety at work. The existing processes are modernised, which involves tasks like designing a new technical data model and migrating the data processing steps onto new technology.
As part of a process standardisation and modernisation a large set of KPIs related to power plant and environment management and health & safety at work are migrated to the organisation's central Snowflake DB data warehouse on the Microsoft Azure cloud. Improvements are made to the technical data model where most parts now following a star schema design. The data transformation logic is implemented as Python scripts running on Azure Batch Services. Features available in Snowflake, such as the management of time zones and time travel functionalities for reports comparing different versions, are used. A dedicated web frontend, written in Python, is available for users to start a data loading process manually, for example after uploading an Excel file.
The existing KPI reporting is based on different technologies, for example MS Excel or data transformations directly in PowerBI. To standardise the process and to improve the quality of the data the KPIs are calculated using Python scripts and stored centrally in the Snowflake database. This corresponds to the organisation's best practices.