Short description:

For the reporting of the gas storage capacity, a global power plant operator uses Snowflake on Azure as database. For integrating data from additional data sources for reporting purposes, additional business logic was implemented.


Some data sources periodically save XML files on an Azure Blob Storage. Azure Function Apps (written in C#) are triggered every time a new file is saved on the Blob Storage and process the data in the XML files in order to save it in the Snowflake database. For further processing Python scripts on Azure Batch Accounts are used. The ETL tool Talend is used to start the Azure Batch Accounts, which are calling these Python scripts. Furthermore, Talend is used to query some other data sources which provide API endpoints. The data is then also directly saved on Snowflake and further processed by the Python scripts. The Python scripts are mainly executing SQL statements for data transformation on Snowflake. The existing reports in Tableau and Power BI are adjusted to use the prepared clean data on Snowflake as data source.

Technical description:

The customer is using a central application for the capacity management of its gas storages. The application is importing technical and business data from external systems. It calculates the free and occupied capacities of the gas storages and uses these values for midterm and longterm capacity renting. Additionally, the free capacity for intraday trading is calculated.