Maintained and developed for Munich Re by ERGO Technology & Services S.A.
Data Pipeline Factory
The Data Pipeline Factory is an Azure cloud-based solution built to provide the right data at the right time for analytical solutions.
The Data Pipeline Factory is a managed service for analytical projects that addresses the challenges of data integration:
- Required data sourcesare stored in disparate internal and external environments and come in different formats
- Data change constantly
- Associated data is isolated, and the same entity comes with different identifiers
- Information is hidden in raw data without context
- Unstructured data contain a wealth of information that cannot be used
- Cyberattacks threaten data in motion and at rest
- Regulatory requirements need to be met (e.g., GDPR, HIPAA, etc.).
What Data Pipeline Factory is based on
Our team consists of highly skilled professionals with experience in Big Data & ETL.
Our solution is built on top of Azure and resolves all data integration needs.
We monitor and operate data pipelines to offer good SLAs for analytical solutions.
Capabilities of the Data Pipeline Factory
- Dynamic data integration (e.g., internal data, client data, third party data, web data, etc.)
- Trusted data products (e.g., data typing, data matching, data mapping, data enrichment, metadata management, etc.)
- Smart solutions for structuring data (e.g., natural language processing, table recognition, auto-mapping to target format, etc.)
- Standardization of data (e.g., mapping to uniform target format, sharing of algorithms, creating data pools, etc.)
- Secure & compliant platform (i.e., certified technical architecture, monitoring and exception handling)
The Data Pipeline Factory principles
Unlike traditional infrastructure and operations services with highly standardized processes and strict responsibilities, the Data Pipeline Factory follows principles from agile development:
Product teams keep end-to-end responsibility (“You build it, you run it.”), including product-specific2nd and 3rd level support. Another team, Digital Operations, augments DevOps activities for ops & monitoring, case management and support for cross-sectional components to provide efficient and consistent delivery of our digital products.
The Data Pipeline Factory handles external customer-facing, business critical services which require a high degree of customer centricity, courtesy and confidence in user interactions and professional handling of operations and support. High customer satisfaction is a key objective for Munich Re as the global leader in reinsurance services.
Managed Service, Cloud, Multi-Tenant
The Data Pipeline Factory is a managed service running in the cloud with multi-tenant architecture.
The Data Pipeline Factory services are available in all geographies where the Big Data Analytics Platform is available.
A spirit of constant improvement and continuous optimization of the services and underlying processes and tools exists. This goes hand in hand with the agile CI/CD approach of the product teams.
Primary insurance data from customers
The continued demand for new pipelines and the increasing number of active pipelines requires changing the way of working to a more mature operating model.
Technologies we use