Case

Enterprise Data Hub: Development and Integration

Big Data

Customer

International bank

Industry

Financial sector

Scale

1000+ employees

Challenge

Creation of a multifunctional center for aggregation, processing and presentation of data along with secured real-time exchange of enterprise data.

Solution

The structure of functional components:
Data Broker (Kafka) – ensures real-time data exchange.
Big Data Platform (Hortonworks / Cloudera Hadoop Data Platform, Hadoop Data Float) – stores all data coming through a data broker.
Logical Datawarehouse (Tibco Data Virtualization) – a logical data warehouse, a tool for business users to access and manage data. Also, it enables fast data access for online monitoring of the end-to-end technical process (from data collection to data usage), and provides data for technical monitoring.
Data Governance – data management, includes data quality and security.

Result

Scalable architecture built on Apache Kafka for real-time data exchange, 24/7. The number of processed events is about 2-3 million per day, and up to 200 events per second. Optimization of data governance, with regard to data quality and security.

Form Background

Would you like to see
the full case study?

Fill out the form and we will
contact you right away