Optimization of TCO - part I: Footprint

Data is at the heart of the banking system, so efficient (running) systems are vital to their operations. Explore how we helped a bank when natural growth and additional onboarding of datasets resulted in their SAP HANA hot memory being flooded.


A data warehouse based on the data vault technique with hundreds or sources that enables reporting  along 2 separate timelines, a necessary condition for regulators. The platform provides data to both internal stakeholders as well as European authorities


Data volumes were growing, and the SAP HANA hot memory was flooded by increasing data volumes, due to natural growth and the additional onboarding of datasets. As a consequence, processing memory was squeezed and out of memory and intense swapping occurred. Instability loomed, and additional investment in hardware became needed.


Analysis was performed on the read and write patterns of the data processes. A prioritised plan of steps was created, the underlying tables were partitioned and the data processes were pruned so that only the necessary data would be read. Non frequently accessed data was placed in the SAP HANA NSE warm store. Design recommendations were created to be applied to new software. This proactively helps to manage data volume growth.


Hot footprint reduction (measured 1 year later, and growing) compared to ‘Do nothing’ scenario

-7 TB

No additional investment in hardware/licenses needed and not foreseen for the next years to come. Facilitating a cloud move and leaner cloud consumption

- x.x00.000 EURO

Predictable, near flat –hot- data growth curve


Stable system, no more out of memory or extensive swapping. Positive effect on System Operations activities


Improved perception and adoption of the reporting platform


Optimization, modelling, simplification




SAP HANA, SAP HANA NSE, Informatica PowerCentre