DAR was dealing with a huge amount of data. They were ingesting about 40 millions rows of data every day, which translates to high computation and storage costs incurred on a daily basis. They wanted to investigate cost-saving measures that they could apply to the data loading process while maintaining the integrity and quality of the data they were providing to their clients.
At the time, DAR’s existing setup captured and recorded data for more than 1000 assets every 15 seconds. To make this process more efficient and cost-effective, they needed a robust processing system. In order to build a robust system, the system was recording data from multiple sources simultaneously. With this setup, they saw frequent outages during high volume periods and had increasing database contention between updates and reads.