For compliance and latency in banking, move the data closer to the customer

For compliance and latency in banking, move the data closer to the customer

Driven by a new breed of fintech applications, financial services organizations look to the cloud for infrastructure that is always-on, resilient, and able to support the real-time processing of transactions. In this post, we investigate how these organizations can deliver peak performance while ensuring compliance with local data protection and privacy laws.

As Facebook and Google, and new-age fintech apps like Betterment and Robinhood, have driven consumers to expect feature-rich applications, every business needs to be concerned with performance. While this remains a top priority for financial services organizations, another component unique to the banking industry needs to be top of mind – keeping latency low. Another significant benefit of cloud-native applications for banks is that they can support very high volumes of transactions with low latency.

Traditionally, the main drivers for reducing latency have been to speed up transactions and increase revenue – or minimize potential lost revenue. In its Gospel of Speed from 2012, Google revealed that a 400ms delay in search page generation caused traffic a 0.44% drop in search volume. Now, with every Google search, you’ll see how many milliseconds it took to return results. Around the same timeframe, in a presentation by a staff engineer, Amazon revealed that every 100ms of latency cost it 1% in sales. There are countless examples of companies facing lost revenue due to delays or downtime.

In financial services, low latency is most often associated with high-frequency financial trading, or trading that is entirely automated and optimized to take advantage of changing market prices. These applications make millions of decisions per millisecond, and receiving data a fraction of a second faster than a competitor’s systems can equate to millions of dollars. Most other industries do not have these demands, especially under strict compliance guidelines and regulations.

But low latency should be a concern for every financial services organization. A recent study found that nearly 90% of business leaders need low latency of 10ms or less to ensure their applications’ success. Financial services organizations must also consider the effect of latency on new use cases like cryptocurrency, edge computing, artificial intelligence (AI), and machine learning (ML). By taking advantage of low latency, data scientists can make informed real-time business intelligence decisions, and banks can use AI for real-time fraud detection.

Multi-region deployments and geo-partitioning

The majority of data moves between elements in a distributed application over public networks. This means even a perfectly architected application can experience lag if it has to communicate with a database thousands of miles away. As most banks have operations spanning regions, nations, and even the globe, they need to make infrastructure decisions considering these dispersed applications or customers. The solution is simple in theory – put the data closer to the application or customer. One way to keep the data closer to users is through a multi-region deployment. 

For a U.S. financial software company seeking a new database solution for its customer identity access management (CIAM), a multi-region deployment was the solution for achieving high performance and consistency. The CIAM layer was initially built on Oracle with multi-region replication using GoldenGate. However, the company soon discovered this configuration did not provide the speed or the always-available login experience it needed. Customers would experience a lag in authentication after creating an account, resulting in a poor user experience. The team decided to deploy CockroachDB across three AWS regions in the U.S., which brought resiliency by replicating data and distributing the replicas to maximize geo-diversity. 

However, multi-region deployments can be complicated for organizations with distributed databases because managing state across a set of machines is never straightforward. Organizations need to determine if the benefits outweigh the costs since using a single-region deployment is detrimental to speed and availability. This is where geo-partitioning of data comes in. Geo-partitioning provides row-level replication control, meaning organizations can attach data to a specific location.

A global financial data firm, for example, deployed CockroachDB to reduce latency across four GCP regions and two on-premise data centers by creating a single hybrid, geo-partitioned deployment. The firm had outgrown its expensive and dated Oracle database architecture. It chose CockroachDB to migrate its identity access management microservice because the geo-partitioning features provided a solution for authenticating entities even when strongly tied to specific geographic regions.

Geo-partitioning can also work even if a customer moves or travels, which is crucial for payment applications.

Don’t forget regulations

Beyond the speed complications for banks operating in multiple regions, financial organizations need to consider data regulations. 

Data privacy is a hot-button issue, with new laws and regulations coming into effect every year.

At the start of 2020, more than 120 countries had more than 200 legislations to protect data and consumer privacy. These regulations range from newer state-wide mandates, like the California Consumer Privacy Act of 2018 (CCPA), which gives consumers more control over the personal information businesses collect about them, to sweeping regulations like the European Union’s General Data Protection Regulation (GDPR), which covers everything from data collection and sharing to data storage, erasure, destruction, and even more.

The most important aspect of these regulations for organizations to keep in mind when growing a broad regional or global customer base is that they often prohibit storing certain data outside of certain boundaries. For example, this could mean a U.S. bank with customers in Europe may need to store those customers’ data within the EU. Keeping data closer to the application or customer offers another important benefit of geo-partitioning. The ability to pin data to a specific location can help ensure compliance in countries or regions that require data to be stored within the borders.

CockroachDB from Cockroach Labs is the only database solution that offers geo-partitioning for multi-region deployments. Using these capabilities, developers at banks and financial services organizations can designate where data should be stored at the database, table, and row-level. With this, organizations can deliver their applications with the lowest possible latency while keeping compliant with the latest data protection and privacy regulations.

Download the eBook How Financial Service Companies Can Successfully Migrate Critical Applications to the Cloud to learn more.

About the author

Jessica Edwards

Jessica Edwards is a founding member of the Cockroach Labs team and is the Head of Corporate Marketing. She has been marketing for technical products and companies for a dog's age, and worked with non-profits for years before moving into the tech space. She has a deep love of storytelling, education, knowledge-sharing, and community building. After 13 years in NYC, she recently moved to Portland, OR. She is still getting used to the rain.

Keep Reading

The future of data protection law

GDPR went into effect less than a year ago. And still, the era of conducting global business with limited …

Read more
Happy GDPR Day

May 25, 2018 has loomed over businesses for two years as the day the General Data Protection Regulation (GDPR) …

Read more
Global financial data firm's database migration from Oracle to CockroachDB

A global financial data firm migrated off its legacy Oracle infrastructure, creating a single hybrid, geo-partitioned …

Read more