white paper evolving your regulatory compliance ... · institutions are on a quest to make these...
TRANSCRIPT
Evolving Your Regulatory Compliance Architecture to Create Strategic Advantage
White Paper
Datameer WHITE PAPER
PAGE 2
Regulations in the financial sector can be especially numerous and complicated. These
regulations may often overlap and sometimes can even contradict each other. The
schedules are complex and deadlines change frequently, but institutions are expected
to be ready regardless. Ten years the financial crisis, we’re seeing institutions struggle to
meet the complexity of these regulations.
There has been a relative slowdown in new regulations in the last couple years, although
new revisions of existing regulations continue to keep regulatory teams at financial firms
busy. But even with this continued fast pace, forward-thinking institutions are thinking
more strategically about their compliance programs.
Newer areas of interest for regulators include the global impact of Fintech and managing
the influx of new market entrants. There is now the possibility of non-bank institutions
entering financial services. One example is Amazon, who recently inquired about a banking
license. These developments are sure to generate new regulations that all institutions need
to prepare for.
Tug of War Between Banks and Regulators
Both the regulators and financial institutions have been severely tested in the era since the
2008-2011 recession. This has resulted in stress and trust issues in the financial system.
On the institution side, FT Research reported that banks have paid between $150-200
billion of fines in the last 10 years. These fines have had major consequences for the
banks– both in terms of the balance sheet and in public trust. Suspicion of financial
institutions has lead directly to activist movements like Occupy Wall Street.
The credibility of regulators in government has also been seriously undermined, as many
in the banking industry believe some regulations are too onerous. Both sides are highly
sensitive about how to do better going forward.
The Ever Evolving Regulatory EnvironmentWe have seen an extraordinary array of regulations in the last 8 to 10 years, both globally and especially in the US and EU. Starting with the US Dodd Frank Act during the great recession (late 2000s and early 2010s), it continued in Europe with a series of regulations, most recently around data privacy and protection concerns in the form of GDPR.
The Next Era of Regulations
Regulators have learned a lot in the last 8 to 10 years. They are assessing the effectiveness
of what has been done since the crisis and are responding accordingly. In general, regulators
are moving away from releasing large, complex new regulations and instead breaking them
up over several years.
New threats, like the proliferation of cyber-attacks and release of private customer data
into the public domain, will usher in the next era of regulations after the financial crisis.
Those types of activities call for high levels of sophisticated control, and we can expect
to see regulations tightening around these vulnerabilities.
From a firm’s perspective, the consequences of these new threats go beyond the monetary.
A brand’s reputation and image– which has a significant impact on market capitalization–
can be severely damaged in the wake of such an attack.
The Only Constant is Change… and Cost
The only constant in this space has been change. Dozens of new rules and future regulations
make planning difficult. Requirements and implementation timelines change frequently,
making compliance a costly activity for the banks in terms of both time and money.
For example, Basel III (a volunteer regulatory framework for market liquidity risk and stress
testing) was originally planned to launch in May 2018, but was moved back to 2019 as
regulatory bodies were late in getting the full suite of standards published. Banks were
given a little breathing room, but are still scrambling and expending resources to meet
even the extended deadline.
Another example is the U.S. Comprehensive Analysis and Review (CCAR). In order to
meet deadlines (the annual US stress test regulations), the largest banks in the US will
likely spend several billion dollars on just this one single piece of regulatory compliance.
Datameer WHITE PAPER
PAGE 3
Version 1 Architecture: ~2010 to ~2013
Between 2010 and 2013, banks were using any tool they could to meet the deadlines.
Version 1 “architecture” was tactical invention on the fly, leveraging any and all existing
infrastructure, technology processes, and legacy computing systems. It was an ad hoc
process, with a lot of manual activities and human dependencies. This resulted in errors
and significant key person risk.
Central to the challenge was data sourcing. In order to meet particular requirements,
institutions were forced to create a “golden source” of data by finding and stitching
together fragmented information. It was extremely challenging for banks to understand
the lineage of that data and its original provenance, not to mention demonstrating that to
the regulators.
Compliance teams would use offline tools like Microsoft Excel that introduced errors and
risk during data manipulation. Reconciliation efforts were often a costly, but necessary,
part of producing an accurate regulatory report.
Institutions also had a very poor business understanding of data provenance in this V1
era. It took a long time for businesses to take responsibility for the ownership of data.
Version 2 Architecture: ~2013 to ~2017
Version 2 architecture– which emerged sometime around 2013 and continues into today–
is characterized by institutions investing money to try to reduce the tactical complexity
coming out of Version 1.
Institutions started to make significant investments in technology, people, and operational
processes. Their goal was to reduce delivery times, improve reports, cut down on error
rates and significant reconciliations, and ultimately, improve the predictability of results.
Datameer WHITE PAPER
PAGE 4
The Evolving Regulatory Compliance ArchitectureOver the last 6-8 years, financial institutions have responded to new regulations in a variety of ways. Some institutions have done well from a technical and business perspective, while others have done very poorly. In general, architectural solutions to compliance have occurred two phases.
The other major initiative of V2 architecture was the push for data governance and full
lifecycle data management. This initiative was driven by the need for more qualitative
analysis. Regulators not only cared that the numbers were correct (quantitative), but
they also wanted to see how you arrived at those conclusions. Many institutions in V1
succeeded quantitatively, but failed qualitatively.
Despite the improvements, most V2 architectures still depended on legacy infrastructure.
Instead of investing in new infrastructure, institutions chose to build new capabilities
around their legacy systems to make up for deficiencies. This created even more
complexity and new challenges.
Initial Regulatory Solutions Created Stress Points
While regulators have considered the early results of regulation moderately successful,
the implementation of V1 and V2 architectures has created significant stress inside the
institutions.
The first major challenge has been the reassignment or creation of teams to focus on
regulatory compliance. Part of this challenge includes budget allocation and high costs
(remember the CCAR example in the previous section).
Compliance has also put tremendous stress on internal technology infrastructure.
Companies have had to provision and re-configure infrastructure to create environments
where compliance reports and obligations could be developed, tested, and ultimately put
into production.
On the business side of financial firms, almost every group has been impacted. Lines
of business have had to adapt to changing regulations, while entirely new departments
have been created around risk regulatory affairs. In general, the operational side of the
business has taken on the lion‘s share of the ownership for maintaining compliance.
Continuing Trends
There is a series of trends that developed during the first 10 years of regulations that are
continuing through today. These trends include:
• Regulator pressure and enforcement scrutiny
• A crisis around data management
• The quest for efficiency and effectiveness
Datameer WHITE PAPER
PAGE 5
Regulatory pressure will continue for the foreseeable future, whether it means complying
with the existing regulations or adjusting to timelines and targets.
This pressure has effectively created a crisis in data management across the whole
industry: Institutions must manage:
• rapidly expanding the universe of data necessary to meet these rules
• increasing volumes from normal market and customer activity
• increasing velocity of the life cycle of that data.
The solutions and approaches employed to date are not sustainable for long-term cost
effectiveness and efficiency. This is due to the siloed approach of V1 and V2 architecture
that results in duplication and rework.
Institutions are on a quest to make these processes as fully automated as possible,
ultimately moving them into the core of the business. This is how companies can
leverage their data to gain strategic advantage.
New Investment in Data Management
In 2017, a number of industry observers commented that banks still have a long way to go
to improving their overall data management. This speaks to the magnitude and complexity
of the challenge at hand. Financial institutions are still learning to manage the ever-
changing rules and scope of compliance regulations.
Notably, in a report by KPMG, Ten Key Trends and Regulatory Challenges for 2018, the
consulting firm stated a major area of focus will be:
“Investments in platforms, systems, tools, and algorithms to capture, aggregate, govern, and analyze data from customers, financial activity, employee behavior, and third party transactions.”
There will continue to be a significant push to invest in strategic core data management
and data analytic infrastructure for the purposes of meeting the regulations. That brings
with it other opportunities to gain strategic advantage.
Datameer WHITE PAPER
PAGE 6
Datameer WHITE PAPER
PAGE 7
Evolving the Architecture
As we think about evolving the architecture into a future state, there are a number of
drivers pushing financial institutions, including pressures from internal and external
activity, and frankly, requirements to just to “do better”. This includes:
• The increasing need to foster digital transformation to meet customer demand and
competitive threats, which always starts with data
• The rise of FinTech, which promises to take revenue away from traditional institutions
and saw $13.8 billion of VC money invested in new startups in 2015
• Stubbornly high cost/income ratios, driving cost effectiveness and cost efficiencies to
be one of the main challenges for financial institutions
The complexity introduced by Versions 1 and 2 architectures have created cost, latency,
and friction that need to be addressed. The siloed approach of V2 architecture is outdated
and counter to the direction that banks and institutions need to move.
Keys to Better Regulatory Compliance Architecture
Forward thinking institutions pursue strategies to leverage their investments to reach
strategic and operational objectives, even if regulatory demand grows less aggressive.
Institutions should leverage their investment in compliance to create greater strategic
value. Those that know how to do this will be the winners in the future.
We see six key attributes that can lead to a regulatory compliance architecture which
that creates greater strategic value:
• Simplification - consolidation and elimination of duplication from earlier architectures
• Sustainability - the flexibility to meet future needs in both capacity and performance.
• Delivery Speed - the ability to deliver results and new projects in a much faster time
frame.
Implementing a Compliance Architecture
Datameer WHITE PAPER
PAGE 8
• Automation - the elimination of manual intervention that is slow, high-risk, costly, and
error-prone.
• Qualitative - a highly-qualitative approach that supports both “how it‘s done” and
accuracy of the numbers.
• Modular - a services-based, architecture allows for future integration with FinTech
capabilities and technologies.
A Version 3.0 Architecture
The first two versions of regulatory compliance architecture technology may have gotten
the basic aspects done, but they were more tactical - to become compliant - rather
than be used to drive strategic business advantage. The net effect was higher cost and
increased duplication, complexity, and risk in the form of architecture and delivery.
The next version of architecture – Version 3.0 – addresses these major issues. It focuses
on four major areas:
• Data Harmonization
• Integration into the Business
• Embracement of AI
• Cloud-native Architecture
Let’s examine each of these four areas in more detail.
Data HarmonizationNew studies have found that the data collection and manipulation for compliance can
be used to improve significant business decisions – better budgeting, optimizing balance
sheet allocations, establishing better risk thresholds, and making acquisition and
divestiture decisions. This advantage would not have existed without the infrastructure
and the data harmonization efforts for in a compliance architecture.
In order to achieve this everyday compliance, you need to have a pristine, fully-automated
common data environment. This includes consolidation of data, clear lineage of all data,
excellent governance, clear ownership, and the ability to create business intelligence.
Harmonizing the data and creating a central data strategy also eliminates duplication,
removing redundant data silos and the resources use to manage these. Yet, it can also
retain flexibility for the business. A Single Version of the Truth can be managed for certain
data, while LOBs can be free to create their own version of these assets when applicable.
Datameer WHITE PAPER
PAGE 9
Integration into the BusinessA second principle underlying Version 3 architecture is to integrate regulatory capabilities
into the core of the business so that they become part of daily operations. This starts
and ends with data. Integrating compliance into the business core requires operational
convergence– the harmonizing of regulatory data with broader operational data by
eliminating silos and putting all data in one place.
While the goal of a Version 2 architecture was accurate reporting and regulatory compliance,
the objective of a Version 3 architecture is to run a lean, accurate, cost-efficient business,
where regulatory reporting is simply a byproduct. Compliance is an output benefit and
not an end in itself.
Embracement of AIIn a common data environment, AI can be used to facilitate model-driven decision making
to proactively manage risk. In other words, this would allow you to inject artificial
intelligence into the data fabric and use that as a decision making framework. This would
allow institutions to proactively investigate incoming or even anticipated regulations,
which is a fundamental shift from the first two versions of compliance architecture.
From a risk perspective, AI introduces the ability to have real-time risk management.
Many of the regulations that been put on hold around risk, credit, and liquidity will
become possible.
The data environment in V3 architecture can also serve as an on-ramp for integration
with FinTech innovation. One of the biggest challenges for FinTech startups when working
with the large financial institutions is the lack of good data access, which inhibits their
ability to stand up pilots and Proof of Value projects (POVs). Banking institutions with
clean, clear data access are going to have a competitive advantage when embracing
new, innovative technologies.
The introduction of AI and other new technologies is another reason why KPMG is
forecasting a significant increase in spending on data platforms and tools in 2018.
Version 3 architecture will guide regulatory infrastructure into the core business and
create advantages that would not have existed otherwise.
Cloud-Native ArchitectureA broader technology trend is the to move to the public cloud. Financial services have
not been a leader in this space for a variety of reasons, particularly due to data privacy
and how they manage their data for regulatory compliance.
However, the cloud-native paradigm– including leveraging containers and functions (or
Lambdas as Amazon calls them)– is an architecture that allows for great modularity,
agility, and future-proofing. As financial institutions look to move to the cloud, it makes
increasing sense to design V3 architecture in a cloud-native format.
Datameer WHITE PAPER
PAGE 10
There are already conversations happening internally at institutions about cloud-native
architecture. This expertise needs to be brought to the regulatory infrastructure.
The Results of Version 3 Architecture
The clear strategic direction for financial institutions is defined by Version 3 Architecture:
Moving regulatory infrastructure into the core of the business. This shift won’t happen
overnight (and still there are institutions that are struggling with Versions 1 and 2) but the
desired future state seems to be well-understood, and clearly it all starts and ends with
the data.
A V3 architecture provides a number of strategic benefits to institutions:
1. Agility - the business flexibility and speed of a V3 architecture fosters the ability to
embrace competitive threats as well as new opportunities, new regulations, and new
innovation.
2. Clear data ownership - a V3 architecture gives the Lines of Business clear
ownership of their data to innovate and create new products and services.
3. Cost effectiveness - by converging data infrastructures and removing duplication, a
V3 architecture will offer tremendous cost effectiveness benefit over multiple years.
4. Reduces operational risk - a V3 architecture will help institutions stay compliant
every single day, as opposed to reporting deadlines.
5. Creates the path to AI adoption - with AI only going to continue to increase in
significance, a V3 architecture fosters its’ adoption on a clear, comprehensive set
of data.
These benefits can lead to faster reporting, the ability to manage change more effectively,
and better data governance. It also increases trust in the data, so the veracity of the output
has a far greater degree of acceptability.
In this continuous mode, institutions can be compliant every day or every hour, building
that compliance into the way they run the business rather than making it a special event.
Datameer WHITE PAPER
PAGE 11
A 2017 article by Thomas Davenport in the Harvard Business Review has somewhat
debunked the idea of an SVOT. Instead, Davenport suggests having two different data
strategies: a defensive data strategy and an offensive strategy.
Offensive vs. Defensive Data Strategies
Offensive and defensive data strategies have different objectives. The objective of a
defensive data strategy is to keep data secure and private, while also maintaining proper
governance and regulatory compliance. This requires understanding all requirements
and effectively implementing the right processes to meet them. In the end, you create a
SIngle Version of the Truth (SVOT) dataset to share with regulators.
With an offensive data strategy, the goal is to improve the firm’s competitive position,
enter new markets, or grow the business. In these cases, you may want to manipulate
data in a different way, or at least look at it through a different lens. An institution doesn’t
necessarily have to comply with a “single version of the truth” when taking on these
initiatives. Different LOBs can create their own versions of truth, creating a Multiple
Versions of Truth (MVoT) for these datasets.
The defensive data strategy is usually the responsibility of the data engineers and Chief
Data Officer. The offensive strategy, however, is best in the hands business analysts.
Analysts need the freedom to create new “versions of a truth” that help them develop
new business initiatives.
The offensive and defensive data strategies are not completely separate. Ideally, the
offensive version of the truth is derived directly from the defensive version. In other
words, the defensive version is the “common shared model” of the data, which analysts
then copy and manipulate to fit their business needs.
Meeting Critical RequirementsOne common topic in regulatory data compliance is the notion of a “single version of the truth” (or SVOT). When addressing regulatory compliance questions, banks and other large institutions want to rely on a single set of data to prove compliance, rather than having to stitch multiple data sets together. This idea developed out of the chaotic V1 architecture era and is meant to reduce the risk for error and delays.
Datameer WHITE PAPER
PAGE 12
Benefits of Offensive and Defensive Data Strategies
There are a few key benefits to having multiple versions of truth for your data. First is the
balance of control and flexibility for offensive data initiatives. Controls over the offensive
data are not too strict, but because the derived from the SVOT, it is not too flexible, either.
The second benefit is the balance of trust and uniqueness in the data. You know the data
has been confirmed and verified, but it can also be transformed into something unique
for business initiatives.
Having an offensive and defensive data strategy is the kernel of the strategic advantage
that can be derived from regulatory compliance. However, institutions need the right
tools to make this idea a reality. That is where analytic data management comes in.
Analytic Data Management for Compliance Architecture
Business today need to think beyond regulations and focus on analytic data management.
Analytic data management for compliance architecture gives institutions the ability to
take and use all available data for strategic business uses. There are several key features
of analytic data management architecture:
1. Multiple Pipelines. Institutions need the ability to create pipelines from all available
sources of data. This means incorporating the SVOT data set with all other managed
data so it can be used by data scientists and business analysts.
2. Data Lake. These data pipelines should be run and managed directly inside your
data lake. Moving volumes of data this size costs considerable money and time.
You need to the ability to keep and effectively manage all data in one place.
3. Ad-Hoc Exploration. Compliance architecture should support ad-hoc exploration
of the data. Data scientists and business analysts need the ability to explore,
understand, and do “forensics” on the data at will.
4. Recording Metadata. Any information about data usage must be recorded into
enterprise governance catalogs. Often, these catalogs are needed for compliance
reporting processes.
5. Integration and Movement. Institutions need the ability push data into other tools
or to other locations in the organization.
Datameer WHITE PAPER
PAGE 13
Core Data Pipelines
The primary pieces of infrastructure in the analytic data management architecture are the
core data pipelines. Core data pipelines feed all data to the SVOT data set and fuels the
compliance process.
But once the data is in one place, line of business teams can use it to drive new
initiatives. These teams can create new pipelines– derived from the SVOT data– that are
specialized to their own needs.
This is how defensive and offensive data strategies work together. Businesses start with
a defensive strategy– gathering all data in one place for compliance– and evolve to a
defensive/offensive strategy where they can use that same data to drive new initiatives
within the business.
Datameer WHITE PAPER
PAGE 14
This platform must have a number of critical capabilities. It has to enable automation,
security, and governance while also managing enormous amounts of data. It has to allow
data engineers and business analysts to create their own data pipelines, ranging from
simple integration and blending; to data preparation; to advanced analytic enrichment.
Your team needs the ad hoc exploration capabilities to dig in and understand what the
data is saying, and then the ability to let any downstream tool consume the end data
they’ve developed.
Let’s look most closely at the capabilities needed in a full data lifecycle platform. These
capabilities are best broken up into two broad categories: Functional and Control.
Functional Capabilities
To enable a sustainable regulatory architecture, a data platform must provide key functional
capabilities that liberate your business analysts, giving them the data they need to freely
do their jobs. This also requires a cooperative process between the data engineer and
the business analyst, allowing the data engineer creates the SVoT common data sets that
the business analyst then consumes, explores, refines, and uses for their MVoT datasets.
Multi-source, large-scale aggregation In a regulatory compliance environment, institutions need to aggregate large volumes of
data that come from multiple sources in multiple formats. Sifting through, classifying and
normalizing this complex data presents a major challenge. And once normalized, it must
also be continually aggregated at scale.
This requires a platform that can work with complex, multi-variant data formats, and
easily blend this data together. It also requires the ability execute jobs at scale in the
billions of records, to aggregate the data into a common dataset.
Full Data Lifecycle PlatformTo meet the needs for regulatory compliance as well as proactive analytic data management, institutions need a platform that covers the entire lifecycle of their data.
Datameer WHITE PAPER
PAGE 15
Agile Data Modeling With regulatory rules ever changing, the data models must continually be adjusted to
meet these new rules. An analyst must also be able to see how model adjustments
impact results at every step in the process.
To this end, a platform must have an agile modeling environment that can create pipelines
that are both flexible and powerful. An interactive interface allows analysts to add functions
and change the data model, then immediately see the results. If the results aren’t what
they’re looking for, they can instantly revert back to a previous version and try again.
Advanced Data Curation and Smart EnrichmentMany regulatory compliance reports require more complex, advanced analytics. The
institution may also be expanding to use AI or ML in their regulatory compliance analytics.
Sophisticated analytics require the data platform to support more advanced data curation,
such as organizing and shaping data for time series or windowing analysis. Feature
engineering may also be needed to identify the key attributes to feed AI and ML models.
Smart enrichment is also a required form of advanced curation. This involves running
algorithms across large datasets to identify hidden relationships and patterns within the
data. This enables it to be enriched with classifications or different groupings that would
otherwise not have been identifiable.
Ad-Hoc Data ExplorationWhen regulatory compliance problems are identified, “forensic” teams need to dig deeper
into the data to discover the source of the compliance issues. But sifting through the
large volumes of data in a regulatory architecture can be a daunting task, impossible to
achieve with everyday discovery tools.
Built in interactive, visual data exploration enables analysts to dig deep into the data
directly where it lies, eliminating the costly process of moving the data. To be effective
and fast, the visual exploration must provide free-form exploration across any possible
dimension, value or attribute. To achieve this, a platform needs an interactive, dynamic,
indexing-style system gives you the ability to explore billions of records while also keeping
response times down to mere seconds.
Collaboration and ReuseFinally, a platform must enable collaboration and reuse. Once a data engineer creates
the SVOT– a common shared model– individual analysts need the ability to produce
derivatives of their own that link back to that model. Then the analyst can add their own
levels of enhancement and enrichment around the model to produce a final data set that
is specific for their particular function and their specific initiative.
Datameer WHITE PAPER
PAGE 16
Critical Control Capabilities
The other set of key capabilities are best described as “Control” capabilities. Control
capabilities include automated operations, highly scalable execution, and most
importantly, security and governance.
Strong security ensures your data is encrypted, physically in the right place, and used in
the proper methods. You need a regulation compliance platform that has:
• Role-based security
• Enterprise security integration
• Encryption and obfuscation
• Secure impersonation
You also need data governance capabilities to make sure you understand where and
how the data is used. Critical governance features include:
• Usage and behavior auditing
• Data retention policies
• Full Lineage
• Enterprise governance integration
These capabilities ultimately help you answer the questions, “Where did this data come
from?”; “How is it manipulated along the way?”; “How were the end results produced?”;
and finally, “Where did those end results go?“
Both functional and control capabilities are necessary to unlock the strategic advantage
of regulatory compliance. The calling card of V3 architecture will be the combination of
all the capabilities in the core business.
Datameer
Datameer helps organizations gain the maximum value from their data by creating
secure, scalable and accessible business data pipelines that connect users to the data
they need when they need it. Datameer offers a complete platform for data ingestion,
preparation, enrichment and exploration that simplifies and accelerates the time-
consuming, cumbersome process of turning complex, multi-source data into valuable
business-ready information.
Datameer lets you create, manage and deploy business-driven data pipelines for faster,
smarter insights that drive greater agility and better outcomes. The intuitive visual
interface, dynamic modeling and enterprise grade features enable you to deliver more
business-ready information to fuel your analytics initiatives.
Datameer WHITE PAPER
PAGE 17
Complete (End-to-end) business data platform
Datameer offers a platform the covers the complete data lifecycle. It complements your
eco-system, working with ANY data source (over 70), in ANY location (on-premise, cloud
or hybrid) and feed ANY destination (data science tools, and data warehouses).
Scalable & Secure Enterprise-grade platform
Datameer offers three critical capabilities to enable industrial-grade data pipelines and
management:
• Scalable execution model and engine for high performance and responsiveness when
exploring large volumes of data
• Role-based security, encryption, integration with enterprise security controls and
other security features ensure the highest degree of data privacy
• Data sharing, lineage tracking, auditing and other governance features facilitate data
stewardship and ensures proper use of data for regulatory compliance
Intuitive, Business-ready Interface
To facilitate agile creation of data pipelines for faster analytics, Datameer offers an
interactive, no-coding approach that includes:
• An interactive spreadsheet style interface with over 270 powerful functions helps
transform, blend, organize and slice even the most complex data
• Wizard-led auto-discovery of complex data formats and a schema-on-read
architecture enables easy, rapid data modeling
• Visual data profiling at every step in the pipeline enables analysts see the shape and
distribution of the data to drive the proper end result
Easy Discovery & Exploration
A critical aspect to see faster times to insight is the ability to explore the data at scale and
let the answers reveal themselves. To this end, Datameer provides critical features including:
• Visual exploration at the speed of thought on extremely large datasets with response
times in the seconds speeds discovery
• Unconstrained exploration across any dimension, value and metric without pre-
modeling enables discovery of unknown insights
• Integration with popular BI tools allows analysts to use their familiar tools to dig deep
into the data without moving or copying that data
Datameer WHITE PAPER
PAGE 18
Business Challenges
In 2016, a leading bank faced a set of very interesting business challenges. First, they
were staring down regulatory compliance deadlines for BCBS 239. Second, their
market was getting incredibly competitive. At the same time, the bank wanted grow
internationally and reach new markets across the Americas, Europe, and Asia.
Finally, they faced major demographic changes. There was a new generation of customers–
millennials– who were coming of age and wanted to bank in a different way. They didn’t
want to go to a branch; millennials wanted to bank online and from their smartphones. The
bank had to wrap their arms around the challenge of serving this new breed of customer.
Technical Challenges
Along with these business challenges, the bank also faced extreme technical challenges.
They had to manage tremendous volumes of data, both for their regulatory compliance
and to drive new initiatives within the bank. This data came from a number of different
sources and in a number of different forms; they needed to rationalize and normalize all
of this data together.
The questions the bank wanted to ask were extremely complex and nuanced. The bank
went from tracking 50 attributes to over 500 attributes for specific data sets. At this
point the question became, “Which of these attributes– and ranges of attributes– are
the most important to us?” The bank needed to dig deeper into their data to get the
answers they wanted.
Solution
The bank first focused on regulatory compliance for BCBS 239. The first step was to
create a risk data lake. The bank built out an RDARR platform on top of their data lake
using Datameer. They aggregated all of their data in one place so they could facilitate
and operationalize their BCBS 239 reporting.
The bank also gave their analysts access to the data lake for ad-hoc exploration. When
analysts identified a problem to solve, they could dig into the data and explore it in
different ways and from various angles. So while the data lake enabled regulatory
compliance, it also enabled a wider variety of uses across the bank.
Customer Case Study
Datameer WHITE PAPER
PAGE 19
They quickly grew the data lake to support a variety of different use cases. By focusing
on one specific initiative at a time, the bank expanded their data analytics program in a
responsible way. They gave self-service access to business analysts, but did so in a way
where the data was secure, governed, and monitored.
The data itself was trustworthy because the data platform automatically recorded
metadata. This told other users who else used the data, why it was published, and how
it should be used. Other users could confidently reuse the data again if they had the
privileges to do so.
Results
The bank implemented their BCBS 239 compliance process in a matter of months. Then
they were able to spread into seven different lines of business over the following month.
Their data lake is now a major piece of the company’s Center of Excellence (CoE). Over
700 analysts use the data lake to create new data sets and solve problems interactively.
The bank has over 1,000 published data sets that are reused across the business, which
has created faster analytic cycles. They‘ve reduced the risk reporting times for BCBS
239 and other compliance regulations. They also reduced the overall cost of reporting by
removing manual aspects of the process, thus removing the “risk” from their risk reporting.
Conclusion
The transformation from V1 to V3 regulatory compliance architecture is filled with
confusion, obstacles, and complex technical challenges. The key to overcoming
these hurdles is the strong internal mobilization and alignment of your teams. Every
stakeholder– from leadership to front line employees– needs a clear vision of the
“future state” of your compliance architecture.
Part of that vision is understanding how regulatory architecture can become a strategic
advantage. By first developing a single source of truth, business analysts can then
create new models to serve their specific initiatives without sacrificing strong security
or governance. This type of freedom and power requires a certain set of capabilities
around both functional and operational activities.
As we saw in our case study, the right tools can enable a company to move quickly
from regulatory compliance success to strategic data excellence in a matter of
months. Datameer has helped enterprises around the world scale their regulatory data
operationalization efforts to do just that. To learn more, please visit www.datameer.com.