market data day 23/03/2011 - session 2 - best practice in data management
DESCRIPTION
TRANSCRIPT
Best practice in data managementWalter Gries, Moscow 23rd March, 2011
Data Flow Automation and Complex Data ManagementFrom many data sources to one you can trust?
2
Regulatory Tsunami
3
Challenges in Data Processing
Business says• Very often there are no processes or technical
infrastrucure in place to have a consolidated source for price and reference data – therefore such information is manually maintained in different systems. creates valuation risk between Front- Mid and
Back office, reconciliation task and a high degree in inconsistency leads to costs !
• Maintenance of static data and the need to apply changes in different system causes high efforts and failures – delivery to downstream system is often not consistent
5
6
Issues Summary• High error rates due to manual entry or processing
• Expensive and inefficient data integration effort
• High and non-optimized vendor data costs
• Business exposure – poor quality or availability of data, leading to market risk exposure and often over provision for Capital Adequacy
• Increased operational risk – is the right information always available in the right place at the right time and in the right format?
• Valuation risk and associated problems due to inconsistent sources of pricing and instrument information
• Failed trades due to poor quality reference data
• Inadequate ort difficult to measure data quality
7
Data Management MaturityLevel 3: Defined
• Business analysts control data management process with IT in support
• Data recognized as business enabler and an enterprise asset
• Executive management appreciates role of data governance and commits resource
• Data administration complements database administration
• Data present for both business and IT related development discussions
• Center of excellence coordinates work streams
Level 4: Managed
• Data treated as critical corporate asset
• Unified data governance strategy exists throughout the enterprise with executive support
• Business users take active role in data strategy and delivery
• Processes monitored and controlled through data collection and analysis, making them quantitatively predictable
• Business process interaction documented and planning centralised
Level 5: Optimised
• Organisation in continuous improvement mode
• Process enhancement managed through monitored feedback and quantitative assessment of data inconsistencies
• Enterprise wide business intelligence possible
• Organisation agile enough to respond to evolving business objectives
• Process improvement is everyone’s responsibility and the focus is on changing processes that aren’t optimal
Level 2: Reactive
• Fundamental data management practices defined & documented
• Governance introduced at local level
• Data policies for creation and change management exist but not institutionalised
• ‘Data as value’ is a concept understood by some
• Little organizational buy-in to the importance of enterprise-wide approach to managing data
• ROI is hard to measure
Level 1: Initial
• Little distinction between IT systems and data content
• Data management processes disorganized and performed ad hoc
• Data not viewed by either business or executive management as a priority
• Data is accessible, but not always available
• Data is not secure and not auditable
• Data problems are mostly ‘invisible’
• Content management is not an architectural consideration
How to eat an elephant?
8
Best practice and a real live scenarios in data management• Set objectives
– Analyze what can and should be changed – Spend the most time for scoping and business analysis,
review workflows and processes, asset classes, vendors, data needs and double usage, etc.
– Don’t look after a process automation – expect a tool to help in data management processing
– Define project phases • Consider resource needs for business analysis,
documentation and testing• Phase the implementation cases by either Asset Class,
department or specific business functionality
9
Best practice and a real live scenarios in data management• Reach out for a
– clear roles and responsibilities for data ownership, quality & standards
– golden copy established for data types with flexible and intelligent data distribution
– Use industry standards wherever possible
10
Best practice and a real live scenarios in data management•Data Sourcing
– Defined sourcing strategy for all data assets including sourcing (external or internal), single or multi-vendor sourcing with rules defined
– Optimisation of external and internal data sources –source data and cleanse data only once
– Leveraging synergies with reference data and market data sourcing
11
Best practice and a real live scenarios in data management• Lesson learned
– Design and development of rules best done as an Agile process where rules are designed, developed and analyzed iteratively
– Integrated team of business experts, analysts, developers and quality assurance staff allows for time-boxed rules development
– Keep enhancements at a minimum for first implementation
– Leverage the vendor’s “best practice” design (e.g. the data model)
– Design distribution approach early, including use of downstream operational data store
12
Best practice and a real live scenarios in data management
14
Regulatory pressure for risk-proof processing
15
Managing risk is about managing data
16
Use Case: Requirements for Credit Risk• Credit Risk
– Risk that loss may be incurred in the event that a counterparty defaults
– Banking book and Trading book– Global regulations – Basel II
• Reporting & Monitoring– Credit Risk models and limits– Regulatory capital reserve to cover risk– Impacts volume of business the bank can undertake– Monthly reporting moving to near real-time– Stress testing and validating identity of counterparty
• Requirements– Consistency of client data such as names and ratings– Accurate cross referencing of ratings against legal entity– Organisational structures and relationships
Use Case: Data Services for Credit Risk
• Closing prices, daily fixings, credit spreads, trades, trading calendars, counterparties and ratings
Load vendor data and proprietary data (Trading Systems)
• Pool retail and corporate data by rating groupsNormalise and cleanse data Service of Repository (SoR)
• Retrieve Interest Rates from SoR• Build yield curves and volatility surfaces, interpolate and store in SoR• Revalue Collateral Agreements
Perform analytics to generate derived data (Risk Analytics Engine)
• Retrieve prices and holding information from SoR• Determine risk factors and store in SoR• Perform scenario generation
Perform scenario generation (Scenario Generation Engine)
• Retrieve prices, curves, surfaces, trades, T&C’s from SoR• Aggregate results for PFE, etc. to regulatory rules
Perform risk reporting (Risk Management System)
18
Use Case: Data Services for Market Risk
• T&C’s, closing prices, trading calendars, credit spreadsLoad vendor data and proprietary data (Trading Systems)
Normalise and cleanse data Service of Repository (SoR)
• Retrieve Interest Rates from SoR• Build yield curves, interpolate and store in SoR • Build volatility surfaces, interpolate and store in SoR
Perform analytics to generate derived data (Risk Analytics Engine)
• Retrieve OTC, holding information and price data from SoR• Derive Risk Factors, perform scenario generation and store in SoR
Perform scenario generation (Scenario Generation Engine)
Perform market risk analytics generating sensitivities, VaR andstress scenarios (Risk Management System)
19
Next Generation Data Management
19
Business Integration(Third party plug-in, application and
process integration)
Data Operations Manager Technical Operations Manager
Data Sources(Multi-vendor feeds, in-house applications and
repositories)
Feed HandlersReal-Time Snapshot
Interactive
Batch File
In-house Systems
Specialised Storage & Management(Normalised Data
Universe and SoR)
Role-Based User Interfaces(Data Operations, Technical Operations, Systems
Administration, Business Analysis)
Data Management Services• Acquisition, Validation & Normalization• Data Quality Management• Event & Exception Management• Reporting & Metrics• Distribution & Orchestration
Data Services Framework
Canonical Data Model (CDM)
Application Programmable Interface (API)
Data Management ModulesReference
Pricing
Counterparty
3rd Party Modules (EDM)
Corporate Actions
The Ripple Effect
• Once you create a new item it is immediately available for use in the following areas;– Forms Designer (UI)– Client-side Jscript– Workflow Engine (triggers, data…)– Advanced Find & Query– Analytics– Workflow and Reporting Wizards– Services (Data/Metadata API)– Mail Merge– Data Import, Mapping & Duplication Detection
Canonical Data Model Modeling
21
Key Findings – what the market tells us
Enterprise Data Management ….. is in its early stages of
development, but qualifies as the “next big thing”?