ims and information governance - ims ug may 2013 dallas
TRANSCRIPT
© 2013 IBM Corporation
®
IMS
IMS – Data Governance Overview
Dennis EichelbergerIT Specialist – IMS Advances Technical [email protected]
© 2013 IBM Corporation2
IMS
IMS and Data Governance
Why discuss data governance?
What is data governance?
How does IMS implement data governance?
What are the today’s challenges?
© 2013 IBM Corporation
IMS
What happens when you’re NOT in control of your business data…
3
Heathcare – “Dozens of women were told wrongly that their smear test had revealed a separate infection after a hospital error, an independent inquiry has found….Confusion arose because the hospital decided to use a code number to signify “no infections”, not realizing that it was already in use at the health authority where it meant “multiple infections”….
Retail – “Hackers have stolen 4.2 million credit and debit card details from a US supermarket chain by swiping the data during payment authorization transmissions in stores..”
Banking – “A major US bank has lost computer data tapes containing personal information on up to 1.2 million federal employees, including some members of the U.S. Senate….The lost data includes Social Security numbers and account information that could make customers of a federal government charge card program vulnerable to identity theft….”
Banking – “A rogue trader accused of the world’s biggest banking fraud was on the run last night after fake accounts with losses of £3.7 billion were uncovered….The trader used his inside knowledge of the bank’s control procedures to hack into its computers and erase all traces of his alleged fraud. Mr Leeson said ”Rogue trading is probably a daily occurrence within the financial markets. What shocked me was the size. I never believed it would get to this degree of loss.”
Public Sector – “Two computer discs holding the personal details of all families in the UK with a child under 16 have gone missing….The Child Benefit data on then includes name, address, date of birth, National Insurance number and, where relevant, bank details of 25million people…”
WASHINGTON – “The FINRA announced today it has censured and fined a financial services company $370,000 for making hundreds of late disclosure to FINRA’s Central Registration Depository (CRD) of information about its brokers, including customer complaints, regulatory actions and criminal disclosures. Investors, regulators and others rely heavily on the accuracy and completeness of the information in the CRD public reporting system – and, in turn, the integrity of that system depends on timely and accurate reporting by firms.”
© 2013 IBM Corporation
IMS
…. Resulting in a broad range of potentially life threatening consequences
4
Heathcare – Dozens of women were told wrongly that their smear test had revealed a separate infection after a hospital error, an independent inquiry has found….Confusion arose because the hospital decided to use a code number to signify “no infections”, not realizing that it was already in use at the health authority where it meant “multiple infections”….
Retail – Hackers have stolen 4.2 million credit and debit card details from a US supermarket chain by swiping the data during payment authorization transmissions in stores..
Banking – A major US bank has lost computer data tapes containing personal information on up to 1.2 million federel employees, including some members of the U.S. Senate….The lost data includes Social Security numbers and account information that could make customers of a federal government charge card program vulnerable to identity theft….”
Banking – Rogue trader accused of the world’s biggest banking fraud was on the run last night after fake accounts with losses of £3.7 billion were uncovered….The trader used his inside knowledge of the bank’s control procedures to hack into its computers and erase all traces of his alleged fraud. Mr Leeson said ”Rogue trading is probably a daily occurrence within the financial markets. What shocked me was the size. I never believed it would get to this degree of loss.”
Public Sector – Two computer discs holding the personal details of all families in the UK with a child under 16 have gone missing….The Child Benefit data on then includes name, address, date of birth, National Insurance number and, where relevant, bank details of 25million people…”
WASHINGTON – The FINRA announced today it has censured and fined a financial services company $370,000 for making hundreds of late disclosure to FINRA’s Central Registration Depository (CRD) of information about its brokers, including customer complaints, regulatory actions and criminal disclosures. “Investors, regulators and others rely heavily on the accuracy and completeness of the information in the CRD public reporting system – and, in turn, the integrity of that system depends on timely and accurate reporting by firms.”
s
s
s
s
s
s
Incorrect classification..Life threatening consequences
Ineffective Security..Brand damage Financial loss Physical Data Loss..
Identity Theft
Late Disclosures.. Inaccurate dataHeavy Fines, Legal implications Physical unprotected Data
Loss..Fraud on a massive scale
Poor Internal Controls..Bankruptcy, Financial ruin, penalties
Need to manage the information
© 2012 IBM Corporation
IMS
What is Data Governance and Information Governance?
• Data:– Structured– Unstructured– Metadata– Video, Audio, Multi-Media– Print, Email, and Archived– Software Code– Patents, IP– Protocols, Message Streams
• Information:– Data which has been processed and
transformed in order to provide insight and answers to business questions
Effective management of data quality needs initiatives which:• span the whole organisation – not just within the silos• get to the root of the problem – not just the symptoms• allocate clear, measurable responsibilities
This is Data Governance
Effective use of business information needs a framework which:• manages the underlying data assets effectively through Data Governance• matches the supply of information with its demand from the business• underpins the business requirements with a solid Information architecture
Information Governance: ‘The specification of decision rights and an accountability framework to encourage desirable behavior in the valuation, creation, storage, use, archival, and deletion of information. It includes the processes, roles, standards, and metrics that ensure the effective and efficient use of information in enabling an organization to achieve its goals.’ ~ Gartner Inc.
This is Information Governance
As a business, we need to use these terms consistently
© 2013 IBM Corporation
IMS
Data Governance Creates Order out of Data Chaos
Orchestrate people, process and technology toward a common goal Promotes collaboration Derive maximum value from information
Data Governance is the exercise of decision rights to Data Governance is the exercise of decision rights to optimize, secure and leverage data as an enterprise asset.optimize, secure and leverage data as an enterprise asset.
Governing the creation, management and usage of Governing the creation, management and usage of enterprise data is not an option any longer. It is:enterprise data is not an option any longer. It is:
Expected by your customers Demanded by the executives Enforced by regulators/auditors
Leverage information as an enterprise asset to drive opportunities Safeguards information Ensure highest quality
Manage it throughout lifecycle
© 2012 IBM Corporation
IMS
IBM Information Governance approach
A good Information Governance program supports compliance initiatives, A good Information Governance program supports compliance initiatives, reduces cost and minimizes risk to enable sustainable profitable growthreduces cost and minimizes risk to enable sustainable profitable growth
Validated by the Information Governance Council Top global companies, business partners and industry experts
http://www.infogovcommunity.com/
Applied with a Unified ProcessRequirements driven aligned with business goals to solve business problems
Accelerates deployment with Council built Maturity Model
A framework (disciplines, levels) as starting point and for prioritizing actions
© 2013 IBM Corporation8
IMS
Data Quality Management
Information Life-Cycle
Management
Information Security
and Privacy
Core Disciplines
Data Risk Management & Compliance
Business Outcomes / Reporting
Value Creation
Data Architecture
Classification &
Metadata
Audit Information
Logging & Reporting
Supporting Disciplines
Organizational Structures & Awareness
Enablers
Policy Data Stewardship
Requires
Supports
Business Intelligence & Advanced Analytics
EnhancesR
equiresSupports
Information Governance Domains
© 2012 IBM Corporation
IMS
IBM Information Governance approach
© 2013 IBM Corporation10
IMS
Data Quality Management
Information Life-Cycle
Management
Information Security
and Privacy
Core Disciplines
Data Risk Management & Compliance
Business Outcomes / Reporting
Value Creation
Data Architecture
Classification &
Metadata
Audit Information
Logging & Reporting
Supporting Disciplines
Organizational Structures & Awareness
Enablers
Policy Data Stewardship
Requires
Supports
Business Intelligence & Advanced Analytics
EnhancesR
equiresSupports
Information Governance Domains
© 2013 IBM Corporation11
IMS
IMS – All You Need in One
A z/OS middleware that inherits all the strength of zEnterprise
A Messaging & Transaction Manager– Based on a messaging and queuing paradigm– High-volume, rapid response transaction management for application programs
accessing IMS and/or DB2 databases, MQ queues– “Universal” Application Connectivity for SOA integration– Integrated with Business Rules & Business Events
A Database Manager– Central point of control and access for the IMS databases – A hierarchical database model
• Used by companies needing high transaction rates• Provides database recoverability
– Now provide a “Universal” Database Connectivity based on JDBC / DRDA• Lot of new features in that space! Stay tuned
A Batch Manager– Standalone z/OS batch support– Batch processing regions centrally managed by the IMS control region
• Managing the batch-oriented programs — providing checkpoint/restart services
© 2013 IBM Corporation12
IMS
Dynamics of an Information Ecosystem … with IMS in perspective
Reducethe Costof Data
Trust & Protect
InformationMachineData
ApplicationData
Transaction
SocialMedia
Content
AnalyticApplications
Mobile/CloudApplications
THE INFORMATION SUPPLY CHAIN
Manage Integrate& Govern
New InsightsFrom
Big Data
Analyze
Enterprise Applications
Govern
Quality Security &
Privacy
Lifecycle Standards
© 2013 IBM Corporation13
IMS
z/OS Database Manager Positioning
Hierarchical – Operational Data– Very High performance– Real time mission critical work – Time sensitive response oriented – Complex data structures with many levels
Relational – Tabular data– Temporal data– Warehousing– Complex and/or ad hoc queries– Decision support
13
CUSTOMER
BILL
COMMAND
ARTICLEPRODUCT
CUSTOMERCUSTOMER
BILLBILL COMMANDCOMMAND
PRODUCTPRODUCT
ARTICLE
DB2 for z/OSEnhanced for business analytics
IMS on z/OSBuilt for performance and recovery
© 2013 IBM Corporation14
IMS
IMS 12 on zEC12 provides superlative Security, Compliance,
Performance, Efficiency, and Industrial-Strength Transaction and
Database management
Revolutionize your IMS with zEC12!
Most secure system with 99.999% reliability Optimized data serving with largest cache in the industry Leadership in performance with z/OS using the 5.5 GHz 6 way processor chip Ability to process terabytes of data quickly Millions of transactions per day with sub second response times Faster problem determination with IBM zAware for improved availability Java exploitation of Transactional Execution for increased parallelism and scalability A 31% improvement to PL/I-based CPU intensive applications based on NEW Enterprise PL/I for z/OS and Updated C/C++ compilers Increased Performance through Flash Express of large pages via z/OS 1.13
Additional gains include: XML hardware acceleration; streamline and secure valuable SOA applications with IBM WebSphere DataPower Centrally monitored, controlled and automated operations across heterogeneous environments with IBM Tivoli Omegamon
IMS 12 on zEC12 shows up to 30% improvement in transaction rate
© 2013 IBM Corporation15
IMS
IMS DB in Perspective
Native Quality of Services
High Capacity HALDB & DEDB
High Availability IMS Data Sharing
Performance without CPU extra cost
1/2 the MIPS and 1/2 the DASD of relational
Application Development
Multi-language AD support COBOL, PLI, C, … JAVA
XML Support Decomposed or Intact
Java SQL support (JDBC) IMS Java
Access from CICS and IMS applications, from Batch
IMS since early days
Open Access and Data Integration
DRDA Universal Driver with IMS 11+ Open Database
Data Management
Metadata Management IMS 12+ Catalog
Advanced Space Management Capabilities
DFSMS family
Health Check Pointer validation & repair
Backup and Recovery Advanced Solutions
IMS Tools
Reorganization for better performance
IMS Tools
Enterprise Data Governance
Compression and Encryption InfoSphere Guardium Tools
Audit for every access InfoSphere Guardium Tools
Data Masking InfoSphere OPTIM Family
Creation of Test databases InfoSphere OPTIM Family
Data Growth Management InfoSphere OPTIM Family
Operational Business Analytics & Reporting
COGNOS & SPSS
Information Integration & Data Synchronization
Fast integration in Web 2.0 applications
IMS Open database
Data Federation InfoSphere Classic Federation
Replication to IMS – Towards Active / Active solution
InfoSphere IMS Replication
Replication to Relational InfoSphere Classic Replication Server & Classic CDC
Publication of DB Changes InfoSphere Classic Data Event Publisher
© 2013 IBM Corporation16
IMS
IMS Explorer for Development – View Examples
Much easier to
understand the database
structure
SQL & result sets
z/OS Perspective
© 2013 IBM Corporation17
IMS
IMS Explorer for Development – View Examples
Multiple Logically related databases
Manufacturing – assembly parts arrival time to assembly line
© 2013 IBM Corporation18
IMS
18
Easily refresh & maintain right sized non-production environments, while reducing storage costs
Improve application quality and deploy new functionality more quickly
Speed understanding and project time through relationship discovery within and across data sources
Understand sensitive data to protect and secure it
InfoSphere Optim solutionsManaging data throughout its lifecycle in heterogeneous environments
ProductionProduction
TrainingTraining
DevelopmentDevelopment
TestTest
ArchiveArchive
•Subset •Mask
•Compare•Refresh
Reduce hardware, storage & maintenance costs Streamline application upgrades and improve
application performance
Data Growth Management
Test Data Management
Data Masking Protect sensitive information from misuse & fraud Prevent data breaches and associated fines
RetireRetire
DiscoverUnderstand
Classify
Discover
Safely retire legacy & redundant applications while retaining the data
Ensure application-independent access to archive data
Application Retirement
© 2013 IBM Corporation19
IMS
19
Managing Test Data in Non-Production – OPTIM Test Data Management
Create right-sized test environments, providing support across multiple applications, databases and operating systems
Deploy new functionality quicker and with improved quality & customer satisfaction
Compare results during successive test runs to pinpoint defects and errors
On z/OS: Support for DB2, IMS, VSAM
100 GB
Development
100 GB
Test 100 GB100 GBTraining
100 GB100 GBQA
Production orProduction Clone Subset
1 TB
http://www-01.ibm.com/software/data/data-management/optim/core/test-data-management-solution-zos
© 2013 IBM Corporation20
IMS
20
Data Masking and Protection - OPTIM Data Masking Option
Reduce risk of exposure during data theft– Fines and lawsuits– Avoid the negative publicity– Customer loss– Loss of intellectual property
Personal identifiable information (PII) is maskedwith realistic but fictional data for testing & development purposes.
http://www-01.ibm.com/software/data/data-management/optim/core/data-privacy-solution-zos/
De-identify for privacy protection
Deploy multiple masking algorithms
Provide consistency across environments and iterations
No value to hackers
Enable off-shore testing
On z/OS: Support for DB2, IMS DB, VSAM
© 2013 IBM Corporation21
IMS
InfoSphere Optim Data Masking Solution / Option
Example 2Example 1
PersNbr FstNEvtOwn LstNEvtOwn27645 Elliot Flynn27645 Elliot Flynn
Event TableEvent Table
PersNbr FstNEvtOwn LstNEvtOwn10002 Pablo Picasso
10002 Pablo Picasso
Event TableEvent Table
Personal Info TablePersonal Info Table
PersNbr FirstName LastName08054 Alice Bennett19101 Carl Davis27645 Elliot Flynn
Personal Info TablePersonal Info Table
PersNbr FirstName LastName10000 Jeanne Renoir10001 Claude Monet10002 Pablo Picasso
Data masking techniques include:
String literal valuesCharacter substringsRandom or sequential numbers
Arithmetic expressionsConcatenated expressionsDate aging
Lookup valuesGeneric mask
Referential integrity is maintained with key propagation
Patient InformationPatient Information
Patient No. SSN
Name
Address
City State Zip
112233 123-45-6789
Amanda Winters
40 Bayberry Drive
Elgin IL 60123
123456 333-22-4444
Erica Schafer
12 Murray Court
Austin TX 78704
Data is masked with contextually correct data to preserve integrity of test data
Satisfy Privacy regulations Reduce risk of data breaches Maintain value of test data
© 2013 IBM Corporation22
IMS
Managing Data Growth in Production – OPTIM Data Growth
Segregate historical data to secure archive
Align performance to service level targets
Reclaim under utilized capacity
On z/OS: Support for DB2, IMS DB, VSAM
Current
Production
Historical
Selective Retrieval
Archived Data/Metadata
Reporting Data
Historical DataReference
Data
Selective Archive
Universal Access to Application Data
Application Application XML ODBC / JDBC
© 2013 IBM Corporation23
IMS
InfoSphere Optim Application Retirement
Preserve application data in its business context
Retire out-of-date packaged applications as well as legacy custom applications
Shut down legacy system without a replacement
Infrastructure before RetirementInfrastructure before Retirement Archived Data after ConsolidationArchived Data after Consolidation
`
User Archive DataArchive Engine
`
User
`
User
`
User DatabaseApplication Data
`
User DatabaseApplication Data
`
User DatabaseApplication Data
© 2013 IBM Corporation24
IMS
Secure & Protect High Value Databases - Guardium Data Encryption
http://www-01.ibm.com/software/data/guardium/
Provides: z/OS integrated software support for data encryption Operating System S/W API Interface to Cryptographic Hardware
− CEX2/3C/4C hardware feature Enhanced Key Management for key creation and distribution
− Public and private keys− Secure and clear keys− Master keys
Created keys are stored/accessed in the Cryptographic Key Data Set (CKDS) with unique key label− CKDS itself is secured via Security Access Facility
© 2013 IBM Corporation25
IMS
Secure & Protect High Value Databases - Guardium Data Encryption
Non-invasive architecture
Clear and Secure Keys
Hardware enabled = Minimal performance impact
Supports DES, TDES & AES algorithms
Supports 56, 128 & 256 bit encryption
Installed at the segment level
No DBMS or application changes
http://www-01.ibm.com/software/data/guardium/
Clear text
Cryptotext
© 2013 IBM Corporation26
IMS
Secure & Protect High Value Databases - Guardium Real-Time Database Monitoring Non-invasive architecture
Heterogeneous, cross-DBMS solution
Does not rely on native DBMS logs
Minimal performance impact
No DBMS or application changes
Activity logs cannot be erased by attackers or rogue DBAs
Automated compliance reporting, sign-offs & escalations (SOX, PCI, NIST, etc.)
Granular, real-time policies & auditing
Single point of monitoring across DBMSs
DB2 & DB2/z
http://www-01.ibm.com/software/data/guardium/
IMS VSAM
Copyrite IBM 2013
IMS
Secure & Protect High Value Databases - Guardium Real-Time Database Monitoring Architecture
Copyrite IBM 2013
IMS
Here is shown an IMS BMP job that ran for 2 minutes. A jobname of TSTCMDDC accessed database AUECCMDD. You can also see the UserID and the PSB being used by the job. Under IMS Context column the calls in sequence made to the database are seen.
Secure & Protect High Value Databases - Guardium Real-Time Database Monitoring Sample report
Copyrite IBM 2013
IMS
Secure & Protect High Value Databases - Guardium Real-Time Database Monitoring Sample reportHere is shown an IMS BMP job that ran for 2 minutes. A jobname of TSTCMDDC accessed database AUECCMDD. You can also see the UserID and the PSB being used by the job. Under IMS Context column the calls in sequence made to the database are seen.
© 2013 IBM Corporation30
IMS
Operational Business Analytics on IMS DataCognos Reporting with IMS 12 Benefits:
– Ad hoc reporting access– Report on data reflecting the most current state of the business– React faster to trusted data– Market-leading BI solution for IMS customers
Roadmap for customers– Cognos 10.2 & IMS V11: IMS 11 JDBC driver is NOT certified with Cognos 10.2.
• Even if Open database architecture is available.– Cognos 10.2 & IMS V12 : IMS 12 JDBC driver with Catalog is certified with Cognos 10.2.
• New functions that allow to get enhanced predicats exploited by Cognos• IMS catalog for z/OS centralized metadata management
Cognos BI
Report Authoring
Published
Reports
Cognos Framework
Manager
IMSConsumer
Author
JDBC
DataStore
DataModel
IMS Universal JDBC
© 2013 IBM Corporation31
IMS
Operational Business Analytics on IMS DataCognos Reporting with IMS 12
Avail with Cognos 10.2
© 2013 IBM Corporation32
IMS
Technologies are in Place for Mainframe Apps Extensibility
WAS
MQ
CICS & IMS Connectors
Data Warehouse
XMLAsset Mgmt
SOA
Business Processes
Compliance
Service Mgmt
Customersare here today
Technology is inplace to go here next
Hybrid ComputingWorkload Optimization
Analytics
ASM & Cobol & PLI & C
Java
Application Investm
ent Protectio
n
zLinux & zBX
Business Rules
© 2013 IBM Corporation33
IMS
Big Data and IMS Databases
IMS integration with the BigInsights application connectors– Merge trusted OLTP data with the Big Data platform
Integrate IMS with the Big Data Machine Data Accelerator (MDA)– Correlate log records from off-platform application servers with IMS log records
Traditional ApproachStructured, analytical, logical
New ApproachCreative, holistic thought, intuition
Data Warehouse
Traditional Sources
StructuredRepeatableLinear
HadoopStreams
New Sources
UnstructuredExploratoryIterative
Web Logs
Social Data
Text Data: emails
Sensor data: images
RFID
Enterprise Integration
IMS IMS Operational Operational DataData
Transaction Data
Internal App Data
Mainframe Data
OLTP System Data
© 2013 IBM Corporation34
IMS
An enterprise information hub on a single integrated platform
Transaction Processing
Systems (OLTP)
Predictive Analytics
Best in Analytics
Industry recognized leader in Business Analytics and Data
Warehousing solutions
Best in Flexibility
Best in OLTP & Transactional Solutions
Industry recognized leader for mission critical transactional
systems
Business Analytics
zEnterpriseRecognized leader in workload
management with proven security, availability
and recoverability
DB2 Analytics Accelerator for z/OS
Powered by Netezza
Recognized leader in cost-effective high speed deep
analytics
Data Mart
Data Mart Consolidation
Unprecedented mixed workload flexibility and virtualization providing the most options for cost effective consolidation
Data Mart
Data Mart
Data Mart
Ability to start with your most critical business issues, quickly
realize business value, and evolve without re-architecting
Best in Batch workload
Efficient execution environment close to the data with first class I/O Technology
Batch workload
Copyrite IBM 2013
IMS
Why Assess Information Maturity
Information Maturity is typically assessed to provide a snapshot in time of an organisation's ability to manage information, as defined by the maturity model
This is most often used to benchmark and compare maturity• Across time within an organisation• Between organisations
With the aim to improve the ability to manage information over time• Reduce the time needed to access information• Reduce stored information complexity• Lower costs through an optimized infrastructure• Gain insight through analysis & discovery• Leverage information for business transformation• Gain control over master data• Manage risk and compliance via a single version of truth
This implies that an information maturity assessment should be an ongoing activity
IMS
A Model for Information Maturity: An Evolution for our Clients
Bus
ines
s Va
lue
of
Info
rmat
ion
Information Management Maturity
• Data: All relevant internal and external information seamless and shared. Additional sources easily added
• Integration: Virtualized Information Services• Applications: Dynamic Application Assembly• Infrastructure: Dynamically, re-configurable;
Sense & Respond
• Flexible, adaptive business environments across enterprise and extraprise
• Enablement of strategic business innovation• Optimization of Business performance and operations• Strategic insight
• Data: Seamless & shared; Information separated from process; Full integration of structured and unstructured
• Integration: Information Available as a Service• Applications: Process Integration via Services; in line bus
apps• Infrastructure: Resilient SOA; Technology Neutral
• Role-based, work environments commonplace• Fully embedded capabilities within workflow, processes &
systems• Information-enabled Process innovation • Enhanced Business Process & Operations Management• Foresight, predictive analytics
• Data: Standards based, structured & some unstructured• Integration: Integration of silos; Virtualization of Information• Applications: Services-based• Infrastructure: Component/Emerging SOA, Platform
Specific
• Introduction of contextual, role-based, work environments
• Enhanced levels of automation• Enhancement of existing processes and applications • Integrated business performance management• Single version of truth• Insight thru analytics, real-time
• Data: Structured content; organized• Integration: Some integration; silos still remain• Applications: Component-based applications• Infrastructure: Layered Architecture, Platform
Specific
• Basic search, query, reporting and analytics• Some automation• Disparate work environments• Limited enterprise visibility• Multiple versions of the truth
• Data: Structured content, static• Integration: Disjointed, Siloed, non-integrated solutions• Applications: Stand alone modules; application-dependent• Infrastructure: Monolithic, Platform Specific
• Basic reporting & spreadsheet- based • Manual, ad hoc dependence• Information overload • No version of truth• Hindsight based
Information as a Competitive Differentiator
Information to Enable Innovation
Information as a Strategic Asset
Information to Manage the
Business
Data to Run the Business1
2
3
4
5
© 2012 IBM Corporation
IMS
37
Data Governance Workshop Key Steps
Conduct Interviews of Key IT/Business Leaders
and DG Council
Assess Data Governance Maturity and
Target Capabilities
Develop a Roadmap for Delivering Capabilities
Develop Recommendations
Next Steps
Identify Gap to Future State (18 months)
© 2011 IBM Corporation16
Information Maturity Assessment – Gap Summary
1. Organizational Structures and Awareness
2. Data Stewardship
3. Policy
4. Value Creation
5. Data Risk Management & Compliance
6. Information Security & Privacy
7. Data Architecture
8. Data Quality Management
9. Classification & Metadata
10. Information Life-Cycle Management
Optimizing
Level 5
11. Audit Information Logging & Reporting
Maturity Category Quantitatively Managed DefinedManagedInitial
Level 4Level 3Level 2Level 1
Scope of Services
Assess current state Determine future state(in 12- 18 months)
Identify required capabilities and initiatives
Capability Gap
© 2011 IBM Corporation22
2011 2012 …..
Implementation Roadmap
1. Organizational Structures and Awareness
2. Data Stewardship
3. Policy
4. Value Creation
5. Data Risk Management & Compliance
6.Information Security & Privacy
July
OSA1: Communication and DG Ownership
OSA2: DGC undertaking critical projects
OSA3: Establish COE & Execution committee
OSA4: Data Stewards across Biz/IT areas
DS1: Stewards clearly identified/defined
DS2: Pilot program across departments
DS3: Data Stewards Accountability
POL1: Policy prioritization
POL2: Flushing Policy Details
POL3: Policies communication, enforcement and compliance
VC1: Develop DG Scorecard
VC2: Selective LOB projects using DG
VC3: Selective cross-LOB projects using DG
Assessment for baseline and establish Data Centric Security Reference Architecture
Vulnerability Assessment
Data Discovery (structured)
Activity Monitoring of current privileged user access to systems
Verify that Level 4 has started by comparing governance success with assessment findings for people and process.Adjust Privileged Access Rights
‘Sensitive’ Data PolicyDocument Controls in place mapped to requirements for data
security and compliance
* Risk Assessment for current Controls
Data Centric Security Architecture
Automated Activity Monitoring
Establish and Mandate De-Identification program for non-production systems (Test, QA, Dev)
Align perimeter & Identity controls with Activity Monitoring
Baseline Vulnerability Assessment
Pre Assessment Internal Survey
© 2011 IBM Corporation14
Pre-Workshop Survey Results - Executive Summary
© 2011 IBM Corporation21
Next Steps1. Communication of Workshop assessment results2. Validate Data Governance Plan and Objectives
Alignment of current business and IT initiatives with IBM workshop assessment Prioritize Data Governance initiatives and integrate with planned project sequence;
for short term and long term 3. Create Discover Roadmap; with prioritized initiatives4. Implement a Data Governance Project Management Office
Obtain Executive sponsorship Define structure, responsibilities, and identify core team Define quality metrics and reporting
5. Conduct Detailed workshop / Execution of prioritized initiatives. E.g. Data Quality, Classification and Metadata Management
Adopt Metadata Driven Data Governance in IT Acquire Metadata Management, Analysis, and Quality Tools Analyze current data quality
Implement Process Improvement for Data Quality
6. Define the metrics to identify how the business realizes returns on investment in the collection, production, and use of data.
7. Identify areas where additional consulting would accelerate timeline.
© 2011 IBM Corporation8
Key Observations and Opportunities
Efficiency
Data Integrity
Policy, Standards, Data definition1
Monitored Data Quality (early Risk ID)
Quantified Risk
Metrics – Data Quality, Business Impact4
Data and analytics optimization for the business
Higher ROI / faster payback
Value Creation process ( Enterprise and LOB)3
OpportunityObservationREF
Cost avoidance
Risk mitigation
Unstructured content7
Risk mitigation & complianceInternal data access and sensitive data location/control
6
Ownership
Efficiency
Data Quality
Organizational Effectiveness and Data Accountability (Stewardship)
5
Communication
SME availability
Level of influence
Organizational Awareness and Enterprise Solutions
2
© 2013 IBM Corporation
Information Governance – Company’s Assessment results
Data Quality
Management/Discovery
Information Life-Cycle
Management
Information Security
and Privacy
Core Disciplines
Data Risk Management & Compliance
Business Outcomes / ReportingValue Creation
Data Architecture
Classification & Metadata
Audit Information Logging & Reporting
Supporting Disciplines
Organizational Structures & Awareness
Enablers
Policy Data Stewardship
Requires
Supports
Enhance
© 2013 IBM Corporation
Information Governance Maturity Assessment current and target mapping
Assessed current state
Planned future state
Prioritized* Domains with Recommended Action Plan
Prioritization has been made by Workshop attendees based on :
• Highest gaps between current and to-be positions
• Evaluation of acceptance capability / Feasability by the Organization
© 2013 IBM Corporation40
IMS
IMS and Data Governance
Palisades, New York: May 14-15, 2013Chicago: May 21-22, 2013São Paulo, BR: June 3, 2013Costa Mesa, CA: June 4-5, 2013Boeblingen, DE : June 5, 2013Taipei, Taiwan: June 2013United Kingdom: June 2013Charlotte, NC: June 25-26 2013 (tentative)
© 2013 IBM Corporation41
IMS
IMS and Data Governance
Data governanceRegulation complianceAvoiding media embarrassmentCompetitive edge
IMS Enterprise Suite V2.2 ExplorerInfosphere Optim for Test Data ManagementInfosphere Optim for data and application retirementInfosphere Guardium for data protection
Encryption of IMS dataS-Tap monitors
Data Maturity assessment workshopInformation Governance Wildfire Workshops
Data Governance for System z Workshop(DGSYSZ)
Palisades, NY May 14-15, 2013Chicago, IL May 21-22, 2013
Costa Mesa, CA June 4-5, 2013
This workshop like all Wildfire Workshops is offered at no-fee to qualified customers.
IBM Advanced Technical Skills Wildfire Workshop
With the complexity of today’s information ecosystems, organizations must improve the level of trust users have in information, ensure consistency of data, and establish safeguards over information. When information is trusted, business can optimize outcomes.Join us for one and a half days at the IBM Data Governance for System z Workshop. Meet with experts to understand business and IT implications of Data Governance, Real Time Analytics, and Operational Data Warehousing, and learn how the IBM System z platform can help you meet, simplify, and reduce the cost of meeting your data governance requirements.
Workshop Topics:• Drivers of Information Governance• Data & Information Governance, What are They?• Enablers of Enterprise Data Governance Strategy
− Policy− Data Stewardship− Organizational Structure & Awareness
• Pillars of Data Governance− Data Quality Management− Information Life Cycle− Security, Privacy, & Compliance− Master Data Management
• Enterprise Data Governance on System z• Data Architecture on System z • Role of DB2 for z/OS & IMS in Data Governance• Operational Analytics & Real Time Analytics on System z• Data Governance Assessment
Audience: Attendance of this workshop is recommended for Chief Technology Officers, Architects, Data Stewards, IT Management, Owners of Business Analytics, Data Warehouse Owners, Line of Business Application Owners, and DBA Management, and Test & Development Management.Enrollment:To enroll please work with your IBM sales representative and enroll together by visiting the following website: https://www.ibm.com/servers/eserver/zseries/education/topgun/enrollment/esfldedu.nsf/0/0D284179789982B5852578B8004C07B4?EditDocumentFor more information on enrollment or for other Wildfire administration questions, contact Judy Vadnais-Keute at [email protected] , and for more information on this Data Governance for System z Workshop, please contact Peter Kohler at [email protected]
© 2012 IBM Corporation
IMS
5/22/13 IBM Smart Analytics System 9600
Copyright IBM Corp. 2008 1Section Title in Header
© 2013 IBM Corporation
®
IMS
IMS – Data Governance Overview
Dennis EichelbergerIT Specialist – IMS Advances Technical [email protected]
This is the "cover slide" for use at the start of a major section of the class. For example, this would be first slide for DBRC, Operations, or another major topic.
Copyright IBM Corp. 2008 2Section Title in Header
© 2013 IBM Corporation2
IMS
IMS and Data Governance
Why discuss data governance?
What is data governance?
How does IMS implement data governance?
What are the today’s challenges?
Messaging & Transaction managerAn IMS control program receives a transaction request, stores the transaction on a message queue (in memory
or in a shared structure), and then invokes its scheduler to start the business application program in a message processing region.
The message processing region retrieves the transaction from the IMS message queue and processes it, reading and updating resources like IMS Databases, DB2 databases and WebSphere MQ queues ensuring proper management of the transaction scope.
The IMS application itself decides to send a response message back, to start another IMS transaction asynchronously or to access synchronously a set of services.
IMS Batch ManagerA very important advantage of the IMS environment is to provide an imbedded batch container available for both type of
configuration DBCTL or DCCTL.For batch workload, IMS plays the role of syncpoint manager and provides very important backup/restart capabilities with
repositioning of resources at the latest checkpoint. IMS coordinates resource access while protecting data integrity, and allows parallel access between transactional workload and batch workload based on efficient locking mechanisms provided by resources managers.
Batch processing regions are called BMP for non-Java applications or JBP for Java batch applicationsFor historical reason, IMS is still supporting a standalone batch environment that runs in a single address space for batch
application or utilities. This environment is not covered in this document even if it’s still used a lot by some customers who didn’t need over years parallel processing between online and batch.
Copyright IBM Corp. 2008 Section Title in Header
3
© 2013 IBM Corporation3
IMS
What happens when you’re NOT in control of your business data…
3
Heathcare – “Dozens of women were told wrongly that their smear test had revealed a separate infection after a hospital error, an independent inquiry has found….Confusion arose because the hospital decided to use a code number to signify “no infections”, not realizing that it was already in use at the health authority where it meant “multiple infections”….
Retail – “Hackers have stolen 4.2 million credit and debit card details from a US supermarket chain by swiping the data during payment authorization transmissions in stores..”
Banking – “A major US bank has lost computer data tapes containing personal information on up to 1.2 million federal employees, including some members of the U.S. Senate….The lost data includes Social Security numbers and account information that could make customers of a federal government charge card program vulnerable to identity theft….”
Banking – “A rogue trader accused of the world’s biggest banking fraud was on the run last night after fake accounts with losses of £3.7 billion were uncovered….The trader used his inside knowledge of the bank’s control procedures to hack into its computers and erase all traces of his alleged fraud. Mr Leeson said ”Rogue trading is probably a daily occurrence within the financial markets. What shocked me was the size. I never believed it would get to this degree of loss.”
Public Sector – “Two computer discs holding the personal details of all families in the UK with a child under 16 have gone missing….The Child Benefit data on then includes name, address, date of birth, National Insurance number and, where relevant, bank details of 25million people…”
WASHINGTON – “The FINRA announced today it has censured and fined a financial services company $370,000 for making hundreds of late disclosure to FINRA’s Central Registration Depository (CRD) of information about its brokers, including customer complaints, regulatory actions and criminal disclosures. Investors, regulators and others rely heavily on the accuracy and completeness of the information in the CRD public reporting system – and, in turn, the integrity of that system depends on timely and accurate reporting by firms.”
So what I did was select a handful of incidents taken from the press - all public domain - that show what happens when you’re not in control of your data. They range from incorrect classification of data, security breaches to false information and late disclosures. Data governance applies to every industry irrespective of size and geography.
The healthcare item for example - a batch of women were incorrectly told that their cervical smear test had multiple infections - not the kind of letter you want to receive in the post. The error was that the Hospital lab had used a code to signify that a selection of smears had no infections, sent the details to the governing authority who were using the same code but to mean multiple infections. So potentially there could have been numerous patients starting some quite nasty treatments unnecessarily.
The one below concerns a bank in Europe where a trader had used his inside knowledge of the back office systems to try to remove any traces of his alleged fraud. Nick Leeson states that rogue trading is probably a daily occurren but he couldn’t believe it had got such a large figure before being noticed.
Up the top a US supermarket had 4.2 million credit card details stolen during the card swiping process.
And below, two discs containing unencrypted details of families entitled to child benefit (any family with a child under 16) were sent in the post to another department but never arrived. I was one of the 25 million people affected and I received a letter stating that our bank details and daughter’s information were on one of the discs that went missing.
At the top there is an example of computer tapes going missing with details of 1.2 million federal employees on them – leaving them exposed to identity theft.
And finally the F.I.N.R.A. has been fining companies for late submission on information about their brokers, complaints, regulatory action, criminal disclosures. They state that “Investors, regulators rely heavily on the accuracy and completeness of the information in the public reporting system – and, in turn, the integrity of that system depends on timely and accurate reporting by firms”
Copyright IBM Corp. 2008 Section Title in Header
4
© 2013 IBM Corporation4
IMS
…. Resulting in a broad range of potentially life threatening consequences
4
Heathcare – Dozens of women were told wrongly that their smear test had revealed a separate infection after a hospital error, an independent inquiry has found….Confusion arose because the hospital decided to use a code number to signify “no infections”, not realizing that it was already in use at the health authority where it meant “multiple infections”….
Retail – Hackers have stolen 4.2 million credit and debit card details from a US supermarket chain by swiping the data during payment authorization transmissions in stores..
Banking – A major US bank has lost computer data tapes containing personal information on up to 1.2 million federel employees, including some members of the U.S. Senate….The lost data includes Social Security numbers and account information that could make customers of a federal government charge card program vulnerable to identity theft….”
Banking – Rogue trader accused of the world’s biggest banking fraud was on the run last night after fake accounts with losses of £3.7 billion were uncovered….The trader used his inside knowledge of the bank’s control procedures to hack into its computers and erase all traces of his alleged fraud. Mr Leeson said ”Rogue trading is probably a daily occurrence within the financial markets. What shocked me was the size. I never believed it would get to this degree of loss.”
Public Sector – Two computer discs holding the personal details of all families in the UK with a child under 16 have gone missing….The Child Benefit data on then includes name, address, date of birth, National Insurance number and, where relevant, bank details of 25million people…”
WASHINGTON – The FINRA announced today it has censured and fined a financial services company $370,000 for making hundreds of late disclosure to FINRA’s Central Registration Depository (CRD) of information about its brokers, including customer complaints, regulatory actions and criminal disclosures. “Investors, regulators and others rely heavily on the accuracy and completeness of the information in the CRD public reporting system – and, in turn, the integrity of that system depends on timely and accurate reporting by firms.”
s
s
s
s
s
s
Incorrect classification..Life threatening consequences
Ineffective Security..Brand damage Financial loss Physical Data Loss..
Identity Theft
Late Disclosures.. Inaccurate dataHeavy Fines, Legal implications Physical unprotected Data
Loss..Fraud on a massive scale
Poor Internal Controls..Bankruptcy, Financial ruin, penalties
Need to manage the information
So moving on, the next slide shows that there are wide ranging consequences
Incorrect classification leading to life threatening consequences
Poor internal controls leading financial ruin and bankruptcy
Ineffective security can result in brand damage and financial loss
Physical loss of data possibly resulting in potential fraud and identity theft and
Late disclosures leading to fines, legal implications and resignations.
So don’t slip up as it can cost you and your organization lots of money and longer term problems.
Copyright IBM Corp. 2008 Section Title in Header
5
© 2012 IBM Corporation
IMS
What is Data Governance and Information Governance?
• Data:– Structured– Unstructured– Metadata– Video, Audio, Multi-Media– Print, Email, and Archived– Software Code– Patents, IP– Protocols, Message Streams
• Information:– Data which has been processed and
transformed in order to provide insight and answers to business questions
Effective management of data quality needs initiatives which:• span the whole organisation – not just within the silos• get to the root of the problem – not just the symptoms• allocate clear, measurable responsibilities
This is Data Governance
Effective use of business information needs a framework which:• manages the underlying data assets effectively through Data Governance• matches the supply of information with its demand from the business• underpins the business requirements with a solid Information architecture
Information Governance: ‘The specification of decision rights and an accountability framework to encourage desirable behavior in the valuation, creation, storage, use, archival, and deletion of information. It includes the processes, roles, standards, and metrics that ensure the effective and efficient use of information in enabling an organization to achieve its goals.’ ~ Gartner Inc.
This is Information Governance
As a business, we need to use these terms consistently
Copyright IBM Corp. 2008 Section Title in Header
© 2013 IBM Corporation
IMS
Data Governance Creates Order out of Data Chaos
Orchestrate people, process and technology toward a common goal Promotes collaboration
Derive maximum value from information
Data Governance is the exercise of decision rights to Data Governance is the exercise of decision rights to optimize, secure and leverage data as an enterprise asset.optimize, secure and leverage data as an enterprise asset.
Governing the creation, management and usage of Governing the creation, management and usage of enterprise data is not an option any longer. It is:enterprise data is not an option any longer. It is:
Expected by your customers Demanded by the executives Enforced by regulators/auditors
Leverage information as an enterprise asset to drive opportunities Safeguards information
Ensure highest quality
Manage it throughout lifecycle
Copyright IBM Corp. 2008 Section Title in Header
7
© 2012 IBM Corporation
IMS
IBM Information Governance approach
A good Information Governance program supports compliance initiatives, A good Information Governance program supports compliance initiatives, reduces cost and minimizes risk to enable sustainable profitable growthreduces cost and minimizes risk to enable sustainable profitable growth
Validated by the Information Governance Council Top global companies, business partners and industry experts
http://www.infogovcommunity.com/
Applied with a Unified ProcessRequirements driven aligned with business goals to solve business problems
Accelerates deployment with Council built Maturity Model
A framework (disciplines, levels) as starting point and for prioritizing actions
8
The IBM Information Governance Council Maturity Model measures information governance competencies of organizations based on the 11 crucial domains of information governance maturity, such as organizational awareness and risk lifecycle management.
Illustrated here are those domains which have been grouped based upon their primary relationships. These groupings are:
• Outcomes• Enablers• Core Disciplines• Supporting Disciplines
For example, consider that quality and security/privacy requirements for data need to be assessed and managed throughout the information lifecycle. Executive-level endorsement and sponsorship is an enabler for stewardship of information that requires standardization across processes and functional boundaries. Consistency in practice can be enabled through stewardship when there are enterprise-level policies and standards in place for information governance disciplines.
These domains or categories of the maturity model are further refined into multiple sub-domains or sub-categories for assessing maturity.
The maturity model is described on the next slide.
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------SOURCE:“The IBM Information Governance Council Maturity Model: Building a roadmap for effective information governance” white paper, dated October 2007, http://www-01.ibm.com/software/sw-library/en_US/detail/Z992137B74662E40.html
Copyright IBM Corp. 2008 Section Title in Header
9
© 2012 IBM Corporation
IMS
IBM Information Governance approach
10
The IBM Information Governance Council Maturity Model measures information governance competencies of organizations based on the 11 crucial domains of information governance maturity, such as organizational awareness and risk lifecycle management.
Illustrated here are those domains which have been grouped based upon their primary relationships. These groupings are:
• Outcomes• Enablers• Core Disciplines• Supporting Disciplines
For example, consider that quality and security/privacy requirements for data need to be assessed and managed throughout the information lifecycle. Executive-level endorsement and sponsorship is an enabler for stewardship of information that requires standardization across processes and functional boundaries. Consistency in practice can be enabled through stewardship when there are enterprise-level policies and standards in place for information governance disciplines.
These domains or categories of the maturity model are further refined into multiple sub-domains or sub-categories for assessing maturity.
The maturity model is described on the next slide.
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------SOURCE:“The IBM Information Governance Council Maturity Model: Building a roadmap for effective information governance” white paper, dated October 2007, http://www-01.ibm.com/software/sw-library/en_US/detail/Z992137B74662E40.html
Copyright IBM Corp. 2008 11Section Title in Header
© 2013 IBM Corporation11
IMS
IMS – All You Need in One
A z/OS middleware that inherits all the strength of zEnterprise
A Messaging & Transaction Manager– Based on a messaging and queuing paradigm– High-volume, rapid response transaction management for application programs
accessing IMS and/or DB2 databases, MQ queues– “Universal” Application Connectivity for SOA integration– Integrated with Business Rules & Business Events
A Database Manager– Central point of control and access for the IMS databases – A hierarchical database model
• Used by companies needing high transaction rates• Provides database recoverability
– Now provide a “Universal” Database Connectivity based on JDBC / DRDA• Lot of new features in that space! Stay tuned
A Batch Manager– Standalone z/OS batch support– Batch processing regions centrally managed by the IMS control region
• Managing the batch-oriented programs — providing checkpoint/restart services
Messaging & Transaction managerAn IMS control program receives a transaction request, stores the transaction on a message queue (in memory
or in a shared structure), and then invokes its scheduler to start the business application program in a message processing region.
The message processing region retrieves the transaction from the IMS message queue and processes it, reading and updating resources like IMS Databases, DB2 databases and WebSphere MQ queues ensuring proper management of the transaction scope.
The IMS application itself decides to send a response message back, to start another IMS transaction asynchronously or to access synchronously a set of services.
IMS Batch ManagerA very important advantage of the IMS environment is to provide an imbedded batch container available for both type of
configuration DBCTL or DCCTL.For batch workload, IMS plays the role of syncpoint manager and provides very important backup/restart capabilities with
repositioning of resources at the latest checkpoint. IMS coordinates resource access while protecting data integrity, and allows parallel access between transactional workload and batch workload based on efficient locking mechanisms provided by resources managers.
Batch processing regions are called BMP for non-Java applications or JBP for Java batch applicationsFor historical reason, IMS is still supporting a standalone batch environment that runs in a single address space for batch
application or utilities. This environment is not covered in this document even if it’s still used a lot by some customers who didn’t need over years parallel processing between online and batch.
Copyright IBM Corp. 2008 12Section Title in Header
12
© 2013 IBM Corporation12
IMS
Dynamics of an Information Ecosystem … with IMS in perspective
Reducethe Costof Data
Trust & Protect
InformationMachineData
ApplicationData
Transaction
SocialMedia
Content
AnalyticApplications
Mobile/CloudApplications
THE INFORMATION SUPPLY CHAIN
Manage Integrate& Govern
New InsightsFrom
Big Data
Analyze
Enterprise Applications
Govern
Quality Security &
Privacy
Lifecycle Standards
Copyright IBM Corp. 2008 13Section Title in Header
© 2013 IBM Corporation13
IMS
z/OS Database Manager Positioning
Hierarchical – Operational Data– Very High performance– Real time mission critical work – Time sensitive response oriented – Complex data structures with many levels
Relational – Tabular data– Temporal data– Warehousing– Complex and/or ad hoc queries– Decision support
13
CUSTOMER
BILL
COMMAND
ARTICLEPRODUCT
CUSTOMERCUSTOMER
BILLBILL COMMANDCOMMAND
PRODUCTPRODUCT
ARTICLE
DB2 for z/OSEnhanced for business analytics
IMS on z/OSBuilt for performance and recovery
Page 7 Database Manager positioning is based on two types of fundamentally different data -- Operational and Informational. Operational data is more application oriented, constantly updated, and must support daily operations. Informational data is subject oriented, non-volatile, and supports decision making. Hierarchical and relational were developed with inherently different characteristics. Hierarchical is more efficient in data access and storage with strict rules for access. Relational makes data access easier when not needing to be defined in advance. Thus each plays a different, critical role - best at what each is designed for. Hierarchical is best for mission-critical work requiring the utmost in performance. Relational is best for decision support where application productivity is required. Both embrace hierarchical XML data for Business to Business data interchange. Both continue to be enhanced to address different application requirements, creating more overlap in their capabilities. IBM continues to invest in both with complementary solutions. How IMS fits into your strategy-------------------------------------
move operational data to top bulletutmost perf as numberreal time mission critical work
all deal with mission critical info - in case of ims its real fast, subsecond response time for operations...
relational:warehousing (put data in warehouse)complex queries (against your warehouse)decision support (to help with your decision support)
one is for data warehousing and one is for operational
XML:
xml is popular for messaginxml as data is struggling to make promise a reality
two parts of xml: exchange of messages (metadata is encapsulated in a message)xml as data repository structure
xml:document exchange and storage
high level structure vs. unstructured in db2.ims is terrific repository of xml documents. if xml data is very structured and very hierarchinal in nature then ims is better fit.if very unstructured data in xml and you need the use of db2s unstructured indexing then db2 would suit you betterwe can put multiple xml metadata overlays in ims - we don't do anything to change existing database - your access with dl/i remains untouched. you can have two views - xml view and dl/i view of your db w/out changing or restructuring the data
Copyright IBM Corp. 2008 14Section Title in Header
© 2013 IBM Corporation14
IMS
IMS 12 on zEC12 provides superlative Security, Compliance,
Performance, Efficiency, and Industrial-Strength Transaction and
Database management
Revolutionize your IMS with zEC12!
Most secure system with 99.999% reliability Optimized data serving with largest cache in the industry Leadership in performance with z/OS using the 5.5 GHz 6 way processor chip Ability to process terabytes of data quickly Millions of transactions per day with sub second response times Faster problem determination with IBM zAware for improved availability Java exploitation of Transactional Execution for increased parallelism and scalability A 31% improvement to PL/I-based CPU intensive applications based on NEW Enterprise PL/I for z/OS and Updated C/C++ compilers Increased Performance through Flash Express of large pages via z/OS 1.13
Additional gains include: XML hardware acceleration; streamline and secure valuable SOA applications with IBM WebSphere DataPower Centrally monitored, controlled and automated operations across heterogeneous environments with IBM Tivoli Omegamon
IMS 12 on zEC12 shows up to 30% improvement in transaction rate
Upper left box we consider our “IMS zEC12 Mission Statement”
Box on right includes the key zEC12-specific value statements that also apply to IMS, as well as the three key features that IMS can/will exploit:Java exploitation of Transactional Execution for increased parallelism and scalability – for IMS Java applications on zEC12, clients will see both enhanced performance as well as reduced TCO through specialty engine offload capabilities
A 31% improvement to PL/I-based CPU intensive applications based on NEW Enterprise PL/I for z/OS and Updated C/C++ compilers – For IMS PL/I, C/C++ applications on zEC12, a recompile will lead to increased performance.
Increased Performance through Flash Express and pageable large pages via z/OS 1.13 exploitation – IMS exploits the Flash Express/Pageable Large Pages feature of z/OS 1.13 to improve performance and speed data access.
Lower left box shows additional gains that our clients can see by including specific hardware appliances or software.
And lastly, our current performance improvement metric for IMS 12 on the zEC12. Awesome. 12 on 12 leads to great synergy.
Copyright IBM Corp. 2008 15Section Title in Header
© 2013 IBM Corporation15
IMS
IMS DB in Perspective
Native Quality of Services
High Capacity HALDB & DEDB
High Availability IMS Data Sharing
Performance without CPU extra cost
1/2 the MIPS and 1/2 the DASD of relational
Application Development
Multi-language AD support COBOL, PLI, C, … JAVA
XML Support Decomposed or Intact
Java SQL support (JDBC) IMS Java
Access from CICS and IMS applications, from Batch
IMS since early days
Open Access and Data Integration
DRDA Universal Driver with IMS 11+ Open Database
Data Management
Metadata Management IMS 12+ Catalog
Advanced Space Management Capabilities
DFSMS family
Health Check Pointer validation & repair
Backup and Recovery Advanced Solutions
IMS Tools
Reorganization for better performance
IMS Tools
Enterprise Data Governance
Compression and Encryption InfoSphere Guardium Tools
Audit for every access InfoSphere Guardium Tools
Data Masking InfoSphere OPTIM Family
Creation of Test databases InfoSphere OPTIM Family
Data Growth Management InfoSphere OPTIM Family
Operational Business Analytics & Reporting
COGNOS & SPSS
Information Integration & Data Synchronization
Fast integration in Web 2.0 applications
IMS Open database
Data Federation InfoSphere Classic Federation
Replication to IMS – Towards Active / Active solution
InfoSphere IMS Replication
Replication to Relational InfoSphere Classic Replication Server & Classic CDC
Publication of DB Changes InfoSphere Classic Data Event Publisher
Copyright IBM Corp. 2008 16Section Title in Header
© 2013 IBM Corporation16
IMS
IMS Explorer for Development – View Examples
Much easier to
understand the database
structure
SQL & result sets
z/OS Perspective
Copyright IBM Corp. 2008 17Section Title in Header
© 2013 IBM Corporation17
IMS
IMS Explorer for Development – View Examples
Multiple Logically related databases
Manufacturing – assembly parts arrival time to assembly line
Copyright IBM Corp. 2008 18Section Title in Header
18
© 2013 IBM Corporation18
IMS
18
Easily refresh & maintain right sized non-production environments, while reducing storage costs
Improve application quality and deploy new functionality more quickly
Speed understanding and project time through relationship discovery within and across data sources
Understand sensitive data to protect and secure it
InfoSphere Optim solutionsManaging data throughout its lifecycle in heterogeneous environments
ProductionProduction
TrainingTraining
DevelopmentDevelopment
TestTest
ArchiveArchive
•Subset •Mask
•Compare•Refresh
Reduce hardware, storage & maintenance costs Streamline application upgrades and improve
application performance
Data Growth Management
Test Data Management
Data Masking Protect sensitive information from misuse & fraud Prevent data breaches and associated fines
RetireRetire
DiscoverUnderstand
Classify
Discover
Safely retire legacy & redundant applications while retaining the data
Ensure application-independent access to archive data
Application Retirement
IBM InfoSphere Optim Solutions allows you to manage data through its lifecycle in heterogeneous environments.
You may have a lot of data scattered around the organization – how do you find it? How do you know how it relates to other enterprise data? IBM InfoSphere Optim provides a solution to Discover the data and the relationships as information comes into the enterprise.
You need to develop applications and functionality that can best maintain your data – and you need to effectively test those applications. We provide a solution for DBAs, testers and developers to effectively create and manage right size test data while protecting sensitive test data in development and test environments.
The day-to-day challenges of managing the lifecycle of your data are intensified by the growth of data volumes. IBM InfoSphere Optim provides intelligent archiving techniques so that infrequently accessed data does not impede application performance, while still providing access to that data .IBM InfoSphere Optim provides a Data Growth solution that helps reduce hardware, storage and maintenance costs.
Over time, the applications managing your data will need to be upgraded, consolidated and eventually retired – but not the data. Many organizations today are over burdened with redundant or legacy applications – e.g. as organizations are merged/acquired, so are their IT systems.. By leveraging InfoSphere Optim’s Application Retirement solution and archiving best practices you can ensure access to business-critical data for long-term data retention, long after an application’s life-expectancy.
Copyright IBM Corp. 2008 19Section Title in Header
19
© 2013 IBM Corporation19
IMS
19
Managing Test Data in Non-Production – OPTIM Test Data Management
Create right-sized test environments, providing support across multiple applications, databases and operating systems
Deploy new functionality quicker and with improved quality & customer satisfaction
Compare results during successive test runs to pinpoint defects and errors
On z/OS: Support for DB2, IMS, VSAM
100 GB
Development
100 GB
Test 100 GB100 GBTraining
100 GB100 GBQA
Production orProduction Clone Subset
1 TB
http://www-01.ibm.com/software/data/data-management/optim/core/test-data-management-solution-zos
Once piece of Solution Delivery is managing Test Data. This means the creation of non-production environments including Test, Development and Training environments with the least amount of data in the fastest amount of time. As you can see in this example, our production database is made up of over 2 Terabytes of data. If we replicate this all over the place, we will have to pay for software and hardware to support it as well as just imagine the amount of duplication effort that is required in time and difficulty. Effective Test Data Management will allow me to build just what is needed, quickly, easily and significantly more cost effectively.
We begin with a production system or clone of productionOptim extracts the desired data records, based on user specifications, and safely copies them to a compressed file.It loads the file into the target Development, Test or QA environment. After running tests, and for DB2 only, it can compare the results against the baseline data to validate results and identify any errors. They can refresh the database simply by re-inserting the extract file, thereby ensuring consistency.
Copyright IBM Corp. 2008 20Section Title in Header
20
© 2013 IBM Corporation20
IMS
20
Data Masking and Protection - OPTIM Data Masking Option
Reduce risk of exposure during data theft– Fines and lawsuits– Avoid the negative publicity– Customer loss– Loss of intellectual property
Personal identifiable information (PII) is maskedwith realistic but fictional data for testing & development purposes.
http://www-01.ibm.com/software/data/data-management/optim/core/data-privacy-solution-zos/
De-identify for privacy protection
Deploy multiple masking algorithms
Provide consistency across environments and iterations
No value to hackers
Enable off-shore testing
On z/OS: Support for DB2, IMS DB, VSAM
Data Privacy protects an organizations data both in production and non-production environments with encryption and masking technology. Remember, we don’t keep the data from being stolen, but instead render the data unusable and of no value if stolen. This protects the business both financially and from loss of information and provides IT with a simple to use solution that supports a common way of protecting data across the enterprise.
Here is an example of the context aware data masking that IBM Optim Data Privacy can perform. The real credit card number is transformed to another seemingly valid credit card number that conforms to all the rule for a valid Visa credit card numbers (e.g.. it has 16 digits, starts with 4 etc.). Similarly the first and last name of the actual person “Eugene V. Wheatley” is transformed to a fictional yet valid name of “Sanford P. Briggs”.
IBM Optim Data Privacy provides the ability to easily perform this type of de-identification to your sensitive data automatically. Most importantly, it can do this data transformation in a way that is appropriate to the context of the application. That is, the results of data transformation have to make sense to the person reviewing the test results. For example, fields containing alphabetic characters should be substituted with other alphabetic characters, in the appropriate pattern. Also, the transformed data must be within the range of permissible values. For example, if your diagnostic codes are four digits long, but only range from 0001 to 1000, a masked value of 2000 would not be appropriate. Also if an address is needed, you would like to use a street address that actually exists as opposed to using something meaningless like XXXXXX as a street name.
This is called “context aware” data transformation or data masking and is a core capability of IBM Optim Data Privacy.http://www-01.ibm.com/software/data/data-management/optim/core/data-privacy-solution-zos/
IBM Optim Data Privacy comes with a multitude of these built in masking functions, as well as the ability to of course definite your own transformations. There is no longer a reason to needlessly expose your sensitive data in your test environments ever again.
Copyright IBM Corp. 2008 Section Title in Header
© 2013 IBM Corporation21
IMS
InfoSphere Optim Data Masking Solution / Option
Example 2Example 1
PersNbr FstNEvtOwn LstNEvtOwn27645 Elliot Flynn27645 Elliot Flynn
Event TableEvent Table
PersNbr FstNEvtOwn LstNEvtOwn10002 Pablo Picasso
10002 Pablo Picasso
Event TableEvent Table
Personal Info TablePersonal Info Table
PersNbr FirstName LastName08054 Alice Bennett19101 Carl Davis27645 Elliot Flynn
Personal Info TablePersonal Info Table
PersNbr FirstName LastName10000 Jeanne Renoir10001 Claude Monet10002 Pablo Picasso
Data masking techniques include:
String literal valuesCharacter substringsRandom or sequential numbers
Arithmetic expressionsConcatenated expressionsDate aging
Lookup valuesGeneric mask
Referential integrity is maintained with key propagation
Patient InformationPatient Information
Patient No. SSN
Name
Address
City State Zip
112233 123-45-6789
Amanda Winters
40 Bayberry Drive
Elgin IL 60123
123456 333-22-4444
Erica Schafer
12 Murray Court
Austin TX 78704
Data is masked with contextually correct data to preserve integrity of test data
Satisfy Privacy regulations Reduce risk of data breaches Maintain value of test data
Copyright IBM Corp. 2008 22Section Title in Header
© 2013 IBM Corporation22
IMS
Managing Data Growth in Production – OPTIM Data Growth
Segregate historical data to secure archive
Align performance to service level targets
Reclaim under utilized capacity
On z/OS: Support for DB2, IMS DB, VSAM
Current
Production
Historical
Selective Retrieval
Archived Data/Metadata
Reporting Data
Historical DataReference
Data
Selective Archive
Universal Access to Application Data
Application Application XML ODBC / JDBC
Here is how Optim works: This is a typical example of a Production environment prior to archiving. Both Active and Inactive data is stored in the Production environment, taking up most of the space on the Production Server.<Click>Optim safely moves the inactive or historical data to an archive, which can be stored in a variety of environments.<Click> Optim provides universal access to this data through multiple methods, including Report Writers such as Cognos and Crystal Reports, XML, ODBC/JDBC, application based access (Oracle, Siebel, etc.) or even MS Excel. <Click>Finally, data can be easily retrieved to an application environment when additional business processing is required. As you can see, Optim’s broadcapabilities for enterprise data management help give CIOs a comprehensive solution for dealing with data growth.
Copyright IBM Corp. 2008 Section Title in Header
© 2013 IBM Corporation23
IMS
InfoSphere Optim Application Retirement
Preserve application data in its business context
Retire out-of-date packaged applications as well as legacy custom applications
Shut down legacy system without a replacement
Infrastructure before RetirementInfrastructure before Retirement Archived Data after ConsolidationArchived Data after Consolidation
`
User Archive DataArchive Engine
`
User
`
User
`
User DatabaseApplication Data
`
User DatabaseApplication Data
`
User DatabaseApplication Data
24Copyright IBM Corp. 2008
Section Title in Header
© 2013 IBM Corporation24
IMS
Secure & Protect High Value Databases - Guardium Data Encryption
http://www-01.ibm.com/software/data/guardium/
Provides: z/OS integrated software support for data encryption Operating System S/W API Interface to Cryptographic Hardware
− CEX2/3C/4C hardware feature Enhanced Key Management for key creation and distribution
− Public and private keys− Secure and clear keys− Master keys
Created keys are stored/accessed in the Cryptographic Key Data Set (CKDS) with unique key label− CKDS itself is secured via Security Access Facility
ZSERIES Encryption
Using the CEX hardware accelerator provides minimal impact on performance.
Both Clear Keys and Secures are supported.
A clear key is not encrypted and can be found in a storage dump. This is not truly acceptable for security purposes. A Secure key is encrypted by the system master key while at rest AND will not be shown in storage or a dump of the system.
DES (Data Encryption Standard), TDES (triple or Telecommunications Data Encryption Standard) and AES (Advanced Encryption Standard) are all supported encryption algorithms. The DES supports 56 bit only and is not considered strong by today's standards. TDES and AES support 128 bit and are considered acceptable. AES also supports 192 and 256 bit encryption. The 256 bit is considered to be strategic.
An encryption exit is installed in the IMS Database definition at the segment level and implemented during an unload/reload of the database.
There are not changes to the application programs accessing the data
24
25Copyright IBM Corp. 2008
Section Title in Header
© 2013 IBM Corporation25
IMS
Secure & Protect High Value Databases - Guardium Data Encryption
Non-invasive architecture
Clear and Secure Keys
Hardware enabled = Minimal performance impact
Supports DES, TDES & AES algorithms
Supports 56, 128 & 256 bit encryption
Installed at the segment level
No DBMS or application changes
http://www-01.ibm.com/software/data/guardium/
Clear text
Cryptotext
ZSERIES Encryption
Using the CEX hardware accelerator provides minimal impact on performance.
Both Clear Keys and Secures are supported.
A clear key is not encrypted and can be found in a storage dump. This is not truly acceptable for security purposes. A Secure key is encrypted by the system master key while at rest AND will not be shown in storage or a dump of the system.
DES (Data Encryption Standard), TDES (triple or Telecommunications Data Encryption Standard) and AES (Advanced Encryption Standard) are all supported encryption algorithms. The DES supports 56 bit only and is not considered strong by today's standards. TDES and AES support 128 bit and are considered acceptable. AES also supports 192 and 256 bit encryption. The 256 bit is considered to be strategic.
An encryption exit is installed in the IMS Database definition at the segment level and implemented during an unload/reload of the database.
There are not changes to the application programs accessing the data
25
26Copyright IBM Corp. 2008
Section Title in Header
© 2013 IBM Corporation26
IMS
Secure & Protect High Value Databases - Guardium Real-Time Database Monitoring Non-invasive architecture
Heterogeneous, cross-DBMS solution
Does not rely on native DBMS logs
Minimal performance impact
No DBMS or application changes
Activity logs cannot be erased by attackers or rogue DBAs
Automated compliance reporting, sign-offs & escalations (SOX, PCI, NIST, etc.)
Granular, real-time policies & auditing
Single point of monitoring across DBMSs
DB2 & DB2/z
http://www-01.ibm.com/software/data/guardium/
IMS VSAM
Let’s talk about our solution!
Heterogeneous support for Databases and ApplicationsS-TAP Agents
lightweight cross platform supportNO changes to Databases or ApplicationsAlso monitor direct access to databases by privileged users (such as SSH console access), which
can’t be detected by solutions that only monitor at the switch level.Collectors handle the heavy lifting (continuous analysis, reporting and storage of audit data)
reduces the impact on the database serverOur solution does not rely on log or native audit data
DBAs can (sometimes have to!) turn this offLogging greatly impacts performance on the Database Server as you increase granularity!
Real-time alerting – not after the factMonitor ALL Access
26
27Copyrite IBM 2013
06/04/13
IMS
Secure & Protect High Value Databases - Guardium Real-Time Database Monitoring Architecture
28Copyrite IBM 2013
06/04/13
IMS
Here is shown an IMS BMP job that ran for 2 minutes. A jobname of TSTCMDDC accessed database AUECCMDD. You can also see the UserID and the PSB being used by the job. Under IMS Context column the calls in sequence made to the database are seen.
Secure & Protect High Value Databases - Guardium Real-Time Database Monitoring Sample report
29Copyrite IBM 2013
06/04/13
IMS
Secure & Protect High Value Databases - Guardium Real-Time Database Monitoring Sample reportHere is shown an IMS BMP job that ran for 2 minutes. A jobname of TSTCMDDC accessed database AUECCMDD. You can also see the UserID and the PSB being used by the job. Under IMS Context column the calls in sequence made to the database are seen.
30Copyright IBM Corp. 2008
Section Title in Header
30
© 2013 IBM Corporation30
IMS
Operational Business Analytics on IMS DataCognos Reporting with IMS 12 Benefits:
– Ad hoc reporting access– Report on data reflecting the most current state of the business– React faster to trusted data– Market-leading BI solution for IMS customers
Roadmap for customers– Cognos 10.2 & IMS V11: IMS 11 JDBC driver is NOT certified with Cognos 10.2.
• Even if Open database architecture is available.– Cognos 10.2 & IMS V12 : IMS 12 JDBC driver with Catalog is certified with Cognos 10.2.
• New functions that allow to get enhanced predicats exploited by Cognos• IMS catalog for z/OS centralized metadata management
Cognos BI
Report Authoring
Published
Reports
Cognos Framework
Manager
IMSConsumer
Author
JDBC
DataStore
DataModel
IMS Universal JDBC
31Copyright IBM Corp. 2008
Section Title in Header
31
© 2013 IBM Corporation31
IMS
Operational Business Analytics on IMS DataCognos Reporting with IMS 12
Avail with Cognos 10.2
Copyright IBM Corp. 2008 32Section Title in Header
32
© 2013 IBM Corporation32
IMS
Technologies are in Place for Mainframe Apps Extensibility
WAS
MQ
CICS & IMS Connectors
Data Warehouse
XMLAsset Mgmt
SOA
Business Processes
Compliance
Service Mgmt
Customersare here today
Technology is inplace to go here next
Hybrid ComputingWorkload Optimization
Analytics
ASM & Cobol & PLI & C
Java
Application Investm
ent Protection
zLinux & zBX
Business Rules
Copyright IBM Corp. 2008 33Section Title in Header
© 2013 IBM Corporation33
IMS
Big Data and IMS Databases
IMS integration with the BigInsights application connectors– Merge trusted OLTP data with the Big Data platform
Integrate IMS with the Big Data Machine Data Accelerator (MDA)– Correlate log records from off-platform application servers with IMS log records
Traditional ApproachStructured, analytical, logical
New ApproachCreative, holistic thought, intuition
Data Warehouse
Traditional Sources
StructuredRepeatableLinear
HadoopStreams
New Sources
UnstructuredExploratoryIterative
Web Logs
Social Data
Text Data: emails
Sensor data: images
RFID
Enterprise Integration
IMS IMS Operational Operational DataData
Transaction Data
Internal App Data
Mainframe Data
OLTP System Data
Copyright IBM Corp. 2008 Section Title in Header
© 2013 IBM Corporation34
IMS
An enterprise information hub on a single integrated platform
Transaction Processing
Systems (OLTP)
Predictive Analytics
Best in Analytics
Industry recognized leader in Business Analytics and Data
Warehousing solutions
Best in Flexibility
Best in OLTP & Transactional Solutions
Industry recognized leader for mission critical transactional
systems
Business Analytics
zEnterpriseRecognized leader in workload
management with proven security, availability
and recoverability
DB2 Analytics Accelerator for z/OS
Powered by Netezza
Recognized leader in cost-effective high speed deep
analytics
Data Mart
Data Mart Consolidation
Unprecedented mixed workload flexibility and virtualization providing the most options for cost effective consolidation
Data Mart
Data Mart
Data Mart
Ability to start with your most critical business issues, quickly
realize business value, and evolve without re-architecting
Best in Batch workload
Efficient execution environment close to the data with first class I/O Technology
Batch workload
Copyright IBM Corp. 2008 Section Title in Header
35
35Copyrite IBM 2013
IMS
Why Assess Information Maturity
Information Maturity is typically assessed to provide a snapshot in time of an organisation's ability to manage information, as defined by the maturity model
This is most often used to benchmark and compare maturity• Across time within an organisation• Between organisations
With the aim to improve the ability to manage information over time• Reduce the time needed to access information• Reduce stored information complexity• Lower costs through an optimized infrastructure• Gain insight through analysis & discovery• Leverage information for business transformation• Gain control over master data• Manage risk and compliance via a single version of truth
This implies that an information maturity assessment should be an ongoing activity
Copyright IBM Corp. 2008 Section Title in Header
36
36
IMS
A Model for Information Maturity: An Evolution for our ClientsB
usin
ess
Valu
e of
In
form
atio
n
Information Management Maturity
• Data: All relevant internal and external information seamless and shared. Additional sources easily added
• Integration: Virtualized Information Services• Applications: Dynamic Application Assembly• Infrastructure: Dynamically, re-configurable;
Sense & Respond
• Flexible, adaptive business environments across enterprise and extraprise
• Enablement of strategic business innovation• Optimization of Business performance and operations• Strategic insight
• Data: Seamless & shared; Information separated from process; Full integration of structured and unstructured
• Integration: Information Available as a Service• Applications: Process Integration via Services; in line bus
apps• Infrastructure: Resilient SOA; Technology Neutral
• Role-based, work environments commonplace• Fully embedded capabilities within workflow, processes &
systems• Information-enabled Process innovation • Enhanced Business Process & Operations Management• Foresight, predictive analytics
• Data: Standards based, structured & some unstructured• Integration: Integration of silos; Virtualization of Information• Applications: Services-based• Infrastructure: Component/Emerging SOA, Platform
Specific
• Introduction of contextual, role-based, work environments
• Enhanced levels of automation• Enhancement of existing processes and applications • Integrated business performance management• Single version of truth• Insight thru analytics, real-time
• Data: Structured content; organized• Integration: Some integration; silos still remain• Applications: Component-based applications• Infrastructure: Layered Architecture, Platform
Specific
• Basic search, query, reporting and analytics• Some automation• Disparate work environments• Limited enterprise visibility• Multiple versions of the truth
• Data: Structured content, static• Integration: Disjointed, Siloed, non-integrated solutions• Applications: Stand alone modules; application-dependent• Infrastructure: Monolithic, Platform Specific
• Basic reporting & spreadsheet- based • Manual, ad hoc dependence• Information overload • No version of truth• Hindsight based
Information as a Competitive Differentiator
Information to Enable Innovation
Information as a Strategic Asset
Information to Manage the
Business
Data to Run the Business1
2
3
4
5
Copyright IBM Corp. 2008 Section Title in Header
© 2012 IBM Corporation
IMS
37
Data Governance Workshop Key Steps
Conduct Interviews of Key IT/Business Leaders
and DG Council
Assess Data Governance Maturity and
Target Capabilities
Develop a Roadmap for Delivering Capabilities
Develop Recommendations
Next Steps
Identify Gap to Future State (18 months)
© 2011 IBM Corporation16
Information Maturity Assessment – Gap Summary
1. Organizational Structures and Awareness
2. Data Stewardship
3. Policy
4. Value Creation
5. Data Risk Management & Compliance
6. Information Security & Privacy
7. Data Architecture
8. Data Quality Management
9. Classification & Metadata
10. Information Life-Cycle Management
Optimizing
Level 5
11. Audit Information Logging & Reporting
Maturity Category Quantitatively Managed DefinedManagedInitial
Level 4Level 3Level 2Level 1
Scope of Services
Assess current state Determine fu ture state(in 12- 18 months)
Identify required capabilities and initiatives
Capability Gap
© 2011 IBM Corporation22
2011 2012 …..
Implementation Roadmap
1. Organizational Structures and Awareness
2. Data Stewardship
3. Policy
4. Value Creation
5. Data Risk Management & Compliance
6.Information Security & Privacy
July
OSA1: Communication and DG Ownership
OSA2: DGC undertaking critical projects
OSA3: Establish COE & Execution committee
OSA4: Data Stewards across Biz/IT areas
DS1: Stewards clearly identified/defined
DS2: Pilot program across departments
DS3: Data Stewards Accountability
POL1: Policy prioritization
POL2: Flushing Policy Details
POL3: Policies communication, enforcement and compliance
VC1: Develop DG Scorecard
VC2: Selective LOB projects using DG
VC3: Selective cross-LOB projects using DG
Assessment for baseline and establish Data Centric Security Reference Architecture
Vulnerability Assessment
Data Discovery (structured)
Activity Monitoring of current privileged user access to systems
Verify that Level 4 has started by comparing governance success with assessment findings for people and process.Adjust Privileged Access Rights
‘Sensitive’ Data PolicyDocument Controls in place mapped to requirements for data
security and compliance
* Risk Assessment for current Controls
Data Centric Security Architecture
Automated Activity Monitoring
Establish and Mandate De-Identification program for non -production systems (Test, QA, Dev)
Align perimeter & Identity controls with Activity Monitoring
Baseline Vulnerability Assessment
Pre Assessment Internal Survey
© 2011 IBM Corporation14
Pre-Workshop Survey Results - Executive Summary
© 2011 IBM Corporation21
Next Steps1. Communication of Workshop assessment results2. Validate Data Governance Plan and Objectives
Alignment of current business and IT initiatives with IBM workshop assessment Prioritize Data Governance initiatives and integrate with planned project sequence;
for short term and long term 3. Create Discover Roadmap; with prioritized initiatives4. Implement a Data Governance Project Management Office
Obtain Executive sponsorship Define structure, responsibilities, and identify core team Define quality metrics and reporting
5. Conduct Detailed workshop / Execution of prioritized initiatives. E.g. Data Quality, Classification and Metadata Management
Adopt Metadata Driven Data Governance in IT Acquire Metadata Management, Analysis, and Quality Tools Analyze current data quality
Implement Process Improvement for Data Quality
6. Define the metrics to identify how the business realizes returns on investment in the collection, production, and use of data.
7. Identify areas where additional consulting would accelerate timeline.
©2011 IBM Corporation8
Key Observations and Opportunities
EfficiencyData Integrity
Policy, Standards, Data definition1
Monitored Data Quality (early Risk ID)
Quantified RiskMetrics – Data Quality, Business Impact4
Data and analytics optimization for the business
Higher ROI / faster payback
Value Creation process ( Enterprise and LOB)3
OpportunityObservationREF
Cost avoidanceRisk mitigation
Unstructured content7
Risk mitigation & complianceInternal data access and sensitive data location/control
6
OwnershipEfficiency
Data Quality
Organizational Effectiveness and Data Accountability (Stewardship)
5
Communication
SME availabilityLevel of influence
Organizational Awareness and Enterprise Solutions
2
Copyright IBM Corp. 2008 Section Title in Header
© 2013 IBM Corporation
Information Governance – Company’s Assessment results
Data Quality
Management/Discovery
Information Life-Cycle
Management
Information Security
and Privacy
Core Disciplines
Data Risk Management & Compliance
Business Outcomes / ReportingValue Creation
Data Architecture
Classification & Metadata
Audit Information Logging & Reporting
Supporting Disciplines
Organizational Structures & Awareness
Enablers
Policy Data Stewardship
Requires
Supports
Enhance
38
Copyright IBM Corp. 2008 Section Title in Header
© 2013 IBM Corporation
Information Governance Maturity Assessment current and target mapping
Assessed current state
Planned future state
Prioritized* Domains with Recommended Action Plan
Prioritization has been made by Workshop attendees based on :
• Highest gaps between current and to-be positions
• Evaluation of acceptance capability / Feasability by the Organization
Copyright IBM Corp. 2008 40Section Title in Header
© 2013 IBM Corporation40
IMS
IMS and Data Governance
Palisades, New York: May 14-15, 2013Chicago: May 21-22, 2013São Paulo, BR: June 3, 2013Costa Mesa, CA: June 4-5, 2013Boeblingen, DE : June 5, 2013Taipei, Taiwan: June 2013United Kingdom: June 2013Charlotte, NC: June 25-26 2013 (tentative)
Messaging & Transaction managerAn IMS control program receives a transaction request, stores the transaction on a message queue (in memory
or in a shared structure), and then invokes its scheduler to start the business application program in a message processing region.
The message processing region retrieves the transaction from the IMS message queue and processes it, reading and updating resources like IMS Databases, DB2 databases and WebSphere MQ queues ensuring proper management of the transaction scope.
The IMS application itself decides to send a response message back, to start another IMS transaction asynchronously or to access synchronously a set of services.
IMS Batch ManagerA very important advantage of the IMS environment is to provide an imbedded batch container available for both type of
configuration DBCTL or DCCTL.For batch workload, IMS plays the role of syncpoint manager and provides very important backup/restart capabilities with
repositioning of resources at the latest checkpoint. IMS coordinates resource access while protecting data integrity, and allows parallel access between transactional workload and batch workload based on efficient locking mechanisms provided by resources managers.
Batch processing regions are called BMP for non-Java applications or JBP for Java batch applicationsFor historical reason, IMS is still supporting a standalone batch environment that runs in a single address space for batch
application or utilities. This environment is not covered in this document even if it’s still used a lot by some customers who didn’t need over years parallel processing between online and batch.
Copyright IBM Corp. 2008 41Section Title in Header
© 2013 IBM Corporation41
IMS
IMS and Data Governance
Data governanceRegulation complianceAvoiding media embarrassmentCompetitive edge
IMS Enterprise Suite V2.2 ExplorerInfosphere Optim for Test Data ManagementInfosphere Optim for data and application retirementInfosphere Guardium for data protection
Encryption of IMS dataS-Tap monitors
Data Maturity assessment workshopInformation Governance Wildfire Workshops
Messaging & Transaction managerAn IMS control program receives a transaction request, stores the transaction on a message queue (in memory
or in a shared structure), and then invokes its scheduler to start the business application program in a message processing region.
The message processing region retrieves the transaction from the IMS message queue and processes it, reading and updating resources like IMS Databases, DB2 databases and WebSphere MQ queues ensuring proper management of the transaction scope.
The IMS application itself decides to send a response message back, to start another IMS transaction asynchronously or to access synchronously a set of services.
IMS Batch ManagerA very important advantage of the IMS environment is to provide an imbedded batch container available for both type of
configuration DBCTL or DCCTL.For batch workload, IMS plays the role of syncpoint manager and provides very important backup/restart capabilities with
repositioning of resources at the latest checkpoint. IMS coordinates resource access while protecting data integrity, and allows parallel access between transactional workload and batch workload based on efficient locking mechanisms provided by resources managers.
Batch processing regions are called BMP for non-Java applications or JBP for Java batch applicationsFor historical reason, IMS is still supporting a standalone batch environment that runs in a single address space for batch
application or utilities. This environment is not covered in this document even if it’s still used a lot by some customers who didn’t need over years parallel processing between online and batch.
Data Governance for System z Workshop(DGSYSZ)
Palisades, NY May 14-15, 2013Chicago, IL May 21-22, 2013
Costa Mesa, CA June 4-5, 2013
This workshop like all Wildfire Workshops is offered at no-fee to qualified customers.
IBM Advanced Technical Skills Wildfire Workshop
With the complexity of today’s information ecosystems, organizations must improve the level of trust users have in information, ensure consistency of data, and establish safeguards over information. When information is trusted, business can optimize outcomes.Join us for one and a half days at the IBM Data Governance for System z Workshop. Meet with experts to understand business and IT implications of Data Governance, Real Time Analytics, and Operational Data Warehousing, and learn how the IBM System z platform can help you meet, simplify, and reduce the cost of meeting your data governance requirements.
Workshop Topics:• Drivers of Information Governance• Data & Information Governance, What are They?• Enablers of Enterprise Data Governance Strategy
− Policy− Data Stewardship− Organizational Structure & Awareness
• Pillars of Data Governance− Data Quality Management− Information Life Cycle− Security, Privacy, & Compliance− Master Data Management
• Enterprise Data Governance on System z• Data Architecture on System z • Role of DB2 for z/OS & IMS in Data Governance• Operational Analytics & Real Time Analytics on System z• Data Governance Assessment
Audience: Attendance of this workshop is recommended for Chief Technology Officers, Architects, Data Stewards, IT Management, Owners of Business Analytics, Data Warehouse Owners, Line of Business Application Owners, and DBA Management, and Test & Development Management.Enrollment:To enroll please work with your IBM sales representative and enroll together by visiting the following website: https://www.ibm.com/servers/eserver/zseries/education/topgun/enrollment/esfldedu.nsf/0/0D284179789982B5852578B8004C07B4?EditDocumentFor more information on enrollment or for other Wildfire administration questions, contact Judy Vadnais-Keute at [email protected] , and for more information on this Data Governance for System z Workshop, please contact Peter Kohler at [email protected]
Copyright IBM Corp. 2008 Section Title in Header
© 2012 IBM Corporation
43
IMS
5/22/13 IBM Smart Analytics System 9600