metavista proposal 10-014-its final

111
A Proposal For California Prison Health Care Services Information Technology Services Division PO Box 4038 Sacramento, CA 95812-4038 June 7, 2010 Submitted By MetaVista Consulting Group 2411 15 th Street, Suite A Sacramento, CA 95818-2264 Main Office: (916) 296-2290 Fax: (916) 443-4830 Information Technology Consulting Services Enterprise Data Warehouse Services RFO # 10-014-ITS

Upload: alexander-dore

Post on 29-Jan-2018

124 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: MetaVista Proposal 10-014-ITS Final

A Proposal For

California Prison Health Care Services Information Technology Services Division

PO Box 4038 Sacramento, CA 95812-4038

June 7, 2010

Submitted By

MetaVista Consulting Group 2411 15th Street, Suite A

Sacramento, CA 95818-2264 Main Office: (916) 296-2290

Fax: (916) 443-4830

Information Technology Consulting Services Enterprise Data Warehouse Services

RFO # 10-014-ITS

Page 2: MetaVista Proposal 10-014-ITS Final

2 4 1 1 1 5 T H S T R E E T • S U I T E A • S A C R A M E N T O • C A L I F O R N I A • 9 5 8 1 8 - 2 2 6 4

M e t a V i s t a C o n s u l t i n g G r o u p . w w w . m e t a v i s t a . c o m . ( 9 1 6 ) 2 9 6 - 2 2 9 0

June 7, 2010

Cynthia Basa-Pinzon California Prison Health Care Services P.O. Box 4038 Sacramento, CA 95812-4038

Re: Request for Offer (RFO) 10-014-ITS

Dear Ms. Basa-Pinzon:

The MetaVista Consulting Group is pleased to submit this Proposal in response to RFO 10-014-ITS for Information Technology Consulting Services: Enterprise Data Warehouse Services, dated May 13, 2010.

We understand the importance of creating an enterprise data warehouse for both CPHCS and CDCR in general. The Receivership and its many initiatives has high visibility within and outside of California state government, and of course a major underlying objective is to meet acceptable and sustainable levels of inmate-patient medical care.

Our proposed team has exceptional qualifications for performing the work described in the RFP; the team includes two senior data warehouse / business intelligence consultants, supported by a staff with a broad range of skills from IT Project Management, GUI development, business analysis, and infrastructure implementation. We are confident that our team and our approach will meet or exceed CDCR’s requirements, and that our solution will provide the best overall value to the State.

We at MetaVista appreciate very much the opportunity to assist CDCR with this project, and we look forward to contributing to its success. If you have any questions about our proposal, please contact me at 916-549-1800 or by email at [email protected]. Or, contact Jay Jackson at 916-295-6074 or by email at [email protected].

Sincerely,

Charles A. Ritchie, PMP CIO and Principal Consultant MetaVista Consulting Group

The above signed is authorized to bind MetaVista contractually. He holds the title and position of Chief Information Officer.

critchie
Ritchie Small Transparent
Page 3: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

ii

Table of Contents Table of Contents ..........................................................................................................................ii  MetaVista MSA Contract .............................................................................................................. 1  Proof of Workers Compensation Insurance.................................................................................. 7  Completed Rate Sheet (Exhibit B-1)............................................................................................. 8  Offeror Declaration Form (GSPD-05-105), Attachment A .......................................................... 10  Small Business Certification ....................................................................................................... 11  Payee Data Record (STD 204), Attachment B ........................................................................... 12  MetaVista’s Data Warehouse Methodology................................................................................ 13  

Data Warehouse Project Steps ........................................................................................... 14  An Introduction to Next Generation Enterprise Data Warehouses ...................................... 18  Reference Methodologies, Processes and Standards ........................................................ 20  

Approach for Completing Requested Services........................................................................... 23  Introduction ............................................................................................................................. 23  Response to Scope of Services .............................................................................................. 23  Specific Deliverables............................................................................................................... 24  

1. Project Management Plan ............................................................................................... 24  2. Reporting Requirements and Custom Reports................................................................ 26  3. Performance Metrics – Key Performance Indicators ....................................................... 29  4. Strategy Validation .......................................................................................................... 31  5. Solution Design ............................................................................................................... 32  6. Technical Roadmap......................................................................................................... 40  7. Information/Data Architecture, Policies, and Standards .................................................. 42  8. Data Security Architecture, Policies, and Standards ....................................................... 44  9. Resource and Transition Plans ....................................................................................... 44  10. Business and Technical Implementation Plan............................................................... 45  11. Data Cleansing, Extraction, Transformation, and Loading Plan.................................... 47  12. Data Warehouse Governance Plan............................................................................... 48  13. Disaster Recovery Plan ................................................................................................. 50  14. Service Level Agreement(s) .......................................................................................... 51  15. Training and Knowledge Transfer ................................................................................. 51  16. Other Data Warehouse, Business Intelligence, and Reporting Needs.......................... 52  17. Monthly Status Reports ................................................................................................. 53  

Assumptions ............................................................................................................................... 54  MetaVista’s Expertise and Experience ....................................................................................... 55  

Page 4: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

iii

Representative Projects .......................................................................................................... 55  Customer References................................................................................................................. 57  

Reference 1: Access to Care Project (CPHCS) ...................................................................... 57  Reference #2: IT Consulting Services for FASIMS (CDFA).................................................... 59  

Reference 3 – Configuration Data Service (Wells Fargo Bank) .......................................... 61  Consultant Qualifications ............................................................................................................ 63  

The MetaVista Team............................................................................................................... 63  Team Member Introductions................................................................................................ 63  Team Member Roles ........................................................................................................... 65  Mandatory Qualifications ..................................................................................................... 66  Desirable Qualifications....................................................................................................... 68  

Resumes..................................................................................................................................... 70  Charles A. Ritchie, PMP, MCTS ............................................................................................. 70  Alexander Doré ....................................................................................................................... 76  Suresh Chellappa, PMP.......................................................................................................... 84  Nipesh Shah............................................................................................................................ 91  Errol Thomas, PMP, CBAP ..................................................................................................... 97  

IT Certifications......................................................................................................................... 102  

Page 5: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 1 of 107 MetaVista Consulting Group Updated: 2010-06-07

MetaVista MSA Contract Note: MetaVista’s MSA contract was originally issued under the name Public Sector Consultants, and was modified in 2009 to reflect our current company name.

Page 6: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 2 of 107 MetaVista Consulting Group Updated: 2010-06-07

Page 7: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 3 of 107 MetaVista Consulting Group Updated: 2010-06-07

Page 8: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 4 of 107 MetaVista Consulting Group Updated: 2010-06-07

Page 9: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 5 of 107 MetaVista Consulting Group Updated: 2010-06-07

Page 10: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 6 of 107 MetaVista Consulting Group Updated: 2010-06-07

Liability Insurance Certificate

Page 11: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 7 of 107 MetaVista Consulting Group Updated: 2010-06-07

Proof of Workers Compensation Insurance

Page 12: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 8 of 107 MetaVista Consulting Group Updated: 2010-06-07

Completed Rate Sheet (Exhibit B-1) Service Description / Deliverable Personnel

Classification1 Est. Hours Per

Service / Deliverable Deliverable

Price

Del 1 - Plans

Sr. Project Manager Sr. Technical Lead Sr. Programmer Technical Lead

462.5 $48,370

Del 2 - Reqs & Rpts for P1 See Deliverable 1 252.0 $26,355

Del 3 - Reqs & Rpts for P2 See Deliverable 1 180.0 $18,825

Del 4 - Reqs & Rpts for P3 See Deliverable 1 180.0 $18,825

Del 5 - Perf Metrics P1 See Deliverable 1 296.0 $30,957

Del 6 - Perf Metrics P2 See Deliverable 1 223.0 $23,322

Del 7 - Perf Metrics P3 See Deliverable 1 223.0 $23,322

Del 8 - Gap Analyses See Deliverable 1 830.0 $86,804

Del 9 - Portal Design See Deliverable 1 390.0 $40,788

Del 10 - Reporting Design See Deliverable 1 260.0 $27,192

Del 11 - BI Design See Deliverable 1 1,007.5 $105,368

Del 12 - Data Marts Design See Deliverable 1 1,040.0 $108,767

Del 13 - Enterprise DW Design See Deliverable 1 1,040.0 $108,767

Del 14 - Data Design See Deliverable 1 650.0 $67,979

Del 15 - Software Design See Deliverable 1 1,052.5 $110,074

Del 16 - Hardware Design See Deliverable 1 617.5 $64,580

Del 17 - Tech Roadmap P1 See Deliverable 1 100.0 $10,458

Del 18 - Tech Roadmap P2 See Deliverable 1 100.0 $10,458

Del 19 - Tech Roadmap P3 See Deliverable 1 100.0 $10,458

Del 20 – Info / Data Arch See Deliverable 1 200.0 $20,917

Del 21 – Info / Data Policies See Deliverable 1 200.0 $20,917

Del 22 – Info / Data Stds See Deliverable 1 200.0 $20,917

Del 23 - Data Security Arch See Deliverable 1 160.0 $16,733

Del 24 - Data Security Prac See Deliverable 1 40.0 $4,183

Del 25 – Staffing Res Plan See Deliverable 1 200.0 $20,917

1 MetaVista anticipates that more than one team member will participate in the development and/or review of virtually all project deliverables. The estimated hours and deliverable price shown here are derived from a more detailed Cost Worksheet not included in this proposal, which allocates the hours across each deliverable and role.

Page 13: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 9 of 107 MetaVista Consulting Group Updated: 2010-06-07

Service Description / Deliverable Personnel Classification1

Est. Hours Per Service / Deliverable

Deliverable Price

Del 26 - Transition Plan See Deliverable 1 400.0 $41,833

Del 27 - Implementation Plans See Deliverable 1 970.0 $101,446

Del 28 - Data Related Plans See Deliverable 1 480.0 $50,200

Del 29 - DW Governance Plan See Deliverable 1 120.0 $12,550

Del 30 - DW Solution DR Plan See Deliverable 1 60.0 $6,275

Del 31 - CDCR / CPHCS SLA See Deliverable 1 60.0 $6,275

Del 32 - EIS / OTech SLA See Deliverable 1 60.0 $6,275

Del 33 - Training / KT Plans See Deliverable 1 60.0 $6,275

Del 34 - End User Training P1 See Deliverable 1 60.0 $10,458

Del 35 - End User Training P2 See Deliverable 1 60.0 $10,458

Del 36 - End User Training P3 See Deliverable 1 60.0 $10,458

Del 37 - Other Needs, Item 1 See Deliverable 1 50.0 $5,229

Del 38 - Other Needs, Item 2 See Deliverable 1 50.0 $5,229

Del 39 - Other Needs, Item 3 See Deliverable 1 50.0 $5,229

Del 40 - Other Needs, Item 4 See Deliverable 1 50.0 $5,229

Del 41 - Other Needs, Item 5 See Deliverable 1 50.0 $5,229

Del 42 - Monthly Status Reports See Deliverable 1 800.0 $83,667

TOTALS 13,564 $1,418,568

Other Itemized Costs (if allowed) $ 0

Total Costs $ 1,418,568

Page 14: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 10 of 107 MetaVista Consulting Group Updated: 2010-06-07

Offeror Declaration Form (GSPD-05-105), Attachment A

Page 15: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 11 of 107 MetaVista Consulting Group Updated: 2010-06-07

Small Business Certification

Page 16: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 12 of 107 MetaVista Consulting Group Updated: 2010-06-07

Payee Data Record (STD 204), Attachment B

Page 17: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 13 of 107 MetaVista Consulting Group Updated: 2010-06-07

MetaVista’s Data Warehouse Methodology MetaVista proposes, and its consultants have extensive experience with, traditional waterfall and iterative methodologies for developing Enterprise Data Warehouses.

MetaVista’s approach is to consider the basic strategies that provide a business foundation for the development of a data warehouse fusing healthcare and incarceration environments. MetaVista will plan, document and design the four levels of analytical processing that drives the evolution of the data warehousing process. MetaVista will tailor-build a mature, adaptive and reusable data warehouse project methodology to tackle this unique task for the State of California corrections health management services. MetaVista’s methodology understands that data warehouses are not an end in themselves but merely a step on the path to the intelligence data warehouse superstore.

MetaVista’s methodology will insulate the State from common issues that historically have impacted data warehouse projects. The introductory material in the following pages alerts the reader of how data quality erodes the integrity and accuracy of Business Intelligence (BI) results and reports. This material also alerts the reader of how uncontrolled BI “feature-creep” places costly and time-consuming burdens on the ability of the system to perform. The result often causes unintelligible and confusing business metrics saturating the users with more data than they can productively use.

MetaVista can greatly reduce typically data warehouse project risks. MetaVista builds in pre-programmed alerts to control scope creep and schedule slippage into our tailored SDLC methodology that has the ability and flexibility to drive complex projects. MetaVista will bring over fifteen years of data warehousing experience and “lessons learned” to the table.

Page 18: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 14 of 107 MetaVista Consulting Group Updated: 2010-06-07

Data Warehouse Project Steps Step 1: Data warehouse planning: Project Initiation

Deliverable Content and Approach

The planning activity is the most important aspect of the project. It is key to selecting the relevant tasks and activities. This step ensures the project is correctly classified and comprehensible for the non-technical and technical audience. MetaVista suggests using three basic dimensions to assist in classifying the project, all of which have great impact on the architecture, development costs and expected delivery. MetaVista recommends a lean documented process to alleviate the burden of process intensive management to a more Agile perspective, still maintaining the accountability of Waterfall SDLC.

1. The project scope: this includes determining the scope of the data from single and multiple sources and various subject areas, the role of technology, such as emergent use of web services and BI tools, and the temporal considerations of whether reporting is to be highly available and how frequently it is needed to add the greatest value.

2. The business reason: this clarifies the business drivers and the business problems to be solved from leveraging a new relationship with the current operational data store with a new decision support system made up of a data warehouse, possibly some specialized data marts, a business intelligence portal running on web services and some yet undefined middleware application and a paperless integrated reporting device that might also be web based.

3. The strategic, tactical and operational taxonomy: understanding the participants and their roles, assumptions and constraints and the overall architectural framework that is governable and maintainable that endure short and/or long-term consequences.

Page 19: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 15 of 107 MetaVista Consulting Group Updated: 2010-06-07

Step 2: Architecture Definition This step entails developing the high-level blueprints (plans) that establish the technical and application infrastructure for a data warehouse and decision support system. MetaVista recommends following one of two approaches.

1. Develop Zachman’s architectural suite in cohesive Conceptual, Logical, and Physical views

2. Develop Kruchten’s 4+1 architectural suite in cohesive Logical, Process, Physical, and Development views

Either of these approaches are excellent when an organization is facing an overwhelming information need and confronting a significant business reengineering exercise. The architecture definitions serve to divide up the activities into manageable yet high business value elements.

The architecture is meant to ensure that the various components work together. It is important to define and prepare for the ensuing technical environment early on by defining the following criteria:

Source: Where will the data come from, in terms of system?

Transport: What topology will the data be carried over?

Destination: What system is the data heading for?

Metadata: How will data definitions, relationships, business rules, and transformations be stored and accessed?

Access: How will the data be viewed by the end-user?

Transform: How will the data be transformed (ETL) and how will the metadata be built?

Step 3: Decision-Maker Needs This step represents the conceptual data-driven design of data access and information requirements. Measurements of the business drivers must be identified. These metrics can then be turned into requirements for information, requirements for business processes, and requirements for data access. The architecture sets out the priorities to map out the requirements breakdown.

• Information requirements are generally derived from dissemination of the core primitive data elements and whether the data is calculated on or summarized.

• Business process requirements focus on the ability of a BI application or tool to manipulate the warehoused data to produce the right business answer.

• The access requirements present the processes used that the end-users desire to get the data.

Step 4: Subject Area Analysis

Information requirements are refined in this step. Both the level of detail is identified as is the subject matter, and the content of the subject area is verified and the data warehouse model is initiated. Refining the scope is critical to avoiding subject area overlap. The scope and content of the data required is confirmed by validating the subject area with the information requirements. The data model can be normalized or denomalized. The level of detail affects the

Page 20: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 16 of 107 MetaVista Consulting Group Updated: 2010-06-07

summarization, which is the optimal data to be viewed. Development of preliminary summary tables is essential to allow for analysis maturation. Early measurements usually are expressed in dimensions and periodicity. Populating and maintaining domain values can occupy a major part of project resources.

Step 5: Source System Analysis

This step determines where the data will come from. The analysis in this step is fundamental to ETL architecture and data quality. Candidate data sources must be scrutinized for integrity, quality and problematic characteristics. A preliminary mapping is evolved in this step and refinements are made to the master data model as appropriate. These steps must not be trivialized and are often the cause of catastrophic project failure.

It is essential in this step to also identify all integrity, processing exceptions and inherent operational problems. Due diligence must be exercised to evaluate and weigh quality, accuracy and timeliness of the candidates. Additional domains will evolve as source systems are explored for value. Transformation will be necessary to adapt to new codes as the data set grows. Metadata is also addressed in this step and preliminary mapping gives insight to the extent to which metadata is to be used and stored.

Historical availability is paramount to this step and it must be determined if historical conversion is required at this time. The system of record is identified and documented for its legacy attributes. It is essential not to overlook control requirements, which ensure that all of the data goes to where it is supposed to reside.

Data quality consists of a number of indicators and a governance program must be instituted and enforced to conserve the quality. Some of the major attributes required is accuracy, type managed, integrity, consistent, table optimization, non redundant, adherent to business rules, corresponds to established domains, timely, well understood, integrated, satisfies business needs, satisfies user needs, complete, no dupes, anomalies flagged and contamination eradicated. A purification process is essential to support these indicators. Surveys can enable development assess if the end-user is satisfied.

Step 6: Transform Design

This step produces the processes for getting data from the source to its destination while maintaining accuracy and integrity. In this stage using the preliminary mapping transformation routines can be determined. Cleansing routines are often interjected in this step, which are more efficient as compared to making costly repairs to legacy systems. It is important to maintain all documentation for audit purposes in order not to lose the basis of analysis. The design routines for transforming data are identified in a transformation specification. At this juncture it is important to address a technology approach whether to utilize a change data capture (CDC) or a snapshot approach. This is not a trivial undertaking and can amount to 50% of the components that must be developed.

Use and maintenance of code tables or association tables used for transformation must be specified in this phase. It often takes multiple segments to populate one specific subject area. Another major task in the step is to build control design and audit routines. In the summarization process, business measurements are identified. At this point summarization routines can be applied internally or externally to the warehouse generally where the metadata is captured. The complexity of the historical conversion process must be determined at this stage as format

Page 21: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 17 of 107 MetaVista Consulting Group Updated: 2010-06-07

changes and data definition can seriously affect the economics of maintaining or scraping this from the requirements. This is also the point where developing a test data set can be leveraged the best. A preliminary pass at summary table layout can be made at this point by creating tables to hold the key business measures and their dimensions.

Step 7: Physical Database Design

This step describes the physical specifications and set up of the data warehouse repository. This includes all control tables, granular tables, fact tables, fact-less fact tables, and domain tables. The star, snowflake or star-snowflake schema is rendered in an entity relationship model. The platform in all probability will be designated as Oracle Data Warehouse.

A common table (one size fits all) or separate tables for each domain is decided upon that is best suited for the platform of choice, depending on performance criteria. Summarization tables by default improve performance. Star, snowflake joins and granular fact tables cause controversy requiring an expert level of DBA competence as the decisions made in the design severely affect runtime operations, performance and maintenance.

Another important task is to specify and create cohesive indexes, another performance related issue. Inverted indexing is an option that cuts down huge query time by as much as 80%. Backup and recovery guidelines are essential to specify at this juncture. Mission critical settings considerably complicate this process.

Step 8: End-User Needs Design

End-user access steps are considered a discipline within a discipline. It can be automatically assumed that some type of user portal is always required for a data warehouse. Apart from stating the obvious, in that there is an array of methods and tools to render data, metadata access is often left out of design. It is imperative when creating an efficient BI system that it should be considered mandatory that access to transformation rules, data and measurements definitions, summarization calculations and data change history be provided.

End-user access definition requires the design and development of stored procedures for specific technical components. The only unique component that is required at this stage addresses batch processing. Designing the facilities for scheduled report production is often complex. End-user access development features the development and design of unit tests of end-user access panels and reports. A mature design incorporates validating the use of the data in conjunction with accessing it. End-user access evolves along with the use of the data.

Step 9: Data Warehouse and Data Mart Detail Design Development

This step covers the development of the mechanism to build and maintain the target data warehouse and data marts. The goal of this step is to produce the processes that build the data warehouse and verify that the transformation rules and calculations are working accurately. Additionally metadata is loaded into the data warehouse repository and should be available. At this juncture the business aspects are still prime and exceed the technical aspects in importance.

Page 22: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 18 of 107 MetaVista Consulting Group Updated: 2010-06-07

Step 10: Data Warehouse Implementation and Test

This step involves initial loading of source data. It is critical that user verification by Subject Matter Experts (SMEs) takes place as a proof of concept. Specialized code can then developed or generated during transformation development including unit testing for each component. Components that create summaries or aggregations are also developed, ideally after the granular transformations are developed. This completes the development and assembly of the transformation process including batch job controls, scripts, etc.

Step 11: Data Warehouse Deployment

The final step includes the activities required to bring the data warehouse online and make it available for production use; fully converted, transformed, accepted, and operational. The various rollout steps and training should be completed. Population run is the key activity in this phase, which may include a historical conversion, or just the initial load for the new data warehouse. Control and audit verification occurs at this point. The controls must be deemed effective enough to prevent data errors occurring without detection. Data training and access training are essential at this juncture. The user acceptance test is the result of combining final verification of data content (detail and summary) along with the access to that data. Sign off occurs here. Finally control and audit verification takes place to ensure that the internal controls built into the data warehouse build process are adequate.

An Introduction to Next Generation Enterprise Data Warehouses MetaVista’s enterprise data warehouse plans focus on how its clients can take advantage of the next generation platforms without diverging from the more traditional approach. One of the disadvantages of taking the technology and new architectural leap is the lack of well-honed resources. Often the advantages of leading edge do not necessarily outweigh the drawbacks of developing not only new technology but taking on a complete cultural challenge. MetaVista’s planning seeks to leverage the best attributes of traditional approaches with the advantages of emerging platforms.

Many new options for data warehouse platforms have appeared this last decade. We’ve seen the emergence of new categories of data warehouse (DW) platforms, such as data warehouse appliances and software appliances. A new interest in columnar databases has led to several new vendor products and renewed interest in older ones. The Linux operating system is now common in data warehousing, and open source databases, data integration tools, and reporting platforms have now established a firm presence. In the hardware realm, 64-bit computing has enabled larger in-memory data caches, and more vendors now offer MPP architectures. Leading database vendors have added more features and products conducive to data warehousing.

Those are mostly features within the data warehouse platform, especially its database. There are also growing practices that are demanding support from the platform, including real-time integration between the data warehouse platform and operational applications, advanced analytics, and reusable interfaces exposed through Web services or service-oriented architecture (SOA). Furthermore, a number of data warehouse platforms and other business intelligence platforms are now readily available through software-as-a-service (SaaS) and cloud computing.

First Generation Business Intelligence Plans

MetaVista’s planning approach to first generation business intelligence objectives focuses on a

Page 23: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 19 of 107 MetaVista Consulting Group Updated: 2010-06-07

step-by-step opening information within an organization to end-users. Enterprise data warehouses, data marts, data mining, content management, and data retrieval all evolved to solve the problem of poor knowledge sharing. While this is better than ad-hoc reporting, first generation business intelligence fails to capture the knowledge life cycle of the business. It’s not the data that has intelligence, it’s the process but it’s a start in the right direction.

First generation business intelligence requires the use of technology to gain control of knowledge. Strategies for coding, transfer and transformation of data form the foundation to bring about sense out of the chaos of data that exists in every business. Frequently, there is very little understanding of the human process that produced the data in the first place. This knowledge is found in the legacy systems, memories of the users, business process and the manual life cycle of that process.

Knowledge Life Cycle Plans Before embarking in second generation BI and the reliance on technology, MetaVista’s planning methods are to document the human act of conducting business, where knowledge is socialized within subjects of the business. MetaVista will document claims of knowledge, rules and actions that are maintained in both verbal and written form where there is often no attempt to validate knowledge as true or incorrect because there is no easy way to relate data to a manual process. MetaVista’s concepts are based on the fact that the knowledge life cycle recognizes the natural taxonomy of knowledge. Existing knowledge needs to be maintained in a development knowledge warehouse in a way that allows new knowledge claims to ultimately become integrated into the company’s knowledge base. To bring this knowledge forward, a view of the subject life cycle can be built, documented for rules and claims and then validated with data.

The Impact of the Knowledge Process on Data Warehousing Plans

As in the step-by-step development of first and second generation BI the same applies to the data warehouse. First generation data warehousing and business intelligence does not address this vast amount of knowledge and does not have a formal process to manage the information.

Data that is collected, transformed, mined and reported in a first generation data warehouse is, by definition, historical, static and in today’s market old news. The primary function of this data is to manage the business with after the fact information. One reason that data warehouse projects fail to live up to management expectations initially is the failure to understand the relationship of the data to the business process and the fact that data warehousing starts dumb and is only made intelligent by blending data types and leveraging new technology.

Data warehouse projects often try to mix operational data in the form of an operational data store (ODS) with the data warehouse to gain access to real time information. This is a mixing of time variant data and creates volumes of discussions on how best to handle the situation. A different view might be that this is really a mixing of the knowledge process.

MetaVista’s planning approach is to phase in the intelligent data warehouse (IDW), where the knowledge process is managed from within the knowledge warehouse using both historical and active data in near to real time. There may still be a need for an ODS depending on the process that is being managed. The IDW becomes a means of sending high performance information to the operational systems and keeping data warehouse data tightly coupled to the business process life cycle. While the IDW still maintains the traditional definition of the warehouse within its boundaries, current operational data becomes integrated and useable.

MetaVista’s approach is to expand the primitive data warehouse into an intelligent data warehouse the knowledge process components for process definitions, business rules,

Page 24: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 20 of 107 MetaVista Consulting Group Updated: 2010-06-07

knowledge structures, and knowledge bases are added and managed by the knowledge warehouse software. Merging these components advances the primitive data warehouse from just data management to knowledge process management.

Second Generation Business Intelligence Plans

MetaVista’s planning approach is to develop Developing an Intelligent Data Warehouse, which will require the effort to understand the business process as well as the business data. Linking historical data and employing data mining within operational systems can realize quantum gains in productivity and reduction of cost. The effort to map data with process is in no way small but as with all Data Warehousing efforts the projects can be targeted and expanded.

The IDW Knowledge Warehouse knowledge bases, methods and objects in themselves can generate repeatable and reusable knowledge bases, which will produce organizational reporting assets that may be more valuable than processes.

Second generation business intelligence is the linkage of knowledge production and knowledge integration with the business process environment. With this linkage comes an understanding of the data and the procedures that are used to create business objectives. Higher levels of information can be generated to produce better methods, shorter time to market, higher product quality and a reduction in overhead costs.

Second generation business intelligence combines what the business knows about itself with what the business knows how to do and conforms that knowledge into the socialized structure thus creating an Intelligent Data Warehouse environment.

Reference Methodologies, Processes and Standards

Data Warehouse Definition Framework [Zachman]

Page 25: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 21 of 107 MetaVista Consulting Group Updated: 2010-06-07

Data Warehouse/Marts Architecture • W.H. Inmon, Prism Solutions “decision making is the progressive resolution of uncertainty” • Dr. Ralph Kimball, Metaphor & Red Brick Systems • Douglas Hackney, Data Warehouse Institute, the International Data Warehouse Association • Dodge & Gorman, Oracle8 Data Warehouse Consulting Services • Bischoff & Alexander, Bischoff Technical Services, IBM • Dr. Barry Devlin, IBM Europe

System Architecture • John A. Zachman, IBM, the Zachman Framework for Enterprise Architectures (TAFIM),

Zachman Institute for Framework Advancement (ZIFA). • Zachman 3 / RUP / UML Framework • The Open Group Architectural Framework (TOGAF) 8.1, ADM (Architecture Development

Method) • The Federal Enterprise Architecture (FEA) • Federal Enterprise Architectural Framework (FEAF) • Capgemini Dynamic Architecture Framework DyA • 4+1 Philippe B. Kruchten, IBM/Rational • The Gartner Methodology • Object Management Group (OMG), Model Driven Architecture (MDA) • Object Management Group (OMG), Unified Modeling Language (UML) V 1.1 to V 2.1 • Object Management Group (OMG), CORBA • Object Management Group (OMG), DoDAF (UPDM)1.5 UML Metamodel

Systems Development Life Cycle (SDLC) and/or Software Development Life Cycle (SDLC) • Waterfall • Incremental • Spiral • Rapid Application Development • Joint Application Development • Aspect Oriented • Enterprise Ontology • Unified Process (UP) • Rational Unified Process (RUP) • Agile Unified Process (AUP)

Process Improvement Models • SEI CMM Level 3 • SEI CMMI Level 3 • ISO 15504 SPiCE • ISO 9000-3

Governance • ISACA, IT Governance and Control (COBIT) AS 8015

SEI Metrics-based scheduling [MBS] ISO/IEC 15939 • Department of Defense (DoD) Information Analysis Center (IAC), Defense Technical

Information Center (DTIC), Air Force Research Laboratory - Information Directorate (AFRL/RI).

• ITT Corporation (DACS)

Page 26: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 22 of 107 MetaVista Consulting Group Updated: 2010-06-07

• Dr. Robert Kaplan (Harvard Business School), Dr. David Norton, Balanced Score Card (BSC), Balanced Score Card Institute (BSI)

Sustainability • Leadership in Energy Efficient Design [LEED] • Dr. Karl-Henrik Robèrt, The Natural Step (TNS) Framework, OCM Framework • Sustainable Technologies, EPA, US Gov

Standards • IEEE Standard 1471-2000: IEEE Recommended Practice for Architectural Description of

Software-Intensive Systems. • ANSI/IEEE 1028: Standard for Software Review and Audits • ANSI/IEEE 1058: Standard for Software Project Management Plan • ANSI/IEEE 1074: Standard for Software Lifecycle Process • SEI/CMMI: DAR, IT, OPD, IPM, RD, REQM, IPM, PI, PMC, PP, PPQA, TS, VER & VAL

Process Areas

Legislation • U.S. Department of Defense (DoD); Technical Architecture Framework for Information

Management (TAFIM) Volumes 1-8. Version 2.0. Reston, VA: DISA Center for Architecture, 1994.

• Clinger-Cohen Act of 1996 • The Chief Information Officers Council A04. Federal Enterprise Architecture Framework

Version 1.1. September 1999. • The Patriot Act, 2001, Electronic Privacy, Security, Commerce, and e-Government;

Technology (Web Portals) and Records Privacy, US Gov, Homeland Security: Computer Security: Sections 105, 202, 210, 211, 216, 220, 808, 814, & 816; Critical Infrastructure Protection: Title VII and Section 1016, Electronic Government: Sections 361, 362, 403, 405, 413, 414, 702, 906, 1008, 1009, 1012, 1015; Internet Privacy: Sections 210, 211, 212, 216, 217, 220, 224

• Sarbanes-Oxley Act, 2002, SOX 404 Management Assessment of Internal Controls: Section 302 Corporate Responsibility for Financial Reports, including 302 (a)(4)(C) and (D), 302 (a)(5), 302 (a)(6).

• Health Insurance Portability and Accountability Act (HIPAA), Part 164, Security & Privacy Subpart C, Security Standards for the Protection of Electronic Protected Health Information, § 164.308 Administrative safeguards, including 164.308 (a)(3), 164.308 (a)(4), and 164.308 (a)(5)(ii)(C)

• Gramm-Leach-Bliley Act (GLBA), 1999, Section 501, protect "non-public personal information"

Summary

The essence of building the data warehouse project plan requires the following:

• Data warehouse projects must be iterative

• Business needs must constantly be reflected

• Specific deliverables enable development to identify progress and maintain focus, avoiding feature creep while still adhering to the “big” picture

Page 27: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 23 of 107 MetaVista Consulting Group Updated: 2010-06-07

Approach for Completing Requested Services

Introduction This section of our proposal describes our response to specific sections of the CPHCS Request for Offer, including our response to specific deliverables requested by CPHCS.

Since our methodology does not produce the same set of deliverables as those described in the RFO (in fact, our methodology calls for many additional deliverables), we will map our standard deliverables to those described in the RFO post-award, during the development of our planning documents (Deliverable 1).

Roles and responsibilities of our proposed team are detailed later in this proposal, under MetaVista’s Expertise and Experience.

Response to Scope of Services MetaVista understands that the Enterprise Data Warehouse effort will take place within a complex and ever-changing information technology environment. Accordingly, our proposed project strategy and solution will reflect:

• In-flight technology initiatives for both CDCR and CPHCS

• Existing and emerging state-wide initiatives including, but not limited to, recent initiatives to consolidate data centers as well as promoting interagency sharing of common services where practical

• Existing and emerging CDCR and CPHCS enterprise architecture standards, including standards for many hardware, software, and network components that will effectively constrain our solution designs

• Data Warehouse, Business Intelligence (BI), and other related strategies as developed and approved during the course of the project

• The anticipated needs of relevant stakeholders, including those that might be identified after the start of the project

MetaVista understands that this is a deliverables-based project. Prior to initiating work on any given deliverable, we will produce a Description of Deliverables (DOD) document -- also known as a Deliverables Expectation Document or DED in some organizations -- which will describe our proposed approach to preparing the deliverable, including the methodology, format, content and level of detail. We will submit the DOD to the CIO or designee for approval.

We understand and acknowledge that our designs and recommendations will be subject to review and approval by relevant committees.

Finally, MetaVista acknowledges the State’s desire to work cooperatively to accommodate any potential changes in Scope. We will negotiate in good faith and will provide supporting information as required in terms of the schedule and cost impacts of any given change. We understand that all changes must be documented via a written amendment to our contract.

Page 28: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 24 of 107 MetaVista Consulting Group Updated: 2010-06-07

Specific Deliverables The remainder of this section provides specific responses to the deliverables described in the RFO.

MetaVista’s standard methodology is described using a standard waterfall SDLC. However, we recognize and acknowledge that this particular project will be implemented in three phases, as described in Appendix A. The exact content and timing of each phase will be defined during the development of Deliverable 1.

Additionally, MetaVista acknowledges the requirement to provide implementation services for the approved solutions, as opposed to providing planning and design services only, as described in the Questions and Answers document provided by the State.

1. Project Management Plan Deliverable 1 Enterprise Data Warehouse Plan, Business Intelligence Plan, and Reporting

Project Management Plan (i.e., Project Management Plan) for all phases

MetaVista SDLC Artifacts: • Planning documentation; consisting of:

a. Readiness Assessment/Capability Assessment b. Decision Support Data Warehouse & Data Marts Business Drivers c. Strategic KPI Deployment Business Intelligence Reasons d. Tactical Metric Reporting Business Reasons e. Data Scope [Subject Areas] f. Governance Framework g. Technology Scope [Architecture] h. Temporal Scope [Delivery] i. Operational Access & Integration Business Drivers j. Operational Information Roles & Privileges Business Reasons k. Business User Roles, Assumptions & Constraints

Deliverable Content and Approach: Enterprise Data Warehouse Plan

MetaVista’s approach [Fairhead 1995] is to define a two level planning process that delivers the required infrastructure in stages, ensuring that each stage delivers visible business value. The first level in this planning approach is accomplished by segmenting the data warehouse into logical sections that can be delivered reasonably independently. Staged data warehouse implementation planning ensures maximum reuse and develops a close association with the future designing of specific prioritized informational BI applications. The enterprise data model (EDM) is the cornerstone and knowledgebase of this approach.

To accomplish the next level, MetaVista will introduce its concept of planning for organic growth, as a joint activity among all business areas and informational system shops. The outcome of an organic growth pattern is that the data warehouse will be planned on the basis of the business needs, enterprise modeling considerations, and infrastructure limitations. The concept of staged implementation will provide the basis for planning each of the individual segments in the overall enterprise data warehouse delivery process. Each segment will be designed to stand on its own merits from the point of view of the business benefit it delivers.

Page 29: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 25 of 107 MetaVista Consulting Group Updated: 2010-06-07

To provide cohesion and consistency MetaVista will compile a list of critical success factors for the overall segment and staging process planning. Progressive segmentation planning down through the business layers will outline the distinctions between the two models that MetaVista uses to segment the data warehouse; the business data warehouse (BDW) and the business information warehouse (BIW).

MetaVista will accomplish segmentation in three steps. The first step is to define the high-level enterprise model, which can be an intensive 1-to-2 month exercise. This exercise cannot be skipped and replaced by unplanned tool based ETL; to do so would add significant risk for the overall consistency of the data warehouse model.

The second step is to plan how to model various optional subsets of the BDW, which involves defining both the structure and its relationship to the operational applications that are the source of its data. This is by far the most complex planning step, which can take to take as much as 6-to-9 months, depending on the scope of the optimal BDW subset and the difficulty encountered in sourcing the required data. The total warehouse generic business needs drive the second planning step.

The third planning step is accomplished by understanding the elements required in building a model of the first version of a BIW, which involves defining its structure and its relationship to the BDW. This is relatively straightforward and should be accomplished in 4-to-6 weeks. The specific data needs to achieve a known business objective that a specific generic application can address drives the third step.

MetaVista will plan for a staged implementation process, also in three steps. The first step is to define a development plan of the infrastructure components that will be used broadly throughout the data warehouse that are to be implemented in scheduled stages. The principal infrastructure areas that MetaVista envisions are a series of source database management systems (DBMS), a BDW population, a BIW population, a data warehouse catalog and a business information guide, and asset of administrative components. The second planning step is to understand the scope and complexities of completing the enterprise modeling from the generic enterprise-level to the logical and physical [Zachman] of each bounded area of business. The third step is to plan for generic business applications that will support the specific business requirements of the users. MetaVista will also plan to implement sustainability methods to manage data warehouse costs including storage, and data and query processing, as well as uncontrolled repository growth, failure to obsolete old data verses implementing optimization and data attrition policies.

Business Intelligence Plan MetaVista will leverage and extend the segment and staging approach for data warehouse planning described in the previous section. MetaVista planning process will discover the best usage of a service-oriented architecture (SOA) to leverage services for applications to transform data into information and transform that information into knowledge through discovery into what we call Business Intelligence or BI.

MetaVista’s planning methods will discover the best business and technology capabilities needed to utilize data warehousing in order that strategic, tactical and operational decision makers can analyze, slice and dice, query, and generate reports.

An important approach MetaVista will incorporate in its BI Planning apart from the obvious advantages of OLAP methods is to create a foundation for realistic capacity planning for BI applications, platforms and support repositories. To better understand BI planning, MetaVista will first outline the extent and temporal limitations of the required data stores and their data sources. The second planning task is to understand the BI processing, including data

Page 30: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 26 of 107 MetaVista Consulting Group Updated: 2010-06-07

processing and maintenance, data analysis and presentation, and data administration.

Data processing planning must map all the sources of data, in order that the relevant subset of raw data that is desired for analysis is extracted. The raw data will naturally come in many formats. MetaVista will document precise plans to transform, consolidate and cleanse all the disparate data into a single format to maintain the highest data quality and to alleviate the need for complex applications that have to work with dirty or missing data factoring elements into data that should have been anticipated before analysis by upfront quality control.

MetaVista’s approach to capacity planning is to have the means to estimate the computer resources required to meet an application’s service-level objectives over specific measurements of time. MetaVista planning methods is realized through establishing Service Level Agreements (SLAs) and maintaining balanced systems to handle BI workloads that are critical requirements for IT operations. MetaVista plans will map the BI infrastructure, which is considered the fundamental building block upon which the BI system is constructed. MetaVista will discover in its capacity planning if the resources are in place to optimize the performance of data warehouse. While MetaVista’s fundamental concepts of capacity planning for traditional workloads can be applied to BI systems, we note that there are significant differences in the characteristics of OLTP and BI workloads that must be taken in consideration when planning for the growth of the BI system.

Reporting Project Management Plan Immediately upon contract award, MetaVista will initiate development of the Project Management Plan and the Project Schedule. These are critical planning documents, and will be developed in parallel with the other Plans identified as part of Deliverable 1.

The Project Management Plan will describe our overall approach to this project and the specific project management processes that we will follow. The content of this Plan will be based on industry best practices, including the PMBOK Version 4 and the State of California PMM. Common project management processes already in place within CDCR and/or CPHCS will be leveraged to the extent possible, so that project management practices for this project are consistent with those already in use elsewhere in the organization.

Minimally, the Project Management Plan will address the overall project strategy and schedule, stakeholder roles and responsibilities, project communication between and among stakeholders, project risk and issue management, project quality, and project scope management – including how potential changes will be presented, reviewed and approved.

2. Reporting Requirements and Custom Reports Deliverable 2 CDCR and CPHCS Requirements and Reports for Business Reports Phase I

Deliverable 3 CDCR and CPHCS Requirements and Reports for Business Reports Phase II

Deliverable 4 CDCR and CPHCS Requirements and Reports for Business Reports Phase III

MetaVista SDLC Artifacts: • MetaVista’s Business Requirement Specification (BRS)

• SDLC Business Gates Summary

• SDLC Business Gates Policy

• Requirements Management Plan

Page 31: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 27 of 107 MetaVista Consulting Group Updated: 2010-06-07

• Weighted Requirements Ranking

• Stakeholder Requests

• Control of Quality Records

Deliverable Content and Approach: MetaVista will assess, document and program the annual, bi-annual and quarterly mandatory reporting requirements, taking into consideration the quarterly and monthly optional reporting time periods. MetaVista will also assess the need for ad-hoc reporting that complies with State policy and to create where possible a paperless environment.

MetaVista will assess if the reporting applications will need the capacity to transmit copies of forms, user guides, and inform individual institutions of changes in reporting format and changes in reporting frequency requirements using web services and email.

MetaVista envisions that BI applications will be able to store multiple versions of integrated virtual report forms using electronic imaging that are automatically versioned being sent directly to the institutional users to both prevent paper reproduction needs and to facilitate distribution to the primary health service prevention providers.

MetaVista suggests how electronic web based user guide forms could provide directions to users on how to correctly collect and report BI data. This concept is similar to a “knowledge-tree” and would represent the thoughts and ideas of numerous staff electronically and also concurrently service providers who work to accurately represent health services prevention data [ref: PADS reporting requirements, CA State 2004].

MetaVista’s proposed user guide portal would collect thoughts and suggestions for enhancing the BI products from the user analysts, which would inherently make it an even more useful reporting document for the health services and incarceration control field users.

MetaVista also considers distributing self-assessment and survey engine web applications to users in order to stimulate improving the quality of data submissions from the warehouse. This would also allow the “knowledge-tree” concept to improve BI and reporting, dynamically constantly improving transparency and governance. MetaVista is aware that real-time and shorter frequencies of reporting can be considerably disruptive to current processes in place.

It is MetaVista’s aim to plan reporting frequencies to accommodate the most urgent and critical care needs as well as accommodating the current corrections administration methods and policies managing incarcerated individuals.

MetaVista suggest incorporating and reusing some of the most compelling and relevant OSHPD BI concepts. MetaVista can create an integrated web based reporting corrections healthcare information division with a high-level dashboard monitoring the financial health of California incarceration hospitals that would be interoperable with OSHPD BI data. Cost management is one of the greatest challenges for the State of California. OSHPD has already created a State roll up to Federal reporting tool. MetaVista feels it makes sense to breed in transparency where possible rather than starting from scratch.

Page 32: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 28 of 107 MetaVista Consulting Group Updated: 2010-06-07

Business Requirements Gathering

Report data and reporting process requirements gathering is a vital part of successful project management and application development and part of MetaVista’s core competency:

• Interviewing subject matter experts and relating needs • Organizing complex information into understandable subject areas • "Translating" technical language into business language and vice versa • Ensuring stakeholder involvement at all levels of involvement • Drafting clear and concise written documentation for users and technicians • Working successfully with multidisciplinary teams

MetaVista proposes a methodology that captures the users’ requirements, quickly, accurately and completely - one that provides a flexible yet structured approach to producing a high quality specification.

MetaVista’s methodology in specifying its client business requirements ensures we can develop the highest quality system that meets the needs of our clients. This has been one of the driving objectives of our approach - a systematic application of the best interviewing and modeling practices that gives the analyst the knowledge of what they need to do and the comfort that it has been accurately captured.

MetaVista’s experts recommend Requirements Discovery Sessions with business representatives and an experienced Information Management analyst, using a methodology based in business data understanding. Not a conventional requirements interview, an RDS is focused on the business and should apply proven and easy to understand communication and modeling techniques.

Our experience has taught us that the biggest cause of changing requirements is not asking the right questions in the beginning. A simple, systematic approach leads the analyst and business users to discover all the information requirements, business rules and functionality, which results in a complete and accurate business specification the first time.

MetaVista realizes that there is no magic bullet for this issue, but key success factors involve applying an approach that delivers results and is focused on yielding rapid and visible results for business clients. To satisfy the client, we must know what the client wants, and then can show that we have addressed those requirements.

Page 33: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 29 of 107 MetaVista Consulting Group Updated: 2010-06-07

MetaVista’s Business Requirement Specification (BRS)

To be consistent, the business requirements specification should be accurate, complete and clear. MetaVista will subject its business requirements specification to an internal qualify review to ensure that the business requirements are clear, accurate, complete, and testable.

In the final analysis, using the methodological approach to business requirements gathering will enable an organization to collect, segregate, prioritize, analyze and document all the relevant informational and process needs for the application under design. MetaVista’s understanding of the business needs for data and process will result in useable, robust and sustainable systems that give an organization a competitive edge.

3. Performance Metrics – Key Performance Indicators Deliverable 5 CDCR and CPHCS Performance Metrics Phase I

Deliverable 6 CDCR and CPHCS Performance Metrics Phase II

Deliverable 7 CDCR and CPHCS Performance Metrics Phase III

MetaVista SDLC Artifacts: • KPI Requirements Management Plan

• KPI Measurement Plan

• KPI Weighted Requirements Ranking

o Fundamental Constitutional (concrete) KPIs governed by laws and policy

o Strategic and Tactical (adaptable) KPIs governed by BPM best practices

o Operational Concept (ad-hoc) KPIs governed by operational needs as they arise • KPI Stakeholder Requests

• KPI Control of Quality Records Deliverable Content and Approach: Introduction

The core fundamental exercise of this project is the creation of KPIs that translate into metadata to support a BI platform. The KPIs must develop metrics that have actionable value and are not just statistics for the sake of statistics. An example of bad intelligence is when the KPIs report everything close to OK or a 100%. In reality the probability of everything being close to 100% is unrealistic. KPIs must establish trueness, constantly pushing performance maturity and the ability to adapt and change direction on the fly when things do not work out as planned.

The development of KPIs is essential to leverage ROI and institutional value from a data warehouse environment. Without the expert crafting of a KPI family the entire project and cost will have absolutely no value. The data warehouse will become a futile exercise of amassing massive amounts of hard to understand data. Queries though useful will not replace the science of well-structured KPIs.

An institution must grow its own ability to manage, mature and develop KPIs due to the harsh affects of our constantly changing economies and policies. Furthermore KPIs leverage huge burdens from management in tracking illicit behavior that is gleaned from patterns found in data. Good KPIs interrogate policy and report to governance allowing decision makers to maximize accountability. Good KPIs force operations to raise the bar as failures are detectable early on

Page 34: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 30 of 107 MetaVista Consulting Group Updated: 2010-06-07

before they become embedded in a culture and are almost impossible to eradicate. KPIs allow front-line managers to practice self-assessment opening up opportunities for cost savings and streamlining.

Approach

MetaVista will develop and document best of breed KPI development process and build a fundamental process to create a flexible, dynamic and intuitive platform. In order for KPIs to be living and adaptable resources it is essential that they address the fluid changes of dynamic business process management (BPM) and that changes, upgrades, refinements and replacements are within reach of the user in an easy to manage portal. KPIs can be hard coded and preprogrammed in a library setting accessible through an application or they can be built on the fly in a web browser, tested real-time with current data, then stored and retrieved as favorites.

However, before any technology gets involved KPIs must be developed systematically, forcing the organization to manage the diverse implications of crafting a “Rosetta Stone” of intelligence calculations and metrics. A good practice using good forensics is to establish a constitution of core KPIs as a baseline preferably answerable to laws and governance. Additional limited KPIs can be crafted to manage operational performance issues that build accountability to the constitutional KPIs, similar to the role of federal and state.

These two areas manage the coarse grain metrics, however, it is highly practical to have the ability to engineer compliant KPIs to manage the fine grain issues that are peculiar to individual sites and business processes. As changes are much more commonplace in the fine grain management of institutions as management should constantly be trying to improve performance the ability to build KPIs in an ad-hoc KPIs really serves the front line management of care and incarceration.

Method A: Adoption of KPI (Metric) Development Process • KPIs are developed from balanced scorecards that

• Determine the business requirements, first

• Define the measurement architecture

• Specify strategic business objectives

• Elect strategic metrics and reporting strategy

• Develop KPI implementation guide on how they are to be used and applied

• KPIs should only be developed with existing information

• KPIs are a set of metrics that produce accompanying reportable scores

• KPIs measure desirable and unacceptable performance at regular intervals

• KPIs gauge the effectiveness and define problem areas

• KPIs form the basis to measure a variety of compelling business areas

• In most cases the middle score or passing score is equivalent to the actual expectation

Method B: Adoption of KPI Build Process • Build KPI lists containing countable items, workflow items, date items, or tasks progress.

• Track KPI Issue Status, KPI Priority, KPI Task Status, KPI % Complete

Page 35: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 31 of 107 MetaVista Consulting Group Updated: 2010-06-07

• Calculate the goal of the KPI, select one of the following ways to calculate

• Number of list items; a count of the total number of items in the list.

• Percentage of the number of list to the number of list items in the view above

• Calculation using all list items (a computation of Total Sum, Average, Maximum, or Minimum of a numerical column in the list)

• Evaluate business data against business goals to reduce risk (using graphical scorecard)

• Choose visual indicators, e.g.: OK [Green], Warning [Yellow], or Problem [Red] to validate actual value vs. target value

• Display indicator for each individual item in KPI list using standard calculated formulas

• KPI result value is the number of items in the target list

• Keep track of totals • Set Rules by selecting higher or lower values to indicate which range of numbers will be

green (OK) for example

4. Strategy Validation Deliverable 8 CDCR and CPHCS Data Warehouse Gap Analysis, Business Intelligence

Gap Analysis, and Reporting Strategy Gap Analysis

MetaVista SDLC Artifacts: • Gap Analysis Worksheet

• Gap Risk List

• Gap Risk Management Plan

• Gap Configuration Management Plan

• Potential Target Application Benefits and Risks Matrix

• Intelligent Enterprise Data Warehousing (IEDW) Assessment

Deliverable Content and Approach: Introduction

Strategic Gap Analysis is a forecasting technique in which the difference between the desired performance levels and the extrapolated results of current performance levels are measured and examined. This measurement indicates what needs to be done and what resources are required to achieve the goals of an organization's strategy.

Approach

MetaVista will perform a strategic and tactical gap analysis assessment to cover the data warehouse, business intelligence and reporting environment. An enterprise data warehousing assessment greatly assists in examining where an organization is today and where it should be, and recommends how to initiate the process of designing an enterprise data warehousing solution.

Some of the recommended components of this assessment include:

Page 36: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 32 of 107 MetaVista Consulting Group Updated: 2010-06-07

• Ascertain current environment and existing business needs. • Compare above with industry best practices and benchmark the organization against other

organizations along the enterprise data warehousing maturity curve. • Perform a gap analysis between where the organization is and where it should be. • Recommend a series of actions that will move the organization towards an enterprise data

warehousing solution. The enterprise data warehousing assessment helps the organization regardless of where it is along the data warehouse maturity stages. If the organization is new to enterprise data warehousing, it helps to determine the enterprise's readiness for enterprise data warehouse and provides recommendations for how it can get started. If the organization has reached the initial stages of enterprise data warehousing, it should aid in determining specific steps that are needed to increase the business value of existing data.

Data Warehouse, BI and Reporting Gap Analysis

MetaVista will incorporate similar gap analysis and strategic roadmap exercises as those facilitated by the Gartner architecture framework process. Gartner’s process methodology identifies proposed actions to better align HHS people, processes, and technologies in preparation for implementing a health services enterprise data warehouse initiatives. The MetaVista gap analysis and roadmap will address gap by defining and a readiness assessment focusing on personalized CDCR and CPHCS guiding principles, objectives, and continued next step implementation of phased approach initiatives.

Robustness Gap Analysis Techniques

Robustness Analysis involves analyzing the narrative text of use cases, identifying a first-guess set of objects that will participate in those use cases, and classifying these objects based on the roles they play, and it helps you partition objects within a Model-View-Controller (MVC) paradigm. Robustness analysis enables the ongoing discovery of objects, and helps you ensure that you've identified most of the major domain classes before starting any additional design or development.

5. Solution Design Deliverable 9 CDCR and CPHCS Portal Design

Deliverable 10 CDCR and CPHCS Reporting Design

Deliverable 11 CDCR and CPHCS Business Intelligence Design

Deliverable 12 CDCR and CPHCS Data Marts Design Deliverable 13 CDCR and CPHCS Enterprise Data Warehouse Design

Deliverable 14 CDCR and CPHCS Data Design

Deliverable 15 CDCR and CPHCS Software Design Deliverable 16 CDCR and CPHCS Hardware Design

MetaVista SDLC Artifacts: • Design Guidelines

• Functional Decomposition Model

• Business Process Performance Framework

Page 37: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 33 of 107 MetaVista Consulting Group Updated: 2010-06-07

• Business Unit Performance Framework

• Target Application Design Support for Business Processes

• Target Application Design Support for Critical Success Factors

• Enterprise Data Warehouse/Data Mart Performance Framework

• SDLC Design Process Gates Summary

• SDLC Design Process Gates Policy

• Detailed Architecture Specification (with Architecture Blueprints)

• Business Requirements (with Use Cases)

• Functional Requirements (with Activity, Sequence and Collaboration Diagrams)

• Functional Process Specification (with Design Blueprints, Data Models, Class, and Component Diagrams)

• Detailed Design Implementation Specification (Tools, Frameworks, etc.)

Deliverable Content and Approach: Introduction

A successful solution design is founded on extracting and refining the best blend of business models, then crafting the most adaptable and governable architecture that leverages cohesively the best of breed technology solutions and licensable products that alleviate custom engineering and that fall within the strategic, tactical and operational resource capabilities of the project allowing completion time to market, engineered well within the budget.

The pre-baked technology solutions should never be anticipated or decided upon before the project planning has been completed, reviewed and approved by a governance committee. This strategy also applies to making critical technology choices before the architecture blueprints and frameworks have been designed in a platform, technology agnostic model.

Until the true capabilities and business requirements are examined it is not recommended to move forward and commit huge amounts of money to a platform that might be overkill, difficult to integrate and be facing near term obsolescence. No different to the auto industry for example, the sales and marketing divisions of large technology support firms will do anything to get rid of old inventory. Often as not the development support can be a budget buster before the project even gets off the ground.

Portal Design

MetaVista will provide portal designs and concepts to develop outwardly simple and yet inwardly sophisticated virtual integrated workbenches and tools to run on a shared platform. The design would be role and discipline based, providing potentially significant decreases in time and expense for enterprise management processes to be more sustainable and efficient.

The types of portal design MetaVista envisions will be specific to the unique services that will be incorporated into the new BI data warehouse environment. Some of the ideas that MetaVista will present have proved highly successful in other industries that have matured the BI data warehouse environment for the last fifteen years.

MetaVista envisions creating this knowledge-space by using real-time information access to accelerate time to value increasing the pace of BI innovation while reducing development time and costs.

Page 38: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 34 of 107 MetaVista Consulting Group Updated: 2010-06-07

Leveraging this process-driven strategy, the state correctional healthcare organization will be able to:

• Introduce innovative new KPI BI products and product variations to more levels in the corrections health services many diverse organizations.

• Virtually design and validate products without a single prototype mock-up portal.

• Front-load critical decision-making to reduce downstream time and costs.

• Increase business agility and responsiveness to changing customer requirements.

• Maximize productivity of global resources (external emergency services, prisons, jails, emergency equipment, healthcare services supply chain, etc).

• Optimize processes to streamline KPI product development and ensure compliance.

• Provide seamless design for value chain enrichment to accelerate health services performance cycles.

The following are some basic portal ideas that could be adopted easily in a web services BI framework:

• Mock-up Portals that build ad-hoc KPI test reporting sessions.

• Navigational Portals that share KPIs in workflows already built in graphic representation views.

• Metric Analysis Portals that create and test advanced metrics on test data.

• Quality Control Portals that flag non-performing, redundant, and overlapping KPIs.

• Dashboard Viewing Portals that display 2D and 3D data visualization graphs and charts

• Drill-Thru Portals that expose OLAP Cube surfaces [Cognos & BO use pre-baked cubes]

• Drill-Down Portals that build summarization data sets from raw data

• Alert Portals that monitor and disperse alerts through various channels

Reporting Design

MetaVista will offer an array of reporting devices from static reports to dynamic reports that trigger web based alerts. MetaVista believes that the sooner the project can start reporting on even sample data the more intelligence there will be available to influence the data warehouse architecture and design. After all the report is the ultimate product. Its accuracy and usefulness will attest as a proof of concept for the massive data warehouse investment. The ability to improve and integrate reporting is a huge task on its own. Finding the data apples and oranges that don’t work together without massive data design in itself is daunting.

Depending on how frequent BI reporting needs are, in terms of design, initially it is prudent to leave as much source data in place, and build multidimensional views through a distributed caching strategy to deliver a subset of business solutions quickly. In the beginning of the project it will be hard to discern what data is redundant and what are the BI nuggets. Even though a user can query data efficiently by using advanced vendor tools against existing application stores, even across competing platforms, MetaVista warns that by not designing the reports from the adaptive business needs, the user will end up being trapped into using canned utilitarian reports that do not find the edge, disparities and anomalies.

Another design strategy that MetaVista needs to analyze and solve is the probability that operational legacy systems in place will continue to be on line and cannot be replaced in the

Page 39: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 35 of 107 MetaVista Consulting Group Updated: 2010-06-07

short development time needed to bring the data warehouse on-line. The design initiative will explore if users can still derive effective reporting even if the operational platform consists of non-compatible vendor technologies. Another hurdle could be from the business perspective when there are strong divisional boundaries existing between management and the individual degree of control of independent data stores, file formats and hard-copy documents.

Even though it would be ideal to have all the data marts capitalizing on the enterprise data warehouse with a common architecture, thus making reporting centralized, in reality it will take a considerable amount of time (years), design maturity and development to reach that goal. By creating a more loosely coupled temporary reporting hub, that is immune to all the technology development, changes and maturities, this strategy will produce a far more efficient report delivery service, especially in terms of person-hour commitment, allowing the users to deliver reporting solutions quicker to the business units as the data warehouse development progresses. When the data warehouse system is more mature, then the enterprise reporting hub can be built into a more centralized and normalized state.

What is paramount is for the report design strategy to decrease an unpredictable longer wait time for users to start playing with primitive BI data in order to learn how to build more intelligible reports. MetaVista will explore using stop-gap tools such as SSIS tools that can build packages to retrieve aggregate and clean large quantities of data (even from non-Microsoft technology stores), build the dimensional relationships in memory cache, and present them to requesting applications via Microsoft SQL Server Analysis Services (SSAS).

In the past, this could be practical only for relatively small quantities of data; however, database vendor optimizations are making this scenario a real choice. The more time the users have to analyze and access the data, practicing building smart KPIs the greater the measure of success the data warehouse will prove to be. One of the killers of most data warehouse projects is waiting for data to almost the end of the project.

Business Intelligence (BI) Design

MetaVista will address the following BI development and design challenges in its design:

• Migration to new versions of Web-intelligence and Business Objects is a constant strain on technical support;

• Adding users in excess of amounts of 1,000 to 1,500 adds significant stress to the architecture design and in some cases can be prohibitive;

• Management constantly desires extensibility [expression] and extendibility [reach] to answer to the marketplace business demands;

• Adding more applications also adds stress to the architectural framework;

• Management desires to embed organizational key performance indicators (KPI) in the data layer and project up-to-date performance against them;

• Users that are not advanced KPI analysts always have a desire to view more and more operational numbers in a dashboard-like presentation to make up for lack of data access training and knowledge of how multi-dimensional data views are created; one of the most commonplace bad practice is to use indexes as dimensions rather than fact data.

MetaVista will also address the following BI Modeling Challenges in its documentation:

• One of the prime focus of BI projects is to build system architecture blue prints as well as analysis and justification on viable and alternative system configurations;

Page 40: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 36 of 107 MetaVista Consulting Group Updated: 2010-06-07

• Lack of reliable documentation can cause unnecessary challenges, especially during troubleshooting of issues that requires encapsulated business objects and component support;

• Migrations, re-factoring metadata, integration, as well as real-time maintenance can be hampered by sparse information on why some parameters were set up in a certain manner as well as justification and implementation of some customized front-end components;

• Adding scalability and new technology can also be a contentious task, for lack of diagrams showing how system components were deployed and traced; wherein diagrams should at least shows exactly what type of key hardware tuning and health checks, SQL Query tuning, OS tuning, efficiency of performance tuning metrics web/application server tuning, etc., that are requires to be made operational;

• Look for alternatives to stock RDBMS, especially for small strategic data marts, that would not require the massive power and operational complexity as a full-blown Oracle. Some of the design alternatives to explore are hosting applications on same/different servers, deploying SQL partitions, building replication rather than relying on one massive instance, optimizing by employing physical partitioning of the data into different nodes, using SQL clustering to decrease the amount of fine grain data to be scanned, attempt to upgrade to newer releases in a programmable fashion, explore index and inverted index management, as well as enforcing full ACID compliance;

Best Practices for Creating Business Intelligence Applications:

• The design of schemas—both relational and for Analysis Services.

• The implementation of a data extraction, transformation, and loading (ETL) process.

• The design and deployment of client front-end systems, both for reporting and for interactive analysis.

• The sizing of systems for production.

The management and maintenance of the systems on an ongoing basis, including incremental updates to the data.

Data Marts Design

MetaVista data mart architecture and design approach covers both generally accepted types of data marts; the subset data mart and the incremental data mart. MetaVista’s approach mitigates the commonplace risk factors involving data marts specifically, including but not limited to, resolving aggregations strategies, data throughput, adaptable and reconfigurable topologies, scalability, replication and distribution, metadata conundrums, sustainability and new skills adoption.

Data marts participate to a higher degree in data mining than enterprise data warehouses, depending on the type of mining and the intended results. A server size data mart is very useful when a mining tool can define and relate different age groupings when analyzing flat file patient data and multi dimensional diagnosis data. Hard file periodic treatment records record different ages as time moves on, causing standard queries to produce confusing historic results in disagreement to the actual current age of the patient incarcerated. Data mining forms different relationships between the data elements by assuming a relationship between ranges of age, gender and diagnosis.

Page 41: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 37 of 107 MetaVista Consulting Group Updated: 2010-06-07

There are a variety of data mining techniques available; associations or link analysis, sequential patterns, clustering, and classification. The search process can be either from a bottom-up or data-driven approach, which examines fine grain detail and builds a final elemental set or a top-down approach, which pre-bakes a search solution through a refinement process casting out non conforming elements until the package meets or exceeds the desired thresholds.

The ability of search engines to be highly productive when unleashed on data marts rather than massive enterprise data warehouses is much more effective, as they are already strategically designed to manage a smaller, optimized and specialized sub-set of data. The advantage is that the search is not burdened with a complete exhaustive global search of all the data. There are several different approaches and strategies to build useful mining results arriving at accurate conclusions rapidly by employing either individually or in a blend; the most widely used being search graph strategies, exhaustive strategies, irrevocable searches, tentative searches, and hill climbing strategies. The data mart design should be tailored to work for the types of searches that are considered optimal for the resulting data sets required tto drive the BI portal UI to render graphs and charts.

• MetaVista will install prerequisites for subset data mart success by focusing its design expertise on parent elements only, enforcing bidirectional data flow and maintaining and building adaptable and flexible topology. MetaVista will concentrate on key success factors specifically controlling scope management and enforcing metadata containment.

• MetaVista will install prerequisites for incremental data mart success by focusing its design expertise on parent data warehouse strategy, the architectural environment, wide-scale and long-term view, integration strategy such as the data mart to data warehouse road map and designing for accomplishable scope. MetaVista will concentrate on key success and sustainability factors especially by instituting a design that fulfills the hypothetical mission statement or “elevator test,” limiting uncontrollable source feeds to one mart at the design level, designing for good performance that doesn’t degrade with unpredictable usage, managing feature creep through compromise or by change of scope and by documenting auditable data mart deliverables.

Enterprise Data Marts Design

MetaVista enterprise data warehouse architecture and design approach encompasses the whole enterprise to include the historical record of the business, the forensic evaluation of the source for all current state and target state data that will be stored in the enterprise data warehouse or dispersed to the front line data marts, which will have the ability to change both in variable quality, structure and content for distribution.

MetaVista’s approach mitigates the commonplace risk factors involving enterprise data warehouses specifically, including but not limited to, resolving the enterprise data model structure (EDM), which includes focus on scope and delivery architecture, business data classifications and generic entity relationships (ERM), logical application views and physical data design.

The scope delivery architecture will focus on the topography and dispersing of derived data, reconciled data and real-time data. Another approach MetaVista will install into its design principals is the analysis of time in the data model, using timestamps for tracking time at the field level and record level represented in snapshots and lifetime cardinalities representing the static view and the temporal view, respectively.

MetaVista’s enterprise data warehouse architecture and design approach will be singular and centralized, on a much larger scale than strategic data marts. The design will also focus heavily

Page 42: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 38 of 107 MetaVista Consulting Group Updated: 2010-06-07

on data summarization strategies, data throughput, adaptable and reconfigurable schema topologies, scalability, downsizing, obsolescence of low value data, replication and distribution to data marts, resolution of metadata maintenance conundrums, sustainability and new skills adoption.

Oracle data warehouse software architecture and control features are considerably more complex and fine grain than Microsoft SQL Server. Oracle clearly defines the division of tasks between its external server architecture and internal server architecture.

Data Design

In most data warehouse development 80% of the time is dedicated to locating, extracting, filtering, scrubbing and loading of data. Sometimes where quality is hampered by dirty and missing data this is not sufficient. If this process is not managed other important elements of the project suffer, resources are tapped and time and money becomes exhausted. The generally accepted rule of thumb is that the estimated time needed to do data extracts, scrubbing and loading is doubled with an additional contingency for resolving additional data problems.

The complexity of the integration process grows rapidly as the number of subject areas and source systems increase. This fact incrementally compounds as the development goes through the stages of analysis, design, implementation, deployment and run-time operations.

As discussed earlier in MetaVista’s Methodology Step 5: Source System Analysis the quality and accuracy of data shared between health services and incarceration institutions will be a priority to maintain the most appropriate human care possible as well as managing escalating costs of healthcare. Well-designed data can be an asset or if it is inaccurate, untimely, improperly defined can be a tremendous liability.

It is important to create a system of review and quality control into the initial design of the data. The process must include the cleanliness and timeliness of the source data, the process of update and delete and the quality controls imposed on those accessing the data. The design must allow the validation of data standards being used. There must also be a process in the design to clean up the data, controls on how to share the data and updates and how to query the data.

A reality check must be inbred into the design to detect poor-quality data. Detection of data design problems can be measured by programs that abnormally terminate with data exceptions, clients that experience errors or anomalies in reports and transactions, where an inherent lack of trust of the data develops by the users, where there is confusion as to what the data represents, where the updated data fails to load and the users are forced to use old data, where data does not integrate across departments and platforms, where consolidation is impossible, where programs do not balance the output and that system merges do not work.

Data design improvements is similar to a triage. The priority is to improve the quality of the data that brings the most important benefit to the user community (usually 80%). Other criteria can also be employed to fix and clean data. Unimportant data can be ignored; data approaching obsolescence can be ignored, bypassed and replaced with better-designed data elements.

Data re-design and reparation costs will vary immensely in order to clean different files and databases currently in existence. A current state view of the data must be created and mapped to the target state where the data needs to be to supply the envisioned data warehouse system. The purification process consists of a importance check, the user perception of quality, prioritization of purification, establish standards, incorporate standards, and provide a feedback loop to promote quality.

Page 43: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 39 of 107 MetaVista Consulting Group Updated: 2010-06-07

Bad data design and stewardship causes the following costly lessons. Bad decisions are made due to incorrect data. Opportunities are lost due to unavailability and credibility. Time and effort wasted to redo and restart bad programs as well as fixing problems. Overall the consequences of bad data can be disastrous. MetaVista’s roadmap and technology education will focus on resolving and eradicating these issues avoiding the obvious pitfalls before they become imbedded systemically into the newly desired BI and data warehouse culture.

Directory/Catalog

The concept behind the Directory/Catalog is to create a “control-center” of the enterprise data warehouse that ties the components and pieces of the warehouse environment together. Data design is only as good as the ability to retrieve and build results from it. In order to understand design it is important to understand where to put it and how to find it again. An active metadata catalog monitors the flow of metadata through the enterprise-wide data warehouse providing integration and access for the user community.

The Directory/Catalog It initially captures and stores business metadata and logical metadata. This metadata describes the real data in a business terminology with associated business rules governing the context of the real data. Business metadata is created in the logical design process and then moved within the context to an active directory/catalog. Technical metadata is also captured in this process including data source and transformation as well as the target warehouse binding the logical and physical mapping and maintaining the relationship structure. Simultaneously environmental metadata is also captured including user profiles, security and access information as well as physical data about the data marts, hardware server capacity, server network location and performance data.

Metadata Design

Metadata is an extremely important component of a successful data warehouse as it is the data that manages the structure of the real data stored in the data warehouse. Oracle stores much of the metadata in the data dictionary component but it falls short of the required metadata to manage the desired rich features of today’s decision support systems. Metadata design lifecycle creates three synergistic stages to develop metadata. Collection, where metadata is identified and captured into a central repository, Maintenance, where it is put in place to automatically synchronize automatically with changing data architecture and Deployment, where metadata is provided to the users in the right form and in the right tools.

Software Design

MetaVista will incorporate many levels of different types of software design to be distributed among the many software layers of the envisioned web based intelligent data warehouse complex. These elements will be needed to drive a service-oriented architecture (SOA) which MetaVista will design to leverage all the required services needed to drive a fully involved distributed BI and data warehouse, data mart system.

MetaVista will recommend and design the appropriate client platforms to render GUI either on a web based thin client or thick client that runs its own application suites. MetaVista will also design and recommend the distribution of web services to fulfill the reporting and BI needs.

At some later point in time MetaVista will be in a position to make an informed judgment as to which technology family or blending of implementation of technology of to employ for this project.

Page 44: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 40 of 107 MetaVista Consulting Group Updated: 2010-06-07

Hardware Design

In the event that Oracle is the data warehouse technology for all the repositories, the Oracle product family may be implemented on a multiple systems with options for data storage, fault tolerance, backup and recovery. Each of these systems can be categorized in terms of broader hardware architectures depending on the complexity of the architecture and the extent of the build and operational budgets. It should be noted that Oracle’s products, in return for its excellent level of technology demands highly skilled DBAs and Systems Administrators as well as costly maintenance and upgrade programming.

MetaVista can offer an option to Oracle as efficient medium-sized data warehouses have also been successfully deployed on Microsoft technology which require much less build and operational costs but still return the desired business intelligence in production-quality systems. Microsoft has just launched it Windows 8 based BI Stack which will eventually support Cloud computing. This will open the possibility to build a lighter virtual data warehouse for the initial versions. If the need arises for greater scalability a more powerful repository engine such as Oracle can be bolted on. MetaVista’s approach is to make each layer independent and replaceable.

Until the planning and architecture phases are completed it is hard to assess which of the broader hardware architecture will suit this project. Oracle as does IBM (DBS and Informix) in their own unique and different technology addresses the largest databases we use today. Microsoft addresses the mid-market addressing the cost-to-performance mantra, as the reason not to buy beyond ones needs or means.

Oracle recommends certain complex and costly hardware concepts for its enterprise wide data warehouse environment, as its technology is designed and optimized to run on either symmetric multiprocessing systems (SMP) using multiple high cost CPU power to drive CPU cache coherence, together with massive parallel processing (MPP) systems, and Oracle clustering technology. Oracle does not recommend uni-processor system architecture and non-uniform memory access (NUMA) systems other than for small non-complex data marts.

What is paramount before defining the breadth and depth of the hardware architecture and design irrespective of the vendor technology is to establish an objective evaluation criteria involving the costs, required resources to obtain a means between adequate and the best for reliability, availability, performance, scalability (de-scalability) and manageability.

6. Technical Roadmap Deliverable 17 CDCR and CPHCS Technical Roadmap Phase I

Deliverable 18 CDCR and CPHCS Technical Roadmap Phase II

Deliverable 19 CDCR and CPHCS Technical Roadmap Phase III

MetaVista SDLC Artifacts: • Information System Technical Strengths and Weaknesses Assessment

• Technical Functional Decomposition Model

• Technical Design Model

• Decision-Support Technical Roadmap

Deliverable Content and Approach:

Page 45: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 41 of 107 MetaVista Consulting Group Updated: 2010-06-07

For MetaVista technology planning is important for many reasons. MetaVista will formalize the technology roadmap process in its project development process. MetaVista will identify, recommend and if appropriate develop the technologies required to meet the mission. Once identified, technology enhancements or new technologies can also be developed and integrated internally with State IT or collaboratively with external partners. MetaVista believes that for either approach, technology road-mapping, is an effective tool for technology planning and coordination, which fits within the broader set of MetaVista planning activities for the data warehouse. MetaVista will develop and document the technology road-mapping process, which can be reused by the State and its other technology partners.

MetaVista will build a quality technology road-mapping process that provides information to make better technology investment decisions by identifying critical technologies and technology gaps.

MetaVista’s technology road-mapping process consists of three phases; preliminary activity, development of the technology roadmap, and follow-up activity.

Preliminary activity includes:

• Satisfy essential conditions

• Provide leadership/sponsorship

• Define the scope and boundaries for the technology roadmap

Development of the technology roadmap includes:

• Identify the “product” that will be the focus of the roadmap

• Identify the critical system requirements and their targets

• Specify the major technology areas

• Specify the technology drivers and their targets

• Identify technology alternatives and their time lines

• Recommend the technology alternatives that should be pursued

• Create the technology roadmap report

Follow-up activity includes:

• Critique and validate the roadmap

• Develop an implementation plan

• Review and update

MetaVista’s concept of a best of breed technical road map involves the mapping of:

• Goals that increase sustainability improve transparency and involve governance at every step

• Challenges must address at the cost level, diversity impact, technology deployment and integration, the permanence and state government acceptance thresholds.

• Research Pathways that examine storage, capture, breakthrough concepts and intuitive delivery methods.

Page 46: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 42 of 107 MetaVista Consulting Group Updated: 2010-06-07

7. Information/Data Architecture, Policies, and Standards Deliverable 20 CDCR and CPHCS Information / Data Architecture

Deliverable 21 CDCR and CPHCS Information / Data Policies

Deliverable 22 CDCR and CPHCS Information Data Standards

MetaVista SDLC Artifacts: • Information/data architecture policy and standards governance assessment

• Information/data architecture policy and standards enforcement assessment

Deliverable Content and Approach: MetaVista will initially review current state information/data architecture policies and standards to ascertain if they are appropriate for the envisioned enterprise data warehouse that will handle discrete healthcare and incarceration data. An information/data architecture policy and standards assessment will be released on the findings. In the event that it is not appropriate and needs expanding and updating to accommodate the requirements a revised policy paper will be drafted for approval.

The information/data architecture assessment will address the governing authority that is responsible for the statewide stewardship of information/data architecture policies and standards. An information/data architecture legal framework must be agreed to in order to maintain accountability, oversight and enforcement of said information/data architecture policies and standards. The information/data architecture purpose, budget and scope sections will be recommended by MetaVista and ratified by the state authority.

Information/data architecture policy and standards will be addressed as follows:

• MetaVista shall direct how the state utilizes information/data architecture target technologies, methodologies, standards, and best practices to develop, acquire, and/or implement application systems that collect, modify, and store data and report information.

• MetaVista will address the scope of how information/data architecture focuses on the process of modeling the information that is needed to support the business processes and functions of budget units, and more strategically, of communities of interest.

• MetaVista will where applicable, will span organizational boundaries to address interoperability, integration, consolidation, and sharing of resources by correlating business processes to common government services through the identification and definition of data/information relationships and dependencies.

• MetaVista will document information/data architecture outcomes are expressed in the form of data models, information flows, and analysis of inputs/outputs and decision-making criteria for the activities of state government.

• MetaVista as part of its SDLC will build and submit data model blueprints to produce as accurate a model as possible, with other useful graphical representations, of the information needs and business processes.

• MetaVista will develop the data model as a framework for business re-engineering and the development of new or enhanced applications to fulfill business requirements and processes.

Page 47: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 43 of 107 MetaVista Consulting Group Updated: 2010-06-07

• MetaVista’s data modeling will describe the types of interactions and information exchanges that occur within and between budget units and their various customers, constituencies, and business partners.

• MetaVista will also review current state information/data architecture general principles to ensure that the planning, design, and development of data/information architecture are guided by the following general principles that support the state’s strategic business goals and objectives.

Information/data architecture target technologies will be addressed as follows:

• MetaVista will control the outcomes based on approved data models of target data/information Architecture and ensure they are reviewed and refreshed based on business requirements, applicable statutes, as well as federal mandates and regulations.

• MetaVista will also document, advise and monitor appropriate shifts in technology, as well as the emergence and adoption of new, technology-related industry or open standards will be addressed as appropriate during reviews.

Information/data architecture standards will be addressed as follows:

• MetaVista’s information/data architecture standards define data and business modeling methodologies designed to promote program interoperability and to increase the efficiency and effectiveness of government services; consistent classifications of data; and secure, business-rule-based database access through software application systems.

Information/data architecture implementation standards will be addressed as follows:

• MetaVista will develop designs to maximize current investments in technology, provide a workable transition path to targeted technologies, maintain flexibility, and to enhance interoperability and sharing.

• MetaVista will ensure that information/data architecture implementations shall adhere to implementation strategies described in current and future amended statewide policies.

• Meta Vista will also ensure that enterprise architecture encompassing information/data architecture shall also be implemented in accordance with statewide IT security policies and applicable statewide standards for security.

Information/data architecture new design elements will be addressed as follows:

• MetaVista will address new requirements for custom-developed, application software and re-engineering projects requiring custom-developed application software.

• MetaVista will address new requirements for commercial-off-the-shelf (COTS) or externally developed, government-off-the-shelf (GOTS), application software involving significant modification to the core application software product.

• MetaVista will design significant modifications pertain to process and structural changes that are made to application software, such as changes to underlying programming code, files, records, and data elements made to application software as well as documenting additions or customization of reports, views, queries, etc., are not considered significant modifications.

• MetaVista will ensure conformance of IT investments and projects to enterprise-standards-based architecture to ensure the integrity and interoperability of information technologies for all statewide standards, project investment justifications, by defining conformance with the established and associated statewide policies and standards.

Page 48: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 44 of 107 MetaVista Consulting Group Updated: 2010-06-07

MetaVista will document and justify in the appropriate document any variances from the established statewide standards, project investment justifications, and associated statewide policies and standards.

8. Data Security Architecture, Policies, and Standards Deliverable 23 CDCR and CPHCS Data Security Architecture

Deliverable 24 CDCR and CPHCS Data Security Policies

MetaVista SDLC Artifacts: • Data security architecture policy and standards governance assessment

• Data security architecture policy and standards enforcement assessment

Deliverable Content and Approach: MetaVista will initially review current state data security architecture policies and standards to ascertain if they are appropriate for the envisioned enterprise data warehouse. A data security architecture policy and standards assessment will be released on the findings. In the event that it is not appropriate, and needs expanding and updating to accommodate the requirements, a revised policy paper will be drafted for approval.

The data security architecture assessment will address the governing authority that is responsible for the statewide stewardship of data security architecture policies and standards. A data security architecture legal framework must be agreed to in order to maintain accountability, oversight and enforcement of said data security architecture policies and standards. The data security architecture purpose, budget and scope sections will be recommended by MetaVista and ratified by the state authority.

Data security architecture policy and standards will be addressed as follows: • MetaVista will ensure that the principals require as flexible and comprehensive enterprise

data warehouse architecture model as possible so that the state’s data and information are valued and protected as critical assets of the enterprise, that secure access to data and information is provided.

• MetaVista will ensure that data and information are shared securely statewide and that interoperability standards facilitate and support community of interest government initiatives and other business solutions and that government solutions maximizing target architectures will achieve optimal efficiency and effectiveness for the delivery of services.

9. Resource and Transition Plans Deliverable 25 CDCR and CPHCS Staffing Resource Plan (Phases I, II, and III)

Deliverable 26 CDCR and CPHCS Transition Plan (Phases I, II, and III)

MetaVista SDLC Artifacts: • Deployment Plan

• SDLC Transition Gates Summary

• SDLC Transition Gates Policy

• Vision Performance Measures Targets Matrix

Page 49: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 45 of 107 MetaVista Consulting Group Updated: 2010-06-07

• Critical Performance Matrix

• Information System Audit Grid Matrix

• Product Acceptance Plan

• Release Notes

• Envisioned Cultural Elements Report

Deliverable Content and Approach: Staffing Resource Plan

• MetaVista will produce a Staffing Resource Plan addressing all phases. The purpose of the Staff Management Plan is to capture ‘how’ staff resources will be managed throughout the life of the project, and in particular the staffing required to maintain and support the entire technology stack for all three phases.

• This Staffing Resource Plan identifies the process and procedures used to manage staff throughout the project’s life. The plan describes the planning and acquisition of both state staff and consulting staff, describes the responsibilities assigned to each staff, and discusses transition of staff to other assignments.

Transition Plan • MetaVista will produce a Transition Plan that describes how the entire technology stack for

all three phases will be transitioned from the project team to full operational status, and integrated into ongoing operations and maintenance.

• The Transition Plan will be developed with active participation by impacted stakeholder groups (data center operations, network operations, IT security, application support, business area management, end users, and the like). We will ensure that the Transition Plan addresses all contractual transition requirements, and the overall scope of the transition plan addresses all impacted stakeholder groups. For each stakeholder (e.g., data center operations), we will ensure that the Plan is clear, complete, and at a level or detail appropriate for the each transition event.

10. Business and Technical Implementation Plan Deliverable 27 CDCR and CPHCS Business and Technical Implementation Plan (all phases)

MetaVista SDLC Artifacts: • Business Use Case Specification

• Business Use Case Realization Specification

• System Requirements Specification

• System Architecture Document

• SDLC Implementation Gates Summary

• SDLC Implementation Gates Policy

• Software Development Plan

• Software-Architecture Document

• Hardware Requirements Specification

Page 50: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 46 of 107 MetaVista Consulting Group Updated: 2010-06-07

• Hardware Architecture Document

• Test Guidelines

• Test Evaluation Summary

• Test Plan

• Implementation Development Workshop Plan

Deliverable Content and Approach: MetaVista’s approach to team effort requires the full-time involvement of the entire project team, including end users of the data warehouse, technical staff from the state’s data warehouse team, and any additional consultants and third party vendors. It cannot be over-emphasized that one of the primary keys to success is the committed participation of designated end users from a wide variety of levels across the organization in the various workshops.

MetaVista Data Warehouse Development Approach Plan • MetaVista will document a data warehouse development approach plan. All perspectives on

the project will be explicitly included, from executives’ requirements, to the end-users’ needs and computing environment, as well as the data processing staff’s knowledge of the systems environment and its challenges.

• MetaVista’s approach maintains focus on the critical success factors along the development path by involving the committed users and technical staff from the beginning defining a clear definition of scope to prevent paralyzing scope-creep, early executive review and buy-in to ensure priorities are met, careful attention to configuring a platform that will enable rapid response time to queries, intense scrutiny of the data loading and cleansing process to ensure data integrity from source to data warehouse, and documentation and training of technical and production staff and end users to guarantee active use, refinement, and custodianship of the data warehouse.

MetaVista Data Warehouse Design Workshops

• MetaVista will facilitate data warehouse in-depth requirements and design workshops that are fundamental to the success of the project, to bring the most active data warehouse users and technical staff into the same room for mock interaction sessions between users from across the organization, including technical staff, project management, and the MetaVista team to develop a broader, deeper understanding of the business and technical requirements, to ensure the best possible design for the data warehouse, and develop ownership of the data warehouse at all levels of the organization.

• Following on from the workshop sessions, the technical members of the data warehouse team will tackle the systems design aspects of the project with a detailed design workshop. The Team will build a mockup data warehouse model, analyze for data transfer issues, and lay the groundwork for software and hardware requirements, as well as addressing any anticipated barriers to the active and ongoing use of the data warehouse.

• The implementation workshops will also implement further refinement using model-based prototype testing. The workshops with end users and technical staff are essential to a successful data warehouse project. Full participation in these workshops is required of all persons identified on the project team, including the end users who will be most active, specifically the state project manager, and the state technical lead.

Page 51: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 47 of 107 MetaVista Consulting Group Updated: 2010-06-07

11. Data Cleansing, Extraction, Transformation, and Loading Plan Deliverable 28 CDCR and CPHCS Data Cleansing, Extraction, Transformation and Loading

Plan (all phases)

MetaVista SDLC Artifacts: • ETL Process Assessment, Traditional or Conformed

• ETL Architecture High-level Visionary Specification (with Architecture Blueprints)

• ETL Business Requirements (with Use Cases, Activity, Sequence and Collaboration Diagrams)

• ETL Functional Process Specification (with Design Blueprints, Data Models, Class, and Component Diagrams)

• ETL Design Implementation Specification (Tools, etc.)

Deliverable Content and Approach: ETL Architecture

• MetaVista strongly recommends the ETL architecture to be created before ETL tools are acquired and an ad-hoc non-source and source specific action processes are established without a fundamental oversight to the long-term issues.

• MetaVista ETL architecture will homogenize data into a standard and consistent format as source system data is inherently different.

• MetaVista’s ETL architecture approach is to eliminate proliferating redundant ETL actions by distinguishing where to apply specific and non-specific business rules during the ETL load processes.

• MetaVista will plan a migration blending from the Traditional ETL architecture, or one massive ETL operation to perform all the logic, to a Conformed ETL architecture, which creates intermediary data definitions and standardizes the formatting of convergent source systems. The Conformed ETL offers modularization, reusability of post-conform, and extensibility. However, it does cause more objects and processes to be created and maintained.

• MetaVista will document an assessment report to evaluate the additional development overhead. It is important to understand that as the IDW environment grows and richer elements and features are added that the assessment not only addresses the short term but the long-term target state.

Extract, Transform and Load or Data Population for IDW and BI • Data population includes all the steps necessary to add data to the business intelligence

data store, starting with the extraction of data from operational systems, mapping, cleansing, and transforming it into a form that is suitable for the IDW and BI application, and finally loading the data in the IDW and BI environment. All these steps together are usually referred to as the ETL process: Extract, Transform and Load.

Extract • In order to update the BI applications data store, it is important to know what data is

required from the operational system. Then, programs can capture any additions or

Page 52: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 48 of 107 MetaVista Consulting Group Updated: 2010-06-07

changes to the data in the source system. There are many options used to extract data from operational sources, such as total extracts of all the operational data, incremental extraction of a subset of data, or propagation of changed data.

• These extraction runs might only be possible when the source system is not actively using its data. This window when the source application is offline may also need to accommodate other housekeeping functions for the source system as well, such as backups, reorganizations, or other necessary tasks; this can reduce the availability of the data for the extraction processes.

• The extraction processes execute on the system where the source data is located, which may be a separate system than the one housing the IDW BI solution. Understanding where these tasks run is important when conducting a capacity planning effort.

Transformation and Cleansing • Business intelligence applications combine data from many independently developed

systems, so the data must be transformed and cleansed to ensure that the data warehouse will have valid, consistent, and meaningful information. In this process, data that has been extracted from the source systems is standardized to a common format with consistent content. The transformation process is often very I/O intensive, and involves things such as:

a. Converting data in different formats to a single consistent format.

b. Data consistency checking to correct or remove bad data, such as misspelled names, incorrect addresses, and other inaccuracies.

c. Detecting changes in the source data over time.

• These processes need to be performed each time data is extracted from the source system, and as the number of source systems increases, the complexity of this task multiplies. These processes may run on any system where there is available capacity, and are not required to run on the same image with the business intelligence solution.

Load • The load component encompasses those processes that directly add the data to the data

warehouse. Loads are periodically processed to keep the data in the system up-to-date: either nightly, weekly, monthly, or as often as necessary.

• Once a file is created from the records to be loaded, they can be sorted into partition key sequence, then split into multiple load files to exploit load parallelism. It is recommended to design ETL to run multiple utilities in parallel to fully exploit the server farms and I/O subsystem bandwidth.

12. Data Warehouse Governance Plan Deliverable 29 CDCR and CPHCS Data Warehouse Governance Plan

MetaVista SDLC Artifacts: • Assessment of Current State IT Governance

• Governance roll-out plan, including adherence to SDLC and TOGAF

Page 53: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 49 of 107 MetaVista Consulting Group Updated: 2010-06-07

Deliverable Content and Approach: Key Elements of a IDW Governance Structure • To execute a governance program, an organization must combine three key elements:

sponsorship, organization, and process.

• Ensure executive sponsorship. Successful data warehousing programs feature thorough executive sponsorship, i.e., solid, enthusiastic senior management backing. Without that, governance programs, like most programs, fall flat. Senior management must support the program and (most important) provide funding and access to resources for data initiatives.

Structure an oversight organization • Establishing the appropriate team is another key governance component. Start with a

governance board comprising senior business and technology contributors. They will ensure that the right people provide direction and have a vested interest in the success of data initiatives.

• Data owners, data stewards, and data beneficiaries are also critical to a data warehouse organization. Owners include groups that provide data to benefit the organization; they own the content and the corresponding definition of quality. Stewards manage data on behalf of the organization. They execute processes that support the organization’s SLAs. Beneficiaries receive value by using information. The governance model must consider brokering dialogue among these constituencies. Then, data collection, management, and use can achieve optimal value.

Establish lasting processes

• Once the proper sponsorship and organization are in place, fundamental processes can also be established. These focus primarily on alignment, prioritization, funding allocation, measurement, arbitration, and program management. The directed execution of these ongoing processes allows the data warehouse to provide clear near-term and optimal lasting value. Most data warehouses that fall short of expectations lack the process discipline to ensure success beyond the first few releases. This shortcoming often reflects an insufficient focus on and discipline in process management.

Imperatives, initiatives, and requirements • A governance model must, first and foremost, consider the needs of the business—e.g.,

profitability growth, compliance, and operational efficiency improvements—as its primary driver. Framing the business problem first, apart from specific data-centric activities, provides clarity and helps teams focus in proper areas. This can be achieved by defining imperatives, initiatives, and requirements.

• Imperatives represent the organization’s business requirements and objectives. Initiatives are data warehousing activities that help achieve imperatives. Establishing a direct correlation between both and leading efforts accordingly clarifies a data warehouse’s value proposition. The priority of business imperatives, and the effort required to accomplish them, drives the initiative-based program, and therefore keeps data warehouse efforts in lockstep with business needs.

• Requirements are specific details of what is needed to support business imperatives. Initiatives group requirements into executable projects, which are managed and directed by the governance structure.

Page 54: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 50 of 107 MetaVista Consulting Group Updated: 2010-06-07

Maintain High Performance

Data is extremely valuable for companies able to exploit it, despite the time, expense, and complexity involved. However, when one considers all the effort required to establish a highly functional, robust business intelligence system, constant, diligent, and vigilant governance only makes sense. Otherwise, entirely preventable problems may arise, hindering performance and reducing value.

13. Disaster Recovery Plan Deliverable 30 CDCR and CPHCS Data Warehouse Solution Disaster Recovery Plan

MetaVista SDLC Artifacts: • Level of Protection Assessment

• IDW Mission-critical Disaster Recovery (DR) Plan

Deliverable Content and Approach: • MetaVista’s approach to disaster recovery planning consists of two basic sets of activities:

1. Working with CDCR and CPHCS stakeholders to establish recovery time objectives (i.e., time to recover business functions) and recovery point objectives (the point in time from which data must be recovered) for the data warehouse system(s), and

2. Working with technology service providers (most likely OTech for this particular project) to define, implement, and (ideally) test disaster recovery plans that will reasonably result in the recovery time and recovery point objectives being met during an actual disaster recovery event.

• MetaVista’s disaster recovery planning approach assumes that most data warehouses are mission-critical systems. Many systems administration systems now capture and update transactions on a real-time basis and support dozens of run-the-business applications. More importantly, as a decision-making engine, an intelligent data warehouse is critical to helping organizations respond in an optimal fashion when crisis strikes.

• Disaster recovery for data warehouses should focus on high-quality, up-to-date, end-to-end metadata, something that few organizations have successfully implemented. Metadata is critical for performing impact assessments especially when something in a source system changes. The challenge is tracing how it will affect every other component in the system down to metrics within end-user reports that are stored as favorites. If hyperlinks store favorites and these are not recovered the links will be lost.

• During a disaster recovery event, the ability to meet recovery time objectives (i.e., time to recover business functions) and recovery point objectives (the point in time from which data must be recovered), without access to a dynamic, comprehensive metadata management system.

Preparation Assessment

• The disaster recovery plan must include not only the data warehouse, but the servers it runs on, and the report metadata and applications it supports. Disaster recovery planning is insurance, and most organizations must insure what they need, not just what they can budget.

• MetaVista’s will assist stakeholders in prioritizing business processes and applications that are critical to their operations. Even though the data warehouse will presumably be a top

Page 55: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 51 of 107 MetaVista Consulting Group Updated: 2010-06-07

priority, the extract, transform and load (ETL) engines that populate the data warehouse and the BI servers that generate and distribute critical reports are equally important. The data warehouse environment can't be fully restored until every one of its components is brought back online unless the design is flexible enough to be loosely coupled.

• MetaVista will define test scenarios for the disaster recovery plan for the data warehouse such as recovering from a database failure, restoring clients, servers, networks, storage, applications and databases to fully simulate a recovery event.

14. Service Level Agreement(s) Deliverable 31 CDCR and CPHCS Service Level Agreement

Deliverable 32 CDCR EIS and OTech Service Level Agreement

MetaVista SDLC Artifacts: • Service Level Agreement Worksheet

• Level of Protection Assessment

Deliverable Content and Approach: • MetaVista will work with all applicable stakeholder groups to develop and gain approval

for relevant Service Level Agreements (SLAs).

• MetaVista is not aware that a “standard” SLA template exists within the State, particularly at OTech which reportedly has few formal SLAs in place despite its many State customers. Accordingly, MetaVista will work with stakeholders to determine the overall format and content of each SLA, in addition to assisting the parties in negotiating the actual service levels.

• We anticipate that each SLA will address topics such as service availability, performance (e.g., application response time), problem management (e.g., service desk response time expectations), disaster recovery (recovery time and recovery point objectives), and the like.

15. Training and Knowledge Transfer Deliverable 33 CDCR and CPHCS Solution Training and Knowledge Transfer Plans (All

Phases)

Deliverable 34 CDCR and CPHCS End User Training Phase I Deliverable 35 CDCR and CPHCS End User Training Phase II

Deliverable 36 CDCR and CPHCS End User Training Phase III

MetaVista SDLC Artifacts: • Target Organization Assessment Plan

Deliverable Content and Approach: Solution Training and Knowledge Transfer Plan

• MetaVista has over ten years of knowledge transfer and training experience. We define knowledge transfer and training as the process of converting tacit knowledge into usable explicit knowledge while acquiring new knowledge and skills to maintain the system.

Page 56: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 52 of 107 MetaVista Consulting Group Updated: 2010-06-07

• Over the years, our staff have developed and overseen knowledge transfer efforts as part of implementing various types of projects. This includes many IT System development and IT System deployment projects.

• For this project, MetaVista will develop Solution Training and Knowledge Transfer Plans that define who will be trained (e.g., technical staff or end users), training objectives for each group, training delivery options (training materials, training event durations, frequency, and the like), and the like. Training will be delivered in phases, coordinated with the overall Data Warehouse implementation schedule.

• While ordinarily MetaVista considers the development of training plans and the delivery of the actual training to be two separate deliverables, we understand from the Questions and Answers, Question 34, that CDCR wishes for training for technical staff to be included in Deliverable 33. According, having completed and obtained approval for the overall training and knowledge transfer plans, MetaVista will develop and deliver training for the technical staff as part of Deliverable 33.

End User Training

• MetaVista will develop and deliver end user training according to the approved Solution Training and Knowledge Transfer Plan (Deliverable 33, above).

• MetaVista will present all training materials to appropriate stakeholders prior to any training events to confirm that the training materials are technically accurate, sufficient for the task, and are aligned with the learning skills and preferences of the identified audiences.

• We anticipate that for each phase, four hours of classroom training will be required along with appropriate written materials. We understand that there will be a minimum of thirty staff that will require training, and accordingly we assume that between two and three training events will be required during each phase in order to accommodate the work schedules of all participants.

16. Other Data Warehouse, Business Intelligence, and Reporting Needs Deliverable 37 CDCR and CPHCS Other Data Warehouse, Business Intelligence, and

Reporting Needs, Item 1

Deliverable 38 CDCR and CPHCS Other Data Warehouse, Business Intelligence, and Reporting Needs, Item 2

Deliverable 39 CDCR and CPHCS Other Data Warehouse, Business Intelligence, and Reporting Needs, Item 3

Deliverable 40 CDCR and CPHCS Other Data Warehouse, Business Intelligence, and Reporting Needs, Item 4

Deliverable 41 CDCR and CPHCS Other Data Warehouse, Business Intelligence, and Reporting Needs, Item 5

As described in the RFO, MetaVista is including the above five deliverables to accommodate unanticipated tasks. We understand that these deliverables will be requested and approved at the discretion of the CPHCS CIO or designee(s), and that 50 hours will be allocated for each of these deliverables in the Rate Sheet (Exhibit B-1)

Page 57: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 53 of 107 MetaVista Consulting Group Updated: 2010-06-07

17. Monthly Status Reports Deliverable 42 Monthly Status Reports

MetaVista will provide a monthly status report to provide a high-level summary of our status. The summary will describe our overall process, with an emphasis on completed and upcoming activities and milestones, and a general discussion of whether the project is proceeding as planned. Additionally, the monthly status report will summarize key (high-priority) issues and risks..

MetaVista assumes that the primary target audience for these reports will be CDCR senior management, Project Managers for related projects, and, potentially, oversight agencies. Accordingly, our objective in these monthly reports will be to provide a brief overview of key status information (ideally, no more than two pages in length) so that interested stakeholders can quickly review and understand the status of MetaVista’s activities, deliverables, issues, and risks without having to read through excessive detail.

Page 58: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 54 of 107 MetaVista Consulting Group Updated: 2010-06-07

Assumptions • State and other non-MetaVista subject matter experts will be available as needed to

work with MetaVista in completing all project deliverables within the schedule constraints identified in the RFO.

• State staff will review and provide feedback on project deliverables within five business days of their being submitted for review.

• The approved solution design will reflect multiple data warehouse environments, including potentially a development environment, a test environment, a quality assurance environment, and a production environment.

• Testing of the disaster recovery solution is not within the scope of this project.

• The production environment, and possibly all environments, will be hosted in an OTech-managed facility

• The State will procure all hardware and licensed software required to implement approved solution designs.

• State staff will perform the initial physical installation of all hardware and network components within the appropriate data center(s).

• State staff will implement any changes to shared server, network, and/or storage infrastructure components required by this project. This includes the design, installation, modification, and/or configuration of any routers, switches, firewalls, virtual server pools, SANs, backup devices, and the like that will support the data warehouse solution on a non-exclusive basis.

Page 59: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 55 of 107 MetaVista Consulting Group Updated: 2010-06-07

MetaVista’s Expertise and Experience Since 1996, MetaVista has specialized in delivering high-quality, cost-effective professional consulting services for our public and private sector clients. Services include:

• Project and program management

• Strategic planning

• Business analysis

• Facilitation

• IV&V

• IPOC

• HIPAA security consulting

• Data center consolidation and relocation

• IT strategic planning

• Feasibility studies

MetaVista has a proven track record of managing projects that are mission critical, technically complex, severely schedule constrained, and multi-vendor in nature. These projects include many large-scale private sector data center infrastructure projects, and some of the largest IT projects for the State of California.

Over the past fourteen years, MetaVista has served dozens of California state agencies, departments, commissions, boards, and other organizations, including the Department of Corrections and Rehabilitation. Other clients include some of the largest private sector firms in the country, such as Hewlett Packard, J.C. Penney, Microsoft, and Wells Fargo Bank.

We are proud of our business history with the State of California and work hard to continually ensure the high standards we have effectively established and applied throughout our contracting enterprises.

Representative Projects Current and recent projects include:

• Currently providing Project Management services to the California Prison Health Care Services for the Access to Care project, Ten Thousand Beds, and Strategic Offender Management System (SOMS). The services provide include, but are not limited to, producing project documentation, weekly time recording, status reports, risk management, procurement and contract management, transition management, communications management, and quality management.

MetaVista holds the following certifications and contracts with the State of California:

• Certified Small Business #17002 • Info Technology CMAS # 3-09-70-1454C • Info Technology CMAS # 3-08-70-0471F • Info Technology MSA # 5-06-70-105

MetaVista is also on the: • Sacramento County Vendor List • CalPERS Spring Fed Pool

Page 60: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 56 of 107 MetaVista Consulting Group Updated: 2010-06-07

• Provided project management and technology consulting services to Wells Fargo Bank in support of multiple projects, each of which required detailed IT asset inventory information describing 16,000+ servers and over 1,000 applications in more than 100 data centers and server rooms nationwide. MetaVista personnel worked with senior management, enterprise architects and others to define the structure and content for an enterprise-wide configuration management database (CMDB), and developed the tools and processes for consolidating inventory data from multiple sources and soliciting missing data from server and application owners.

• Provided project management and organizational change management services as the CalWIN Implementation Deputy responsible for overall management of the technical and application infrastructure rollout, data conversion, training and transition and change leadership. Acted as liaison to the on-site WCDS Project Team for Pilot County (Sacramento and Placer) issues.

• Provided business analysis, requirements gathering and data/systems modeling services for the California Department of Food and Agriculture (CDFA) for their Emerging Threats Project. This project involved analyzing and documenting existing applications, data stores, and business requirements for over thirty unique business processes, ranging from animal disease control to the inspection and licensing of dairy facilities.

• Providing software development and database administration services since 2006 to the California Energy Commission in support of the CEC’s Program Information Management System (PIMS), a multi-tier ASP .NET/SQL Server application. During most of this time, MetaVista also provided Project Management services until this service was taken over by State staff.

• Providing HIPAA IT Consultant Services since 2004 to support and assist CalOHII in coordinating and leading the State’s implementation of the IT components of HIPAA. Activities include project planning, technical analysis and recommendations on the impact of HIPAA on IT systems, strategic planning for statewide HIPAA IT and NPI implementation; analysis and evaluation of HIPAA changes to the Short-Doyle Medi-Cal system; analysis and summary of major issues and risks identified in State HIPAA project management reports, and analysis of HIPAA-related emerging issues.

• Provided project management, enterprise-wide business analysis, requirements gathering and technical writing for the California Department of Corrections and Rehabilitation (CDCR), according to the OCIO guidelines, developed CDCR’s Information Technology Capital Plan (ITCP) and Agency Consolidation Plan (ACP). Additionally, MetaVista developed Implementation Plans for Execution of the ITCP and the ACP, provided facilitation services for CDCR’s IT Strategic Planning session and created the Executive Summary and mapping to Business Strategic Plan.

Several of MetaVista’s employees and partners are recognized experts in their respective fields; they have published technical papers, presented at national and global conferences, taught at the university level, and actively contribute to industry standards. MetaVista has been actively involved with the local chapter of the Project Management Institute (PMI) as a corporate sponsor, as board members and volunteers.

Page 61: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 57 of 107 MetaVista Consulting Group Updated: 2010-06-07

Customer References

Reference 1: Access to Care Project (CPHCS) Project Title Access to Care – Sick Call Project and Emergency Response

Initiative

Project Begin Date November 3, 2008

Project End Date On going

Reference Customer Agency/Company Street Address City, State and Zip Code Contact Person Contact Phone Number

CPHCS Healthcare Services 660 J Street, Suite 295 Sacramento, CA 95812-4038 Carl Block (916) 322-0087 [email protected]

Project Description: Briefly describe the nature of this project.

This project was for a complete redesign of the Sick Call processes, forms, and staffing models.

Specific Services Provided

Emergency Response Initiative Project Manager Activities Overall management of the Emergency Response Initiative including scope, planning, scheduling, monitoring and control. Responsibilities of the project manager include:

1. Manage team of Nurse Consultants responsible for the deployment of the Emergency Response Policy & Procedures across 33 CDCR institutions spread across the state.

2. Interface with CPHCS and CDCR training organizations to ensure deployment of Advance Life Support and Basic Life Support certification programs for Medical and Non-Medical staff, respectively.

3. Interface with CPHCS procurement staff to implement standardization process for emergency response related medical equipment purchasing.

4. Interface with stakeholders at Statewide, Regional and Local institution levels. 5. Liaison between all appropriately involved CPHCS and CDCR staff to ensure successful

implementation of the initiative. 6. Identify and support all activities required to implement the initiative. 7. Convene a multi-disciplinary committee to ensure proper representation, communication

and direction for the project from the CPHCS/CDCR organization. 8. Communicate with CPHCS project staff to ensure efficient and effective exchange of

information and that important and timely decisions are made. 9. Oversee the resolution of all project issues and manage project risks. 10. Meet all required policies of the Project Management Office including development of

required project documentation (Project Charter, Project Management Plan, etc.), weekly time recording and status reporting through the Clarity Project Management

Page 62: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 58 of 107 MetaVista Consulting Group Updated: 2010-06-07

application. 11. Interface with other project managers to communicate risks and project dependencies

that potentially impact the portfolio of CPHCS projects. Access to Care – Sick Call Project - Project Manager Activities

1. Conduct a clinical process redesign including implementation and monitoring of at least one pilot program.

2. Determination of a new sick call staffing model. 3. Clinical policy and forms revisions. 4. Introduction of Information Technology systems to provide proactive, planned care to

inmate-patients. 5. Correctional adult institution facility improvements. 6. Statewide implementation of a new sick call program and staffing model that includes

clinical training and change management activities. 7. Work and collaborate with other PMs and stakeholders supporting the Access to Care

Initiative and other CPHCS related projects. 8. Any other requests made by CPHCS’ Chief Information Officer (CIO).

Page 63: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 59 of 107 MetaVista Consulting Group Updated: 2010-06-07

Reference #2: IT Consulting Services for FASIMS (CDFA) Project Title IT Consulting Services for the Food and Animal Safety

Information System (FASIMS), also known as the Emerging Threats (ET) Project

Project Begin Date July 2007

Project End Date August 2008

Reference Customer Agency/Company Street Address City, State and Zip Code Contact Person Contact Phone Number

CA Dept of Food and Agriculture 1220 N Street, Room A-114 Sacramento, CA 95814 Dr. Annette Whiteford 916-654-0881

Project Description: Briefly describe the nature of this project.

The purpose of this project was to document the requirements associated with a proposed new Food and Animal Safety Information Management System (FASIMS), more commonly known as the Emerging Threats (ET) system. The FASIMS/ET system was needed in order to enable the CDFA and other agencies to respond more quickly to any potential food safety threat. The CDFA and other agencies were constrained by existing systems and processes that were not integrated, and in most cases were partially or completely paper-based. This prevented information from being rapidly collected, integrated, analyzed, and distributed in response to actual food safety threats, or in support of any other investigation or process improvement effort.

Specific Services Provided

IT Requirements Analysis MetaVista’s responsibility under this contract was to perform a thorough exploration of the proposed new FASIMS/ET system with the intention of discovering (and often inventing) the functionality and behavior of the system, and documenting the requirements for the system. MetaVista participated in numerous meetings with CDFA personnel over the course of nearly a year to document and analyze existing business processes, model the desired FASMS/ET functionality via written Use Cases and other means, and ultimately document the requirements.

The primary deliverable from this effort was a Software Requirements Specification, a 177-page technical specification containing 384 software requirements and extensive supporting data. This document was used as the basis for a subsequent RFP and contract for development services for the FASIMS/ET system. Note that MetaVista was not the development vendor.

Page 64: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 60 of 107 MetaVista Consulting Group Updated: 2010-06-07

IT System Analysis As a prerequisite to documenting the requirements for the new FASIMS/ET system, MetaVista performed and documented the results of an analysis of over thirty existing business processes, many of which were supported by multiple IT applications. Often, these existing systems were not fully documented, not integrated with other related applications, and partially duplicated the business functionality present in other applications. For example, many of the existing IT applications included information describing the names, addresses, and telephone numbers of California producers and/or veterinarians; each such contact database tended to have slightly different data, and each was generally created and maintained separately from the rest.

MetaVista’s analysis of these existing systems included a description of each system, including major inputs and outputs and data repositories, along with a description of the business use cases supported by each. This analysis made it possible for MetaVista and CDFA to identify business functionality not supported by existing applications, functionality that was duplicated in other applications, and in general to discover requirements for the new FASIMS system that might not otherwise have been captured.

IT Process Reengineering MetaVista assisted CDFA staff in identifying opportunities to improve the efficiency and effectiveness of existing business processes within the scope of the project. Specifically, MetaVista devoted a significant amount of effort to reengineering common work processes and process-related data such that the processes were generic, and thus could be applied to multiple business processes. For example, existing business processes and data that supported multiple non-integrated repositories of information describing animal operations and contacts were generalized such that the new system could provide a single web-based repository with a common user interface, usable by many different processes.

Database Design As part of the System Analysis and Requirements Analysis described above, MetaVista reviewed and documented existing data and records management practices for approximately thirty existing business processes and applications. Working with CDFA stakeholders, MetaVista documented Entity-Relationship Diagrams (ERDs) for existing applications (to the extent this information was available), and developed a proposed data model as part of the Software Requirements Specification. In addition, MetaVista documented data-related requirements, including those relating to system capacity, data security, data integrity, and data retention.

Page 65: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 61 of 107 MetaVista Consulting Group Updated: 2010-06-07

Reference 3 – Configuration Data Service (Wells Fargo Bank) Project Title Configuration Data Service

Project Begin Date July 2005

Project End Date December 2005

Reference Customer Agency/Company Street Address City, State and Zip Code Contact Person Contact Phone Number

Wells Fargo Bank 1111 Main Street, Third Floor Vancouver, WA 98660 Mr. William Tomko (retired) 503-759-4682

Project Description: Briefly describe the nature of this project.

The purpose of this project was to establish a temporary, near-term IT data center inventory data collection service in support of multiple projects, and also to serve as a pilot for the long-term System of Record / Configuration Management Data Base (CMDB) enterprise strategy.

Project benefits:

• Significantly reduce the overall time and cost of collecting, integrating and maintaining data center inventory information (servers, applications, etc., including physical locations, data owners)

• Reduce impacts to internal lines of business due to frequent requests for the same or similar data

• Ensure that the data collected is consistent, and compatible with the planned future enterprise CMDB product

• Provide “head start” for the future data center inventory Systems of Record; consistent and complete inventory data collected in advance

Specific Services Provided IT Strategic Planning MetaVista worked with senior management, the Enterprise Architect, and other enterprise stakeholders to define and document the need for an integrated strategy, as well as the proposed strategy itself, for: collecting, storing and maintaining information describing tens of thousands of IT assets located in more than 250 locations nationwide; transitioning from largely manual data collection processes to automated processes; and transitioning the data within approximately one year to a proposed new enterprise CMDB product. IT Requirements Analysis MetaVista worked with stakeholders from thirteen other projects in multiple locations

Page 66: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 62 of 107 MetaVista Consulting Group Updated: 2010-06-07

nationwide, each with a need for current and accurate information describing data center assets; five projects in particular had significant overlap in data requirements. MetaVista worked with these stakeholders to define data requirements that addressed the needs of all projects, and to resolve conflicts in data requirements (e.g., different strategies for describing and storing information describing various assets). These integrated requirements were the basis for an enterprise inventory data model (see Consulting – Database Design below). Database Design MetaVista developed, in conjunction with the Enterprise Architect and his staff, a data model that incorporated the data requirements of all known projects with a need for information describing data center servers and applications. This model included extensive information describing the physical location of these assets, ownership, and the complex relationships between hardware assets, applications, virtual server pools, and the like. This model was designed such that it was consist with both near-term project data requirements (see IT Requirements Analysis, above) as well as the long-term strategy to migrate to an enterprise CMDB product. Migration Planning As part of the Strategic Planning associated with this project, MetaVista worked with senior management, the Enterprise Architect and others to plan the transition from multiple independent data repositories with a wide variety of data models and manual data collection and maintenance processes to a centralized, documented, partially automated, and sustainable process that would eventually transition to an ongoing organizational unit. As part of this effort, MetaVista assisted in planning and coordinating modifications to the existing Asset Management system to accommodate new data records and fields identified in the Requirements Analysis, in support the eventual migration of data into the enterprise CMDB. IT Data/Records Management Early in the project, MetaVista provided data management services by soliciting data center inventory data from multiple teams that had already collected data for other purposes, or were in the process of collecting data, and integrating this data into a common data model. MetaVista performed quality reviews of the incoming data, and followed up with data owners on a recurring basis to solicit missing information and to correct invalid information that was defined as critical in the data requirements. IT System Planning As part of this project, MetaVista assisted client stakeholders in planning for and acquiring the IT resources (physical servers, software, staffing, and the like) to host the new enterprise inventory data repository and to keep the inventory data current over time. This effort including providing capacity estimates (estimated number of records) for the repository. MetaVista also coordinated and participated in planning modifications to the existing Asset Management system to include new data elements identified through the Requirements Analysis and Database Design efforts (see above), and planning for the installation of automated data gathering agents (server-based software tools).

Page 67: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 63 of 107 MetaVista Consulting Group Updated: 2010-06-07

Consultant Qualifications

The MetaVista Team For this project, MetaVista offers:

Resource Name MSA Classification Hourly Rate

Charles Ritchie, PMP Senior Project Manager $120

Alexander Doré Senior Technical Lead $110

Suresh Chellappa, PMP Senior Technical Lead $110

Nipesh Shah Senior Programmer $90

Errol Thomas, PMP, CBAP Technical Lead $100

TBD2 Technical Lead $100

Each team member is introduced below, followed by a table that maps each team member’s role against the High-Level RUP Role from MetaVista’s proposed methodology.

Team Member Introductions • Mr. Charles Ritchie, PMP will serve as MetaVista’s Project Manager and Engagement

Manager for this effort, and will also assist in developing selected deliverables.

Mr. Ritchie is the founder and currently the CIO of MetaVista, and has extensive, relevant experience in project management, project oversight, software engineering, technology training, and large-scale technology deployment projects.

• Mr. Alexander Doré has more than fifteen years in the IT industry, and has served in a pivotal architectural programming and project development role with some of the most prestigious organizations in technology development, such as CISCO and Boeing

Mr. Doré’s services have ranged from architecting complex enterprise systems to developing individual components of n-tiered B2B, B2C & B2C architecture. He has supported Business Intelligence exploration from the user experience to the system realization and real-time operations. He strives to employ best practices and principals to maintain accountability as well as seeking optimization to meet stakeholder goals.

• Mr. Suresh Chellapa s a senior IT consultant with over eleven years of experience in diverse industries ranging from Financial, HealthCare, Government (State of California), Manufacturing, IT, to Telecommunications. This includes over seven years of Business Intelligence (BI) Data Architect experience coupled with hands-on BI leadership/PM

2 The list of MetaVista team members shows TBD for one of the Technical Leads. This is a placeholder for our implementation engineering support (hardware and networking) candidate. MetaVista anticipates that implementation activities are sufficiently far in the future that we cannot reliably propose any specific candidate. MetaVista has access to engineers with a broad range of experience; we will select and present a candidate for CPHCS approval when needed.

Page 68: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 64 of 107 MetaVista Consulting Group Updated: 2010-06-07

experience handling multiple projects, focused go-to-market BI/DW projects, planning, and executing deliverable based projects as well as time and materials (T&M) based projects.

• Mr. Nipesh Shah is a senior application developer with over eight years of experience in Microsoft Technologies in analyzing, designing, development, and maintenance of business applications in client/server technology, enterprise level applications, object oriented programming, testing and process documentation, Web/Win Forms, and client-server applications.

• Mr. Errol Thomas, PMP, CBAP is a principal consultant with over thirty years experience in the information technology field where his roles have included Independent Project Oversight Consultant, Project Manager, Implementation Coordinator, Business Analyst, Lead Technical Analyst and Trainer.

Mr. Thomas has participated in all phases of the project life cycle. He earned his Project Management Professional (PMP) certification in 2003 and his Certified Business Analysis Professional (CBAP) certification in 2007. He has also earned certifications as a Certified Enterprise Architect and Microsoft Certified System Engineer.

Additional Support from MetaVista MetaVista has the ability to rapidly adapt to changing project conditions and deliver needed services at the appropriate time, thus helping our consultants to achieve a successful completion to each project. We provide all our consultants and contractors with web-accessible groupware services, conference call services, and a library of templates and past project deliverables to help them to be as productive as possible. Furthermore, in addition to our own internal consultants and administrative support staff, MetaVista maintains an extensive network of independent consultants and peers in the local consulting community, so that we can provide anything from simple administrative support to additional consulting skills and experience on demand.

Page 69: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 65 of 107 MetaVista Consulting Group Updated: 2010-06-07

Team Member Roles Note: the table below shows TBD as one of the resources under Network / Security Engineer. This is because MetaVista anticipates that the network engineering activities are sufficiently far in the future that we cannot reliably propose any specific candidate. MetaVista does however have ready access to Network Engineers with a broad range of experience; we will select and present a candidate for CPHCS approval when needed.

RUP Role Candidate(s)

Project Manager Charles Ritchie

Chief Architect Alexander Doré

Web Services Developer Alexander Doré

ETL Architect Data Engineer Suresh Chellapa

DW Database Configuration Test Eng Suresh Chellapa

Hardware Configuration Test Engineer Suresh Chellapa / Errol Thomas

Middleware SOA Framework Engineer Alexander Doré

GUI Portal Design/Implementation Eng Alexander Doré / Nipesh Shah

Network / Security Engineer Alexander Doré / Charles Ritchie / TBD

Business Integration Analyst Charles Ritchie / Errol Thomas

Workshops / JAD Sessions / SME Nipesh Shah / Errol Thomas

Systems Integration Analyst Nipesh Shah / Errol Thomas

Page 70: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 66 of 107 MetaVista Consulting Group Updated: 2010-06-07

Mandatory Qualifications Mandatory Qualifications Response

1. IT career certification acknowledging skills and competency in area of IT Project development and/or support activities.

Mr. Ritchie, Mr. Chellappa, and Mr. Thomas all hold the Project Management Professional (PMP) certification from the Project Management Institute. Mr. Ritchie holds or has held the following certifications: ITIL v2 and v3 Foundation, Microsoft Certified Technology Specialist (MCTS), and Certified Data Processor (CDP). He recently passed the Certified Information Security Manager (CISM) test, but the certification itself is pending. Mr. Doré has earned a variety of professional certifications over his long career, including Oracle7 and Oracle8 DBA certifications, ISO 9000 Program Collaborator, and SEI CMM Level 3 Auditor. Mr. Chellappa is Informatica certified. Mr. Thomas holds the Certified Business Analyst Professional (CBAP) certification, as well as the Certified Enterprise Architect (CEA) and the Microsoft Certified System Engineer (MCSE).

2. Minimum of five (5) years experience performing business intelligence analyses including development and maintenance of data warehouse for a large public and/or private organization. (For example, experience with online analytical processing, extraction, transformation and loading, performance management, predictive analytics, etc.).

Mr. Doré has over ten years experience designing, implementing, Business Intelligence (BI) projects in healthcare, aerospace flight test and aircraft manufacturing OSS serviced by massive data warehouses, and specialized data marts.

Mr. Chellappa has over seven years of Business Intelligence (BI) Data Architect experience coupled with hands-on BI leadership/PM

3. At least three (3) years of experience in a lead capacity supporting data modeling, system modeling, application system analysis, cubes and report testing.

Mr. Doré has at least eight years of experience in the specified areas. He has designed KPI frameworks for integrated delivery, complex cube design, exposed drill-down and drill-through cube frameworks, Cognos cubes, as well specialized schema modeling in start, star/snowflake and starburst neural network data mining pattern recognition

Page 71: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 67 of 107 MetaVista Consulting Group Updated: 2010-06-07

schemas.

Mr. Chellappa has over seven years of experience with all phases of Data Warehouse development including prototyping, data modeling, analysis, design, development, testing, documentation, and training end-users.

4. Experience with industry standard business intelligence (BI) platforms (e.g., SharePoint, Oracle BI suite, SAS BI, etc.)

Mr. Doré has worked in Oracle Data Warehouse, and was the requirements specialist for Oracle8 beta spatial data requirements, also working with MS .NET and SQL Server 7 implementing MOLAP for over 500 Healthcare institutions.

Mr. Chellappa has more than a decade of experience with a wide variety of industry standard database servers, OLAP tools, CRM products, ETL tools, GUI development tools, and the like. These include tools and products from Oracle, Microsoft, Siebel (now Oracle), Cognos, and more.

5. Knowledge of SQL, TSQL and/or PL/SQL, and MDX.

Mr. Doré is a certified DBA in Oracle7 and Oracle8. He has at least four years with Oracle products, and at least four years of Microsoft SQL Server experience building databases, designing queries, and the like.

Mr. Chellappa has been a database developer and architect for over a decade. His knowledge of scripting languages includes SQL, PL/SQL, HTML, DHTML, ASP, VB script, Front Page 2000, JavaScript

Mr. Shah has approximately eight years experience with a variety of databases and data access tools, including Microsoft SQL Server 2000/2005, Oracle 8.1/10G, MS Access, LINQ, ADO.NET, ADO.

6. Knowledge of State IT policy and governance processes.

Mr. Ritchie has hands-on experience with a variety of projects for State of California clients, including CDFA, EDD, and CDCR. In addition, he has serviced as an internal MetaVista Program Manager for an extensive series of projects with the State since 2004.

Mr. Thomas is a former State employee, and has worked almost exclusively with the State

Page 72: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 68 of 107 MetaVista Consulting Group Updated: 2010-06-07

as a technology consultant for nine years.

Mr. Doré recently crafted a Governance program for WellPoint covering multi-state integration of Anthem, BCBS systems and upgrades. Mr. Doré also worked together with State of California with OSHPD 2 Certification, mapping CA State reporting requirements to the Federal Government model .

Mr. Chellappa has worked on projects for OSHPD and Caltrans.

Desirable Qualifications The following responses are indicative of the wide variety of technical and project experience represented by the MetaVista Team.

Desirable Qualifications Response

1. Experience with architecture and design of data marts.

Mr Doré has over ten years experience optimizing data marts for BI. Mr. Chellappa has developed multiple data marts over the course of several years for the State of California, Barclays Global Investors, Agilent/HP, and others.

2. Experience in health care technical deployment initiatives.

Mr. Doré has been involved with various healthcare initiatives including HIPAA roll-out and Medicare part D. Mr. Chellappa has worked since July 2008 on a clinical data warehouse for Kaiser Permanente. Mr. Thomas has past project experience with Sacramento County Health and Human Services, serving as Project Manager and Lead Business Analyst for a large county-wide clinical practice management deployment.

3. Experience with projects supporting correctional environments and processes.

Mr. Ritchie and Mr. Thomas both have recent experience working on projects for CDCR. Additionally, MetaVista currently has three consultants working with CPHCS.

4. Experience working with State staff and State executives.

All proposed MetaVista team members have experience working with State staff on projects for the State of California or which involved

Page 73: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 69 of 107 MetaVista Consulting Group Updated: 2010-06-07

State stakeholders. Mr. Ritchie and Mr. Thomas have experience working with State executives, including State CIOs and other senior managers. Mr. Doré has worked with many branches of government from US Congressional, Pentagon, Federal and non-California State Agencies.

5. Ability to work in a team environment as well as independently.

All proposed MetaVista team members are experienced, senior-level consulting professionals, with at least ten years working with a wide variety of projects – including virtual teams and international teams. All team members are able to work productively as part of a team as well as independently.

Page 74: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 70 of 107 MetaVista Consulting Group Updated: 2010-06-07

Resumes

Charles A. Ritchie, PMP, MCTS Project and Program Manager with thirty years of data processing experience, primarily in the financial services and insurance industries as well as multiple California state agencies. Experienced in planning and leading major data center projects, establishing successful operating methodologies, designing and implementing new software systems, creating project proposals and specifications, and professional education. Broad scope of responsibilities includes project and program management, project oversight and IV&V, team building, motivating, supervising technical/production staff, testing, documentation, budgeting, scheduling, and training. Strong record of success in increasing efficiency, establishing standards, and improving profitability.

AREAS OF EXPERTISE

• Large-scale, multi-site, multi-vendor technology projects

• Enterprise data center strategies and architecture

• ITIL Service Delivery and Support • Project Oversight and IV&V

• Software development project management, including software requirements specifications

• Data modeling, particularly in support of IT asset inventories and enterprise CMDBs

• Server consolidation • Data center relocations and consolidations

• IBM mainframe products and technologies • Business case development for IT projects

PROFESSIONAL ACCOMPLISHMENTS METAVISTA CONSULTING GROUP, Sacramento, CA Dec 1996 – Present CIO, Director and Principal Consultant (Jun 2009 – Present) • Direct and manage computing and information technology strategic plans, policies,

programs and schedules for its internal computer services, voice and data communications, and management information systems to accomplish its corporate goals and objectives.

• Direct and manage business development and service delivery for customer contracts related to data center and software engineering projects.

CA Dept of Corrections and Rehabilitation (May 2009 – Present)

• Served as consultant and subject matter expert for data center consolidation projects in the development of the first annual Agency Consolidation Plan and related deliverables, including a detailed Implementation Plan for multiple consolidation initiatives.

President and Senior Consultant (Dec 1996 – May 2009) • Provided overall management and leadership for MetaVista, including corporate strategy,

financial management, marketing, proposal development, and the like.

• Served as internal Program Manager since 2005 for the portfolio of MetaVista’s of consulting engagements, generally six to ten concurrent projects for a variety of customers at any given time.

• Co-authored two public classes on Project Management topics: “Preparing for the PMP Exam” (a two-day class) and “Project Management Fundamentals” (a three-day class).

Page 75: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 71 of 107 MetaVista Consulting Group Updated: 2010-06-07

• Instructor for several Project Management courses sponsored by the Sacramento Chapter of the Project Management Institute (PMI).

CA Employment Development Dept (Mar 2008 – May 2009)

• Served as an Independent Project Oversight Consultant for the Automated Collection Enhancement Project (ACES) Project at the California Employment Development Department (EDD), providing oversight activities structured to determine whether formal project management processes are in place and being followed and whether project objectives are being achieved as defined in the project plan.

CA Dept of Food and Agriculture (July 2007 – Aug 2008)

• Led a team of four consultants performing research consulting services for the purpose of documenting as-is business processes and documenting software requirements in support of the Emerging Threats project. Requirements for the proposed statewide, web-enabled application addressed a wide variety of business functions including contact management, case management, processing veterinary lab test results, licensing and certification, inspections, training, and more.

CA Public Utilities Commission (May 2007 – Oct 2007)

• Served as a member of a project team developing a Feasibility Study Request (FSR) for the Consumer Protection and Safety Division. Activities include performing quality reviews on key deliverables, including all business process reengineering deliverables as well as the final FSR.

Wells Fargo Bank (Nov 2002 – Dec 2006)

• Provided project management and data modeling support for an enterprise-wide IT asset inventory effort that consolidated and validated existing inventory data from multiple legacy repositories into a single database, and coordinated ongoing data collection activities among multiple independent projects to ensure data consistency and to reduce redundant effort. The scope of this effort covered more than 1,000 applications and 15,000+ servers, located in dozens of facilities around the United States. This effort was planned in conjunction with, and later transferred to, a new ITIL-based enterprise CMDB team.

• Provided project management and technical consulting services in the areas of Enterprise Architecture and technical process improvement in support of a strategic initiative to reduce costs and improve operational efficiencies at a large national bank. The project involved developing new hardware and system software standards, and related lifecycle management processes for implementing and maintaining these standards over time.

• Provided project management and technical consulting services in support of an enterprise-wide server consolidation effort. Services including representing the client as the overall Project Manager on a team that included a major hardware vendor and other consultants. Worked with senior executives to define objectives, schedule estimates, technical strategies, and other key deliverables. Assisted in defining and resolving strategic issues at the executive level.

• Provided project and program management as well as technical consulting services in support of a long-term strategic alignment of three large-scale mainframe data centers. Worked with senior executives (CIO and others) to define objectives and priorities, then define a thirty-month strategy for completing the project. Assisted in developing Statements of Work, high-level schedules, budget estimates, technical strategies, and other key project deliverables. Assisted in defining and resolving strategic issues at the executive level.

Page 76: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 72 of 107 MetaVista Consulting Group Updated: 2010-06-07

Served as Project Manager for one of the initiatives resulting from this strategic alignment program, and provided ongoing project oversight services to the customer Program Manager.

CA Dept of Social Services (Oct 2003 – Jun 2004)

• Provided project oversight and independent verification and validation (IV&V) services for the California Department of Social Services in support of the multi-year CMIPS and CMIPS II projects. Worked as part of a team to assess performance variances, risk factors, cost issues, schedule issues, discrepancies in project documentation, and the like.

CPS Human Resource Services (Sep 2002 – Oct 2003)

• Performed a Needs Assessment of existing business functions. Documented business requirements, developed an RFP for software development services, and assisted in vendor selection activities for a complex, mission-critical software system for a non-profit government agency based on California. This project won a “Best of California” award in December 2006.

Et Cetera Group, Inc (Jan 2001 – May 2003)

• Developed and presented “Marketing for Capitol Projects”, a class on marketing and communication concepts and techniques for Project Managers, for the California Department of Transportation (Caltrans).

• Presented other Project Management courses at multiple locations statewide for Caltrans.

Output Technology Solutions (Jul 2001 – Nov 2001)

• Provided project planning and management services for a fast-paced software development team in a C / Unix environment. Client firm provides billing and statement processing outsourcing services for customers across the United States and Canada. OTS West, based in El Dorado Hills produced at that time over 2% of the total domestic mail in the United States.

• Completed a Project Analysis Initiative which involved reviewing existing project management and software development processes, benchmarking OTS against industry statistics and best practices with special emphasis on project management process maturity and software development practices, and presenting recommendations for improvement.

Barclays Global Investors (Feb 2000 – Jan 2001)

• Provided data center design input and project planning services for a proposed business office relocation, which included the client’s backup data center. As part of this project, identified and documented numerous opportunities for reducing the initial project budget by hundreds of thousands of dollars.

• Provided project management services in support of a complete power shutdown of a multi-tenant office building in San Francisco. The client operated a high-volume securities trading floor and data center within the building that required a structured shutdown, restart, and checkout.

Wells Fargo Bank (Nov 1996 – Jan 2000)

• Project Manager for two Year 2000 projects for a major national bank. One of these focused on “environment certification” which involved establishing standards for documenting Y2K readiness for infrastructure components (hardware, operating systems,

Page 77: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 73 of 107 MetaVista Consulting Group Updated: 2010-06-07

system software products, etc.) and then performing IV&V on these Y2K readiness deliverables to ensure that all applicable standards were met.

• Served as Program Manager for a complex data center relocation from Concord, California to another site in Northern California.

• Project Manager for a corporate-wide COBOL upgrade project involving hundreds of banking applications and thousands of developers.

UNIVERSITY OF CALIFORNIA, DAVIS / University Extension Project Management Instructor Sep 2000 – Dec 2001 • Instructor for the Certificate Program in Project Management for the University Extension at

UC Davis.

• Taught “Preparing for the PMP Exam” in Winter 2001

• Taught “Project Planning and Management Overview” in Fall 2000 and Fall 2001.

• Designed and taught “Advanced Project Management” in Summer 2001.

• Taught “Project Planning, Design and Implementation” in Winter 2000.

SCIENCE APPLICATIONS INT’L CORP (SAIC), Sacramento, CA Project Manager Jun 1994 – Dec 1996 • Program Manager under contract with Wells Fargo Bank for a multi-site, multi-state data

center consolidation project during the Wells Fargo / First Interstate Bank merger. This project included a 20,000 square foot expansion of the receiving data center, and was successfully completed under extreme schedule constraints -- the facilities upgrade and the mainframe system relocations were both completed in just over four months.

• Served as Program Manager for a data center relocation project for a large California bank.

• Served as Master Architect for a multi-system, multi-site mainframe consolidation project for the Defense Information Systems Agency (DISA). Duties included performing project planning and scheduling, as well as developing the overall methodology and approach for the project, which took place simultaneously at six sites around the United States.

• Assisted in developing bids and proposals, including project specifications, statements of work, and resource estimates.

DELTANET, INC, Rancho Cordova, CA Apr 1991 – Jun 1994 Senior Systems Programmer • Provided general MVS-related technical services for data center, performing product

evaluations and creating project specifications. Installed, maintained and supported MVS/ESA and JES2 software and related components, including software compilers.

• Established methodology for creating, installing, storing, and documenting modifications to the operating system and other related software.

• Installed and customized a complete replacement for the operating system environment for manufacturing client. The existing system was obsolete and undocumented.

• Evaluated and recommended a new software product for providing system security. Financial analysis revealed product would save approximately $300K over a five-year period.

Page 78: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 74 of 107 MetaVista Consulting Group Updated: 2010-06-07

FIRST SECURITY BANK, Salt Lake City, UT Jan 1977 – Apr 1991 Multiple Positions • Served in multiple positions in Computer Operations, Technical Support, and Systems

Programming, including several positions that involved supervising technical teams.

• Responsibilities included operating multiple IBM mainframe computer systems, and providing technical support primarily in the areas of storage management, capacity management, change management and disaster recovery testing.

• Served as an MVS (OS/390) Systems Programmer (including limited CICS Systems Programming support) for five years. Responsible for installing, maintaining and supporting the overall systems software environment as part of the Systems Programming team.

FORMAL EDUCATION • MBA, Management, Golden Gate University, Sacramento, CA, 1995

• BS, Business Administration, University of Phoenix, Salt Lake City, UT, 1990

PROFESSIONAL CERTIFICATIONS • ITIL Foundation v3 Certification, EXIN International, 2009

• Microsoft Certified Technology Specialist – Microsoft Office Project 2007, Managing Projects, Microsoft, 2008

• ITIL Foundation Certification, EXIN International, 2007

• Project Management Professional (PMP), Project Management Institute, 1995-Present

• Certified Data Processor (CDP), Institute for the Certification of Computer Professionals, 1989-1995

PUBLICATIONS • “Large-Scale Data Center Relocations: The Road to Success”. SAIC internal white paper,

1995

TECHNICAL SKILLS • Extensive experience with MS-Project (all versions)

• Extensive experience installing, customizing and maintaining Microsoft Windows, Linux (Red Hat, SUSE, Mandriva and others), Apple OS X, and OS/2.

• Experience designing, implementing, and maintaining several large internal project web sites using Dreamweaver and other web development tools

• Network and system administration skills include designing, implementing and supporting office networks based on Linux and Mac OS X servers, including a firewalls, routers, DNS servers, Apache web servers groupware servers, mail servers, and networked storage.

• Workstation software experience includes word processing, spreadsheet, technical drawing, presentation graphics, and software development applications.

• Mainframe Systems (1977 – 1995)

• Extensive experience with the MVS/XA and MVS/ESA operating systems, including operating system installation, customization and support.

Page 79: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 75 of 107 MetaVista Consulting Group Updated: 2010-06-07

• Programmed using IBM 370/assembler for twelve years. Other programming languages include CLIST, REXX, SAS, BASIC, COBOL, FORTRAN, C, and Python.

• Installed and/or used SMP/E, RACF, CA-Scheduler, FDR, and many other system software products. Familiar with mainframe telecommunications concepts.

TRAINING Selected examples of professional training received

• Agile Overview, Project Management Institute, Sacramento Valley Chapter, 2010

• ITIL V3 Foundation eLearning Program, The Art of Service, 2009

• Effective Negotiating, Karrass USA Ltd, 2007

• Demystifying the FSR Process, Project Management Institute (Sacramento Valley Chapter), 2007

• Aligning Projects to Strategic Initiatives, Project Management Institute, 2006

• Project Management Institute Global Congress North America, 2006

• Network and Telecom Principles for Project Managers, ESI International, 2004

• Managing Information Projects for the State of California, University of California, 2002

• Annual Seminar and Symposium, Project Management Institute, 2001

• Red Hat Linux Networking and Security Administration, Red Hat, Inc., 2000

• Red Hat Linux System Administration I and II, Red Hat, Inc., 2000

• Annual Seminar and Symposium, Project Management Institute, 1999

• Annual Seminar and Symposium, Project Management Institute, 1998

• From Stakeholder Requirements Document to Work Breakdown Structure, Project Management Institute, 1997

• Annual Seminar and Symposium, Project Management Institute, 1997

• Superior Results Through Streamlined Project Management, Project Management Institute, 1997

• Software Estimating, Science Applications International Corporation, 1996

• Software Requirements Engineering and Management Course, Science Applications International Corporation, 1995

PROFESSIONAL AFFILIATIONS • Information Systems Audit and Control Association (ISACA), 2009 - Present

• IEEE and IEEE Computer Society, 2002 – Present

• Project Management Institute (PMI), 1995 – Present

• Software Process Improvement Network (SPIN) 2002 - 2003

• Institute of Management Consultants (IMC), 1998 – 2001

• Member, Toastmasters International, 1992 – 1993

Page 80: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 76 of 107 MetaVista Consulting Group Updated: 2010-06-07

Alexander Doré Alexander Doré has more than fifteen years in the IT industry, and has served in a pivotal architectural programming and project development role with some of the most prestigious organizations in technology development, such as CISCO and Boeing

Mr. Doré’s services have ranged from architecting complex enterprise systems to developing individual components of n-tiered B2B, B2C & B2C architecture. He has supported Business Intelligence exploration from the user experience to the system realization and real-time operations. He strives to employ best practices and principals to maintain accountability as well as seeking optimization to meet stakeholder goals.

Mr. Doré’s architectural experience spans the development of GUI design, Wireframes, SaaS Portals, UI design, SOA pattern design, MVC framework, MS. NET framework, JEE framework, J2EE Spring & Globus framework, State Machines, Middleware, Messaging, Networks, Kernels Engines, APIs, Logical Interface, Frameworks, Databases, Virtual and Intelligent Data Warehouse, Metadata and Data.

RECENT CAREER ACHIEVEMENTS • CISCO: Designed groundbreaking streaming media SOA-light middleware application

• Swiss Re: Designed new conceptual architecture to deliver catastrophic event simulations on a massive data warehouse, supporting core underwriting and executive governance

• Delta Dental: Designed proprietary architecture for hyper-speed extension applications supporting business rules data agents mid stream to EDS core application

AREAS OF EXPERTISE

• Enterprise-wide, multi-site, multi-vendor technology software/hardware projects

• Massive Data Warehouse/Mart, On-Line Storage, BI dashboard and OLAP support

• SOA software development project management, requirements, functional specifications, design and implementation

• UNIX, VMS, AS400, Microsoft Server and secure network consolidation

• Model Driven Architect specializing in UML

• IBM mainframe products and technologies • Enterprise data center strategies and ETL

architecture • Oracle and SQL Server ER data modeling,

specializing in schemas and 1NF-5NF • Project Oversight • Mission-critical data center security,

relocations and consolidations • Business case development for IT projects • Developing Governance and SDLC programs

PROFESSIONAL ACCOMPLISHMENTS Note: gaps in date continuity are smaller projects; only best in breed are included in resume

BOOZ & CO, Indianapolis, IN Dec 2009 – Jan 2010 Consultant Technical Architect Project: SOW covers external WellPoint B2B architecture support across three areas:

Responsibilities:

• Review and strategize on Platform Consolidation Program (a.k.a. “IT Strategy”) including Potential Business / Integration Services, including but not limited to STAR, Red Brick, WLP, NASCO, FACETS & CS 90.

• Discovery Process Architecture Support

Page 81: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 77 of 107 MetaVista Consulting Group Updated: 2010-06-07

• Develop Enterprise Architecture Governance Package Definition

• Develop Enterprise Architecture integration roadmap to expand proposed 6-TB Health Management Data Warehouse to 30-TB to support BI predictive modeling, rapid growth and enterprise-grade ETL.

• Develop the Enterprise level blueprints, Domain level blueprints and High-level solution designs

SAIC, McClean, VA Feb 2009 – Sep 2009 Consultant Technical Architect

• Performed discovery work on prospective projects. Was brought in to develop Mobil-Netsphere M2M architecture models from network design documentation; work at a technical level with complex wireless IP networks, network management systems working hands-on with SOA network architectures.

CISCO SYSTEMS, San Jose, CA Dec 2007 – Dec 2008 Consultant Chief Architect/Agile Practices Project Manager Project: Formulate new B2B, B2C, B2E architectural footprint for high-speed distribution of interactive streaming media products on a conceptual TelePresence mediaNet to replace current Internet with high-speed MediaNet.

Responsibilities:

• Integrated and Implemented tool-driven Agile (Version-One) PMO practices, SCRUM & Sprint managed

• Articulated and drove next generation conceptual and system architectural vision

o Redefined JEE framework

o EDA/SOA-Agile

o Business Intelligence Brain “MILOS” to monitor Global Resources Traffic and High Availability

• Created the one visionary abstract model view for entire project lifecycle

• Created openness and choices into the design of prototype conceptual architecture

• Authored a blend of SOA solution architectures to attain future business goals

• Adapted business capabilities into a sustainable enterprise technology roadmap

• Mentored Agile implementation techniques to on and offshore development teams

• Developed user experience (UX) capability case models to define stakeholder product

SERVPATH, San Francisco, CA Dec 2007 – Mar 2008 Consultant Agile Architect Project: Streamline organization, reevaluate principal roles, redesign product development process, improve on/offshore model, implement Agile methods into current project management

Responsibilities:

• Audited lead engineering leadership and authored balanced scorecard appraisal report

• Developed tailored organizational regenerative initiative program to implement Agility

Swiss Re, Zurich, Switzerland Jul 2007 – Dec 2007

Page 82: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 78 of 107 MetaVista Consulting Group Updated: 2010-06-07

Consultant Principal Solutions Architect/Agile PMO Project: Replace manual portfolio input process, automate calculated unit (CU) calibration, integrate catastrophic event data and design integrated pricing solution

Responsibilities:

• Developed fundamentals for tool-driven Agile (Sparx EA) PMO practices, SCRUM & Sprint managed

• Developed fundamental logical and conceptual B2B, B2C, B2E architectural visionary document based on SOA and WCF API in MS .NET

• Developed groundbreaking data visualization API layer to map Business Intelligence (BI), integrating catastrophic event simulation data utilizing MOLAP & RTOLAP Data Warehouse, based on Katrina multi-event model

• Developed architecture supporting reinsurance portfolio submission auto loading

• Developed portal concepts to build accurate pricing to support underwriter decisions

• Mentored and certified project management and mathematical experts on OOAD best practices

DELTA DENTAL, San Francisco, CA Jan 2007 – Jun 2007 Consultant Sr. System Architect/SDLC Project Manager Project: Build fault tolerant hyper-performing core BI extension for B2B, B2E Coding Audit, Rules Agent, Fraud Detection applications to effect data changes, deletions and additions to core EDS dental claims, MS .NET/Citrix Portal

Responsibilities:

• Created SDLC architectural vision document, authored use cases, functional requirements, functional specifications and detail design documents

• Managed JAD sessions to develop business requirements for extending business rule functionality of EDS core application

• Managed SME sessions to develop GUI client interface to manage business objects in the functional areas of benefit audit, membership, billing, eligibility and collections

• Created multi dimensional “Rosetta Stone” matrix; defining claims’ coding audit business rules BI metadata

SAN MATEO COUNTY COMMUNITY COLLEGE DISTRICT, San Mateo, CA Consultant Project Manager in the Planning Dept Oct 2006 – Dec 2006 • Managed design-build project and bridging phase of architectural sustainability

programming; planned disbursement of $468 million portion of General Obligation Bond; LEED Certified.

SAN FRANCISCO HEALTH PLAN, San Francisco, CA Sep 2006 – Oct 2006 Consultant Architect/Project Director • Co-authored detailed architecture and PMO proposal for claims processing application

feature enhancements; co-authored step-by-step plan to execute code remediation

Page 83: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 79 of 107 MetaVista Consulting Group Updated: 2010-06-07

KAISER PERMANENTE, Oakland, CA Jan 2006 – Mar 2006 Consultant Solutions Architect Project: Implement Medicare Part D CSS B2B, B2C, B2E MS .NET portal application in the functional areas of benefits, membership, billing, eligibility and collections

Responsibilities:

• Gathered stakeholder dependencies, expectations and calculation requirements impacts

• Created user storyboards, use cases, business and functional requirements gap analysis

MEDEANALYTICS (formely MedeFinance), Emeryville, CA Full-Time Sr. Director, On-Demand Prototyping Feb 2004 – Oct 2005 Project: Direct fusion of business and technical aspects of a highly adaptable and customizable B2B MS .NET web analytics platform for healthcare providers and insurance payers

High-level Responsibilities:

• Embedded Agile/SCUM methodologies, SPiCE Self-assessment CMM compliance

• Planned and implemented iterative product development roadmaps

• Scaled architecture to meet company growth and market demands

• Managed US onshore and Ukrainian offshore development

Hands-on Responsibilities:

• Developed MS .NET web application BI products for predictive analytics suite: Balanced Score Card (BSC); Quality Factoring Dashboard design in Ms .NET Graphics, AVS Data Visualization API; Benchmarking; Revenue Cycle Analytics; Denial Analytics; Predictive Analytics; Eligibility Screening by Equifax Scoring; Scoring and Measuring Self Pay Analytics; Survey Manager;Adjudication Ombudsman Manager (HIPAA Mediation); Alert Management System (IEEE);

• Developed MS. OLE DB, for OLAP services and MDX query analytics, on SQL Server supporting MOLAP, 0.6TB, 200+ Fact tables, 10 billion rows, 5GB+ Dimension tables, 12 million rows, 24 MDX query structures, 14 tokens, 1NF & Non-1NF

• Developed BI alternative technology prototypes; Cognos Svcs 7 & Cognos 8 beta; Cube Design

• Developed SaaS Proprietary Portlets running on Sales Force Framework,

VIRTIAL PRO, INC, San Francisco, CA Nov 2003 – Jan 2004 Chief Technical Officer (CTO) • Evaluated US representation prospects of LUXOFT as an outsourcing and offshore strategic

resource management resource; developed SEI Technology Transition Practices (TTP)

WORLD BANK/IMF, Washington DC Jul 2003 – Sep 2003 Consultant Solutions Architect / USAID Donor Initiative • Conducted onsite Financial/Regulatory ISO 17799 Security Audit of Gov of Mozambique’s

Finance Ministry, Maputo, Mozambique, E. Africa

STATE OF WASHINGTON, Olympia, WA Aug 2002 – Nov 2002 Consultant UML Mentor • Delivered ad authored RUP/UML training/mentoring plan for RFQ

Page 84: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 80 of 107 MetaVista Consulting Group Updated: 2010-06-07

SIEMENS, Philadelphia, PA Jun 2001 – Apr 2002 Consultant Architect and Project Lead • Created SOA incorporating B2B, B2C, B2E architectural vision for next-generation

SOARIAN global healthcare EIM system, MS .NET Beta

BOEING MILITARY, Seattle, WA Jan 2000 – Dec 2000 Consultant Architect and Project Lead Project: Develop Single Source Product Data (SSPD) EPDM B2B, B2C, B2E portal architecture to replace WIRES mainframe; integrating DCAC/MRM PDM; supporting of the Joint Strike Fighter (JSF) program

Responsibilities:

• Developed Tailored Business Stream (TBS) BI workflow process standardization

• Designed logical interface kernel for ENOVIA/CATIA (VPM) portal integration

BIO-RAD, Hercules, CA Apr 1999 – May 1999 Consultant Planning Architect • Created in depth planning documents to validate the analysis, design, evaluation,

functionality, testing and implementation of a new generation B2B integration data exploration warehouse; Cognos 5.1 Finance/Powerplay

PACIFIC BELL WIRELESS, Pleasanton, CA Jun 1998 – Apr 1999 Consultant Architect • Built eCommerce web application platform and SQL Server OLAP data warehouse

integrating AMDOC Telligence Billing System, Telesales and other reporting systems; Oracle9i

SPRINT, Overland Park/Kansas City, MO Feb 1998 – Apr 1998 Consultant PMO, Architecture Lead • Upgraded BITS organization from CMM level 1 to level 3 compliance

BOEING/MCDONNELL DOUGLAS, MTA, CA Sep 1996 – Feb 1998 Senior Principal Engineer of Technology / Flight Test Technology Group • Multiple Projects: Built FAA certified Flight Test Information Management System

(FTIMS/2000); massive data warehouse SONY robotics storage for 20 years flight test information history, neural networks BI portal.

• Additional experience provided upon request

PRIOR EXPERIENCE • Mr. Dore has over twenty years in finance, taking the CPA and CMA, working as a

Comptroller, Controller, Sr. Auditor, configuring large enterprise software accounting packages; Solomon, MAS90

• Previously, Mr. Dore worked for twelve years as a Sr. Cost Engineer and Statistician developing BI reporting on IBM mainframes and AS400 for major construction engineering companies; Morrison-Knudsen, BP, Brown & Root.

FORMAL EDUCATION • Attended University of Washington, Seattle, WA, Computer Science & Engineering Dept

Page 85: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 81 of 107 MetaVista Consulting Group Updated: 2010-06-07

o Boeing Chair, Unaccredited Post-Masters, 98% distinction in OOAD Using UML

o Thesis I: Migrate Design Patterns from Smalltalk language to Java language; source: Elements of Reusable Object-Oriented Software, by Gamma, Helm, Johnson & Vlissides

• National Diploma (BS equivalent) in Economics and Business Administration, University of Surrey, NESCOT/UNIS, Ewell, Surrey, United Kingdom

o Major in Finance, Minor in Computerization

PROFESSIONAL CERTIFICATIONS • HIPAA Certified: Siemens Health Services, Kaiser Permanente and Delta Dental

• Oracle8 DBA, Oracle Corporation, 1998

• Malcolm Baldridge NQA BSC, ISO 9000 Program Collaborator, 1998

• Software Engineering Institute (SEI) CMM Level 3 Auditor, 1997

• Total Quality Management (TQM) Six Sigma Programmer, 1997

• Oracle7 DBA, Oracle Corporation, 1996

• Hoshin Kanri Facilitation Workshops (TQM) (1994) Facilitator, 1995

• International Standards Organization (ISO) 9000, 9001-3, 1996 -1998

• Zenger Miller Certificate in Frontline Leadership Core, 1998

• National Association of Accountants: CMA and CPA, 1978 -1979

AWARDS • Boeing Certificate of Achievement 2000 and Letter of Recognition for Outstanding Service

PUBLICATIONS • Co-authored hfm (HFMA) September 2005 issue feature article about Claim Denials

• Authored groundbreaking Siemens 2002 White Paper Achieving Component Service Provider (CSP) with Component Based Development (CBD) now relevant to contemporary SaaS and Cloud Computing.

CORE TECHNICAL SKILLS • Experience designing, implementing, and evangelizing LEED certifiable Sustainable

Development (SD) technology projects, implementing The Natural Step Framework (TNS), improving transparency, governance and quality.

• Experience designing, configuring, and implementing massive data centers, telemetry satellite data, NS DGPS data, batch processing and ETL processing, including mission critical NS USF flight-test data and USN DARPA new naval technology sea-trials data.

• Experience designing, implementing, Business Intelligence BI projects in healthcare, aerospace flight test and aircraft manufacturing OSS serviced by massive data warehouses, and specialized data marts.

• Experience designing, implementing, and maintaining projects that employ standard waterfall gated SDLC, as well as tailoring the Unified Process for RUP and AUP for

Page 86: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 82 of 107 MetaVista Consulting Group Updated: 2010-06-07

individual enterprise projects, as well as other forms of Agile, Scum and XP methodologies for smaller organizations.

• Experience designing, implementing, and maintaining projects that employ multiple SDLC profiles including waterfall, iterative, spiral for onshore and offshore development models; including participation in the creation of the DoD SDLC from the DOJ abstracts, DoD Commonality Initiative, DoD Re-Use Initiative and DoD COTS initiative.

• Experience designing, configuring, and implementing extended-SOA and SOA-Light, SOA Governance, SOA-Test-Harness, SOA Stack (considered a SOA design expert at CISCO)

• Experience designing, implementing middleware framework integration including Loral-JEE/JOS, Sun J2EE, Spring J2EE Spring Enterprise Bus and Message, Globus Resource J2EE in Eclipse Development tools, and Ms .NET

• Experience designing, implementing Web Services including CORBA ORBs, SOAP/XML, WSDL Client/Server, IBM WebSphere, and BEA WebLogic

• Experience designing, configuring, and implementing RDBMS Oracle 6.1, Oracle7, Oracle8, Oracle9i, Oracle10i, Oracle11i, Forms, Data Model, Data Warehouse, MS. SQL Server, DB2, Informix and other ODBMS and RTRDBMS specialized repositories.

• Experience designing, configuring, and implementing BI in MS .NET SQL server OLAP/MDX SQL BIS, Cognos and Business Objects

• Experience designing, configuring, and implementing compliance frameworks for SOX 401, and HIPAA

HANDS-ON TECHNICAL SKILLS • Over 15 years experience analyzing, designing, configuring, and implementing static and

dynamic software architecture, frameworks, MCV, user experience, GUI, wire-frames, systems and hardware in CASE tools.

• Over 15 years using Rational Rose Enterprise suite since its beta version release, 9 plus years using RequisitPro, ClearCase, and more recently ClearQuest, 8 years to present using Sparx EA modeling tools.

• Other tools used are TogetherJ, Erwin Data Modeling, Gane & Sarson ER Modeling tool, MAGIC RAD developer, Oracle2000 Developer, SQL Server diagrams (schemas), Visual Studio VB, Eclipse IDE for J2EE, and other plug ins.

• 15 years plus using MS Project, Visio, Power Point; expert in building smart Excel spreadsheets.

TRAINING • International Certified CASE UML RAD EA Instructor, UML 1.0 to 2.1.2 Modeler

• Certified Hoshin Kanri (TQMS) Facilitator

PROFESSIONAL AFFILIATIONS • Object Management Group (OMG): represented McDonnell Douglas Military for CORBA,

MDA and UML integration

• International Standards Organization (ISO): represented McDonnell Douglas Military and Boeing in ISO 9000, 9001 standards and policies

Page 87: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 83 of 107 MetaVista Consulting Group Updated: 2010-06-07

• International Standards Organization (ISO): represented World Bank/IMF in ISO 17799 field implementation

• SEI - represented McDonnell Douglas Military, Boeing, Sprint, Siemens as CMM auditor

• Leadership in Energy and Environmental Design (LEED)

Page 88: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 84 of 107 MetaVista Consulting Group Updated: 2010-06-07

Suresh Chellappa, PMP Mr. Suresh Chellappa is a senior IT consultant with over eleven years of experience in diverse industries ranging from Financial, HealthCare, Government (State of California), Manufacturing, IT, to Telecommunications. This includes over seven years of Business Intelligence (BI) Data Architect experience coupled with hands-on BI leadership/PM experience handling multiple projects, focused go-to-market BI/DW projects, planning, and executing deliverable based projects as well as time and materials (T&M) based projects.

Mr. Suresh has extensive experience with all phases of Data Warehouse development including prototyping, data modeling, analysis, design, development, testing, documentation, and training end-users. He has implemented several strategic and tactical BI initiatives.

AREAS OF EXPERTISE

• Business Intelligence data architecture • Data Warehouse and Data Mart development • Software development project management • Requirements elicitation and documentation

• Enterprise architecture • Data modeling and migration • Information management strategies • Business planning and needs analysis

PROFESSIONAL ACCOMPLISHMENTS KAISER PERMANENTE, CA Jul 2008 – Present Project Name: CA Start-Up services/Clinical Data Warehouse

Role: BI/DW Solutions Consultant/Hands-on BI Project Manager

Lead BI Data Architect Responsibilities:

• Created and advised client on an assortment of data modeling techniques, including star schemas, Kimball methodologies and ETL strategies.

• Provided development expertise in bridging domain requirements to developer/QA-ready functional specifications

• Ensured traceability of requirements throughout the functional specification.

CA Office of Statewide Health, Planning and Development (OSHPD) CA Dept of Transportation (Caltrans) Dec 2007 – Jun 2008 Project Name: Utilization Data Mart/Financial Data Mart

Role: Lead BI Data Architect/Hands-on BI Project Manager

Lead BI Data Architect • Responsible for gathering functional specifications from SME(Subject Matter Experts) and

from other sources, in addition, to ensuring the accuracy, completeness and simplification of functional specifications.

• Presented functional specifications to internal stakeholders for review and acceptance to ensure compliance with the requirements laid out in RFO.

• Revised functional specifications as necessary throughout the development lifecycle.

• Led the Business Objects transition initiative for Caltrans E-FIS Accounting project by providing a strategic and a tactical approach to conversion of various legacy reports.

Page 89: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 85 of 107 MetaVista Consulting Group Updated: 2010-06-07

• Provided guidance and oversaw the BO Configuration, Deployment plan, BO development environments, Upgrades, Report bursting, and distribution of the reports.

• Actively involved in ETL processes, data modeling, data quality measures and delivering reports.

• Developed logical and physical data models for a complex, multi-application environment to integrate with external data sources.

• Created & advised client on an assortment of data modeling techniques, including star schemas, Kimball methodologies and ETL strategies.

• Provided development expertise in bridging domain requirements to developer/QA-ready functional specifications.

• Created visual models, charts, flow diagrams and other specification components that facilitate simplification and abstraction.

• Created initial business object (entity) diagrams that can be used to initiate database design.

• Created user interface mockups and associated behavioral rules to define navigation, validation, data relationships and trigger events.

• Defined business rules that are complete and unambiguous including supporting formulas, algorithms, data queries, process steps and decision points.

• Identified exception conditions and alternate use case paths not identified in requirements documentation.

• Provided report layouts, filters, groupings, subtotals and data mapping rules.

• Defined impact of enhancement/project to other integrated applications.

• Defines glossary of significant terms as a component part of the functional specification.

• Ensured traceability of requirements throughout the functional specification.

Hands-on BI Project Manager • Responsible for providing project status updates to Executive steering committee, state CIO

office and other senior executive management staff on a periodic basis.

• Facilitating weekly status meeting, monthly Risk& Issues meeting, conducting Lessons learned and sharing the results with the management.

• Development team management

• Coordination of test plan execution with both consultants and State staff

• Responsible for managing project scope, budget, deadlines, schedules, risks, and change requests.

• Authored technical and functional requirements project deliverables complying with industry standards formats

• Responsible for facilitating Lessons learned, PIER(Post Implementation Evaluation Report) documents

Environment: Erwin, Business Objects, Informatica 8.1.1, Oracle 10g, MS-SQL Server, MS-reporting services, Pro Clarity.

Page 90: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 86 of 107 MetaVista Consulting Group Updated: 2010-06-07

Barclays Global Investors, San Francisco, CA Dec 2006 to Dec 2007 Role: Lead BI Data Architect/Hands-on BI Project Manager

Lead BI Data Architect • Translated business needs into data requirements

• Analyzed, cleansed and Enhanced source system data for population in the ODS, data marts, and the enterprise data warehouse.

• Worked with the business users and systems analysts to identify and understand source systems.

• Designed data marts and supporting tables for ETL process in conjunction with Business Intelligence Developers

• Migrated several Access databases to SQL Server. Actively involved the client’s team comprising of users and the tech team to acclimatize with the migration process.

• Designed Data Marts in conjunction with Business Intelligence Developers. Worked with the DBA team to optimize system performance.

• Documented and presented business requirements for a variety of audiences including client executives, functional managers, end users and developers.

• Participated in creating end state functional business intelligence vision and implementation plan.

• Participated in designing and developing the architecture for all business intelligence solution components.

• Contributed to the design of business intelligence application solutions including database design, ETL, report design and analytics integration

• Performed initial analysis of the data and helped the development team understand the intricacies of the incoming data, developing Informatica high-level design, developing mappings and workflows for the Thompson reuters Xpress feed project and World scope project

• Developed financial derivatives reports using Brio, Cognos

• Created Type1, Type2 Informatica mappings.

Project Management • Fully responsible for defining the project, planning, interviewing and hiring developers,

managing 3 developers (consultants and hired staff), training, deployment and support of the developed software applications

• Conducted daily stand-up meetings with the development team to determine the velocity of the team, adding/removing stories, tracking Iterations, burn down charts etc.,

• Ensured that project objectives are accomplished within the prescribed time frame, methodology and funding parameters, utilizing project development life cycle principles.

• Built consensus; coordinated planning and management to ensure project activities are executed in accordance with established requirements, defined scope, budget and schedule.

Page 91: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 87 of 107 MetaVista Consulting Group Updated: 2010-06-07

• Accountable for compliance with PMO structure and methodology: project information/artifact management, monitoring of the project budget, project contracts, design and development, resource management, project planning and tracking, estimating, risk analysis, sponsor relations, status communication, issue identification, tracking, and resolution, and served as the “go to” contact for all project activities.

Environment: Erwin, MS-Project, Informatica Power Center 7.1.4, Windows NT, Solaris , Informatica version control, Oracle8i/9i/10g, Sybase IQ, Sybase ASE, SQL Server, Rapid SQL.

Hewlett-Packard, CA Apr 2004 to Dec 2006 Project: Sales & Marketing/Financial Data Warehouse

Role: BI Data Architect/Informatica lead/DW Project Manager

BI Data Architect • Planned, designed, developed and maintained the overall data strategy, data architecture,

data models, principles, and standards for the PNB data warehouse.

• Ensured new features and subject areas are modeled to integrate with existing structures and provide a consistent view.

• Develop and maintain documentation of the data architecture, data flow and data models of the data warehouse appropriate for various audience base.

Informatica Administration / Tech Lead • Responsible for administration and maintenance of Informatica 7.1.4 environment, that is

deployed in a three-server (HP-UX) Dev, Test/DRP and Production configuration.

• Installed SQL Server ODBC drivers on HP-UX 11i to extract SQL Server data to the Oracle 9i Data warehouse. Installed& Configured Informatica PowerConnect for WebServices on HP-UX.

• Installed and configured Informatica 7.1 and successfully migrated mappings/workflows from 6.2 to 7.1. Responsible for creating and managing users and user group privileges in Informatica.

• Designed and implemented Documentation templates, Informatica Version Control & standards to use them, Informatica Upgrade checklists, ETL naming standards and conventions, Informatica Security etc.,

• Performed ETL Change Management code migrations for developers as part of an established internal CM process

• Participated in Informatica peer code-review and provided suggestions to improve performance, overall readability of the mappings, reducing development cycle by leveraging existing code, thereby increasing the response to time-to-market.

• Responsible for assisting and mentoring ETL development staff.

• Implemented a Change Request Process to address issues that were not part of a Project or a Service Request

• Created SCD (Slowly Changing Dimension) Informatica mappings – Type 1, Type 2 and Type 3. Modified several existing Informatica mappings, workflows to improve performance and throughput.

Page 92: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 88 of 107 MetaVista Consulting Group Updated: 2010-06-07

• Successfully designed and implemented Product Registration project from start to finish (https://my.procurve.com). Created Logical & Physical data model (Star Schema), Oracle DB Triggers, Views for Reporting needs, Informatica mappings (Type1, Type2, Used Web Service and Transaction Control transforms), Informatica Workflows and Maestro Schedules to extract data from Staging tables and loading it onto ODS layer .

• Responsible for gathering requirements from Sales and Marketing personnel (End Users)/Business Analysts to design and develop logical and physical models/Informatica mappings, estimating development efforts and translating those requirements into Key Performance Indicators. (KPIs).

Project Management • Created Project Management plan, MS-project schedule, WBS, Project presentations, PSS

(Project Scope Statement), C/B analysis, Design document, Quality assurance reports, Testing reports, Risk management reports, lessons learned, Maintenance document, Support manuals, and Project closing reports

• Responsible for M&M (Margin Management) app Project management & development, a financial component of the Data Warehouse - that deals with Gross Margins, Quota, Orders, Shipments & Revenue data.

• Managed Sales Analytics Project from requirements gathering thru analysis, design, development, testing and deployment. Developed Project plans, deliverables, detailed work plans, schedules, project estimates, resource plans and status reports. Collaborated with cross-functional teams and business unit leadership in developing project objectives, change requests and timelines.

• Proactively managed scope to ensure that only what was agreed be delivered to the business, unless changes are approved through scope management. Also, proactively informed business and IT managers of project risks and provided with mitigation strategies

• Responsible for assigning tasks to local and EMEA ETL developers on a periodic basis and to the Cap Gemini consulting team for PLM agile project and reviewed their Informatica deliverables.

Environment: Informatica Power Center 7.1.3/7.1.4, Windows NT, HP-UX 11i , Informatica version control, Oracle8i/9i, XML, XML Spy, WSDL, Informatica Web service, Business Objects, Siebel Analytics 7.7, Erwin, SQL Server, Oracle Pl/Sql, TOAD, Maestro Scheduler, Brio aka Hyperion 8.5.

Agilent Technologies, San Jose, CA Aug 2003 – Mar 2004 Project: CSS Siebel e-business Data Warehouse (a Deloitte Consulting project)

Duration: August 2003 to March 2004

Role: DW Architect/Senior ETL developer

DW Architect • Defining data warehouse architecture standards, best-practices and design methodology.

Reviewing current Data Warehouses / Data Marts and making recommendations specific to ETL, database, data delivery.

• Providing data warehouse expertise and consultancy to the agency in areas of data warehouse design, dimensional model design and optimization.

Page 93: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 89 of 107 MetaVista Consulting Group Updated: 2010-06-07

• Designing the architecture for all data warehousing components (e.g. tool integration strategy; source system data ETL strategy, data staging, movement and aggregation; information and analytics delivery; and data quality strategy).

• Provided oversight of tool evaluation and selection

Senior ETL developer • Extensively involved in improving throughput of Informatica Mappings/ Informatica

sessions/workflows eliminating performance bottlenecks.

• Created new data model/star schema and developed new Informatica mappings to implement the design.

• Extensively involved in interacting with project co-coordinators, Business Analysts and end-users to gather requirements.

• Created/Installed new Informatica Repositories/ Repository Server. Configured Informatica Server 6.1 for development and test environments. Also, responsible for taking back-ups of the repository and restoring it to dev/test repositories.

• Modified Siebel vanilla ETL mappings including configuration of SDEs/SILs.

• Responsible for migrating Informatica mappings from Development to Production by coordinating with the Off-shore/Global Development Team.

• Created OLAP reports/Dashboards using Siebel Analytics 7.5 Viz., Siebel Answers, Siebel Dashboard, Catalog Manager etc., Created logical tables using Physical, Business and Presentation layers.

• Responsible for moving data to the Siebel ETL Engine by extracting data from Siebel OLTP DB/non-siebel OLTP DBs using Informatica Power Mart 6.1 OEM for SIEBEL and then loading to Siebel e-business data warehouse creating meaningful data for Siebel Analytics (OLAP).

• Created new Informatica mappings and modified existing mappings to populate the Star Schema. Created new parameter files and variables.

• Used SysAdmiral scheduling software to kick-off sessions/Workflows. Created batch files to run the SQL for populating staging tables.

• Used different transformations Viz., Filters, Stored Procedure, Sequence Generator, Expression, Lookup (Connected and Unconnected), and Aggregator in the mappings. Thus, eliminating redundant and inconsistent dependencies.

• Involved in mentoring developers, assigning work to developers and reviewing team members’ work.

Environment: IBM UDB DB2 7.5, Business Objects, Oracle8i/9i, SQL Server, Oracle Pl/Sql, TOAD, SysAdmiral, Teradata, Informatica Power Center/Power Mart 6.1 OEM for SIEBEL, Siebel Analytics 7.5, Windows 2000/NT.

CIBC Oppenheimer, NYC DW Data Architect / Jr. Project Manager Project: Student Pay, Allotment and Management Information System (SPAMIS)

Gary Job Corps (administered by the U.S. Department of Labor), TX Senior ETL Developer/Data Architect

Page 94: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 90 of 107 MetaVista Consulting Group Updated: 2010-06-07

Vision Software, India Informix 4gl Developer/Junior Project Manager

FORMAL EDUCATION • Masters in Business Administration (MBA), ITM, India

• Bachelors in Business, University of Madras, India

Both of the above are prestigious and well-known institutions.

PROFESSIONAL CERTIFICATIONS • Project Management Professional (PMP), Project Management Institute

• Informatica certified

TECHNICAL SKILLS

ETL Informatica Power Center 8.1.1/7.1.2/6.2/6.1/5.x, Informatica Power Mart 6.2/6.1/5.1, Informatica Power Connect and Tera Data

Scripting Languages

SQL, PL/SQL, HTML, DHTML, ASP, VB script, Front Page 2000, JavaScript

Web Servers IIS, Personal Web Server

Languages ESQL/C, C, PERL, C ++, Unix Shell Scripts, Informix - 4GL

Database Server Oracle 10g/ 9i/8i/8.0/7.x, SQL Server 2000/7.0/6.5, Sybase IQ, DB2, MS Access 7.0, Informix Dynamic Server 9.x/7.x

Administrative / Design Tools

Microsoft Office Suite of applications, Visio, Project, Erwin data modeling tool

Operating System Windows 2000, Windows NT, Windows 98/95, Novell NetWare, Sun Solaris, AIX Ver4.0, HP Unix, Sco Unix 3.2, MS-Dos

N-Tier Jaguar CTS

Version Control SCCS, PVCS, RCS, Informatica Version Control

GUI Oracle Developer 2000, Visual Basic, PowerBuilder7.X

Hardware IBM RS6000, HP 9000

Development Tools DB2 Command Center, TOAD, Embarcadero Rapid SQL 7.1, ISQL, DBACCESS, RDS, ACE Report Writer, Visual Inter Dev 6.0

OLAP Siebel Analytics 7.5, Business Objects, Brio/Hyperion, Cognos

CRM Siebel OLTP, Siebel Analytics (OBIEE), and Upshot

Scheduling Software SysAdmiral, Maestro

Page 95: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 91 of 107 MetaVista Consulting Group Updated: 2010-06-07

Nipesh Shah Seasoned software development professional with more than nine years programming and enterprise-level application development experience in the public and private sectors, and five years experience designing and developing applications on the Microsoft .Net platform.

Mr. Shah has extensive involvement in all stages of Software Development Life Cycle (SDLC) including requirements gathering, logical and physical database modeling, application architecture design, development, implementation and production support. Additionally, his experience includes Project Execution, User Acceptance Testing, End User Training, Quality Assurance and Quality Control, Audits and Documentation

Mr. Shah is experienced in leading small teams, and is a proven problem-solving leader with analytical bend of mind. Due in part to his years of experience with an exceptionally wide variety of software development concepts, methodologies, and tools, he is able to master new technologies quickly.

AREAS OF EXPERTISE

• Enterprise-level application development • .NET Technologies, including C# • Web application design and development,

including GUI design • Process documentation

• N-tier application design and development • Object-oriented programming • Microsoft SQL Server, including SQL Server

Reporting Services • Database design and tuning

PROFESSIONAL ACCOMPLISHMENTS Employment Development Dept (EDD), Sacramento, CA Dec 2009 to Mar 2010 Application Developer/Analyst Project Description: e-Apply Project

e-Apply provides an online interface for users who want to apply for unemployment benefits. The application is designed so that state analysts can intervene at the middle-tier before sending the completed application data to mainframe database for case processing.

Responsibilities:

• Setup pre-production Web Server for e-Apply Central application

• Applied client-side show-hide functionality using JQuery and wrote custom validators for controls in C#/ASP.net. Used Web forms and various controls for better UI look and feel.

• Consumed Web Services for Authentication at the middle-tier for lookup and authentication

• Use Performance Optimization methods for SQL Server Database efficiency

• Used TFS for assignment of tasks/bugs for efficient co-ordination between DEV and QA teams. Added Sprint Log functionality to TFS for increased developer productivity.

Environment: Windows XP, Web Forms, .NET 3.5, ASP.NET, C#.NET, SQL Server 2005,JQuery, AJAX Toolkit, XML, Web services, Team Foundation Server

HP-EDS, Rancho Cordova, CA May 2007 – Oct 2009 Senior Programmer Analyst Denti-CAL Project (May 2007 – Oct 2009)

Project Description: Fair Hearing Case Management Information System (FH)

Page 96: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 92 of 107 MetaVista Consulting Group Updated: 2010-06-07

The FH System was a technology refresh project of a legacy application written in VB 6, MS Access and SQL Server 2000. This application utilized an N-Tier architecture consisting of UI (User Interface) Tier, Business Tier, and Data Access Tier. Case document management, automated email notification generation, workflow routing, fine-grained role based activity and security system and configurable menus and configurability through interfaces are highlights of the system.

Responsibilities

• Identified High Level Requirements and Detailed Business Requirements and prepared high level flow diagrams.

• Used ERWIN Data Modeler to make logical, ERD and conversion to physical database model. Created Use Case model and diagrams in UML.

• Prepared detailed SFD/TSD/TST documents for submission to MDSB (state client).

• Prepared detailed test case scenarios and cases.

• Devised mapping and business rules for data migration between old application residing on SQL 2000 to new application in SQL 2005.

• Wrote SSIS packages for ETL processing of data from tables in old schema to tables in new schema.

• Designed and Developed Control Flow, Data Flow and Error modules for SSIS packages.

• Setup Pre Production Web and Database Server for the application.

• Created ancestor UI elements for State Hearing screens, modal and search screens.

• Designed the Data Access Tier, which consists of C# classes that encapsulates all calls to database backend.

• Prepared Report layouts and mock screens for front-end for presentation to client during TSD approval phase.

• Wrote stored procedures, triggers, views, cursors for the data access layer.

• Utilized Web Services for Authentication with design and development of SOAP interface.

• Design base of the application is in .NET 3.0.

Environment: Windows XP/Server 2003, Visual Studio .NET 2008/2005, ASP.NET 2.0, C#.NET, SQL Server 2000/2005, Crystal Reports XI, XML, MS Team System 2008, Telerik UI Controls, ERWIN r7, Enterprise Architect

DELTA DENTAL OF CALIFORNIA, Rancho Cordova, CA May 2005 – Apr 2007 Programmer Analyst Project Description: SURS PC-Audit Reports and Data Download Enhancement (PC Audit)

PC Audit was designed and developed to track narratives against Claim Detail Report downloaded from the mainframe system. Modified and added new functionality like direct download of data from the SQL server as a CSV file or copy to a local Access database table. Designed new reports including Crosstab summary report and wrote stored procedures with cursor and complex business logic for that purpose.

Responsibilities

Page 97: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 93 of 107 MetaVista Consulting Group Updated: 2010-06-07

• Joint Application Development (JAD) session with SME and lead users for requirements gathering and documentation.

• Modified Design base for application in .NET 2.0.

• Used Crystal Reports XI for custom reporting development.

• Designed, developed and deployed new module for data download.

Environment: Windows XP, Visual Studio .NET 2005, ASP.NET 2.0, VB.NET, Java Script, AJAX, SQL Server 2000, Crystal Reports XI, COM Wrappers in .Net, COM+, XML

HP-EDS, Rancho Cordova, CA Feb 2005 – Apr 2005 Information Analyst CA Dept of Health Care Services (Feb 2005 – Apr 2005)

Project Description: Medi-CAL Beneficiary Identification Card Extension

FAME is a web-based beneficiary logging and help system developed for the users of the Medi-Cal program. There were features like uploading files, check the status of the beneficiary application, record miscellaneous details and allow real-time claim submission.

Responsibilities

• Requirement Analysis and interaction with end users for the process and project documentation.

• Designed and modified business and data access layer using C#.

• UI was developed using ASP.NET with C#.

• Changed the data access layer modules as required.

Environment: Windows 2003, Visual Studio .NET 2005, ASP.NET 2.0, C# .NET, Oracle 8, VSS

DESERT RESEARCH INSTITUTE, Reno, NV Jul 2004 - Dec 2004 Software Developer Project Title: ENSEQ (Biological Sequence Storage Database)

ENSEQ is a Web based application for adding data to master tables with report module and user configuration management. The application helped researchers in storing and tracking experimental data within their labs. Reports are generated based on different filtering criteria.

Responsibilities:

Involved in Analysis, Design and Development of the application. Used XML files as input data for device application and generated XML files to be synchronized with master database. Developed User Module for registering new users and assigning rights. Implemented User Group Based security for the application with Form Based Authentication.

Environment: Visual Studio .NET 2005, ASP.NET(C# .NET), ADO.NET, SQL Server 2000, Crystal Reports, VSS, CSS, Windows 2003

POLYPHASER CORPORATION, Minden, NV Feb 2004 – Jun 2004 Lead Developer Project Description: Knowledge Management System

Page 98: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 94 of 107 MetaVista Consulting Group Updated: 2010-06-07

The Knowledge Management System is a SharePoint Portal developed for use by employees of the company. Resources were hired from one of Microsoft Partners for the training and implementation of the project.

Responsibilities:

• Performed project planning, scope definition, and project reviews.

• Monitored the development of the required deliverables.

• Risk Management, Project Documentation, Configuration Management, Quality Assurance.

• Used ADO.NET for data interaction between the DAL and the Database using Data Reader, Datasets and Data Adapters.

Environment: Windows 2003, SharePoint Server 2003, Microsoft Report Services, SQL Server, and VSS

RDP Labs, Michigan State University, MI May 2002, Jun-Aug 2003 Programmer Project Description: Lab Workflow Management System (LIMS)

LIMS is a customized web-based repository for up-stream PCR expression data. Conceptualized, designed and implemented changes to framework of a proprietary multi-tier LIMS. Changed the meta-data architecture of an existing app module. Improved the software usability, allowing more scientists to use the LIMS.

Responsibilities:

• Designed and implemented object-oriented components, interfaces and classes.

• Defined the Software Component Model that involved Components identification.

• Developed DB classes to store and retrieve data from database.

• Performed system analysis and fine-tuned queries for performance improvement of web-based database.

• Created use-cases, sequence diagrams and class diagrams.

Environment: Web Sphere 4.0, JBoss, WSAD 5.0, DB2, UML, JSP, J2EE, JDBC, Servlets, Java Beans, AIX, PG-SQL

DEPT OF MINE ENGINEERING at U.N.R., Reno, NV Sep 2000- May 2003 Software Developer Project Description: DOE Multi-Flux Modeling

The project was part of DOE/LLNL’s Yucca Mountain Rock Mechanics Simulation Program.

Responsibilities:

• Identified, devised and developed a new and robust geometrical algorithm with 70% more efficiency. The work was published in two professional papers at an International conference.

• Debugged systems software written in C++.

• Developed new software modules as a member of a team.

• Enforced quality assurance tasks as per the Dept. of Energy QA policies.

Page 99: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 95 of 107 MetaVista Consulting Group Updated: 2010-06-07

• Analyzed simulation runs, validated results in MATLAB and documented test cases.

• Automated graph plots of the results to ensure easy interpretation.

Environment: VC++, C++, C, Sun Solaris, Mat lab, Linux, MS Office, Windows 2000

COMPTER CENTRE, M. S. University of Baroda, India Aug 1999 – Jun 2000 Software Developer Project Description: Examination Planning and Scheduling System

The Planning and Scheduling System is 3-tier architecture in Visual Basic 6 with Oracle 8i as database server. It tracked student enrollment and workflow of examination and results process. Highly efficient scheduling algorithms were written in PL/SQL that would generate the work schedule for the resources. In addition to the schedule, there were other modules such as Student Enrollment.

Responsibilities:

• Involved in Functional and Technical discussions.

• Worked as Module Lead of one of the module.

• Developed SQL procedures for Scheduling.

• Developed Data Access Layer for Operators Module.

• Documented the modules, developed front-end screens.

• Developed views for Generating Reports.

• Unit tested developed modules and helped in integration testing. Used Microsoft VSS for source management.

Environment: Windows 2000, Visual Basic 6, ASP, COM, ActiveX, Oracle 8i, VSS

BANK OF BARODA, Baroda, India Jan 1999 – Jul 1999 Project Software Engineer Project Description: Leave Management System

Client-Server Architecture used for automatic processing of personnel’s leave application and integrated it with the payroll system. Used Visual Basic 5.0 on WIN NT as Front end and Oracle DB as back-end. Developed features like configurable forms and reports layout, using custom developed ActiveX Controls.

Responsibilities:

• System, database, modules design and development.

• Socket Programming using COM Control. System integration and integrated testing and application rollout at site.

• Project planning and tracking, user training

Environment: Windows NT, Visual Basic 6.0, Oracle 7.3, Crystal Reports, COM, COM+ and VSS

FORMAL EDUCATION • M.S. Computer Science from University of Nevada at Reno

Page 100: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 96 of 107 MetaVista Consulting Group Updated: 2010-06-07

TECHNICAL SKILLS Microsoft Technologies C#, VB.NET, ASP.NET, Visual Studio 6.0, Bus Intel Dev Studio

Scripting and Markups ASP, JavaScript, VBScript, HTML, DHTML, XHTML, CSS, XML, XSLT, XPATH, AJAX, JQuery, Ajax Control Toolkit, SSIS

Databases / Data Access Tools

Microsoft SQL Server 2000/2005, Oracle 8.1/10G, MS Access, LINQ, ADO.NET, ADO

Web Technologies ASP.NET, ADO.NET, XML Web Services, .Net Remoting Services, SOAP, WCF Services, Share point Services

Reporting Tools SQL Server Reporting Services, Crystal Reports, BO Data Reports

Protocols HTTP,WSDL,SOAP,XML,TCP/IP

Application Servers IIS, iPLANET

Operating System Windows 9X/NT/2000/XP, Windows Server 2000/2003, DOS, Unix

Source Control Visual Source Safe, Team Foundation Server, Dimensions

Project Management MS Projects, EPM, Visio

Design / Modeling Tools Microsoft Visio, UML, ERWIN

Patterns/ Architecture/ Methodology

MVC Pattern, SOA Architecture, Client Server, N-Tiered Application, Waterfall Model, Scrum, RUP, Agile Methodology

Page 101: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 97 of 107 MetaVista Consulting Group Updated: 2010-06-07

Errol Thomas, PMP, CBAP A Project Management Professional (PMP), Certified Business Analysis Professional (CBAP), Certified Enterprise Architect (CEA) and Microsoft Certified System Engineer (MCSE) with over 25 years of experience as in Independent Project Oversight Consultant, Project Manager, Implementation Coordinator, Business Analyst, Lead Technical Analyst and Trainer.

AREAS OF EXPERTISE • The IT Project Oversight Framework • The Post-Implementation Evaluation Report • IEEE Standard for Software Verification & Validation • IEEE Standard for IT – Software Life-Cycle Processes • Various Enterprise Architecture frameworks

• The Customer Service Life-Cycle • The Process Life-Cycle • The ITIL & ITSM framework • Capability Maturity Model Integration • Various information technology platforms

PROFESSIONAL ACCOMPLISHMENTS METAVISTA CONSULTING GROUP, Sacramento, CA Jun 2001 – Present Principal Consultant • As a Principal Consultant for MetaVista, responsible for:

• Providing consulting services to clients and maintaining a broad range of industry experience and knowledge.

• Successfully working with multiple layers of an organization, from executives, to business stakeholders and technical teams.

• Communicating ideas to all levels of an organization and obtaining buy-in from both the business and technical stakeholders.

• Developing a working understanding of the business and technical domains to solve a variety of business problems.

• Active participation in all phases of the project life-cycle (e.g. problem definition, planning, requirements elicitation, solution design, build, test, deploy, organizational change).

CA Dept of Corrections and Rehabilitation (Oct 2009 – Dec 2009)

• Business Analyst (BA) to the CDCR

• A project to develop an Executive Summary, introductory paragraphs for each of the six goals and a graphic mapping each of the objectives to the CDCR enterprise architecture model (based on industry standards) for the CDCR Enterprise Information Services Information Technology Strategic Plan

• Responsible for assisting with CDCR executive interviews

• Responsible for developing the Enterprise Architecture mapping of plan objectives

CA State Teachers’ Retirement System (Oct 2009 – May 2010)

• Senior Project Manager (PM) to CalSTRS

• A knowledge transfer project to develop Business Analyst training curriculum for the CalSTRS Continuous-Improvement Team

Page 102: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 98 of 107 MetaVista Consulting Group Updated: 2010-06-07

• Leader of the Knowledge Transfer Project team consisting of two consultants accountable for all planning and implementation activities of the Business Analyst training

• Product lead for all curriculum development efforts

• Develop a Project Management Plan based on the Project Management Institute (PMI) Project Management Body of Knowledge (PMBOK) that includes project charter, budget, schedule, communication, risk, issue, scope change, implementation, transition and closeout plans

• Facilitate:

• Sponsor meetings and status reporting activities

• Team meetings, develop and maintain agendas, minutes, and action items

Global Knowledge Training LLC (May 2007 – Present)

• Training consultant to GK

• Responsible for developing and delivering formal classroom training at GK nation-wide training sites and contracted on-site delivery of Global Knowledge Business Analysis curriculum based on the:

• International Institute of Business Analysis (IIBA) Business Analysis Body of Knowledge (BABOK);

• Project Management Institute (PMI) Project Management Body of Knowledge (PMBOK);

• IEEE Computer Society (IEEE) Software Engineering Body of Knowledge (SWEBOK);

• MITRE Corporation Enterprise Architecture Body of Knowledge (EABOK); and

• Other industry standards for IT solution development

• Training includes the following:

• Certified Business Analysis Professional (CBAP) Exam Prep Boot Camp

• Business Analysis Essentials

• Requirements Development, Documentation and Management

• Writing Effective Requirements

• Formal training materials and informal handouts used to relate key concepts to real world experiences are developed using Microsoft Office tools

CA Dept of Corrections and Rehabilitation (Apr 2009 – Aug 2009)

• Business Analyst (BA) to the CDCR

• A project to develop the five-year Information Technology Capital Plan (ITCP) and Agency Consolidation Plan (ACP) for the CDCR based on Office of the State Chief Information Officer (OCIO) Preparation Instructions and Consolidation Plan Reference Manual and submit them to the OCIO

• Responsible for discovering and documenting planned CDCR projects to be included in their five-year ITCP

• Responsible for documenting current project status for projects reported in the previous years ITCP

Page 103: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 99 of 107 MetaVista Consulting Group Updated: 2010-06-07

• Assist with discovering potential IT related CDCR consolidation efforts and documenting them in their five-year ACP

• Develop implementation plans for each consolidation effort identified based on CDCR business environment and industry best practices

CA Dept of General Services (Aug 2008 – Dec 2008)

• Business Analyst (BA) to DGS

• A project to develop the Feasibility Study Report (FSR), Information Technology Procurement Plan (ITPP) and Request for Offer (RFO) for the data aggregation, trend analysis and reporting needs of the Pharmaceuticals Information Management System (PIMS) for the Procurement Division of DGS

• Lead member of the team that developed a Project Management Plan based on the Project Management Institute (PMI) Project Management Body of Knowledge (PMBOK) methodology that included project charter, budget, schedule, communication, risk, issue, requirement development and closeout plans

• Assisted with:

• Developing requirements and alternative solutions based on the client’s business need

• Identifying requirement risks

• Recommending a proposed solution selected from developed alternatives

• Developing an ITPP and RFO

CA Dept of Public Health (May 2008 – Jul 2008)

• Business Analyst (BA) to DPH

• A project to develop the Feasibility Study Report (FSR) and Information Technology Procurement Plan (ITPP) for the data aggregation, trend analysis and reporting needs of the Statewide Immunization Information System (SIIS) for the Immunization Branch of the DPH

• Lead member of the team that developed a Project Management Plan based on the Project Management Institute (PMI) Project Management Body of Knowledge (PMBOK) methodology that included project charter, budget, schedule, communication, risk, issue, requirement development and closeout plans

• Assisted with:

• Developing requirements and alternative solutions based on the client’s business need

• Identifying requirement risks

• Recommending a proposed solution selected from developed alternatives

• Developing an ITPP

Sac Co Dept of Health & Human Services (Oct 2003 – Apr 2005)

• Senior Project Manager (PM) & Lead Business Analyst (BA) to DHHS

• An out-sourced Application Service Provider (ASP) modified off-the-shelf (MOTS) Microsoft .NET framework development project to implement a county-wide customized patient information management system (PIMS) that integrates clinical practice management, electronic medical records, fiscal accounting, and comprehensive reporting

Page 104: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 100 of 107 MetaVista Consulting Group Updated: 2010-06-07

• Required system interfaces included the State’s Medi-Cal Eligibility Data System (MEDS) and data integration with the County’s Mental Health Billing, Authorization and Reporting (MHBAR) system

• Leader of the California County Information Systems (CalCiS) Patient Information Management System (PIMS) project team consisting of four County staff and four consultants accountable for all planning and implementation activities of the PIMS business solution that included an Electronic Records Management (ERM) subproject

• Developed a Project Management Plan based on Sacramento County’s standard methodology (based on the Project Management Institute’s PMBOK), IEEE Standard for IT – Software Life-Cycle Processes, and IEEE Standard for Software Verification and Validation that included project methodology, charter, scope, quality management, communication, change management, schedule, requirements management, risk management, interfaces, data cleanup/conversion, testing, training, implementation, maintenance and operations and closeout plans

• Delivered monthly project oversight reports to the organization’s financial manager, PMO and steering committee reporting on the progress of the ASP and their MOTS solution

• Identified and facilitated various customer representative focus groups as a part of a Business Process Reengineering (BPR) effort that resulted in the ‘To-Be’ Activity Diagrams used to develop system requirements

• Assisted the customer with identifying key ‘As-Is’ business processes that would benefit from process improvement and ERM

• Facilitated

• Sponsor meetings and status reporting activities

• Team meetings, developed and maintained agendas, minutes, and action items

• Stakeholder interactions for requirements elicitation activities

FORMAL EDUCATION • California State University Sacramento, CA, 1984

• Course work towards Management Information Systems

• Sacramento City College, CA, 1983

• Course work towards Business Administration

• American River College, CA, 1983

• Course work towards Business Administration

PROFESSIONAL CERTIFICATIONS • International Institute of Business Analysis (IIBA)

• Certified Business Analysis Professional (CBAP), 2007 – Present

• Project Management Institute (PMI)

• Project Management Professional (PMP), 2003 – Present

• DCI eUniversity

Page 105: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 101 of 107 MetaVista Consulting Group Updated: 2010-06-07

• Certified Enterprise Architect (CEA), 2000

• Microsoft Corporation

• Microsoft Certified System Engineer (MCSE), 2000

PROFESSIONAL AFFILIATIONS • International Institute of Business Analysis – Sacramento Valley Chapter, 2007 – Present

• IEEE – Sacramento Valley Section – Region 6, 2004 – Present

• Project Management Institute – Sacramento Valley Chapter, 2003 – Present

Page 106: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 102 of 107 MetaVista Consulting Group Updated: 2010-06-07

IT Certifications • Copies of certifications for Mr. Chellappa were not available in time to be included in this

proposal. Please see his resume earlier in this document for a description of his certifications.

• In order to limit the overall size of this proposal, only selected certifications are included here. Additional certifications will be made available upon request.

Page 107: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 103 of 107 MetaVista Consulting Group Updated: 2010-06-07

Page 108: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 104 of 107 MetaVista Consulting Group Updated: 2010-06-07

Page 109: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 105 of 107 MetaVista Consulting Group Updated: 2010-06-07

Page 110: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 106 of 107 MetaVista Consulting Group Updated: 2010-06-07

Page 111: MetaVista Proposal 10-014-ITS Final

Formal Offer for RFO #10-014-ITS Enterprise Data Warehouse Services

Page 107 of 107 MetaVista Consulting Group Updated: 2010-06-07