which data quality assurance...

3
BUSINESS INTELLIGENCE I PERFORMANCE MANAGEMENT CONSULTING I TECHNOLOGY WHICH DATA QUALITY ASSURANCE FOR DECISION-MAKING ENVIRONMENTS? EXPERT OPINION In addition to large enterprise programs, which are important but long and expensive to implement, effective, immediate action can be taken to improve data quality and consistency in decision-making platforms. What approaches are possible and what are the benefits? Keyrus ® 2012 - All rights reserved 50% of enterprise data managers estimate that within three years they will have a “Real-Time Data Quality” system. This technology is in second position after MDM systems. Source : “Next Generation Data Integration” report from TDWI, Second Quarter 2011 Over the last ten years, many studies have revealed the costs to companies of non-quality of data. Most of these studies focus on unitary data (wrong customer address, duplication, etc.), underlining the consequences when sales and marketing teams use incomplete, erroneous or inconsistent data. However, the question is not limited to data details, which are only the tip of the iceberg. Data quality problems occur wherever there are databases, transactional systems and repositories – which is to say at absolutely all levels of an enterprise. Thus, to improve data quality, action must be continuous on the level of all the systems. Even if effective software tools exist today, companies that have launched projects in this field (data quality, data governance, master data management, etc.) can testify to the difficulties of such an approach and the time required to get tangible, durable improvements, in particular on the level of decision-making tools. TREATING THE PROBLEM ON THE DECISION-MAKING LEVEL Faced with company managers’ urgent need for quality assurance, the most rapidly-effective solution consists of concentrating efforts, not on the transactional and ope- rational systems upstream but, because of the central place it now occupies in running a company, on the deci- sion-making environment itself. Why? Because all key company data (ERP, CRM, back- office, etc.) now converges on the decision-making plat- form. It is not only consolida- ted there, but also transfor- med and processed to supply the reporting and the dash- boards on which the directors and operational managers base their decisions. Conse- quently, it is essential to en- sure at this level, first, that the input data is valid (that it is complete, accurate and up-to-date) and, second, that the information delivered by the decision-making applica- tions is consistent with itself and complies with business standards. By Cyril COHEN-SOLAL I Country Manager, Keyrus

Upload: others

Post on 16-May-2020

11 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: WhiCh DAtA QuAlity AssuRAnCe FoRkeyrus-prod.s3.amazonaws.com/.../Keyrus_Expert_opinion_Cohen-Solal.pdf · WhiCh DAtA QuAlity AssuRAnCe FoR DECISION-MAKING ENVIRONMENTS? eXPeRt oPinion

business intelligence i performance management

Consulting i teChnolo gy

WhiCh DAtA QuAlity AssuRAnCe FoR DECISION-MAKING ENVIRONMENTS? e X P e R t o P i n i o n

in addition to large enterprise programs, which are important but long and expensive to implement, effective, immediate action can be taken to improve data quality and consistency in decision-making platforms. What approaches are possible and what are the benefits?

Keyrus® 2012 - All rights reserved

50% of enterprise data managers estimate that within three years they will have a “Real-Time Data Quality” system. This technology is in second position after MDM systems.Source : “Next Generation Data Integration” report from TDWI, Second Quarter 2011

Over the last ten years, many studies have revealed the costs to companies of non-quality of data. Most of these studies focus on unitary data (wrong customer address, duplication, etc.), underlining the consequences when sales and marketing teams use incomplete, erroneous or inconsistent data. However, the question is not limited to data details, which are only the tip of the iceberg. Data quality problems occur wherever there are databases, transactional systems and repositories – which is to say at absolutely all levels of an enterprise. Thus, to improve data quality, action must be continuous on the level of all the systems. Even if effective software tools exist today, companies that have launched projects in this field (data quality, data governance, master data management, etc.) can testify to the difficulties of such an approach and the time required to get tangible, durable improvements, in particular on the level of decision-making tools.

TreaTing The problem on The decision-making level

Faced with company managers’ urgent need for quality assurance, the most rapidly-effective solution consists of concentrating efforts, not on the transactional and ope-rational systems upstream but, because of the central place it now occupies in running a company, on the deci-sion-making environment itself. Why? Because all key company data (ERP, CRM, back-

office, etc.) now converges on the decision-making plat-form. It is not only consolida-ted there, but also transfor-med and processed to supply the reporting and the dash-boards on which the directors and operational managers base their decisions. Conse-quently, it is essential to en-sure at this level, first, that the input data is valid (that it is complete, accurate and up-to-date) and, second, that the information delivered by the decision-making applica-tions is consistent with itself and complies with business standards.

By Cyril COHEN-SOLAL i Country Manager, Keyrus

Page 2: WhiCh DAtA QuAlity AssuRAnCe FoRkeyrus-prod.s3.amazonaws.com/.../Keyrus_Expert_opinion_Cohen-Solal.pdf · WhiCh DAtA QuAlity AssuRAnCe FoR DECISION-MAKING ENVIRONMENTS? eXPeRt oPinion

WHICH DATA QUALITY ASSURANCE FOR DECISION-MAKING ENVIRONMENTS?

Keyrus® 2012 - All rights reserved

measuring The consequences

If the information supplied by the decision-making system is wrong, incomplete or inconsistent, this poses two types of problems that are in fact related. The first is the users’ loss of confidence in the data, possibly even the rejection pure and simple of decision-making tools. Typically, if the figures on the general manager’s dashboard are not the same as those on the

sales manager’s report, which are right? Who decides? What has caused the discrepancy? Is it on the level of the source systems, the process of loading the data, or the processing of the data? These very frequent situations lead users to mistrust the data supplied by business intelligence (BI) systems and to continually make repeated checks, causing them to waste a great deal of time without providing a definitive solution to the problems detected. In addition, the BI teams are discredited. Secondly, defective

data quality leads to unfortunate, possibly even detrimental decisions. For example, in the telecommunications sector, a company was preparing to terminate a product offering that, according to reporting statements, was unprofitable. It realized in time that it was nothing of the sort. In fact, the offering was profitable, but part of the revenue it generated had been wrongly assigned, which explained the negative margin that appeared in the reporting.

In another example, a factory’s error when entering a unit (g instead of mg) in the use of toxic products could not be detected on the unit level, but at the consolidated level led to a serious statutory inquiry that was not justified. In these two examples, it is understood that the teams responsible for the decision-making tools did not have the means to monitor the quality of the data entering and coming out of the decision-making system, and thus could not warn users about any possible inconsistencies or aberrations.

a mechanism for conTinuous conTrol and validaTion

More than ever, the issue for decision-making systems is to provide users with reliable data. Deploying a quality assurance logic in the decision-making environment itself amounts to setting up a permanent mechanism for data control and validation, comparable to the role of anti-virus software. An antivirus protects every computer against 90% of problems by systematically barring suspicious software and data, and continuously checking for the absence of suspicious behavior in the software installed. For all that, it does not cover all risks, and must be complemented by collective governance and individual vigilance rules.

A data quality policy must include these two sections: on the one hand, systematic controls/alerts under the responsibility of the BI team; and on the other hand, vigilance/governance under the joint responsibility of the BI and operating teams.

inTegraTed managers in The bi qualiTy process

On account of the variety and the ever-increasing number of data streams entering and coming out of decision-making systems, it has become practically impossible to manually develop and maintain all the control scripts needed to guarantee data quality in all the fields critical for the company. For example, a major French bank chose to develop a script-based test mechanism. After two years of work, this solution covered only a third of its control and validation needs. Now, with the arrival of new configurable tools, it is possible to automate control and validation tests. The tests are launched every time data is loaded in the data warehouse, so as to check the accuracy and consistency of the data, and to guarantee the integrity of the repositories by completing any missing data, if necessary. The other major contribution of this type of solution is that it allows managers to define the consistency rules for the key data/indicators for their company, business unit or department, for example, “the per product margin rate cannot be negative”, “the risk per contract cannot exceed this threshold level”, “vehicle policyholders cannot be under the age of 18 years”,

eXpert opinion business intelligence i performance management

THE THREE GOLDEN RULES OF BI QUALITY ASSURANCE 1. Integrate Business players and their business rules 2. Install real-time monitoring, from the source to reporting 3. Monitor, correct, and improve continuously

Page 3: WhiCh DAtA QuAlity AssuRAnCe FoRkeyrus-prod.s3.amazonaws.com/.../Keyrus_Expert_opinion_Cohen-Solal.pdf · WhiCh DAtA QuAlity AssuRAnCe FoR DECISION-MAKING ENVIRONMENTS? eXPeRt oPinion

Keyrus® 2012 - All rights reserved

“revenues cannot double from one month to the next”, etc.

In the event of non-compliance with these rules, an alert is immediately issued, and additional tests are launched in order to target the origin of the problem. The managers concerned are alerted, and the problem can be corrected before the defective, abnormal or atypical data spreads into the decision-making environment and in the company.

an approach ThaT reinforces The value of bi

Setting up a data quality assurance solution on the level of the decision-making platform offers many

advantages. The first is being able to progress quickly and independently of any enterprise-level data quality projects. The second is to force the parties involved (technical and operating teams) to think about the level of quality that is required or acceptable. In fact, it is not always relevant to aim for total quality: the focus could be on only the most critical or problematical aspects. A third advantage is the involvement of the managers, thus balancing the responsibilities between those who ensure the technical production of data and those who consume this data, who are, because of this, in a position to judge its relevance, validity and consistency from the business point of view. The logical effect of this shared, concerted responsibility is users’ greater trust in the data supplied to them. BI tools, now considered more reliable, are therefore used more, strengthening their return on investment. Finally, this type of quality approach is becoming almost essential in a context where collaborative BI and personal BI tools are becoming increasingly popular. The possibility of integrating these tools in the controlled scope helps stop the proliferation within the company of data that is unverified and potentially a source of decisions that are, at best, inappropriate, and at worst, highly detrimental.

eXpert opinion business intelligence i performance management

WHICH DATA QUALITY ASSURANCE FOR DECISION-MAKING ENVIRONMENTS?

C.C-S.

ABOUT KEYRUS

As a major player in consulting and the integration of Business Intelligence and e-Business solutions for corporate customers and ERP/CRM solutions for the Mid Market, Keyrus now has some 1,600 employees in eleven countries, assisting its customers optimize their performance by offering a full range of services in the following areas:

• Management Consulting

• Business Intelligence - Performance Management

• e-Business – Web Performance

• Corporate Management Solutions (ERP/CRM)

The Keyrus Group is listed on Eurolist, Euronext Paris (Compartment C/Small caps Code ISIN: FR0004029411 Reuters: KEYR.LN – Bloomberg: KEYP FP).

For more information, see: www.keyrus.com

Marketing Communication Department155 rue Anatole France, 92593 Levallois-Perret Cedex – Tel. : +33 (0)1 41 34 10 46 - [email protected]

With a Management Data Processing degree from Paris Dauphine University, Cyril Cohen-Solal has 15 years of expertise in the Business Intelligence field, during which he has accompanied major corporations both in France and abroad in the design and implementation of their decision-making system. Also, within the Keyrus Group since its purchase in 2011 of the publisher Vision.bi, Cyril Cohen-Solal is in charge of the distribution of the “Quality Gates” brand of solutions for improving the quality of decision-making environments. In parallel with these activities, he has been teaching in the Master in Decision-Making Data Processing program at Paris Dauphine for eight years.