class note tqm

36
Pareto analysis is done in a TQM company not for evaluating the reliability of the quality system, and sorting out the most significant factors responsible for most significant effect on the system but for sorting out the most insignificant factors responsible for most significant effect on the system. The principles of TQM implementation in a business activity sets the core values first and then techniques & Tools. The core values adopted in a TQM company must not be based on the tools and techniques available in the company but the reverse. The core values also must not be based on the skill and ethics of the employees of the company but the reverse is effective in a efficient quality company. Out different techniques used in TQM principle, Pareto analysis is combined with other few important and frequently used techniques such as FMEA (Failure Mode and Effect Analysis), DMAIC (Define Measure Analyse Improve and Control etc TQM is based on some core values of an organisation interested to implement Total Quality Management in the organisation. The different core values of a TQM company along with its tools & techniques generally employed in the system are Objectives of Pareto Analysis: Use this approach to identify which challenges you should tackle first. © iStockphoto Mackan65 Imagine that you've just stepped into a new role as head of department. Unsurprisingly, you've inherited a whole host of problems that need your attention. Ideally, you want to focus your attention on fixing the most important problems. But how do you decide which problems you need to deal with first? And are some problems caused by the same underlying issue? Pareto Analysis is a simple technique for prioritizing possible changes by identifying the problems that will be resolved by making these changes. By using this approach, you can prioritize the individual changes that will most improve the situation.

Upload: debanjana-chowdhury

Post on 11-Dec-2015

2 views

Category:

Documents


0 download

DESCRIPTION

Class note on Total Quality Management

TRANSCRIPT

Page 1: Class Note TQM

Pareto analysis is done in a TQM company not for evaluating the reliability of the quality system, and sorting out the most significant factors responsible for most significant effect on the system but for sorting out the most insignificant factors responsible for most significant effect on the system. The principles of TQM implementation in a business activity sets the core values first and then techniques & Tools. The core values adopted in a TQM company must not be based on the tools and techniques available in the company but the reverse. The core values also must not be based on the skill and ethics of the employees of the company but the reverse is effective in a efficient quality company.Out different techniques used in TQM principle, Pareto analysis is combined with other few important and frequently used techniques such as FMEA (Failure Mode and Effect Analysis), DMAIC (Define Measure Analyse Improve and Control etc

TQM is based on some core values of an organisation interested to implement Total Quality Management in the organisation. The different core values of a TQM company along with its tools & techniques generally employed in the system are

Objectives of Pareto Analysis: Use this approach to identify which challenges you should tackle first. © iStockphoto Mackan65

Imagine that you've just stepped into a new role as head of department. Unsurprisingly, you've inherited a whole host of problems that need your attention.

Ideally, you want to focus your attention on fixing the most important problems. But how do you decide which problems you need to deal with first? And are some problems caused by the same underlying issue?

Pareto Analysis is a simple technique for prioritizing possible changes by identifying the problems that will be resolved by making these changes. By using this approach, you can prioritize the individual changes that will most improve the situation.

Pareto Analysis uses the Pareto Principle – also known as the "80/20 Rule" – which is the idea that 20 percent of causes generate 80 percent of results. With this tool, we're trying to find the 20 percent of work that will generate 80 percent of the results that doing all of the work would deliver.

Note:

The figures 80 and 20 are illustrative – the Pareto Principle illustrates the lack of symmetry that often appears between work put in and results achieved. For example, 13 percent of work could generate 87 percent of returns. Or 70 percent of problems could be resolved by dealing with 30 percent of the causes.

Page 2: Class Note TQM

How to Use the Tool

Step 1: Identify and List Problems

Firstly, write a list of all of the problems that you need to resolve. Where possible, talk to clients and team members to get their input, and draw on surveys, helpdesk logs and suchlike, where these are available.

Step 2: Identify the Root Cause of Each ProblemFor each problem, identify its fundamental cause. (Techniques such as Brainstorming , the 5 Whys , Cause and Effect Analysis , and Root Cause Analysis will help with this.)

Step 3: Score Problems

Now you need to score each problem. The scoring method you use depends on the sort of problem you're trying to solve.

For example, if you're trying to improve profits, you might score problems on the basis of how much they are costing you. Alternatively, if you're trying to improve customer satisfaction, you might score them on the basis of the number of complaints eliminated by solving the problem.

Step 4: Group Problems Together By Root Cause

Next, group the problems together by cause. For example, if three of your problems are caused by lack of staff, put these in the same group.

Step 5: Add up the Scores for Each Group

You can now add up the scores for each cause group. The group with the top score is your highest priority, and the group with the lowest score is your lowest priority.

Step 6: Take Action

Now you need to deal with the causes of your problems, dealing with your top-priority problem, or group of problems, first.

Keep in mind that low scoring problems may not even be worth bothering with - solving these problems may cost you more than the solutions are worth.

Note:While this approach is great for identifying the most important root cause to deal with, it doesn't take into account the cost of doing so. Where costs are significant, you'll need to use techniques such as Cost/Benefit Analysis, and use IRRs and NPVs to determine which changes you should implement.

Page 3: Class Note TQM

Pareto Analysis is a simple technique for prioritizing problem-solving work so that the first piece of work done to resolve the greatest number of problems. It's based on the Pareto Principle (also known as the 80/20 Rule) – the idea that 80 percent of problems may be caused by as few as 20 percent of causes.

Pareto Analysis involves the identification and listing of problems and their causes. Then score each problem and group them together by their cause. Then add up the score for each group. Finally, finding a solution to the cause of the problems in group with the highest score is worked out.

Pareto Analysis not only shows you the most important problem to solve, it also gives you a score showing how severe the problem is.

Pareto Analysis Example:

Analyse the following data table following the method of Pareto analysis by pictorially presenting the derived information from the data to identify the vital 20% causes that need to be taken care of to bring about an 80% overall improvement..

Causes Frequency in %

Cumulative frequency in %

Technical failure 42 42Workforce Problems 35 77Environmental factors 12 89Shortage of Resources 8 97Government approval 3 100

A graphical representation of the analysis provides a more powerful tool, which is easily comprehensible as well, to identify the relative importance of all the enlisted causes, and come to a final decision on what are the few key causes significantly affecting the project. Here’s how Pareto Analysis data can be pictorially presented to identify the vital 20% causes that need to be taken care of to bring about an 80% overall improvement.

1. Make a list of all the probable causes in one column and in the adjacent column fill in the frequency of occurrence, in percentage form, for each of the causes. Sort the data, based on the frequency of occurrence, in a descending order.

2. In the next, adjacent column calculate the cumulative frequency of occurrence, which by default will appear in percentages.

3. Finally, make a bar graph using the data, plotting the frequency along the y-axis and the causes along the x-axis. The cumulative frequency can be represented as a line. The resulting bar

Page 4: Class Note TQM

chart will make it clear what are the key causes resulting in 80% of the problem related to the project.

Pareto Analysis can be Used Universally:The Pareto problem solving technique is an extremely powerful tool, which is simple to use yet very effective in finding solutions to problems. Although the technique is mostly based on finding the 20% vital causes that lead to 80% of the problems, it can still be effectively used in situations where the 80:20 rules does not apply clearly. The main objective of this approach to problem solving is to identify the chief causes and respond to them immediately, so as to improve the overall output of a project. While, in this article we have concentrated mainly on how Pareto analysis can be used on a problem solving project, today’s managers have been using it for decision making as well as problem solving in other spheres of business too.

TQM-Tools & Techniques

Total Quality Management (TQM) is a management strategy aimed at embedding awareness of quality in all organizational processes. TQM defined as: “a set of systematic activities carried out by the entire organization to effectively and efficiently achieve the organization’s objectives so as to provide products and services with a level of quality that satisfies customers, at the appropriate time and price”.

There are many proposed tools and techniques to achieve the TQM promises. Generally, a technique can be considered as a number of activities performed in a certain order to reach the values (Hellsten & Klefsjö, 2000). On the other hand, tools sometimes have statistical basis to support decision making or facilitate analysis of data.

Page 5: Class Note TQM

Most of the studies in TQM implementation focus on the concept of TQM. There are very few studies in the literature that directly suggest an implementation roadmap of TQM tools and techniques and usually they are not a complete roadmap. Therefore, a comprehensive roadmap for TQM implementation is proposed that covers all the cited tools and techniques.

The management focus and commitment phase requires the use of data analysis tools (e.g. cause & effect analysis, flow charts and Pareto analysis) to identify problem areas, quantify their effects and prioritize the need for solution. During the intensive improvement phase the introduction of more complex tools (e.g. statistical process control (SPC) and failure mode and effects analysis (FMEA)) help to facilitate company-wide improvement.

Bunney and Dale (1997) also categorized TQM tools and techniques in two different ways, first in five categories regarding to their application and second in seven categories regarding to the function that they can be used. Table 1 shows the TQM tools regarding to their application.

Table 1: Analysis of Application of Tools and Techniques (Bunney & Dale, 1997)

Table 2: Analysis of Tools and Techniques Used within each Function (Bunney & Dale, 1997)

SPC---Statistical Process Control

FMEA---Failure Mode and Effect (C—criticality) Analysis

QFD---Quality Function Deployment

Page 6: Class Note TQM

In another study, TQM can be defined as a management system, which consists of three interdependent units, namely core values, techniques and tools. The idea is that the core values must be supported by techniques, such as process management, benchmarking, and customer focused planning, or improvement teams, and tools, such as control charts, the quality house or Ishikawa diagrams, in order to be part of a culture. They emphasized that this systematic definition will facilitate for organizations the understanding and implementation of TQM. Therefore, the implementation work should begin with the acceptance of the core values that characterizing the culture of organization. The next step is to continuously choose techniques that are suitable for supporting the selected values. Ultimately, suitable tools have to be identified and used in an efficient way in order to support the chosen techniques.

Figure 1: TQM as a Management System Consists of Values, Techniques and Tools (Hellsten & Klefsjö, 2000)

Page 7: Class Note TQM

According study, the basis for the culture of the organization are the core values. Another component is techniques, i.e. ways to work within the organization to reach the values. A technique consists of a number of activities performed in a certain order. The important concept here is that TQM really should be looked on as a system. The values are supported by techniques and tools to form a whole. We have to start with the core values and ask: Which core values should characterize our organization? When this is decided, we have to identify techniques that are suitable for our organization to use and support our values. Finally, from that decision the suitable tools have to be identified and used in an efficient way to support the techniques (see Figure 2).

Figure 2: TQM Implementation Steps (Hellsten & Klefsjö, 2000)

As an example, “Benchmarking” should not be used without seeing the reason for using that technique and an organization should not use just control charts without seeing the core value behind the choice and a systematic implementation roadmap of the techniques and tools. It is, of course, important to note that a particular technique can support different core values and the same tool can be useful within many techniques.

In another work, 15 frequently used TQM tools and classified them according to qualitative TQM tools and quantitative TQM tools. Qualitative tools consist mainly of subjective inputs, which often do not intend to measure something of a numerical nature. Quantitative tools, on the other hand, involve either the extension of historical data or the analysis of objective data, which usually avoid personal biases that sometimes contaminate qualitative tools. They categorized TQM tools as below:

Qualitative tools: Quantitative tools:

flow charts;Shewart cycle (PDCA);cause-and-effect diagrams;control charts;multi-voting; scatter diagrams;affinity diagram;Pareto charts;process action teams;sampling;brainstorming;run charts;election grids;histograms.task lists.

Page 8: Class Note TQM

Additional Information:

Pareto analysis:

Pareto analysis is a formal technique useful where many possible courses of action are competing for attention. In essence, the problem-solving process estimates the benefit delivered by each action, then selects a number of the most effective actions that deliver a total benefit reasonably close to the maximal possible one. However, it can be limited by its exclusion of possibly important problems which may be small initially, but which grow with time. It should be combined with other analytical tools such as failure mode and effects analysis and fault tree analysis for example.

Page 9: Class Note TQM

This technique helps to identify the top portion of causes that need to be addressed to resolve the majority of problems. Once the predominant causes are identified, then tools like the Ishikawa diagram or Fish-bone Analysis can be used to identify the root causes of the problems. While it is common to refer to pareto as "80/20" rule, under the assumption that, in all situations, 20% of causes determine 80% of problems, this ratio is merely a convenient rule of thumb and is not nor should it be considered immutable law of nature.

The application of the Pareto analysis in risk management allows management to focus on those risks that have the most impact on the project.

Steps to identify the important causes using 80/20 rule

1. Form an explicit table listing the causes and their frequency as a percentage.2. Arrange the rows in the decreasing order of importance of the causes (i.e., the most

important cause first)3. Add a cumulative percentage column to the table4. Plot with causes on x- and cumulative percentage on y-axis5. Join the above points to form a curve6. Plot (on the same graph) a bar graph with causes on x- and percent frequency on y-axis7. Draw a line at 80% on y-axis parallel to x-axis. Then drop the line at the point of

intersection with the curve on x-axis. This point on the x-axis separates the important causes (on the left) and trivial causes (on the right)

8. Explicitly review the chart to ensure that causes for at least 80% of the problems are captured.

The Pareto principle (also known as the 80–20 rule, the law of the vital few, and the principle of factor sparsity) states that, for many events, roughly 80% of the effects come from 20% of the causes. Management consultant Joseph M. Juran suggested the principle and named it after Italian economist Vilfredo Pareto, who, while at the University of Lausanne in 1896, published his first paper "Cours d'économie politique." Essentially, Pareto showed that approximately 80% of the land in Italy was owned by 20% of the population; Pareto developed the principle by observing that 20% of the peapods in his garden contained 80% of the peas.

It is a common rule of thumb in business; e.g., "80% of your sales come from 20% of your clients." Mathematically, the 80–20 rule is roughly followed by a power law distribution (also known as a Pareto distribution) for a particular set of parameters, and many natural phenomena have been shown empirically to exhibit such a distribution.

The Pareto principle is only tangentially related to Pareto efficiency. Pareto developed both concepts in the context of the distribution of income and wealth among the population.

The distribution is claimed to appear in several different aspects relevant to entrepreneurs and business managers. For example:

80% of a company's profits come from 20% of its customers

Page 10: Class Note TQM

80% of a company's complaints come from 20% of its customers 80% of a company's profits come from 20% of the time its staff spend 80% of a company's sales come from 20% of its products 80% of a company's sales are made by 20% of its sales staff

Therefore, many businesses have an easy access to dramatic improvements in profitability by focusing on the most effective areas and eliminating, ignoring, automating, delegating or retraining the rest, as appropriate.

Limited applicability to science

The more unified a theory is, the more predictions it makes, and the greater the chances is of some of them being cheaply testable. Modifications of existing theories makes much fewer new and unique predictions, increasing the risk of the few there is all being very expensive to test. If the Pareto principle or any other kind of increased costs were the cause of stagnation in the unification of especially physics, the modification of existing theories would have been even more severely slowed down than ever the unification by breakthroughs.

In software

In computer science and engineering control theory, such as for electromechanical energy converters, the Pareto principle can be applied to optimization efforts.

For example, Microsoft noted that by fixing the top 20% of the most-reported bugs, 80% of the related errors and crashes in a given system would be eliminated.

In load testing, it is common practice to estimate that 80% of the traffic occurs during 20% of the time. In software engineering, Lowell Arthur expressed a corollary principle: "20 percent of the code has 80 percent of the errors. Find them, fix them!"

Occupational health and safety:

The Pareto principle is used in occupational health and safety to underline the importance of hazard prioritization. Assuming 20% of the hazards will account for 80% of the injuries and by categorizing hazards, safety professionals can target those 20% of the hazards that cause 80% of the injuries or accidents. Alternatively, if hazards are addressed in random order, then a safety professional is more likely to fix one of the 80% of hazards which account for some fraction of the remaining 20% of injuries.

Aside from ensuring efficient accident prevention practices, the Pareto principle also ensures hazards are addressed in an economical order as the technique ensures the resources used are best used to prevent the most accidents.

Page 11: Class Note TQM

Other applications:

In the systems science discipline, Epstein and Axtell created an agent-based simulation model called SugarScape, from a decentralized modeling approach, based on individual behaviour rules defined for each agent in the economy. Wealth distribution and Pareto's 80/20 principle became emergent in their results, which suggests the principle is a natural phenomenon.

The Pareto principle has many applications in quality control. It is the basis for the Pareto chart, one of the key tools used in total quality control and six sigma. The Pareto principle serves as a baseline for ABC-analysis and XYZ-analysis, widely used in logistics and procurement for the purpose of optimizing stock of goods, as well as costs of keeping and replenishing that stock.

The Pareto principle was also mentioned in the book 24/8 - The Secret for being Mega-Effective by Achieving More in Less Time by Amit Offir. Offir claims that if you want to function as a one-stop shop, simply focus on the 20% of what is important in a project and that way you will save a lot of time and energy.

In health care in the United States, 20% of patients have been found to use 80% of health care resources. Several criminology studies have found 80% of crimes are committed by 20% of criminals. This statistic is used to support both stop-and-frisk policies and broken windows policing, as catching those criminals committing minor crimes will likely net many criminals wanted for (or who would normally commit) larger ones.

In the financial services industry, this concept is known as profit risk, where 20% or fewer of a company's customers are generating positive income, while 80% or more are costing the company money.

Mathematical notes

The idea has rule of thumb application in many places, but it is commonly misused. For example, it is a misuse to state a solution to a problem "fits the 80–20 rule" just because it fits 80% of the cases; it must also be that the solution requires only 20% of the resources that would be needed to solve all cases. Additionally, it is a misuse of the 80–20 rule to interpret data with a small number of categories or observations.

This is a special case of the wider phenomenon of Pareto distributions. If the Pareto index α, which is one of the parameters characterizing a Pareto distribution, is chosen as α = log45 ≈ 1.16, then one has 80% of effects coming from 20% of causes.

It follows that one also has 80% of that top 80% of effects coming from 20% of that top 20% of causes, and so on. Eighty percent of 80% is 64%; 20% of 20% is 4%, so this implies a "64-4" law; and similarly implies a "51.2-0.8" law. Similarly for the bottom 80% of causes and bottom 20% of effects, the bottom 80% of the bottom 80% only cause 20% of the remaining 20%. This is broadly in line with the world population/wealth table above, where the bottom 60% of the people own 5.5% of the wealth.

Page 12: Class Note TQM

The 64-4 correlation also implies a 32% 'fair' area between the 4% and 64%, where the lower 80% of the top 20% (16%) and upper 20% of the bottom 80% (also 16%) relates to the corresponding lower top and upper bottom of effects (32%). This is also broadly in line with the world population table above, where the second 20% control 12% of the wealth, and the bottom of the top 20% (presumably) control 16% of the wealth.

The term 80–20 is only shorthand for the general principle at work. In individual cases, the distribution could just as well be, say, 80–10 or 80–30. There is no need for the two numbers to add up to the number 100, as they are measures of different things, e.g., 'number of customers' vs 'amount spent'). However, each case in which they do not add up to 100%, is equivalent to one in which they do; for example, as noted above, the "64-4 law" (in which the two numbers do not add up to 100%) is equivalent to the "80–20 law" (in which they do add up to 100%). Thus, specifying two percentages independently does not lead to a broader class of distributions than what one gets by specifying the larger one and letting the smaller one be its complement relative to 100%. Thus, there is only one degree of freedom in the choice of that parameter.

Adding up to 100 leads to a nice symmetry. For example, if 80% of effects come from the top 20% of sources, then the remaining 20% of effects come from the lower 80% of sources. This is called the "joint ratio", and can be used to measure the degree of imbalance: a joint ratio of 96:4 is very imbalanced, 80:20 is significantly imbalanced (Gini index: 60%), 70:30 is moderately imbalanced (Gini index: 40%), and 55:45 is just slightly imbalanced.

The Pareto principle is an illustration of a "power law" relationship, which also occurs in phenomena such as brush fires and earthquakes. Because it is self-similar over a wide range of magnitudes, it produces outcomes completely different from Gaussian distribution phenomena. This fact explains the frequent breakdowns of sophisticated financial instruments, which are modeled on the assumption that a Gaussian relationship is appropriate to, for example, stock price movements.

Equality measures

Gini coefficient and Hoover index

Using the "A : B" notation (for example, 0.8:0.2) and with A + B = 1, inequality measures like the Gini index (G) and the Hoover index (H) can be computed. In this case both are the same.

Theil index

The Theil index is an entropy measure used to quantify inequalities. The measure is 0 for 50:50 distributions and reaches 1 at a Pareto distribution of 82:18. Higher inequalities yield Theil indices above 1.

Page 13: Class Note TQM

TQM-Tools & Techniques

Total Quality Management (TQM) is a management strategy aimed at embedding awareness of quality in all organizational processes (Siddiqui, Haleem, & Wadhwa, 2009). TQM defined by the Deming Prize Committee of the Union of Japanese Scientists and Engineers (JUSE, 2010) as: “a set of systematic activities carried out by the entire organization to effectively and efficiently achieve the organization’s objectives so as to provide products and services with a level of quality that satisfies customers, at the appropriate time and price”.

There are many proposed tools and techniques to achieve the TQM promises. Generally, a technique can be considered as a number of activities performed in a certain order to reach the values (Hellsten & Klefsjö, 2000). On the other hand, tools sometimes have statistical basis to support decision making or facilitate analysis of data.

Bunney and Dale (1997) reported that on the subject of quality management, there are many studies that agree on the vital role of the use and selection of quality management tools and techniques to support and develop the quality improvement process. However, they emphasized that organizations do encounter a range of difficulties in their use and application of quality management tools and techniques.

Most of the studies in TQM implementation focus on the concept of TQM. There are very few studies in the literature that directly suggest an implementation roadmap of TQM tools and techniques and usually they are not a complete roadmap. Therefore, in this research, firstly, the literature on the current proposed tools and techniques for TQM implementation reviewed and then based on the result of this review, a comprehensive roadmap for TQM implementation proposed that covers all the cited tools and techniques.

Bunney and Dale (1997) stated that the introduction of the quality management tools and techniques depend, to a certain extent, on the phase of the improvement process. The introduction of TQM can take place in three phases – (I) diagnosis and preparation, (II) management focus and commitment, and (III) intensive improvement. The diagnostic and preparation phase of TQM requires the introduction of a number of fact finding tools, some of which are “cost of quality” and “Departmental Purpose Analysis (DPA)”. The management focus and commitment phase requires the use of data analysis tools (e.g. cause & effect analysis, flow charts and Pareto analysis) to identify problem areas, quantify their effects and prioritize the

need for solution. During the intensive improvement phase the introduction of more complex tools (e.g. statistical process control (SPC) and failure mode and effects analysis (FMEA)) help to facilitate company-wide improvement.

Bunney and Dale (1997) also categorized TQM tools and techniques in two different ways, first in five categories regarding to their application and second in seven categories regarding to the function that they can be used. Table 1 shows the TQM tools regarding to their application.

Page 14: Class Note TQM

Table 1: Analysis of Application of Tools and Techniques (Bunney & Dale, 1997)

Table 2: Analysis of Tools and Techniques Used within each Function (Bunney & Dale, 1997)SPC---Statistical Process ControlFMEA---Failure Mode and Effect (C—criticality)AnalysisQFD---Quality Function Deployment

In another study, Hellsten and Klefsjö (2000) suggested a new definition for TQM. Their study indicates that TQM can be defined as a management system, which consists of three interdependent units, namely core values, techniques and tools. The idea is that the core values must be supported by techniques, such as process management, benchmarking, and customer focused planning, or improvement teams, and tools, such as control charts, the quality house or

Page 15: Class Note TQM

Ishikawa diagrams, in order to be part of a culture. They emphasized that this systematic definition will facilitate for organizations the understanding and implementation of TQM. Therefore, the implementation work should begin with the acceptance of the core values that characterizing the culture of organization. The next step is to continuously choose techniques that are suitable for supporting the selected values. Ultimately, suitable tools have to be identified and used in an efficient way in order to support the chosen techniques.

Figure 1: TQM as a Management System Consists of Values, Techniques and Tools (Hellsten & Klefsjö, 2000)

Figure 1 illustrates this definition (The techniques and tools in the figure are just examples and not a complete list).

According to Hellsten and Klefsjö (2000) the basis for the culture of the organization are the core values. Another component is techniques, i.e. ways to work within the organization to reach the values. A technique consists of a number of activities performed in a certain order. The important concept here is that TQM really should be looked on as a system. The values are supported by techniques and tools to form a whole. We have to start with the core values and

Page 16: Class Note TQM

ask: Which core values should characterize our organization? When this is decided, we have to identify techniques that are suitable for our organization to use and support our values. Finally, from that decision the suitable tools have to be identified and used in an efficient way to support the techniques (see Figure 2).

Figure 2: TQM Implementation Steps (Hellsten & Klefsjö, 2000)

Hellsten & Klefsjö (2000) indicated, as an example, “Benchmarking” should not be used without seeing the reason for using that technique and an organization should not use just control charts without seeing the core value behind the choice and a systematic implementation roadmap of the techniques and tools. It is, of course, important to note that a particular technique can support different core values and the same tool can be useful within many techniques.

In another work, Scheuermann, Zhu, & Scheuermann (1997) looked at 15 frequently used TQM tools and classified them according to qualitative TQM tools and quantitative TQM tools. Qualitative tools consist mainly of subjective inputs, which often do not intend to measure something of a numerical nature. Quantitative tools, on the other hand, involve either the extension of historical data or the analysis of objective data, which usually avoid personal biases that sometimes contaminate qualitative tools. They categorized TQM tools as below:

Qualitative tools: Quantitative tools:

flow charts;Shewart cycle (PDCA);cause-and-effect diagrams;control charts;multi-voting; scatter diagrams;affinity diagram;Pareto charts;process action teams;sampling;brainstorming;run charts;election grids;histograms.task lists.

In another study, Ahmed and Hassan (2003) introduced a different method. They indicated that from the point of presentation of data on a process, tools can be classified as graphical tools and flow diagrams. In the graphical tools class there are histogram, stem-and-leaf diagrams, line charts, bar charts, pie charts, run or time series charts, control charts, and Pareto diagrams, and in

Page 17: Class Note TQM

the flow diagrams class there are flow diagrams, process flow charts, cause-and-effect diagrams, and tree diagrams. Tools like check sheets, location plots, and data tables can be used to facilitate data collection and summarization.

For analyzing quality management aspects, these basic quality control tools are powerful and acceptable. Force-field analysis, nominal group technique, affinity diagram, interrelationship diagram, tree diagram, matrix diagram, prioritization matrices, process decision program chart (PDPC), and activity network diagram (PERT, CPM, arrow diagram, AoN) are some of the relevant management tools associated with quality management that can be applied to generate and treat soft data. Dale (2003) listed these tools under seven management tools “M7”, including affinity diagrams, relation diagrams, systematic diagrams, matrix data analysis, PDPCs, and arrow diagrams.

Rao et al. (1996) categorized brainstorming, affinity diagrams (or structured brainstorming), process potential index Cp, process performance index Cpk, Taguchi's loss function, and design of experiment (DoE) as advanced tools.

Ahmed and Hassan (2003) indicated that a systematic approach can produce very significant benefits in the long run. Deming's plan-do-study/check-act (PDSA/PDCA) is an excellent technique in monitoring and problem solving for continuous quality improvement where any brilliant ideas of individuals can be accommodated. However, a good number of other tools and techniques have to be invited to apply this properly. In other words, it integrates a few essential tools and techniques. In fact, without a strategic disposition, any tool or technique should not be taken in isolation for use. Figure 3 depicts the systematic use of various tools in different operational stages.

Page 18: Class Note TQM

In another study in this area, Fazel & Salegna (1996) tried to group major TQM tools and techniques into six major categories as determined by their primary area of implementation focus:1) customer-based;2) management-based;3) employee-based;4) supplier-based;5) process-based; and6) product-based.

Table 3 shows this classification.The above mentioned categories are described below (Salegna & Fazel, 1996): 1) Customer-based strategies should be the focal point of every TQM programme, around which all other strategies are formulated. Customer satisfaction is only likely to be achieved and maintained when the customer plays an active role in the organization’s process of quality improvement. Major techniques used to accomplish this are customer needs analysis, customer surveys and quality function deployment.2) Management-based strategies are also extremely important for the successful implementation of TQM. TQM initiatives are not likely to succeed without strong leadership and support from top management. The goals and the benefits of implementing TQM must clearly be communicated by top management to the workforce. The alignment of the reward structure with the goals of the organization is also vital to the organization’s success in achieving these goals.

Page 19: Class Note TQM

Table 3: TQM Tools and Techniques Categories (Salegna & Fazel, 1996)

Implementation Strategy Tools and Techniques

1) Customer-based Customer survey Customer need analysis Quality function deployment

2) Management-based reward structure Communication Leadership

3) Employee-based teamwork Empowerment Cross-training Quality circles Quality teams

BrainstormingNominal group technique

4) Supplier-based supplier research Supplier training Supplier documentationSupplier certification

5) Process-based statistical process control Quality improvement processJust-in-timeLead time reductionBenchmarkingQuality cost analysisQuality auditsQuality assessmentProcess documentationISO 9000Work flow analysis

6) Product-based standardization Benchmarking Design of experimentsConcurrent engineeringProduct flow analysis

3) Employee-based strategies provide a means of increasing the participatory role of workers. Strategies such as empowerment, teamwork and cross-training may result in employees having increased decision making authority, greater job responsibilities, and increased motivation and sense of pride in their work. Quality programmes may also benefit from employee suggestions resulting from other group activities including quality teams, quality circles, the nominal group technique and brainstorming.

4) Supplier-based strategies provide a means of increasing an organization’s likelihood of having suppliers who are reliable and willing to work towards the organization’s goals of providing a quality product. Given the trend towards companies reducing the number of suppliers and cultivating long-term relationships with the remaining ones, these strategies are particularly important today.

Page 20: Class Note TQM

5) Process-based strategies focus on improving processes by reducing waste, defect rates, cycle time, and providing feedback on the performance of the process. Benchmarking, SPC and JIT are some of the most popular techniques employed by companies to achieve these goals.

6) Product-based strategies are directly focused on the quality of the product, its physical characteristics and its manufacturability.

THE PROPOSED ROADMAP FOR TQM IMPLEMENTATION

According to Hammer & Goding (2001) DMAIC methodology provides a structured framework for solving business problems by assuring correct and effective process execution. This methodology has 6 phases in which, in the case of Six Sigma, teams take total employee involvement approaches to complete the cycle of process management and use self-diagnosis skills to fulfill the goals of each phase. The business will naturally reach the Six Sigma quality, when all key processes within a business are completed for each of these five each phases (Byrne, 2003). Figure 4 shows the DMAIC methodology.

Figure 4: DMAIC Methodology (Hammer & Goding, 2001)

However, in the case of TQM, the factors that affecting TQM tools selection are many and it should be considered before any implementation plan. These factors are:

the availability of resources within the company to facilitate tools successful introduction; · the objective of using quality management tools such as solving a simple problem or reaching to a high level in quality;

the product characteristics; and Current product and process improvement or new product introduction.

Page 21: Class Note TQM

Some tools or techniques appear simple over others in their development and interpretation. The purpose of each of them is distinct and problem specific. Certainly, not all tools or techniques are required in one firm. SPC tools are very basic and can be applied for both short and long term goals. Some of the tools and techniques are commonly (even frequently) used, for example Pareto chart, cause & effect diagram, histogram or quality control charts for quality performance monitoring and improvement, and some others can be used less frequently (such as Benchmarking, QFD). Some of the techniques are used, for example, QFD, FMEA, and design for manufacturability (DFM), in the design and development processes. Different control charts and process capability indices are used in controlling manufacturing processes (Ahmed & Hassan, 2003).

The fundamental slogan of TQM is "Do-it-right-the-first-time (DIRFT)". For this, quality is required to be introduced at the design level. The tools that have direct linkage with the introduction of new products are DoE, QFD, FMEA, and fault tree analysis (Spring, Mc Quater, Swift, Dale, & Booker, 1998).

Regarding to functions and activities of a manufacturing firm the following tools and techniques are suitable for implementing TQM (Ahmed & Hassan, 2003):

a) in new product introduction - DoE, brainstorming, cause & effect diagram, QFD, fault tree analysis;b) in stage of production - process flow diagrams, Pareto chart, control chart; c) in assessing the process or product - histogram, pie chart, scatter diagram, bar chart, etc; andd) in every stage of data collection - capability indices, check sheet or check list, etc. Therefore, need for a systematic approach in selection stage of tools and techniques and then in implementation phases, calls for a more comprehensive methodology. In order to respond to this, an extended model proposed here based on the DMAIC methodology. This model is called TQM implementation roadmap to show its developmental sequences toward TQM tools and techniques implementation. Figure 5 shows the proposed roadmap.

Page 22: Class Note TQM

Figure 5: TQM Tools and Techniques Implementation Roadmap

Each step of this roadmap is described here. The first step is documenting process. A key reason for documenting a process is to allow it to be analyzed for improvement opportunities. The key step in understanding a process is to be able to document it by listing the steps involved. According to Ehresman (1995) the best tools for documenting are process flow diagrams: operating process flow diagram, functional process flow diagrams and layout process flow diagrams. Process flow diagrams are excellent process documentation tools. The diagrams allow everyone involved to view the process in the same way. The next step is to collect the required data and measuring the process. Collecting and analyzing data are key steps in process improvement. Collecting process performance data help answering questions such as (Ehresman, 1995):

What is happening?How is the process performing?Is the process improving?Is the process satisfying customer requirements?The best tools for this purpose is check sheet, Pareto chart, histogram, scatter diagram, run chart and statistical process control (SPC). The data collected by these tools can be used to measure the process. According to Ehresman (1995) process measurements serve as a means to listen to a process. When appropriately analyzed and interpreted, the measurements provide accurate, meaningful and timely process performance feedback. The measurements can tell you a lot about the process. Process measurements are useful for many purposes including those listed below:

understanding what is happening;provide objective performance feedback;evaluate the need for improvement;evaluate the impact of changes; andset schedules and performance targets.

Page 23: Class Note TQM

The third step in the proposed roadmap is to use problem finding and solving tools. Problem solving requires a structured approach. Without such an approach, efforts often are random and/or misguided. The five steps listed below can be applied to any type of problem, regardless of its complexity (Ehresman, 1995):

1) Define a problem;2) identify the root cause;3) Select the best solution;4) Develop an action plan; and5) Verify plan results.The proper techniques for this goal are cause & effect diagram, relationship diagram, brainstorming, reversal and characteristic changing. These techniques can be applied to determine and eliminate the root cause of a problem. Key element of any problem-solving effort are identifying the root cause and generating a list of possible solution ideas.

Continuous quality improvement is the next step. In this step all the above mentioned steps will be repeated and more tools and techniques will be used to ensure continuous improvement. The most common tools and techniques in this step are control charts, FMEA, and fault tree analysis.

After the above steps quality stabilization within company and making quality everyone’s job is necessary. Quality is not a part of everyone’s job. Everyone doesn’t do quality for a while and then get back to his/her real job. Quality is everyone’s job (Ehresman, 1995). The best tool that can assist every organization in its effort to make quality everyone’s job is ISO9000 series. A key aim of TQM is customer satisfaction. The focus of customer satisfaction, in turn, is to understand and meet or exceed customer requirements. A system is required to ensure that the requirements are continually and consistently adhered to. This system contains documentation so requires that anyone assigned to perform a job can be trained in the same precise manner as the previous person on that job. There can be no decline in quality just because the person performing the process has changed. The existence or need for quality systems certainly is not new; companies around the world have been developing their own quality systems for years. ISO9000 is series of international quality standards that widely accepted and is the best tools for ensuring that quality is everyone’s job (Martínez-Costa, Choi, Martínez, & Martínez-Lorente, 2009; Van der Wiele, Dale, & Williams, 1997).

However, in addition to the above mentioned tools and techniques, there are numerous techniques which are technical and engineering related and have specific applications to product design and development. Usually these tools and techniques cover all functions within the organization and need more effort to implement and to lead organization to a high level of quality. Six Sigma, 5Ss housekeeping, total productive maintenance (TPM), reward system, suggestion system, electronic data exchange (EDI), computer aided (product/process) design (CAD), computer aided manufacturing (CAM), design for manufacture and assembly (DFMA), finite element analysis (FEA), computer numerical control (CNC), computer integrated manufacturing (CIM) and just-in-time (JIT) are within this category.

CONCLUSION

Page 24: Class Note TQM

The correct selection and use of tools and techniques is a vital component of any successful TQM implementation plan. The TQM tools and techniques can be divided into simple tools for solving a special problem and complex one that cover all functions within the company. Before any implementation the availability of resources within the company, the usage and scope of each tools and techniques and the product characteristics should be considered carefully. In order to response to the quest for a comprehensive methodology for TQM tools and techniques implementation, a roadmap with 6 steps proposed in this research. The developmental sequences of this roadmap start with steps that begin from process documentation and lead to accomplishment of more complex and modern quality tools and techniques that guide organization to a high level of quality. This roadmap can help all organizations intending to reach to a high level of institutionalized quality.

Quality vs. Profit

Impact of Quality on profitability and efficiency:

Recently US manufacturers have shifted their focus from short-term measures to measures based on quality.

This change in focus was due to the fact that the companies focusing on quality were more profitable in the long-run.

Commitments to high quality demand focus on issues related to the routine operations such as; reduction in a) Customer complaints, b) machine break-down, c) defects, d) scraps, e) cycle time, f) late delivery rate; g) new production introduction time.

Japanese Managements’ focus and commitment to make incremental and continuous improvement placed them ahead of American Manufacturers despite their superior technology.

Loss of market share of the American products in US itself, to the Japanese products are attributed to the high quality of the Japanese products.

The very act of reducing scraps, defects, improving performances, & customer satisfaction should lead to increased profitability.

Technology and management practices diffuse over time and what was competitive advantage of fewer companies become the standard practice of everybody later.

Capabilities related to Quality, efficiency once were considered to be ‘Award Winners’ in the past, now they have become ‘Qualifiers’.

Of course, before being able to construct a quality management program, the company must first organize a system of quality that will determine how they will judge their products. This quality system will contain the characteristics to be judged in order to define the quality of the particular product or service to be produced for the market.

The first characteristic of quality is the quality of the product’s blueprint or plan. This is the engineering that is involved with the scheme of the product or service and how good it is, as well as how good it answers the needs and wants of the target market. This first step is one that is involved with planning the product.

The second characteristic in defining quality is the rate of how the finished product is then seen to have strictly adhered to the specifications that were set by the initial plans and designs. Not all

Page 25: Class Note TQM

the time manufacturing processes yield a product that has been constructed exactly according to plan (in certain manufacturing situations, there are concerns that cause compromises to be made such as costs of materials, time of production, availability of resources, etc.) and that’s why determining whether a certain finished product measures up to the initial plan accordingly is an important factor in determining quality.

The third characteristic for determining quality is the satisfaction based on good customer service. After the product has been planned, designed, the blueprint and the process have been laid out, next comes the actual construction of the product. After the manufacturing processes, then the process of selling the product comes into play. Measuring performance on this third process is very important because in this third phase, actual contact with the target customers is put into play and this is where customer trust, satisfaction and loyalty may or may not be firmly established. It is important to remember that no matter how good a product may be, if it is not presented accordingly to the customers, and if the concerns of the customers afterwards are not given the proper attention and appropriate action, then satisfaction, trust, and loyalty may suffer, and then follows viability and profitability for the business.

In all these aspects of quality, customer satisfaction is the topmost focus. These three characteristics of quality are of equal importance because if a certain level of incompetence is present in either of these quality specifiers, the fate of the product and the business will surely suffer the critical consequences.