mboo44 assignments answers sem 2 both set

44
Name : ///DEEP SINGH Registration / Roll No. : 331630152 Course : Master of Business Administration (MBA) Subject : Production & Operations Managem ent Semester : Semester II Subject Number : MB0044 ( SET-I ) _______________ _______________ ______________ Sign of Center Head Sign of Evaluator Sign of Coordinator

Upload: rajput1463

Post on 28-Nov-2014

552 views

Category:

Documents


0 download

DESCRIPTION

THIS IS MY FREE SERVICE BEKUZ EDUCATION SHOULD BE FREE AND WORSHIPPED NOT BUSINESS... rajput1463atRate123himachalDOTcom ( NO COST NO ADVT NO MATERIALIZATION)...

TRANSCRIPT

Page 1: MBOO44 ASSIGNMENTS Answers Sem 2 BOTH SET

Name : ///DEEP SINGH

Registration / Roll No. : 331630152

Course : Master of Business Administration

(MBA)

Subject : Production & Operations

Management

Semester : Semester II

Subject Number : MB0044 ( SET-I )

_______________ _______________ ______________ Sign of Center Head Sign of Evaluator Sign of Coordinator

PRODUCTION & OPERATION MANAGEMENT

Page 2: MBOO44 ASSIGNMENTS Answers Sem 2 BOTH SET

SET-1

Q.1ANSWERIn computing, just-in-time compilation (JIT), also known as dynamic translation, is a method to improve the runtime performance of computer programs. Traditionally, computer programs had two modes of runtime operation, either interpreted or static (ahead-of-time) compilation.[citation needed] Interpreted code is translated from a high-level language to a machine code continuously during every execution, whereas statically compiled code is translated into machine code before execution, and only requires this translation once.

JIT compilers represent a hybrid approach, with translation occurring continuously, as with interpreters, but with caching of translated code to minimize performance degradation. It also offers other advantages over statically compiled code at development time, such as handling of late-bound data types and the ability to enforce security guarantees.

JIT builds upon two earlier ideas in run-time environments: byte code compilation and dynamic compilation. It converts code at runtime prior to executing it natively, for example byte code into native machine code.

Several modern runtime environments, such as Microsoft's .NET Framework and most implementations of Java, rely on JIT compilation for high-speed code execution.

Overview: In a bytecode-compiled system, source code is translated to an intermediate representation known as bytecode. Bytecode is not the machine code for any particular computer, and may be portable among computer architectures. The byte code may then be interpreted by, or run on, a virtual machine. A just-in-time compiler can be used as a way to speed up execution of byte code. At the time a piece of code is to be executed, the just-in-time compiler will compile some or all of it to native machine code for better performance. This can be done per-file, per-function or even on any arbitrary code fragment; the code can be compiled when it is about to be executed (hence the name "just-in-time"), and then cached and reused later without needing to be recompiled.

In contrast, a traditional interpreted virtual machine will simply interpret the bytecode, generally with much lower performance. Some interpreters even interpret source code, without the step of first compiling to bytecode, with even worse performance. Statically compiled code or native code is compiled prior to deployment. A dynamic compilation environment is one in which the compiler can be used during execution. For instance, most Common Lisp systems have a compile function which can compile new functions created during the run. This provides many of the advantages of JIT, but the programmer, rather than the runtime, is in control of what parts of the code are compiled. This can also compile dynamically generated code, which can, in many scenarios, provide substantial performance advantages over statically compiled code, as well as over most JIT systems.

A common goal of using JIT techniques is to reach or surpass the performance of static compilation, while maintaining the advantages of bytecode interpretation: Much of the "heavy lifting" of parsing the original source code and performing basic optimization is

Page 3: MBOO44 ASSIGNMENTS Answers Sem 2 BOTH SET

often handled at compile time, prior to deployment: compilation from bytecode to machine code is much faster than compiling from source. The deployed bytecode is portable, unlike native code. Since the runtime has control over the compilation, like interpreted bytecode, it can run in a secure sandbox. Compilers from bytecode to machine code are easier to write, because the portable bytecode compiler has already done much of the work.

JIT code generally offers far better performance than interpreters. In addition, it can in some cases offer better performance than static compilation, as many optimizations are only feasible at run-time: The JIT compiler translates byte codes into native machine code. This compilation process is done only once, and a link is created between the bytecode and the corresponding compiled code

The compilation can be optimized to the targeted CPU and the operating system model where the application runs. For example JIT can choose SSE2 CPU instructions when it detects that the CPU supports them. To obtain this level of optimization specificity with a static compiler, one must either compile a binary for each intended platform/architecture, or else include multiple versions of portions of the code within a single binary. The system is able to collect statistics about how the program is actually running in the environment it is in, and it can rearrange and recompile for optimum performance. However, some static compilers can also take profile information as input. The system can do global code optimizations (e.g. inlining of library functions) without losing the advantages of dynamic linking and without the overheads inherent to static compilers and linkers. Specifically, when doing global inline substitutions, a static compilation process may need run-time checks and ensure that a virtual call would occur if the actual class of the object overrides the inlined method, and boundary condition checks on array accesses may need to be processed within loops. With just-in-time compilation in many cases this processing can be moved out of loops, often giving large increases of speed.

Although this is possible with statically compiled garbage collected languages, a bytecode system can more easily rearrange executed code for better cache utilization. Startup delay and optimizations JIT typically causes a slight delay in initial execution of an application, due to the time taken to load and compile the bytecode. Sometimes this delay is called "startup time delay". In general, the more optimization JIT performs, the better the code it will generate, but the initial delay will also increase. A JIT compiler therefore has to make a trade-off between the compilation time and the quality of the code it hopes to generate. However, it seems that much of the startup time is sometimes due to IO-bound operations rather than JIT compilation (for example, the rt.jar class data file for the Java Virtual Machine is 40 MB and the JVM must seek a lot of data in this contextually huge file).

One possible optimization, used by Sun's HotSpot Java Virtual Machine, is to combine interpretation and JIT compilation. The application code is initially interpreted, but the JVM monitors which sequences of bytecode are frequently executed and translates them to machine code for direct execution on the hardware. For bytecode which is executed only a few times, this saves the compilation time and reduces the initial latency; for frequently executed bytecode, JIT compilation is used to run at high speed, after an initial phase of slow interpretation. Additionally, since a program spends most time executing a minority of its code, the reduced compilation time is significant. Finally, during the initial

Page 4: MBOO44 ASSIGNMENTS Answers Sem 2 BOTH SET

code interpretation, execution statistics can be collected before compilation, which helps to perform better optimization.

The correct tradeoff can vary due to circumstances. For example, Sun's Java Virtual Machine has two major modes - client and server. In client mode, minimal compilation and optimization is performed, to reduce startup time. In server mode, extensive compilation and optimization is performed, to maximize performance once the application is running by sacrificing startup time. Other Java just-in-time compilers have used a runtime measurement of the number of times a method has executed combined with the bytecode size of a method as a heuristic to decide when to compile.[3] Still another uses the number of times executed combined with the detection of loops.[4] In general, it is much harder to accurately predict which methods to optimize in short-running applications than in long-running ones.

Native Image Generator (NGen) by Microsoft is another approach at reducing the initial delay.[6] Ngen pre-compiles (or "pre-jits") bytecode in a Common Intermediate Language image into machine native code. As a result, no runtime compilation is needed. .NET framework 2.0 shipped with Visual Studio 2005 runs Ngen on all of the Microsoft library DLLs right after the installation. Pre-jitting provides a way to improve the startup time. However, the quality of code it generates might not be as good as the one that is jitted, for the same reasons why code compiled statically, without profile-guided optimization, cannot be as good as JIT compiled code in the extreme case: the lack of profiling data to drive, for instance, inline caching.

There also exist Java implementations that combine an AOT (ahead-of-time) compiler with either a JIT compiler (Excelsior JET) or interpreter (GNU Compiler for Java.)

History The earliest published JIT compiler is generally attributed to work on LISP by McCarthy in 1960. In his seminal paper Recursive functions of symbolic expressions and their computation by machine, Part I, he mentions functions that are translated during runtime, thereby sparing the need to save the compiler output to punch cards. In 1968, Thompson presented a method to automatically compile regular expressions to machine code, which is then executed in order to perform the matching on an input text. An influential technique for deriving compiled code from interpretation was pioneered by Mitchell in 1970, which he implemented for the experimental language LC².

Smalltalk pioneered new aspects of JIT compilations. For example, translation to machine code was done on demand, and the result was cached for later use. When memory became sparse, the system would delete some of this code and regenerate it when it was needed again. Sun's Self language improved these techniques extensively and was at one point the fastest Smalltalk system in the world; achieving up to half the speed of optimized C[13] but with a fully object-oriented language.

Self was abandoned by Sun, but the research went into the Java language, and currently it is used by most implementations of the Java Virtual Machine, as HotSpot builds on, and extensively uses, this research base.

The HP project Dynamo was an experimental JIT compiler where the 'bytecode' format and the machine code format were the same; the system turned HPA-8000 machine code into HPA-8000 machine code. Counter intuitively, this resulted in speed ups, in some

Page 5: MBOO44 ASSIGNMENTS Answers Sem 2 BOTH SET

cases of 30% since doing this permitted optimizations at the machine code level, for example, inlining code for better cache usage and optimizations of calls to dynamic libraries and many other run-time optimizations which conventional compilers are not able to attempt.

Q.2ANSWERValue engineering (VE) or Value Analysis (VA) is a systematic method to improve the "value" of goods or products and services by using an examination of function. Value, as defined, is the ratio of function to cost. Value can therefore be increased by either improving the function or reducing the cost. It is a primary tenet of value engineering that basic functions be preserved and not be reduced as a consequence of pursuing value improvements.

In the United States, value engineering is specifically spelled out in Public Law 104-106, which states “Each executive agency shall establish and maintain cost-effective value engineering procedures and processes." Value engineering is sometimes taught within the project management or industrial engineering body of knowledge as a technique in which the value of a system’s outputs is optimized by crafting a mix of performance (function) and costs. In most cases this practice identifies and removes unnecessary expenditures, thereby increasing the value for the manufacturer and/or their customers.

VE follows a structured thought process that is based exclusively on "function", i.e. what something "does" not what it is. For example a screw driver that is being used to stir a can of paint has a "function" of mixing the contents of a paint can and not the original connotation of securing a screw into a screw-hole. In value engineering "functions" are always described in a two word abridgment consisting of an active verb and measurable noun (what is being done - the verb - and what it is being done to - the noun) and to do so in the most non-prescriptive way possible. In the screw driver and can of paint example, the most basic function would be "blend liquid" which is less prescriptive than "stir paint" which can be seen to limit the action (by stirring) and to limit the application (only considers paint.) This is the basis of what value engineering refers to as "function analysis".Value engineering uses rational logic (a unique "how" - "why" questioning technique) and the analysis of function to identify relationships that increase value. It is considered a quantitative method similar to the scientific method, which focuses on hypothesis-conclusion approaches to test relationships, and operations research, which uses model building to identify predictive relationships.

Value engineering is also referred to as "value management" or "value methodology" (VM), and "value analysis" (VA). VE is above all a structured problem solving process based on function analysis—understanding something with such clarity that it can be described in two words, the active verb and measurable noun abridgement. For example, the function of a pencil is to "make marks". This then facilitates considering what else can make marks. From a spray can, lipstick, a diamond on glass to a stick in the sand, one can then clearly decide upon which alternative solution is most appropriate.

Page 6: MBOO44 ASSIGNMENTS Answers Sem 2 BOTH SET

Q.3ANSWER

Quantitative Models

There is a wide range of quantitative models, of varying degrees of sophistication and complication. In this pack, we will only cover those that I think you are likely to encounter in systems studies or could use to good effect. The techniques available subdivide broadly into two major classes, static models and dynamic models. The distinction between these will become clearer as you look at some detailed examples. Essentially, dynamic models are those where the set of calculations comprising the model is repeated a number of times. The initial values of the variables in the current set of calculations are taken from the results of the previous set of calculations, and this process is repeated time and time again. In static models, the calculations are executed once to obtain a result (or set of results). Even where the calculations are repeated, as with stochastic models, the values in each set of calculations are not determined by the previous calculation.

Table gives a list of techniques, under these two broad headings.

Main category Sub-types Examples

Static spreadsheet business planning

stochastic spreadsheet financial models

forecasting electricity industry

linear programming diet formulation

decision analysis decision trees

preference models Kepner Tregoe

Dynamic time-stepped queue models

event-driven process models

stock and flow system dynamics population models

Difference between work study and motion study

Work practices are ways of doing any work which has been in vogue and found to be Useful. These are determined by motion and time study conducted over years and found to be efficient and practiced. Any method improvement that is conducted may be adopted to change the practice, but only after trials they have shown that, they increase the comfort of the worker and get the job done faster.

Work study We say that work study is being conducted when analysis of work methods is conducted during the period when a job is done on a machine or equipment. The study helps in designing the optimum work method and standardization of the work method. This study enables the methods engineer to search for better methods for higher utilization of man and machine and accomplishment of higher productivity. The study gives an opportunity to the workmen to learn the process of study thus making them able to offer suggestions for improved methods. This encourages workmen participation and they can be permitted to make changes and report the advantages that can be derived from those. This course is in alignment with the principle of continuous improvement and helps the organization in

Page 7: MBOO44 ASSIGNMENTS Answers Sem 2 BOTH SET

the long run. Reward systems may be implemented for recognizing contributions from the workmen. Work study comprises of work measurement and method study. Work measurement focuses on the time element of work, while method study focuses on the methods deployed and development of better methods. Work measurement Work measurement can be defined as a systematic application of various techniques that are designed to establish the content of work involved in performing a specific task. The task is performed by a qualified worker. With this we arrive at the standard time for a task. This will be used to fix performance rating of other workers. It forms the basis of incentives, promotion, and training for workmen and assessment of capacity for the plant.

Hence, training the workers is very important.

ILO defines a qualified worker as “one who is accepted as having the necessary physical attributes, possessing the required intelligence and education, and having acquired the necessary skill and knowledge to carry out the work in hand to satisfactory standards of safety, quantity, and quality

Methods study Method study focus is on studying the method currently being used and developing a new method of performing the task in a better way. Operation Flow charts, Motion Charts, Flow Process charts, which are the elements of the task, are studied to find the purpose of each activity, the sequence in which they are done, and the effect of these on the work. The study may help in changing some of them and even eliminate some of them to effect improvements. The new method should result in saving of time, reduced motions, and simpler activities.

Machine worker interaction Machine worker interaction study consists of studying the amount of time an operator spends on the machine before it is activated and the time he has nothing to do. In many modern manufacturing centers, where we have automated systems of manufacturing, the role of the worker is limited to, observing various screens, dials, indicator lamps to see that the process is going on smoothly. In some cases, his job may be to load the jobs on the machines and check the settings. What is of concern to us, is to see whether the operations permit for enabling an operator to look after two or three machines, without affecting of the performance of the machine or man.

Ergonomics Ergonomics is the study of physical human factors and their functioning. We study the movements, the amount of energy that is required for certain activities, and the coordination among them. In operations management, we use these factors at two places. The first is when we design the machines which are operated, the way the operator does the tasks on the machine using different controls. Levers, wheels, switches, pedals (See figure) have to be positioned so that the operators have maximum comfort for long working hours

Equipment positioned to enable maximum comfort

Page 8: MBOO44 ASSIGNMENTS Answers Sem 2 BOTH SET

The other factor is the consideration given for the type of loads the body can take at various positions. When doing jobs like lifting, clamping, moving, and holding, energy is expended by different organs for which racks, tables, pallets, are positioned and designed to suit workers’ physical features.

Q.4ANSWER

Rapid Prototyping (RP) can be defined as a group of techniques used to quickly fabricate a scale model of a part or assembly using three-dimensional computer aided design (CAD) data. What is commonly considered to be the first RP technique, Stereolithography, was developed by 3D Systems of Valencia, CA, USA. The company was founded in 1986, and since then, a number of different RP techniques have become available.

Rapid Prototyping has also been referred to as solid free-form manufacturing; computer automated manufacturing, and layered manufacturing. RP has obvious use as a vehicle for visualization. In addition, RP models can be used for testing, such as when an airfoil shape is put into a wind tunnel. RP models can be used to create male models for tooling, such as silicone rubber molds and investment casts. In some cases, the RP part can be the final part, but typically the RP material is not strong or accurate enough. When the RP material is suitable, highly convoluted shapes (including parts nested within parts) can be produced because of the nature of RP.

There is a multitude of experimental RP methodologies either in development or used by small groups of individuals. This section will focus on RP techniques that are currently commercially available, including Stereoli-thography (SLA), Selective Laser Sintering (SLS), Laminated Object Manufacturing (LOM™), Fused Deposition Modeling (FDM), Solid Ground Curing (SGC), and Ink Jet printing techniques.

Rapid Prototyping uses to: To increase effective communication. To decrease development time.

Page 9: MBOO44 ASSIGNMENTS Answers Sem 2 BOTH SET

To decrease costly mistakes. To minimize sustaining engineering changes. To extend product lifetime by adding necessary features and eliminating redundant features early in the design.

Rapid Prototyping decreases development time by allowing corrections to a product to be made early in the process. By giving engineering, manufacturing, marketing, and purchasing a look at the product early in the design process, mistakes can be corrected and changes can be made while they are still inexpensive. The trends in manufacturing industries continue to emphasize the following:

• Increasing number of variants of products. • Increasing product complexity. • Decreasing product lifetime before obsolescence. • Decreasing delivery time.

Rapid Prototyping improves product development by enabling better communication in a concurrent engineering environment.

The basic methodology for all current rapid prototyping techniques can be summarized as follows:

1. A CAD model is constructed, and then converted to STL format. The resolution can be set to minimize stair stepping.2. The RP machine processes the .STL file by creating sliced layers of the model.3. The first layer of the physical model is created. The model is then lowered by the thickness of the next layer, and the process is repeated until completion of the model.4. The model and any supports are removed. The surface of the model is then finished and cleaned.

Q.5ANSWER

BREAK EVEN ANALYSISLearning Objectives• Utility of break-even analysis• Performance analysis, Sensitivity analysis, Probability analysis

IntroductionThe determination of the break-even point of a firm is an important factor in assessing its profitability. It is a valuable control technique and a planning device in any business enterprise. It depicts the relation between total cost and total revenue at the level of a particular output. Ordinarily, the profit of an industrial unit depends upon the selling price pf product (revenue), volume of business (it depends on price) and cost price of the product.

If an entrepreneur is aware of the product cost and its selling price, he can plan the volume of his sale in order to achieve a certain level of profit. The Break-even point is determined as that point of sales volume at which the total cost and total revenue’ are identical.

Page 10: MBOO44 ASSIGNMENTS Answers Sem 2 BOTH SET

Break-Even PointBreak-even point is an important measure being used by thepr0ponents and banks in deciding the viability of a new project, especially in respect of manufacturing activities. This technique is useful dealing with a new project or a new activity of the existing unit.The break-even point (BE?) establishes the level of output/production which evenly breaks the costs and revenues. It is the level of production at which the turnover just covers the fixed overheads and the ‘unit starts making profits.

From the banker’s point of view, the project should achieve a break-even position within a reasonable time from the start of production. The project, which reaches a break-even paint earlier, is considered as a viable project by bankers. They cannot only expect earlier repayment of their advances in the case of such projects but can also be assured that the project can fairly adapt itself to the day-to-day developing technology. The projects which are unlikely to reach the break-even point in the third or fourth year of its permanent of-production will not be a viable proposal for the bankers.The break-even analysis also determines the margin of safety, i.e., excess of budgeted or actual sales over the break-even sales so that the bankers would’ know how sensitive a project is to recession. This is an important factor in determining the feasibility of the project and its ability to absorb the ups and downs in the economy. The bankers, as lenders of funds, insist upon a reasonable margin of safety so that fixed costs are met at a fairly earlier stage.

Algebraic Formulate of Break-Even Analysis

A. Break-even point (BEP) in terms of sales:

(Total Sales - Total variable costs - Total. contribution)

Example:Sales 1000 units

Selling price per unit Rs. 60Variable cost per unit Rs. 40

Fixed cost Rs. 150060 - 401500

BEP(Unit Volume ) =1500/20= 75 units

BEP in terms of unitsBEP in terms of sales = Rs. 75 x Rs. 60 = Rs. 4500

B. BEP (in terms of capacity utilization)

Page 11: MBOO44 ASSIGNMENTS Answers Sem 2 BOTH SET

In this method, BEP in terms of capacity utilization is calculated with reference to capacity utilization in the normal year of production. For instance, if the unit is expected to achieve a capacity utilization of 40%, 45%, 60% and 80% and of the installed capacity in the first five years, the BEP computation will be with reference to 80%.

Calculation of BEPThe break-even point can be calculated in terms of physical units and in terms of sales turnover. i. In terms of physical units: The number of units required to be sold to achieve the break-even point can be calculated using the following formula:

WhereFC =variable costVC = fixed costSP = selling priceC= contribution per unit (C = SP - VC)

Example if:FC = Rs. 1,00,000VC = Rs. 2 per unitSP = Rs. 4 per unit, andMaximum productivity capacity = 1,00,000 units per year.

There are comparison of product and process layout

Product layout Process layout

1. Investment

Needs high investment in machine/equipment

Comparatively low investment needed

2. Duration of Production

Needs less manufacturing times as the economy in time can be planned in the beginning

Production time can not be economized due to frequent movement of men and material.

3. Immobilization due to Breakdown

Page 12: MBOO44 ASSIGNMENTS Answers Sem 2 BOTH SET

Break down of any unit/component immobilises the whole system

Breakdown of any machine does not immobilize the whole system

4. Adjustability to changes

Inflexible as each machine can perform pre-designed operation only

Flexible as different section can adjust ht operation according to operation

5. Floor space

Requires less space. Require more space.

6. Men/Equipment Utilization

Not to full capacity Comparatively better utilization

Lesser amount of material handling and comparatively lesser time, money and efforts

Involves greater handling of material requiring more time, money and efforts.

7. material handling

Lesser amount of material handling and comparatively lesser time, money and efforts

Involves greater handling of material requiring more time, money and efforts.

8. Demand and supply relationship

Proper co-ordination between demand and as these are made to stock

Co-ordination between demand and supply is likely to be difficulty as these made to order.

9. Control and Inspection

Specialized and expertise control is required thus increasing supervision costs

Comparatively lesser efforts on control are needed.

Thus both the layouts have their own advantages and disadvantages. In fact merits of one are demerits of the of the other. The choice an suitability of he layout mainly mainly depends n the nature of the manufacturing system and type f the product to be produced. In general process layout is suitable for intermittent systems and product layout is appropriate for continuous systems.

Quality management for macro processes is carried out by use of the Juran Trilogy, which basically consists of three steps- Quality Planning, Quality Control and Quality Improvement. Let us understand the main activities and the relation between the three phases of the Juran Trilogy.  Quality Planning: The quality planning phase is the activity of developing products and processes to meet customers' needs. It deals with setting goals and establishing the means required to reach the goals. 

Page 13: MBOO44 ASSIGNMENTS Answers Sem 2 BOTH SET

Quality Control: This process deals with the execution of plans and it includes monitoring operations so as to detect differences between actual performance and goals. It consists of three steps:  1. Evaluate actual quality performance  2. Compare actual performance to quality goals  3. Act on the difference  Quality Improvement: The process is for obtaining breakthrough in quality performance, and it consists of several steps:  1. Establish the infrastructure needed to secure annual quality improvement 2. Identify the specific needs for improvement- the improvement projects 3. Establish project teams with clear responsibility for bringing the project to a successful conclusion 4. Provide the resources, motivation, and training needed by the teams to- Diagnose the cause, Stimulate establishment of remedies, and Establish controls to hold the gains. 2 Q.6ANSWERIn most organizations there is a great focus on the Quality Control process, with little or no emphasis on the other two processes; however the well established and customer focused organizations do have clearly defined and robust process for all aspects of the Juan Trilogy. In my previous article on "Quality of Design" we discussed the importance of Quality Planning and its significance in the development of products and processes. Quality Control is an operational activity and the control part becomes easy if the planning process is robust, else the control process will remain only a firefighting exercise. In the control phase, statistical tools can be used to monitor and improve the processes involved. Some examples of control items are defects in products, response time to customers, billing accuracy etc. The Improvement process may typically call for cross functional teams at a macro process level and departmental teams at the micro level. The improvements could be reduction of rework or Cycle time reduction or elimination of any chronic quality issues.

The Juran Trilogy Diagram : The three processes of the trilogy are indicated in the diagram, which is a graph with time on the horizontal axis and cost of poor quality on the vertical axis. The planners are responsible for the product and process design to meet the customer needs; and the job of the operating forces is to run the process and produce the products. . We will see that the process cannot achieve 100 percent quality and 20 percent rework has to be carried out. Quality control prevents from the situation getting worse and also putting off the fires such as the sporadic spike. In due course we will see

Page 14: MBOO44 ASSIGNMENTS Answers Sem 2 BOTH SET

that the chronic problems have come down by the application of the quality improvement process.  The Alligator Analogy: The distinction between Quality Planning and Quality Improvement is brought out by the alligator analogy. This is a fable of a manager who is up to his waist in alligators; and each live alligator is a metaphor for chronic waste. Each completed quality improvement project results in a dead alligator and when all alligators are terminated the quality improvement is considered complete for the moment; but that doesn't happen as long as the quality planning process has not changed. A changed and improved planning process will only help complete improvement and sustain the same.   From the trilogy diagram and the alligator analogy it is clear that quality improvement reduces quality issues but to sustain the new level there are to be improvement in the quality planning process.

Crosby's Fourteen StepsPhil Crosby (1979) developed a fourteen-step program for quality improvement:1. MANAGEMENT COMMITMENT - Top-level view on quality shown to all employees.2. THE QUALITY IMPROVEMENT TEAM - To pursue the quality regime throughout the business.3. QUALITY MEASUREMENT - Analysis of business quality performance in a meaningful manner, for example late deliveries, budgeted to actual sales/deliveries/costs/etc. Keep it simple for all to understand.4. THE COST OF QUALITY - Make sure everyone in the business understands the need for a quality system, and the costs to the business if there is no quality system in place.5. QUALITY AWARENESS - Again make everyone in the business aware of the impact of quality systems.6. CORRECTIVE ACTION - Ensure a system is in place for analyzing defects in the system and applying simple cause and effect analysis, to prevent re-occurrence.7. ZERO DEFECTS PLANNING - Look for business activities to which zero defect logic should be applied.8. SUPERVISOR TRAINING - Get your supervisors trained in both quality logic and zero defect appreciation which they can apply to their business activities.9. ZERO DEFECTS DAY - A quality event by which all members of the effected section become aware that a change has taken place.10. GOAL SETTING - Once a change has been implemented in a section of the business, the next step is to get the employees and supervisors in that section to set goals for improvement to bring about continuous improvement.11. ERROR CAUSE REMOVAL - Communication process by which management are made aware that set goals are difficult to achieve in order for either the goals to be reset or help given by management to achieve the goals.12. RECOGNITION - Management must recognize the employees who participate in the quality schemes.13. QUALITY COUNCILS - Using both specialist knowledge and employee experiences to bring about a focused approach to business quality regime.

Page 15: MBOO44 ASSIGNMENTS Answers Sem 2 BOTH SET

14. DO IT OVER AGAIN - Continuous improvement means starting from the beginning again and again.

To design an application to provide optimum benefits, the architect, designer and programmers must thoroughly understand the data and processes needed. An excellent way to gain this understanding and prepare to implement software is to carefully and completely model business processes and the relevant data. Business processes represent the flow of data through a series of tasks that are designed to result in specific business outcomes. This article reviews the concepts of business processes and logical process modeling. It is a useful place to start understanding the concepts of business processes and the benefits of modeling processes as well as data.

Name : ///DEEP SINGH

Registration / Roll No. : 331630152

Course : Master of Business Administration

(MBA)

Subject : Production & Operations

Management

Semester : Semester II

Subject Number : MB0044 ( SET-II )

Page 16: MBOO44 ASSIGNMENTS Answers Sem 2 BOTH SET

_______________ _______________ ______________ Sign of Center Head Sign of Evaluator Sign of Coordinator

PRODUCTION & OPERATION MANAGEMENT SET-2

Q.1ANSWER

What Is a Business Process? A process is a coordinated set of activities designed to produce a specific outcome. There are processes for saving a file, constructing a building, and cooking a meal. In fact, there is a process for almost everything we do. A business process is a type of process designed to achieve a particular business objective. Business processes consist of many components, including:

The data needed to accomplish the desired business objective Individual work tasks that manipulate, review, or act upon the data in some way Decisions that affect the data in the process or the manner in which the process is

conducted The movement of data between tasks in the process Individuals and groups which perform tasks

Processes can be manual or automated, fully documented or simply knowledge in the minds of one or more people. They can be simple or complex. They can be formal, requiring exact adherence to all details; or flexible, provided the desired outcome is achieved.

Logical Process Modeling Logical Process Modeling is the representation of a business process, detailing all the activities in the process from gathering the initial data to reaching the desired outcome. These are the kinds of activities described in a logical process model:

Gathering the data to be acted upon Controlling access to the data during the process execution Determining which work task in the process should be accomplished next

Page 17: MBOO44 ASSIGNMENTS Answers Sem 2 BOTH SET

Delivering the appropriate subset of the data to the corresponding work task Assuring that all necessary data exists and all required actions have been

performed at each task Providing a mechanism to indicate acceptance of the results of the process, such

as, electronic "signatures" All business processes are made up of these actions. The most complex of processes can be broken down into these concepts. The complexity comes in the manner in which the process activities are connected together. Some activities may occur in sequential order, while some may be performed in parallel. There may be circular paths in the process (a re-work loop, for example). It is likely there will be some combination of these. The movement of data and the decisions made determining the paths the data follow during the process comprise the process model. The contains only business activities, uses business terminology (not software acronyms, technical jargon, etc.…), completely describes the activities of the business area being modeled, and is independent of any individual or position working in the organization. Like its sibling, Logical Data Modeling, Logical Process Modeling does not include redundant activities, technology dependent activities, physical limitations or requirements or current systems limitations or requirements. The process model is a representation of the business view of the set of activities under analysis. Heretofore, many applications and systems were built without a logical process model or a rigorous examination of the processes needed to accomplish the business goals. This resulted in applications that did not meet the needs of the users and / or were difficult to maintain and enhance. Problems with an unmodeled system include the following:

Not knowing who is in possession of the data at any point in time Lack of control over access to the data at any point in the process Inability to determine quickly where in the process the data resides and how long

it has been there Difficulties in making adjustments to a specific execution of a business process Inconsistent process execution

Logical Process Modeling Primer Modeling methods can be grouped into Logical and Physical types. Using a combination of these methodologies can produce the most complete model, and no single method is sufficient to adequately define your processes. Logical Process Modeling Logical process modeling methods provide a description of the logical flow of data through a business process. They do not necessarily provide details about how decisions are made or how tasks are chosen during the process execution. They may be either manual or electronic, or a combination of methods. Some of the logical modeling formats are:

Written process descriptions Flow charts Data flow diagrams Function hierarchies Real-time models or state machines Functional dependency diagrams

A function is a high-level activity of an organization; a process is an activity of a business area; a sequential process is the lowest-level activity. Therefore:

Page 18: MBOO44 ASSIGNMENTS Answers Sem 2 BOTH SET

Functions consist of Processes. Functions are usually identified at the planning stage of development, and can be decomposed into other functions or into processes. Some examples of Functions would include: Human Resource Management, Marketing, and Claims Processing Processes consist of Sequential Processes. Processes are activities that have a beginning and an end; they transform data and are more detailed than functions. They can be decomposed into other processes or into Sequential Processes. Some examples of Processes would be: Make Payment, Produce Statement of Account, and Verify Employment Sequential Processes are specific tasks performed by the business area, and, like a process, transform data. They cannot be further decomposed. Examples of Sequential Processes are: Record Customer Information, Validate Social Security Number, and Calculate Amount Due.

Each business activity in a logical process model is included in a decomposition diagram, given a meaningful name and described in detail with text. As in Logical Data Modeling, naming conventions are quite important in process modeling. Names for processes begin with a verb and should be as unique as possible while retaining meaning to the business users. Nouns used in the activity name should be defined and used consistently. In a decomposition diagram, each level completely describes the level above it and should be understandable to all appropriate business users.

Physical Process Modeling Physical modeling methods specify the topology (connectivity), data, roles, and rules of a business process. This model describes items such as:

Work tasks to be completed during the process The order in which the tasks should be executed Data needed to start the process execution Data required to start and finish each work task Rules needed to determine routing through the process Exception handling techniques At least one defined business outcome Roles and permissions of each process participant

The physical model may not closely resemble the logical model, but they produce the same outcomes.

Data-Driven Approach to Process Definition This approach, most commonly used in relational and object-oriented analysis efforts, analyzes the life cycle of each major data entity type. The approach defines a process for each phase or change the data undergoes the method by which the data is created, the reasons for the change and the event that causes the data to achieve its terminal state. This method assures that all data actions are accounted for and that there are meaningful associations between the data and its processes. However, in a data-driven method, the logical data model must be completed before the process modeling and analysis can begin. Major points of interest in constructing a Logical Process Model are:

The purpose of the process. Writing the purpose and referring to it frequently enables the analyst to recognize a step in the process that does not make sense in the context of the process.

Page 19: MBOO44 ASSIGNMENTS Answers Sem 2 BOTH SET

Who will participate in the process? The participants may be people, groups of people, or electronic applications.

The order in which the steps of the process are done. The data you expect to be included in the process. There is an initial set of

expected data, plus you should know what data you expect to be modified or added during the process. Part of this step is deciding which subset of the data is appropriate at each task in the process.

Decisions that will be made during the execution of the process. These include decisions about which path the process should take, and whether all the required data is present at any given point in the process.

The rules you will use to define the various parts of the process. Also, note any naming conventions that are important for the business.

The disposition of the data at the end of the process. That is, will the data be retained or deleted? If you plan to store the data, where and in what form will the data be kept? Do future process-related reports need to access the data?

There may be other elements in the business processes that need to be included in the model. The more complete the model, the easier it will be to implement the software, and the more successful the processes will be in producing the desired output. Process definition also helps you know when a process should be broken into smaller, sequential processes. If the definition of a process is ambiguous or lengthy, it is usually a candidate for decomposing into sequential processes. All functions are decomposed to processes, and all processes are ultimately decomposed into sequential processes.

Constructing the Process Model Diagrams Once the functions, processes and sequential processes have been identified and defined, the analyst uses process modeling software to construct a set of diagrams to graphically represent the business processes under scrutiny. In drawing the diagrams, consider including the following items:

The starting point of the process. There could be more than one starting point, depending on the purpose and the operation of the process. If a process contains more than one starting point, include all of them.

All tasks to be performed during the execution of the process. The order in which the tasks should be accomplished, including any tasks that

may be performed in parallel. All decision points, both those having to do with choosing a path through the

process and those that determine whether or not the process should continue. Any points at which the process path may split or merge. The completion point of the process. As a process may have multiple starting

points, it can also have multiple completion points. You should also develop a means of identifying the data you expect at each point in the process. Be mindful of areas in the process where more than one task may be performed simultaneously. In these areas, you may need to show data being shared among participants, or different subsets of the data being made available to each participant. Finally, include the ending point(s) of the process. This indicates that the process has been completed and that all the data generated by the process can be identified.

Reviewing the Model As in Logical Data Modeling, plan to spend a significant portion of modeling time reviewing the model. Validate your assumptions by reviewing them with the people who

Page 20: MBOO44 ASSIGNMENTS Answers Sem 2 BOTH SET

are involved in executing the process to be certain your assumptions are correct and complete. Verify all data requirements to ensure that all the data needed has been identified, while using what data is needed at each step in the process. It is a good practice to perform this verification at each sequential process defined. A good check of the accuracy of any model is to simulate it by walking through the process manually. This allows the analyst to locate any points in the processes that are not valid before system construction. Once the process has been successfully simulated, review the results with the people who understand the expected results from each function and process. This verification step allows the process experts to understand the model you have created and point out any potential problems with the model before beginning the deployment of the model.

Summary

Like Logical Data Modeling, Logical Process Modeling is one of the primary techniques for analyzing and managing the information needed to achieve business goals. It is important that analysts understand the concepts of process modeling; the methods used in process discovery and definition, and perfect the analytical skills for relating and explaining the data and processes used by a business area. Properly performed, logical process modeling can greatly assist the system architects and developers in their efforts, producing functional and scalable applications.

Q.2ANSWERThe project manager must have clear understanding of the process, activities and deliverables in managing project.  It includes knowledge on how to use specific tools to bring about the expected product of each project management process. Here are the PMBoK and Prince2 definitions of the project management knowledge areas:

The PMBoK Definition:

1. Management of Integration describes the processes and activities that integrate the various elements of project management.2. Management of Scope Describes the processes involved in ascertaining that the project includes all the work required, and only the work required, to complete the project successfully.3. Management of Time describes the processes concerning the timely completion of the project.4. Management of Cost Describes the processes involved in planning, estimating, budgeting, and controlling costs so that the project is completed within the approved budget.5. Management of Quality Describes the processes involved in assuring that the project will satisfy the objectives for which it was undertaken.6. Management of Human Resource describes the processes that organize and manage the project team.7. Management of Project Communications describes the processes concerning the timely and appropriate generation, collection, dissemination, storage and ultimate disposition of project information.

Page 21: MBOO44 ASSIGNMENTS Answers Sem 2 BOTH SET

8. Management of Risk describes the processes concerned with conducting risk management on a project.9. Management of Procurement Management describes the processes that purchase or acquire products, services or results, as well as contract management processes.

The Prince 2 Definition:1. Business Case The justification behind the project.2. Organization the way in which the personnel involved in the project is structured.3. Plans Documents describing what the project should accomplish, how the work should be carried out, when it should be carried out and by whom.4. Control the way in which the project manager and project board should exercise control over the project.5. Management of Risk The way in which the project should approach and manage risk.PRINCE2 defines a risk as uncertainty of outcome, which maybe either a positive opportunity or a negative threat. Once analyzed risks are managed where appropriate to remove, reduce the effect of a negative threat and to take advantage of positive opportunities.6. Quality in a Project Environment the way in which the project should ensure that a quality product is delivered.

7. Configuration Management the way in which the project should ensure that a quality product is delivered.8.  Change Control the way in which the project manages any changes to specification or scope of its products.

Q.3ANSWERProject Life Cycle - Project Cycle Management

The Project Life Cycle refers to a logical sequence of activities to accomplish the project’s goals or objectives. Regardless of scope or complexity, any project goes through a series of stages during its life. There is first an Initiation or Birth phase, in which the outputs and critical success factors are defined, followed by a Planning phase, characterized by breaking down the project into smaller parts/tasks, an Execution phase, in which the project plan is executed, and lastly a Closure or Exit phase, that marks the completion of the project. Project activities must be grouped into phases because by doing so, the project manager and the core team can efficiently plan and organize resources for each activity, and also objectively measure achievement of goals and justify their decisions to move ahead, correct, or terminate. It is of great importance to organize project phases into industry-specific project cycles. Why? Not only because each industry sector involves specific requirements, tasks, and procedures when it comes to projects, but also because different industry sectors have different needs for life cycle management methodology. And paying close attention to such details is the difference between doing well things, Diverse project management tools and methodologies prevail in the different project cycle phases. Let’s take a closer look at what’s important in each one of these stages:

1)InitiationIn this first stage, the scope of the project is defined along with the approach to be taken

Page 22: MBOO44 ASSIGNMENTS Answers Sem 2 BOTH SET

to deliver the desired outputs. The project manager is appointed and in turn, he selects the team members based on their skills and experience. The most common tools or methodologies used in the initiation stage are Project Charter, Business Plan, Project Framework (or Overview), Business Case Justification, and Milestones Reviews.

2)Planning The second phase should include a detailed identification and assignment of each task until the end of the project. It should also include a risk analysis and a definition of a criteria for the successful completion of each deliverable. The governance process is defined, stake holders identified and reporting frequency and channels agreed. The most common tools or methodologies used in the planning stage are Business Plans.

3)Execution and controlling The most important issue in this phase is to ensure project activities are properly executed and controlled. During the execution phase, the planned solution is implemented to solve the problem specified in the project's requirements. In product and system development, a design resulting in a specific set of product requirements is created. This convergence is measured by prototypes, testing, and reviews. As the execution phase progresses, groups across the organization become more deeply involved in planning for the final testing, production, and support. The most common tools or methodologies used in the execution phase are an update of Risk Analysis and Score Cards, in addition to Business Plan and Milestones Reviews.

4)ClosureIn this last stage, the project manager must ensure that the project is brought to its proper completion. The closure phase is characterized by a written formal project review report containing the following components: a formal acceptance of the final product by the client, Weighted Critical Measurements (matching the initial requirements specified by the client with the final delivered product), rewarding the team, a list of lessons learned, releasing project resources, and a formal project closure notification to higher management. No special tool or methodology is needed during the closure phase.

The Project Management Body of Knowledge (PMBOK)

The Project Management Body of Knowledge (PMBOK) is a collection of processes and knowledge areas generally accepted as best practice within the project management discipline.

As an internationally recognized standard (IEEE Std 1490-2003) it provides the fundamentals of project management, irrespective of the type of project be it construction, software, engineering, automotive etc.

Page 23: MBOO44 ASSIGNMENTS Answers Sem 2 BOTH SET

PMBOK recognizes 5 basic process groups and 9 knowledge areas typical of almost all projects. The basic concepts are applicable to projects, programs and operations. The five basic process groups are:

1. Initiating 2. Planning 3. Executing 4. Monitoring and Controlling 5. Closing

Processes overlap and interact throughout a project or phase. Processes are described in terms of :

Inputs (documents, plans, designs, etc.) Tools and Techniques (mechanisms applied to inputs) Outputs (documents, products, etc.)

The nine knowledge areas are:1. Project Integration Management 2. Project Scope Management 3. Project Time Management 4. Project Cost Management 5. Project Quality Management 6. Project Human Resource Management 7. Project Communications Management 8. Project Risk Management 9. Project Procurement Management

Each knowledge area contains some or all of the project management processes. For example, Project Procurement Management includes:

Procurement Planning Solicitation Planning Solicitation Source Selection Contract Administration Contract Closeout

Much of PMBOK is unique to project management e.g. critical path and work breakdown structure (WBS). Some areas overlap with other management disciplines. General management also includes planning, organizing, staffing, executing and controlling the operations of an organization. Financial forecasting, organizational behaviour and planning techniques are also similar.

CAPM and PMP

The Project Management Institute (PMI) is the publisher of PMBOK (now in its fourth edition) and offers two levels of certification:A Certified Associate in Project Management (CAPM) has demonstrated a common base of knowledge and terms in the field of project management. It requires either 1500 hours of work on a project team or 23 contact hours of formal education in project management.A Project Management Professional (PMP) has met specific education and experience requirements, has agreed to adhere to a code of professional conduct and has passed an examination designed to objectively assess and measure project management knowledge.

Page 24: MBOO44 ASSIGNMENTS Answers Sem 2 BOTH SET

In addition, a PMP must satisfy continuing certification requirements or lose the certification.As of 2006, PMI reported over 220,000 members and over 50,000 Project Management Professionals (PMPs) in 175 countries. Over 44,000 PMP certifications expire annually; a PMP must document ongoing project management experience and education every three years to keep their certification current.

Q.4ANSWER

A Project Management Information System (PMIS) is a part of Management Information Systems (MIS) and manage information of a project centric organization. These electronic systems "help [to] plan, execute, and close project management goals."[1]

PMIS systems differ in scope, design and features depending upon an organisation's operational requirements.Relationship between a PMS and PMISA Project Management System(PM) could be a part of a PMIS or sometimes an external tool beside project management information system. What a PMIS does is to manage all stakeholders in a project such as Project Owner, Client, Contractors, Sub-Contractors, Company persons, Workers, Managers and etc.PMIS pmbok 4th edition definition Project Management Information System (PMIS) [Tool]. An information system consisting of the tools and techniques used to gather, integrate, and disseminate the outputs of project management processes. It is used to support all aspects of the project from initiating through closing, and can include both manual and automated systems.Project Management Information System (PMIS) [Tool]. An information system consisting of the tools and techniques used to gather, integrate, and disseminate the outputs of project management processes. It is used to support all aspects of the project from initiating through closing, and can include both manual and automated systems.

Key Success FactorsIntroduction:Making the transition to becoming a firm that manages all aspects of knowledge well, is clearly going to be difficult, with the emphasis on its commitment and resolve, and to find the resources to get Knowledge Management off to a good start. Measures will depend on the concept of knowledge. Thus the various approaches and popular theories and practices in vogue that have been successful; are dealt with below. Popular Theory & Practice

A key success factor is a performance area of critical importance in achieving consistently high productivity. There are at least two broad categories of key success factors that are common to virtually all organizations: business processes and human processes. Both are crucial to building great companies. Our focus is on the human process areas.

Page 25: MBOO44 ASSIGNMENTS Answers Sem 2 BOTH SET

When the success factors are studied focus falls on the human aspects. A strong academic majority raises a big concern around this area. All agree that the intellectual assets of the employees are the foremost critical success factor. “Usually people begin a KM project by focusing on the technology needs. But the key is people and process.” (Shir Nir, 2002). The key to successful knowledge management (KM) projects is focusing on people first, not cutting-edge technology. The biggest misconception that IT leaders make is that knowledge management is about technology," says Shir Nir, There is no "cookie cutter approach" to adopting knowledge management. Every organization and company has its own definition of knowledge and how it should be gathered, categorized and made available to employees. What works for one company will not work for another because organizational knowledge is so subjective. The one size- fits-all mentality, coupled with the tendency to focus on technology rather than people and process, has obscured the real benefits that KM can bring, according to Nir (2002). It does not help that knowledge management means different things and often involves different kinds of technologies a t different organizations.

Bixler (2002) developed a four pillar model to describe success factors for a KM implementation. To achieve a basic entry level KM program, it has been determined that all ``four pillars`` must be addressed. The four enterprise engineering pillars are leadership, organization, technology and learning in support of enterprise wide knowledge management initiatives. Leadership means that managers develop business and operational strategies to survive and position for success in today’s dynamic environment. Those strategies determine vision, and must align knowledge management with business tactics to drive the value of KM throughout the enterprise. Focus must be placed on building executive support and KM champions.

The success factor organization describes that the value of knowledge creation and collaboration should be intertwined throughout an enterprise. Operational processes must align with the KM framework and strategy, including all performance metrics and objectives. While operational needs dictate organizational alignment, a KM system must be designed to facilitate KM through out the organization. Technology enables and provides the entire infrastructure and tools to support KM within an enterprise.

The Gartner Group defines 10 technologies that collectively make up full-function KM. The functional requirements that enterprises can select and use to build a KM solution include: “capture and store“, “search and retrieve”, send critical information to individuals or groups, “structure and navigate”, “share and collaborate”, “synthesize, profile and personalize”, “solve or recommend”, “’integrate with business applications”, and “maintenance”.

Summary

Based on the above study, it is considered that the most relevant factors for the successful implementation and sustenance of momentum for the KM initiatives are:

(1) A Culture of pervasive knowledge sharing needs to be nurtured enabled within and aligned with organizational objectives. The underlying concern is employees do not want to share information. Successful organizations empower employees to want to share and Contribute intellectual information, by rewarding them for such actions. And, with

Page 26: MBOO44 ASSIGNMENTS Answers Sem 2 BOTH SET

organizational leaders role models of information sharing and interface regularly with staff, teams and stakeholders in review sessions and openly talk about successes and failures.

(2) KM Organization: The first important variable is leadership with a vision, strategy and ability to promote change of the management to a compelling knowledge management actively promoted by the Chief Executive that clearly articulates how knowledge management contributes to achieving organizational objectives. A specialist team t o aggressively manage knowledge property i.e., manage intellectual assets as routines-process, appropriate technology, infrastructure for ‘’social’’ and electronic networking to allow for innovation and leverage organizational knowledge.

(3) Effective& Systematic Processes creating a “knowledge environment” with processes to capture the knowledge assets of the organization is important , but it will probably be most successful once most of the technologies of electronic commerce have been implemented.

(4) Strategy, Systems & Infrastructure establishes a clear definition of all required KM elements and an overall system approach and integration.

(5) Finally the Measures the success of knowledge management can be measured against pragmatic milestones, such as the creation of products, the development of new clients and an increase in sales revenue.

Q.5ANS:

The 7 Principles of Supply Chain Management

The most requested article in the 10-year history of Supply Chain Management Review was one that appeared in our very first issue in the spring of 1997. Written by experts from the respected Logistics practice of Andersen Consulting (now Accenture), “The Seven Principles of Supply Chain Management,” layed out a clear and compelling case for excellence in supply chain management. The insights provided here remain remarkably fresh ten years later.

• Principle 1: Segment customers based on the service needs of distinct groups and adapt the supply chain to serve these segments profitably.

• Principle 2: Customize the logistics network to the service requirements and profitability of customer segments.

• Principle 3: Listen to market signals and align demand planning accordingly across the supply chain, ensuring consistent forecasts and optimal resource allocation.

• Principle 4: Differentiate product closer to the customer and speed conversion across the supply chain.

• Principle 5: Manage sources of supply strategically to reduce the total cost of owning

Page 27: MBOO44 ASSIGNMENTS Answers Sem 2 BOTH SET

materials and services.

• Principle 6: Develop a supply chain-wide technology strategy that supports multiple levels of decision making and gives a clear view of the flow of products, services, and information.

• Principle 7: Adopt channel-spanning performance measures to gauge collective success in reaching the end-user effectively and efficiently.

• Translating Principles into Practice

• Reaping the Rewards

Managers increasingly find themselves assigned the role of the rope in a very real tug of war—pulled one way by customers' mounting demands and the opposite way by the company's need for growth and profitability. Many have discovered that they can keep the rope from snapping and, in fact, achieve profitable growth by treating supply chain management as a strategic variable. These savvy managers recognize two important things:

1. They think about the supply chain as a whole—all the links involved in managing the flow of products, services, and information from their suppliers' suppliers to their customers' customers (that is, channel customers, such as distributors and retailers).

2. They pursue tangible outcomes—focused on revenue growth, asset utilization, and cost. Rejecting the traditional view of a company and its component parts as distinct functional entities, these managers realize that the real measure of success is how well activities coordinate across the supply chain to create value for customers, while increasing the profitability of every link in the chain.

Our analysis of initiatives to improve supply chain management by more than 100 manufacturers, distributors, and retailers shows many making great progress, while others fail dismally. The successful initiatives that have contributed to profitable growth share several themes. They are typically broad efforts, combining both strategic and tactical change. They also reflect a holistic approach, viewing the supply chain from end to end and orchestrating efforts so that the whole improvement achieved—in revenue, costs, and asset utilization—is greater than the sum of its parts.

Unsuccessful efforts likewise have a consistent profile. They tend to be functionally defined and narrowly focused, and they lack sustaining infrastructure. Uncoordinated change activity erupts in every department and function and puts the company in grave danger of “dying the death of a thousand initiatives.” The source of failure is seldom management's difficulty identifying what needs fixing. The issue is determining how to develop and execute a supply chain transformation plan that can move multiple, complex operating entities (both internal and external) in the same direction.To help managers decide how to proceed, we revisited the supply chain initiatives undertaken by the most successful manufacturers and distilled from their experience seven fundamental principles of supply chain management.

Page 28: MBOO44 ASSIGNMENTS Answers Sem 2 BOTH SET

The Bullwhip Effect (or Whiplash Effect) is an observed phenomenon in forecast-driven distribution channels. The concept has its roots in J Forrester's Industrial Dynamics (1961) and thus it is also known as the Forrester Effect. Since the oscillating demand magnification upstream a supply chain reminds someone of a cracking whip it became famous as the Bullwhip Effect.

Because customer demand is rarely perfectly stable, businesses must forecast demand to properly position inventory and other resources. Forecasts are based on statistics, and they are rarely perfectly accurate. Because forecast errors are a given, companies often carry an inventory buffer called "safety stock".

Moving up the supply chain from end-consumer to raw materials supplier, each supply chain participant has greater observed variation in demand and thus greater need for safety stock. In periods of rising demand, down-stream participants increase orders. In periods of falling demand, orders fall or stop to reduce inventory. The effect is that variations are amplified as one moves upstream in the supply chain (further from the customer). This sequence of events is well simulated by the Beer Distribution Game which was developed by Prasad Ligade MIT Sloan School of Management in the 1960s.The causes can further be divided into behavioral and operational causes:

Behavioral causes misuse of base-stock policies misapplication of trinomial theorem misperceptions of feedback and time delays panic ordering reactions after unmet demand perceived risk of other players' bounded rationality

Operational causes Dependent demand processing

o Forecast Errors o adjustment of inventory control parameters with each demand observation

Lead time Variability (forecast error during replenishment lead time) lot-sizing/order synchronization

o consolidation of demands o transaction motive o quantity discount

trade promotion and forward buying anticipation of shortages

o allocation rule of suppliers o shortage gaming (including dereliction under Benford's Law) o Lean and JIT style management of inventories and a chase production

strategy

Theoretically the Bullwhip effect does not occur if all orders exactly meet the demand of each period. This is consistent with findings of supply chain experts who have recognized that the Bullwhip Effect is a problem in forecast-driven supply chains, and careful

Page 29: MBOO44 ASSIGNMENTS Answers Sem 2 BOTH SET

management of the effect is an important goal for Supply Chain Managers. Therefore it is necessary to extend the visibility of customer demand as far as possible. One way to achieve this is to establish a demand-driven supply chain which reacts to actual customer orders. In manufacturing, this concept is called Kanban. This model has been most successfully implemented in Wal-Mart's distribution system. Individual Wal-Mart stores transmit point-of-sale (POS) data from the cash register back to corporate headquarters several times a day. This demand information is used to queue shipments from the Wal-Mart distribution center to the store and from the supplier to the Wal-Mart distribution center. The result is near-perfect visibility of customer demand and inventory movement throughout the supply chain. Better information leads to better inventory positioning and lower costs throughout the supply chain. Barriers to the implementation of a demand-driven supply chain include the necessary investment in information technology and the creation of a corporate culture of flexibility and focus on customer demand. Another prerequisite is that all members of a supply chain recognize that they can gain more if they act as a whole which requires trustful collaboration and information sharing.Methods intended to reduce uncertainty, variability, and lead time:

Vendor Managed Inventory (VMI) Just In Time replenishment (JIT) Strategic partnership Information sharing smooth the flow of products

o coordinate with retailers to spread deliveries evenly o reduce minimum batch sizes o smaller and more frequent replenishments

Eliminate pathological incentives o every day low price policy o restrict returns and order cancellations o order allocation based on past sales instead of current size in case of

shortage