security kaizen - jeffreywelliott.com€¦ · msc security kaizen voice of the customer interviews...
TRANSCRIPT
1
MSC Security Kaizentm Phase I Review and Phase II Planning
7 May 2007 Jeffrey W. Elliott, Principal Consultant
Security Kaizen
2
Server Order
OrderApproval
Request forConfiguration
HW and SWAssy
New Reqts Marked forShipment
Shipped andRacked
Server Procurement
Senior Design Team
IT Control
IT Mgmt
HW SpecSW Spec
App Owners
SDH Owners
Server GoesLive
Exposure Issues1. Time Delay2. Missed Patches3. Non-Data Center
The BID – As Developed in the Working Group 1-4-06
!SDH Owners
!App owners!Business owners!ETG/infrastructure!...
Policy
Audit
Remediate
Assemble
Tailor
Operate
SDH
ChesspieceDB!???
!??? ???
Order(config data)
Approve! Senior design team! IT control! IT management
Specification
!IT AuditAuditRules
!IT Security
!??? Translate
“active”
?
?
Decomission
!Who
Data
Step
The Systems Engineering Perspective
Final Business Interaction Diagram Three Project Domains: 1) Translating security policies and
architecture into server settings and configuration specifications
2) Providing standards for builds, controlling access, and auditing the process
3) Compliance scanning and remediation.
SUPPLIERS PROCESS CUSTOMERS
SUPPORT
Server Order
Order Approval
Request for Configuration
Security Architecture &
Policy
HW and SW Assy
Connect,Ship,Rack,
Reconnect
Operate
Tailor / Modify
Audit Judgement
Sample
SDHSpecification
ChesspieceDB
Other / Unknown
AssessConformance
& Evaluate Urgency
Remediate As Necessary
Report
AD/GPO Settings
Tailor/Modify
Dispose
SMS Patches / Upgrades
Audit / Validate
Audit / Validate
Project Name: Microsoft Security Configuration (MSC) – Security Kaizen!Resource Requirements
Project Leader: Pete Shepker Charter Ratified: 19 January 2006 Project Champion: Mike Thomas Project Sponsors: Roy Thetford, Jerry Winchell
Core Team Contact Process Experts Contact Cory Shaffer 440.395.0929 Damir Brescic 440.395.0191 Shari Cox 440.395.0732 Matt Lynn 440.395.0729 Joe Welhouse 440.395.1536 Lee Wiegman 440.395.1493 Denise Klco 440.395.9662 Tim Anderson 440.395.9132 Scott Devine 440.395.0707 Christopher Smith 440.395.0140 Mike Orlandi 440.396.2403
Black Belts Progressive Project Management Jeffrey Elliott, CSS 425.802.6829 Valerie Paul 440.395.0131 Joe Kovara, CSS 425.503.2753
Strategic Alignment Relevant Company Objectives Integrity – We revere honesty. We adhere to the highest ethical standards, provide timely, accurate and complete financial reporting, encourage disclosing bad news, and welcome disagreement. Objectives – We strive to communicate clearly Progressive’s ambitious objectives and our people’s personal and team objectives. We evaluate performance against all these objectives. Excellence – We strive constantly to improve in order to meet and exceed the highest expectations of our customers, shareholders, and people. We teach and encourage our people to improve performance and to reduce the costs of what they do for customers. We base their rewards on results and promotion on ability.
Relevant Organizational Objectives Performance is paramount, and ensuring good systems performance is everyone’s job. Our systems will have documented service level agreements and be constructed to meet or exceed those commitments. Quality processes and metrics must be included in every aspect of software development. Security and privacy must be designed into applications as they are being built. Wintel servers, the .NET framework, and SQL Server databases are the primary building blocks of our evolving computing environment.
Problem Statement: What pain is the organization experiencing that this project can reduce or eliminate? Available and capable IT server and client assets are essential to operating the company’s business. To keep up with the demand, new servers are deployed at the rate of 1000 per year and clients are deployed at the rate of 1000 per month. Each server and client asset has a large variety of potential hardware and software configurations that affect functionality. While some configurations and settings provide users with needed capabilities, some restrict system capabilities in order to protect the confidentiality, integrity, and availability of information assets. As currently deployed, a formal system of configuration management controls server and client asset settings during the initial build stage of their lifecycles only and network tools are used to deploy upgrades and patches. At this time, the system doesn’t translate all of the formal security policies into technical specifications for deployment. Further, once servers and clients are operational, their configurations are often tailored, modified, or simply become outdated which renders them non-compliant to IT and Security policy. Currently, no formal and documented audit and remediation process ensures complete security compliance for servers or clients. As a result, even where the process is well controlled, it doesn’t fully implement current security policy, and where it is uncontrolled it exposes our business and our customer information assets to risks of unknown magnitude and criticality. Goal Statement: What does success look like? The goal of the Microsoft Security Configuration Project is to design a standardized process for deploying security policy into the server deployment lifecycle including an audit and remediation cycle and conduct a controlled pilot implementation to establish process capability. This will be accomplished by establishing and maintaining an accurate repository of asset configuration attributes and baselines, and auditing and remediating the operational environment to ensure compliance. The process will be driven by the evolving Security Architecture and Security Technical Standards which will operationally define compliant security settings for the build and operational stages in the server lifecycle. A formal audit and process performance measurement system will be developed and piloted. The goal is a process that is repeatable, i.e., consistently produces compliant configurations and remediates non-compliant systems, and reproducible, i.e., the process serves multiple IT assets (servers and clients) and is technology/platform agnostic.
!
“If I had one hour to save the world, I would spend the first 55 minutes defining the problem” -- Albert Einstein
3
Process Control - Documentation
Documenting in ISO provides standardization
We need to know that changes to servers happen and the docs are up to date (The SDH was supposed to
cure that) -- J Winchell
ISO documentation helps SOX
We need a single repository for server config documentation -- in the past, the SDH was that central source of data -- It needs some clean-up to match up aurtomatic install processes to manual processes -- It's not as effective as it used to be
because there is not a single owner of the SDH -- No one has control over the "Big Picture" of docs. Someone used to test all
docs in SDH as a whole -- Currently SDH owners only know their piece of the overall SDH (K. Santacesaria)
MSC Security Kaizen Server Configuration
Customer Voice
Business Priorities
InOut
04
0.036.0
Server Configuration Documentation
InOut
31
27.09.0
Change Strategies
InOut
22
18.018.0
Process Control - Methods
InOut
40
36.00.0
Reporting Performance
InOut
13
9.027.0
Access Control
InOut
00
0.00.0
1
2
3 4
5
Our project customers are saying, “Business priorities must drive performance reporting which will help design and implement change strategies to develop and improve server configuration documentation and help determine appropriate process control methods (which include access control).”
Cory Shaff
er
Denise
Klco
Eva Prin
ce
Joe W
elhouse
Matt Lyn
n
Mike O
rlandi
Shari C
ox
Scott D
evine
Pete Shep
ker
Tim A
nderson
Valerie
Paul
Auditors
MSC Security Kaizen Voice of the Customer Interviews
Cristina Beck, Senior IT Auditor I NJoe Olexa, Director - Control and Analysis I NDave Todd, IT Audit Manager I N
IT Asset Mgmt Jeff Wilde, Manager of Asset Team I NMike Hanna and/or Todd Starr I N
Ops Techs/Mgrs Jeff Holkovic, Director of Operations I NIdressa Davis I NMark McVicker, Manager of NTOPS I NVince Miller, Manager of Tier 3 I NVictor Weinmann I NRock Adeen, Manager NTOC II N I
Application Services - DBA Damian Hennessey, Manager SQL DBA I NApplication Services - Directory Services Mike Thayer, Manager, Directory Services, MSWG Member I NApplication Services - Production Engnrg Andy Say and Peter Jaffee, Architects I NIT Security Mike Thomas, Director, IT Security I N
Roy Thetford, Manager, IT Security I NJerry Winchell, IT Controller I N
App Owner- Business Apps Tony DiPaolo, Claims I NHoward Grandon, Manager Account Team N IJosh Taranowski N I
App Owner - Direct Brian Shura, Direct IT Security Liason, MSWG Member I NGino DiFranco, Manager, Direct IT Infrastructure N I
App Owner - Drive Brian Garvin I NPhillip Howell, Manager I NGeoff Fiedler I NSteve Rothenberg, Senior Developer, MSWG Member I N
Bunker West Operational Services Kimberly Santacesaria, Project Lead I NCorporate Systems John Krescic, Architect, Peoplesoft N IDSE Scott Hollowell, Manager, DSE N I
Mark Onders, DSE Architect I NPatty Griffin, DSE IT Security Liason, MSWG Member I NMike Clark, Manager, DSE IT Security Aligned I N
CS & A Joe Self, Director CSE N IGary Shultz and Ron Kerensky, Managers I N
I = InterviewerN = Notetaker
Auditors
4
Current state and history of device config changes
Audit and remediation
We need to limit the number of people who have Admin Status to change server
Control over access to config settings
We need to communicate audit findings with the Apps guys who create the demand
Audit and remediation
We need to track specific server configuration changes.
Current state and history of device config changes
Control over access to config settings
Current state and history of device config changes
We need to ensure that we're not exposed to vulnerabilities we thought we'd remediated.
Audit and remediation
We need to figure out the configuration up front and manage the drift. We know they drift out of compliance.
Audit and remediation
We need to implement Best Practices 101: Least privilege, managing changing people, tracking legit changes.
We need to know the real state of the environment after servers leave the Config Center.
Roy Thetford
Mike Thomas
MSC PROCESS
CUSTOMERVERBATIM
CTQ (Critical to Quality)
Requirement
5
VERBATIM
We need to limit the number of people who have Admin Status to change server configs. ! !We need to communicate audit findings with the Apps guys who create the demand for servers. ! ! !We need to track specific server configuration changes. ! !
We need to ensure that we're not exposed to vulnerabilities we thought we'd remediated. ! ! ! !We need to figure out the configuration up front and manage the drift. We know they drift out of compliance. ! ! ! ! !We need to test the Config Center on a sample basis ! !We need to know that change to server control documents happens and that the docs are up to date ! !We need some form of triggering to warn us before a problem occurs ! ! ! !We need to achieve a state in which people have only the access they need to do their jobs. ! !We need to know the frequency of change to server configs ! !
!VERBATIM
CTQ
MSC MetricAvailability Incidents Involving Device
Configuration Troubleshooting ! ! !Breakage Incidents Attributable to Security
Configuration ! !Mean Time to Repair (As Reported for This
Device) ! !Number of Users Authorized to Change Security
Configuration ! !Pareto of Authorized Users by Organization ! !
Number of Devices Found Nonconforming to the Current Security Configuration ! ! ! !
Time to Report Closed Remediation ! ! !Percentage of Devices Requiring Repeat
Remediation ! ! ! !Pareto of Repeat Remediation Settings ! ! ! !
Number of Revised Configurations ! !Pareto of Causes for Configuration Revisions
Number of Authorized Deviations ! !Number of Security Configurations Defined in the
MSC Process ! ! !Number of Devices Built to Security Configuration
Spec ! ! !Percentage of Controlled Builds That Fail
Compliance Check ! !Percentage of Devices in the MSC Control
Program ! ! !
Voice of the Customer Relationship to Critical-to-Quality (CTQ) Elements
Mik
e Th
omas
Roy
The
tford
Voice of the Customer Critical-to-Quality (CTQ) Elements
We need to know the real state of the environment after servers leave the Config Center.
We need to implement Best Practices 101: Least privilege, managing changing people, tracking legit changes.
Jerr
y W
inch
ell
! ! ! !
! ! ! ! !
Man
agem
ent D
ashb
oard
--
Con
form
ance
Man
agem
ent D
ashb
oard
-- A
vaila
bilit
yIm
plem
enta
tion
Das
hboa
rdPr
oces
s D
ashb
oard
Current state and history of device config changes
Audit and remediation
Control over access to config settings
Quality control of the build process
Safeguards to prevent system
performance degradation due to
config changes
Standardized configsSystem to
standardize security practices
MSC
Pro
cess
Cus
tom
er
Timely communication of
changes
Communication of nonconformance
6
MSC METRICS WORKSHEETPro
gress
Proce
ss
Perform
ance
Purpose Used By Desirability (L,M,H)
Ease of Acquisition (L,M,H)
1. Quality of Design
# of Security Config deviations authorized X Track uncontrolled devices ITS H H# of Security Config deviations as a % of control population X Track uncontrolled devices ITS / Audit H H% of total devices in configuration controlled population X
Track control set devices and project progress ITS / ETG / Audit H M
# of users authorized to change Security Configuration parameters X Track number of risk opportunities ITS / ETG M LPareto of Users by Orgn authorized to change Security Configuration parameters X
Identify orgn location of risk opportunities ITS M L
# of Configurations revised following Impact Assessment by ETG
# of Configurations revised following functional and core regression tests X
Track ability to develop acceptable standards and configurations ITS M H
# of Security Configurations PublishedX
Measure progress of MSC project ITS M H
# of devices in control population not conforming to current Security Config X Measure known unconrolled risk ITS / Audit H H# of days to report closed remediation
XMeasure time to remediate known uncontrolled risk ITS / ETG / Audit H H
# of devices requiring repeat remediationX Measure persistent non-compliance ITS / ETG H H
Pareto of Security Config settings requiring repeat remediation X
Identify persistent non-compliant settings ITS H H
Accuracy of server configuration control documents
X
Measure state of control documentation and ability to control process ETG L L
# of Security Tech Stds revised following ITS Mgmt Review
XMeasure ability to specify acceptable standards and configurations ITS L H
First pass quality of Configurations for functional and core regression tests
Catalog of key issues resolved following App Team Review
# of devices in control population built to specX
Identify size of control set and MSC project progress ITS / ETG / Audit H H
% of devices failing factory post-build checkX
Evaluate ability to control build process ITS / ETG / Audit H H
# of availability incidents involving server configuration troubleshooting X
Identify area of opportunity to increase availability ITS H H
Mean time to repair servers involving configuration troubleshooting X
Identify area of opportunity to increase availability ITS H H
# of breakage incidents attributable to Security Configuration X
Identify risks to availability attributable to the MSC project ITS H H
3. Quality of Performance
2. Quality of Conformance
1. Quality of Design
WHATS HO
WS
QFD A1 Matrix 504 design evaluations at four levels
8
Time Duration
Man hours
ITS
Stretch: Collaborative ad-hoc opinionStep: Program planning or Policy/Architecture changeLeap: Formal Risk Assessment
Determine scope or focus of the process
Documented portion of the environment to focus on
ITSETGC&A 3 days 16 hrs
ITSNon competititor, environment like partners
Leap: Determine strategic benchmark partners to team up and create joint configurations Partner Contact
ITSETGC&A 0 0
ITSPublished ITS policies, standards and procedures
Pick ITS policies for input based upon scopeStep: ManuallyLeap: Automated
List of applicable polices/standards/procedures to implement.
ITSETGC&A 1 day 12 hrs
VendorETGMSWG
Published platform-specific Security Architecture
Determine input architectures based upon scope
List of applicable architecture sections to implement
ITS EngineersETG EngineersC&A 1 day 8 hrs
ITSETG
Applicable Policies, Architectures and current environment information
Translate Policies & Architecture into technical standards Written Technical Standards
ITS EngineersETG Engineers 10 days 40 hrs
ITSETG
Technical Standards & current environment information
Risk Assessment of Technical StandardStretch - Ad-hoc reviewLeap - formal risk assessment process
Impact of technical specification (might determine scope of communication) & communication
ETG/ITS Management 1 day 20 hrs
Review BoardITS
Impact Assessment & Technical Standards
Agree on Technical Standards/go-nogo Communicate change
ITSETGC&A 1 day 20 hrs
ITSApplicable architectures & Technical Standards
Engineer Creates Configuration Design Proposed Design Testers 10 days 40 hrs
Review BoardETG
Impact Assessment & Proposed Design Design Test Test Plan & Test Script Testers 3 days 16 hrs
Testers Test Plan & Test Script Test the Design Test ResultsETG/ITS Management 5 days 40 hrs
Review BoardTesters
Impact Assessment & Test Results Approve Design
ISO documented configuration & Communication ETG 2 days 20 hrs
Deploy Configuration Standard
Pilot
Supplier Input Process CustomerOutput
Subteam 1 - Policy Deployment SIPOC
STARTITS
ETG / DSE
VENDOR / MSWG
REVIEW BOARD ETG Mgmt/ITS Mgmt
TESTERS
APP TEAM
Schedule Policy Implementation
Determine Input
Architectures
Translate Policies & Arch into Tech Stds
Tailor/Modify Device
Build Device
Publish Config
Conduct Risk
Assessment of Tech Stds
Create Config Remediate
Analyze Scan
Results and Report
Conduct Compliance
Scan
Maintain Device
ETG Mgmt Approve
Config
Conduct Functional and
Core Regression
Test of Config
ITS Mgmt Approve Tech Std
IT / CHANGE MGMT
Modify Policy End
End
AUDIT
Design Test of Config
Design Compliance
Scan
Deploy Config
Conduct App Test
Provide Feedback
and Resolve Issues
Administer Deviation Process
Subteam 2 - Standard Config SIPOC
Supplier Input Process Output Customer Duration Man Hours
User PSC Request Submitted Server Procurement Approved Request Document Requestor 1-2 weeks 40
ITS/ETG Server Builds Standards Build Server Server Build to Std Config Builder 2-5 days 3
Stretch: All setup documentedStep: All documents in 1 placeLeap: All setup automated
App Team Custom Config Specs Tailor/Modify Server Server with Custom Config Infrastucture/App Team 2 days 6
IT / Change Mgmt Server Support Care & Feeding Production Ready Server
Retired Server
Requestor / Change Mgmt / App Team /
DSE / Builderongoing ongoing
Stretch: Document all changesStep: Package all changesLeap: Automated all changes
Change Mgmt / DSE Process Audit Stds Process Audit Process Audit Report Builder / Automation /
Change Mgmt TBD TBD
Remediation TBD TBD
Note: Process Design Goal is to drive improvement/tailor into the process-controlled build
Sub-Team 2 - Server Build Standards and Process Control
Subteam - Compliance Scan SIPOC
Revised 3/10/06
Supplier Input Process Output Customer
ITS, ETG Technical Configuration Standard Scanning for Good/Bad Step: Perform scan at config center,
cut Remedy tickets ITS, ETG
Non-Compliant Results, Documented Exceptions
Classify Non-compliant Results
Report with all items that need attention ETG
Classification Prioritize Fixes Remediation Plan, Document exception ETG
ETG Remediation Plan Remediation Step: Automate remediation ETG
Exceptions Feedback to Policy Updated Configuration Config Owners
Sub-Team 3 - Compliance Scanning and Remediation SIPOC (Step)
9
Project scope
Design
Build
Maintain
Configuration
Retrofit
Sec
urity
operate
Recover
Policy
check
fix
Installability Adaptability Checkability Documentability Affordability Learnability Reliablility Testability Continual Improvement
Varying Exacting Complicating Sorting and Sensing Starting something new Complying Demanding new skills
Maximize the “illities / Minimize the “ings”
The Kano Model of Customer Needs
BASIC QUALITY“MUST-BE”
SPECIFIED QUALITY“PERFORMANCE”
EXCITING QUALITY“ATTRACTIVE”
CUSTOMERDELIGHTED
CUSTOMERREPULSED
BASIC NEEDSNOT MET
BASIC NEEDSFULLY MET
Design Tools Process Archetype Cust Sat Model
Derived CTQs
Design Criteria
10
MSC Security Kaizentm – RACI (R= Responsible; C=Must Coordinate; I=Must Be Informed) Updated 21 April 2006 by Jeffrey Elliott
Orgn ResponsibilityDescriptionPilot Process Sequence
CONFIGURATION STANDARD DEVELOPMENT
Orgn ResponsibilityDescription Pilot Process SequenceCONFIGURATION DEPLOYMENT
Pilot Process SequenceTECH STD DEVELOPMENT Description Orgn Responsibility
5Publish
Proposed Tech Std
4ITS Mgmt Review
Of Proposed Tech Std
3Conduct Impact Assessment of
Proposed Technical Standard
2Create Proposed
Technical Standard(s)
1Determine Scope
(area of focus) and Schedule of
Implementation
16Deploy Configuration
10Conduct Functional and Core
Regression Test of the Proposed Configuration
12Communicate Proposed
Configuration
8Implement Preliminary
Compliance Check
14Resolve App Team Issues
11ETG Mgmt Review
Of Proposed Configuration
13Conduct App Team Review
7Implement Proposed
Configuration Basic Test
9Conduct Proposed
Configuration Preliminary Impact Assessment
6Create Proposed
Configuration
20Administer Deviation
Process
18Conduct Compliance
Check
17Implement Compliance
Check Procedure
The process is designed to provide a phased approach to implementing security policy into server configurations and provide a system of configuration management to promote compliance.
1. The first step is for ITS to select specific security policies, standards, procedures and architectures for deployment in the MSC process.
2. A technical standard defines technology-specific configuration guidelines and defines the population for compliance to the standard.
3. ITS is responsible for conducting a high-level assessment of the benefits and costs of deploying a technical standard. The Impact Assessment is coordinated with ETG (who represents the App Teams) and Audit is informed of the assessment details and outcome through email notification.
4. ITS Management is responsible for assembling a review board to approve or recommend revision of the proposed technical standard.
5. ITS is responsible for publishing the draft technical standard for coordination with ETG . The MSWG receives a copy of the Draft Technical Standard.
6. ETG is responsible for translating the Draft Technical Standard into a specific Configuration coordinated with ITS.
7. ETG is responsible for testing the proposed configuration for compatibility with the technology platform.
8. ITS is responsible for designing and documenting the compliance check and the schedule for evaluating the control set. The Compliance Check Design Specification is coordinated with ETG and Audit.
9. ETG is responsible for evaluating the benefits and costs of implementing a proposed configuration. ITS is informed of the assessment details and outcome.
10. An ad hoc team of Testers is responsible for conducting both functional and core regression testing on the proposed configuration. These tests are coordinated with ETG.
11. ETG Management convenes a formal board of review to evaluate the proposed configuration, Impact Assessment, and the outcome of the Test Results. The Configuration is either approved or recommended for revision. This step is coordinated with ITS.
12. Upon approval by ETG Management, ETG provides email notification of the Proposed Configuration to ITS, MSWG, and the appropriate App Teams.
13. ETG is responsible for reviewing the Proposed Configuration and raising issues with performance or compatibility with specific applications. This step is coordinated with applicable App Teams.
14. ETG is responsible for resolving issues raised by the App Team(s) and coordinating issue resolution with ITS and the App Team(s).
15. Upon resolution of App Team issues, ITS is responsible for publishing the Technical Standard and Configuration Specification and informing ETG, Audit, MSWG, Testers, App Team(s), and IT/Change Mgmt.
16. ETG is responsible for formally deploying the Configuration in the Build instructions, Retrofit instructions, or the Maintain instructions. All relevant control documentation (including training) is updated and formal notification of the change is sent to all relevant parties. Deployment is coordinated with ETS, the App Team(s), and IT/Change Mgmt. Audit is informed of the Deployment plan.
17. TBD
18. Based on the schedule determined in the Compliance Check Design Specification (Step 8), the Compliance Scan is conducted by Audit and coordinated with ITS and ETG.
19. Audit is responsible for analyzing the Compliance Check results and publishing the Compliance Check Report and email notification of non-compliance. This step is coordinated with ETG and ITS. ITS updates the MSC performance dashboards.
20. ETG is responsible for administering any formal compliance deviations and coordinating them with ITS, Audit, and the App Team(s).
19Analyze Compliance
Check Results
15Publish Technical Standard
and Configuration
Pilot Process SequenceCONFIGURATION COMPLIANCE CHECK
& DEVIATION PROCESS
R
R C
RC
R I
IR II
CR
CC R
IIR IC
I
ITS ETG AUDIT MSWG APP-T IT-CM
CC RC R
ITS ETG AUDIT MSWG APP-T IT-CMITS ETG AUDIT MSWG APP-T IT-CM
I CR C
CR C
CR I
R C
C IR
C
C R
I
C
C C R
RCC I
C R CC
Note: “Responsible” means the identified organization must initiate and ensure closure on the process step. “Coordinate” means the identified organization must provide input.
“Inform” means that a formal communication is sent by the responsible organization to the designated recipient.
The goal of the Microsoft Security Configuration Project is to design a standardized process for deploying security policy into the server deployment lifecycle including an audit and remediation cycle and conduct a controlled pilot implementation to establish process capability. This will be accomplished by establishing and maintaining an accurate repository of asset configuration attributes and baselines, and auditing and remediating the operational environment to ensure compliance.
The process will be driven by the evolving Security Architecture and Security Technical Standards which will operationally define compliant security settings for the build and operational stages in the server lifecycle. A formal audit and process performance measurement system will be developed and piloted.
The goal is a process that is repeatable, i.e., consistently produces compliant configurations and remediates non-compliant systems, and reproducible, i.e., the process serves multiple IT assets (servers and clients) and is technology/platform agnostic.
11
Web Page Info Architecture
Title Page
ManagementDashboard
ProcessDashboard
What’s it allmean?
MSC ProjectImplementation
MSC ProcessPerformance
MSCConformanceAvailability
1. Availability Incidents Involving Device Config Troubleshooting
2. Breakage Incidents Attributable to Security Configuration
3. Mean Time to Repair Devices
1. Users Authorized2. Pareto of Users3. Nonconforming
Devices4. Time to Close
Remediation
1. Authorized Deviations2. Revised Configs3. Pareto of Revised
Causes4. Repeat Remediation5. Pareto of Repeat Rem
Settings
1. Security ConfigsPublished
2. Devices Built to Spec3. Devices Failing Build4. Devices in MSC Pgm
Pilot TeamDashboard
AVAILABILITY INCIDENTS INVOLVINGDEVICE CONFIG TROUBLESHOOTING“How many availability incidents in the last reporting periodhad issues with device configuration?”
BREAKAGE INCIDENTS ATTRIBUTABLE TO SECURITY CONFIG“How many breakage incidents in the last reporting period had issueswith Security Configuration?”
MEAN TIME TO REPAIR“What is the average time to repair devices involvingconfiguration troubleshooting? Is this time decreasing?
18
23
28
33
38
43
48
53
1 2 3 4 5 6 7 8 9 10 11
XmR
EXAMPLE DATA
18
23
28
33
38
43
48
53
1 2 3 4 5 6 7 8 9 10 11
XmR
EXAMPLE DATA
MSC Security Kaizen
MANAGEMENT DASHBOARD:Availability
18
23
28
33
38
43
48
53
1 2 3 4 5 6 7 8 9 10 11
XmR
EXAMPLE DATA
18
23
28
33
38
43
48
53
1 2 3 4 5 6 7 8 9 10 11
XmR
EXAMPLE DATA
18
23
28
33
38
43
48
53
1 2 3 4 5 6 7 8 9 10 11
XmR
EXAMPLE DATA
18
23
28
33
38
43
48
53
1 2 3 4 5 6 7 8 9 10 11
XmR
EXAMPLE DATA
NUMBER OF USERS AUTHORIZED TO CHANGE SECURITY CONFIG“How many users are authorized to change Security Configuration Settings? Is this number increasing or decreasing?”
PARATO OF AUTHORIZED USERS“Which organizations have users authorized to change Security Configurations?”
DEVICES NONCONFORMING TO SECURITY CONFIGURATIONS“What percentage of devices currently deployed fail to conform to the current Security Configuration?”
18
23
28
33
38
43
48
53
1 2 3 4 5 6 7 8 9 10 11
Xm
R
EXAMPLE DATA
18
23
28
33
38
43
48
53
1 2 3 4 5 6 7 8 9 10 11
Xm
R
EXAMPLE DATA
MSC Security Kaizen
MANAGEMENT DASHBOARD:MSC Conformance
18
23
28
33
38
43
48
53
1 2 3 4 5 6 7 8 9 10 11
Xm
R
EXAMPLE DATA
18
23
28
33
38
43
48
53
1 2 3 4 5 6 7 8 9 10 11
Xm
R
EXAMPLE DATA
00011
2223
4
12
100%100%100%96%
93%
85%
78%
70%
59%
44%
0
3.375
6.75
10.125
13.5
16.875
20.25
23.625
27
Frequency
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%n=27
TIME TO REPORT CLOSED REMEDIATION“How long is it taking to remediate noncompliant devices?”
0
5
10
15
20
25
30
35
40
45
50
230.0 to263.0
263.0 to296.0
296.0 to329.0
329.0 to362.0
362.0 to395.0
395.0 to428.0
428.0 to461.0
461.0 to494.0
494.0 to527.0
527.0 to560.0
560.0 to593.0
593.0 to626.0
626.0 to659.0
659.0 to692.0
692.0 to725.0
Frequency
n=190
Target = 336
Median = 348.5Standard Deviation = 92.4Max Value = 689Min Value = 231Long-Term Process Capability Ratio = 0.14 NUMBER OF AUTHORIZED DEVIATIONS
“How many devices are authorized to be noncompliant with the current Security Configuration? Is this number increasing or decreasing?”
PARATO OF CONFIGURATION REVISION CAUSES“What are the causes for revising Security Configurations?”
18
23
28
33
38
43
48
53
1 2 3 4 5 6 7 8 9 10 11
XmR
EXAMPLE DATA18
23
28
33
38
43
48
53
1 2 3 4 5 6 7 8 9 10 11
XmR
EXAMPLE DATA
MSC Security Kaizen
PROCESS DASHBOARD
00011
2223
4
12
100%100%100%96%
93%
85%
78%
70%
59%
44%
0
3.375
6.75
10.125
13.5
16.875
20.25
23.625
27
Frequency
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%n=27
PARETO OF REPEAT REMEDIATION SETTINGS“Which device settings require repeat remediation?”
NUMBER OF REVISED CONFIGURATIONS“How many proposed configurations require revision prior to deployment? Is this number increasing or decreasing?” 18
23
28
33
38
43
48
53
1 2 3 4 5 6 7 8 9 10 11
XmR
EXAMPLE DATA
18
23
28
33
38
43
48
53
1 2 3 4 5 6 7 8 9 10 11
XmR
EXAMPLE DATA
PERCENTAGE OF DEVICES REQUIRING REPEAT REMEDIATION“How many devices fail Compliance Check following reported remediation? Is this number increasing or decreasing?”
18
23
28
33
38
43
48
53
1 2 3 4 5 6 7 8 9 10 11
XmR
EXAMPLE DATA
18
23
28
33
38
43
48
53
1 2 3 4 5 6 7 8 9 10 11
XmR
EXAMPLE DATA
00011
2223
4
12
100%100%100%96%
93%
85%
78%
70%
59%
44%
0
3.375
6.75
10.125
13.5
16.875
20.25
23.625
27
Frequency
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%n=27