Capability Maturity Model

From Free net encyclopedia

Template:Cleanup-date Capability Maturity Model (CMM) is a collection of instructions an organization can follow with the purpose to gain better control over its Software development process.

The CMM ranks software development organizations in a hierarchy of five levels, each with a progressively greater capability of producing quality software. Each level is described as a level of maturity. Those 5 levels are equipped with different number of instructions to follow. If an organization is on level 1 (currently an estimated 75% of software development organizations exist at this level, which can be best described as chaotic [source as of May 10, 1998]), it only follows few of the instructions in CMM, if on level 5 it follows everything from CMM.

The CMM was developed by the Software Engineering Institute (SEI) at Carnegie Mellon University in Pittsburgh. It has been used extensively for avionics software and for government projects since it was created in the mid-1980s.

Contents

Maturity model

A maturity model is a structured collection of elements that describe characteristics of effective processes. A maturity model provides:

  • a place to start
  • the benefit of a community’s prior experiences
  • a common language and a shared vision
  • a framework for prioritizing actions
  • a way to define what improvement means for your organization

A maturity model can be used as a benchmark for assessing different organizations for equivalent comparison.

The SEI has subsequently released a revised version known as the Capability Maturity Model Integration (CMMI).

History

Like best practices, the Capability Maturity Model was initially funded by military research, but its method of process improvement could not be more different. Where the best practices approach is "bottom up" and quite informal, the Capability Maturity Model is rigid, "top down", and prescriptive.

The United States Air Force funded a study at the Carnegie-Mellon Software Engineering Institute to create a model for the military to use as an objective evaluation of software subcontractors. The result was the Capability Maturity Model, published as Managing the Software Process in 1989. The CMM has since been revised and updated; version 1.1 is now in print and the entire text is available on-line at the SEI's Web site.

Context

The term software originates from the idea that software is easy to change ("soft") in comparison to hardware, which was more difficult to change ("hard"). Another theory: software is soft in the sense that it is not tangible, unlike hardware, which we can replace and touch. In the 1970s, the field of software development saw significant growth as more organizations began to move to computerized information systems. With this significant growth, two events began unfolding.

The first event was that computerized information systems became commonplace and improved computer hardware allowed for more ambitious information system projects. Along with the improved computer hardware, new technologies and manufacturing processes resulted in cheaper, more reliable, and more flexible computer platforms and peripherials which in turn encouraged the use of information systems in more diverse applications.

The second event was the need for many more people to develop the software needed for the computers created by the explosion in the number of computer information systems due to the increased application of computers to organizational problems. This in turn meant that people with little experience in the art of developing computer software moved into that area of work. Not only was there increased demand for people to design and write computer software, there was also increased demand for people to manage these projects.

Many software projects failed due to inadequate processes and project management. This was primarily due to two causes. The first was software development, both the design and writing of computer software as well as the management of software development projects, did not have a large body of published work discussing software development and what work existed was not used by industry to any great extent.

The second cause was that as information systems became more commonplace and people became more ambitious in the application of computer systems to organizational problems. Projects attempted moved from well known areas such as accounting systems or inventory systems which involved primarily numbers and the embedding of an abstract model into a computing platform with software to applications which involved the movement of physical objects in the real world. In addition, software development teams ran into the problem of attempting to model complex systems, such as the complete information flows of an enterprise, within information systems. The sheer complexity of the problem led to project failure.

During the 1970s there were a number of proponents for a more scientific and professional practice. People such as Edward Yourdon, Larry Constantine, Gerald Weinberg, Tom DeMarco, and David Parnas published articles and books with research results in an attempt to professionalize the software development community.

Watts Humphrey's Capability Maturity Model (CMM) was described in the book Managing the Software Process (1989). The CMM as conceived by Watts Humphrey was based on the earlier work of Phil Crosby. Active development of the model by the SEI (US Dept. of Defence Software Engineering Institute) began in 1986.

The CMM was originally intended as a tool to evaluate the ability of government contractors to perform a contracted software project. Though it comes from the area of software development, it can be, has been and continues to be widely applied as a general model of the maturity of processes (e.g., ITIL service management processes) in IS/IT (and other) organisations.

The model identifies five levels of process maturity for an organisation: 1. Initial (chaotic, ad hoc, heroic) the starting point for use of a new process. 2. Repeatable (project management, process discipline) the process is used repeatedly. 3. Defined (institutionalised) the process is defined/confirmed as a standard business process. 4. Managed (quantified) process management and measurement takes place. 5. Optimising (process improvement) process management includes deliberate process optimisation/improvement.

Within each of these maturity levels are KPAs (Key Process Areas) which characterise that level, and for each KPA there are five definitions identified: 1. Goals 2. Commitment 3. Ability 4. Measurement 5. Verification

The KPAs are not necessarily unique to CMM, representing - as they do - the stages that organisations must go through on the way to becoming mature.

The SEI has defined a rigorous process assessment method to appraise how well a software development organisation meets the criteria for each level.

The assessment is supposed to be led by an authorised lead assessor. One way in which companies are supposed to use the model is first to assess their maturity level and then form a specific plan to get to the next level. Skipping levels is not allowed.

NB: The CMM was originally intended as a tool to evaluate the ability of government contractors to perform a contracted software project. It may be suited for that purpose. When it became a general model for software process improvement, there were many critics.

Shrinkwrap companies, which have also been called commercial offtheshelf firms or software package firms, included Borland, Claris, Apple, Symantec, Microsoft, and Lotus, amongst others. Many such companies rarely if ever managed their requirements documents as formally as the CMM described. This is a requirement to achieve level 2, and so all of these companies would probably fall into level 1 of the model.

Origins

The United States Air Force funded a study at the SEI to create a model for the military to use as an objective evaluation of software subcontractors. In 1989, the Capability Maturity Model was published as Managing the Software Process.

Timeline

  • 1987: SEI-87-TR-24 (SW-CMM questionnaire), released.
  • 1989: Managing the Software Process, published.
  • 1991: SW-CMM v1.0, released.
  • 1993: SW-CMM v1.1, released.
  • 1997: SW-CMM revisions halted in support for CMMI.
  • 2000: CMMI v1.02, released.
  • 2002: CMMI v1.1, released .

Current state

Although these models have proved useful to many organizations, the use of multiple models has been problematic. Further, applying multiple models that are not integrated within and across an organization is costly in terms of training, appraisals, and improvement activities. The CMM Integration project was formed to sort out the problem of using multiple CMMs. The CMMI Product Team's mission was to combine three source models:

  1. The Capability Maturity Model for Software (SW-CMM) v2.0 draft C
  2. The Systems Engineering Capability Model (SECM)
  3. The Integrated Product Development Capability Maturity Model (IPD-CMM) v0.98
  4. Supplier sourcing

CMMI is the designated successor of the three source models. The SEI has released a policy to sunset the Software CMM. The same can be said for the SECM and the IPD-CMM. These models are expected to be succeeded by CMMI.

Future direction

Suggestions for improving CMMI are welcomed by the SEI. For information on how to provide feedback, see the CMMI Web site.

Levels of the CMM

(See chapter 2 of (March 2002 edition of CMMISM from SEI), page 11.)

There are five levels of the CMM. According to the SEI,

"Predictability, effectiveness, and control of an organization's software processes are believed to improve as the organization moves up these five levels. While not rigorous, the empirical evidence to date supports this belief."

Level 1 - Initial

At maturity level 1, processes are usually ad hoc and the organization usually does not provide a stable environment. Success in these organizations depends on the competence and heroics of the people in the organization and not on the use of proven processes. In spite of this ad hoc, chaotic environment, maturity level 1 organizations often produce products and services that work; however, they frequently exceed the budget and schedule of their projects.

Maturity level 1 organizations are characterized by a tendency to over commit, abandon processes in the time of crisis, and not be able to repeat their past successes again.

Level 2 - Repeatable

At maturity level 2, software development successes are repeatable. The organization may use some basic project management to track cost and schedule.

Process discipline helps ensure that existing practices are retained during times of stress. When these practices are in place, projects are performed and managed according to their documented plans.

Project status and the delivery of services are visible to management at defined points (for example, at major milestones and at the completion of major tasks).

Basic project management processes are established to track cost, schedule, and functionality. The necessary process discipline is in place to repeat earlier successes on projects with similar applications.

Level 3 - Defined

At maturity level 3, processes are well characterized and understood, and are described in standards, procedures, tools, and methods.

The organization’s set of standard processes, which is the basis for level 3, is established and improved over time. These standard processes are used to establish consistency across the organization. Projects establish their defined processes by the organization’s set of standard processes according to tailoring guidelines.

The organization’s management establishes process objectives based on the organization’s set of standard processes and ensures that these objectives are appropriately addressed.

A critical distinction between level 2 and level 3 is the scope of standards, process descriptions, and procedures. At level 2, the standards, process descriptions, and procedures may be quite different in each specific instance of the process (for example, on a particular project). At level 3, the standards, process descriptions, and procedures for a project are tailored from the organization’s set of standard processes to suit a particular project or organizational unit.

Level 4 - Managed

Using precise measurements, management can effectively control the software development effort. In particular, management can identify ways to adjust and adapt the process to particular projects without measurable losses of quality or deviations from specifications.

Subprocesses are selected that significantly contribute to overall process performance. These selected subprocesses are controlled using statistical and other quantitative techniques.

A critical distinction between maturity level 3 and maturity level 4 is the predictability of process performance. At maturity level 4, the performance of processes is controlled using statistical and other quantitative techniques, and is quantitatively predictable. At maturity level 3, processes are only qualitatively predictable.

Level 5 - Optimizing

Maturity level 5 focuses on continually improving process performance through both incremental and innovative technological improvements. Quantitative process-improvement objectives for the organization are established, continually revised to reflect changing business objectives, and used as criteria in managing process improvement. The effects of deployed process improvements are measured and evaluated against the quantitative process-improvement objectives. Both the defined processes and the organization’s set of standard processes are targets of measurable improvement activities.

Process improvements to address common causes of process variation and measurably improve the organization’s processes are identified, evaluated, and deployed.

Optimizing processes that are nimble, adaptable and innovative depends on the participation of an empowered workforce aligned with the business values and objectives of the organization. The organization’s ability to rapidly respond to changes and opportunities is enhanced by finding ways to accelerate and share learning.

A critical distinction between maturity level 4 and maturity level 5 is the type of process variation addressed. At maturity level 4, processes are concerned with addressing special causes of process variation and providing statistical predictability of the results. Though processes may produce predictable results, the results may be insufficient to achieve the established objectives. At maturity level 5, processes are concerned with addressing common causes of process variation and changing the process (that is, shifting the mean of the process performance) to improve process performance (while maintaining statistical probability) to achieve the established quantitative process-improvement objectives.

Extensions

Recent versions of CMMI from SEI indicate a "level 0", characterized as "Incomplete". Many observers leave this level out as redundant or unimportant, but Pressman and others make note of it. See page 18 of the August 2002 edition of CMMI from SEI (Note: PDF file).

Anthony Finkelstein[1] extrapolated that negative levels are necessary to represent environments that are not only indifferent, but actively counterproductive, and this was refined by Tom Schorsch[2] as the Capability Immaturity Model:

Process areas

Template:Details

The CMMI contains several key process areas indicating the aspects of product development that are to be covered by company processes.

Key Process Areas of the Capability Maturity Model Integration (CMMI)
Abbreviation Name Area Maturity Level
CAR Causal Analysis and Resolution Support 5
CM Configuration Management Support 2
DAR Decision Analysis and Resolution Support 3
IPM Integrated Project Management Project Management 3
ISM Integrated Supplier Management Project Management 3
IT Integrated Teaming Project Management 3
MA Measurement and Analysis Support 3
OEI Organizational Environment for Integration Support 3
OID Organizational Innovation and Deployment Process Management 5
OPD Organizational Process Definition Process Management 3
OPF Organizational Process Focus Process Management 3
OPP Organizational Process Performance Process Management 4
OT Organizational Training Process Management 3
PI Product Integration Engineering 3
PMC Project Monitoring and Control Project Management 2
PP Project Planning Project Management 2
PPQA Process and Product Quality Assurance Support 2
QPM Quantitative Project Management Project Management 4
RD Requirements Development Engineering 3
REQM Requirements Management Engineering 2
RSKM Risk Management Project Management 3
SAM Supplier Agreement Management Project Management 2
TS Technical Solution Engineering 3
VAL Validation Engineering 3
VER Verification Engineering 3

Controversial aspects

The software industry is diverse and volatile. All methodologies for creating software have supporters and critics, and the CMM is no exception.

Praise

  • The CMM was developed to give Defense organizations a yardstick to assess and describe the capability of software contractors to provide software on time, within budget, and to acceptable standards. It has arguably been successful in this role, even reputedly causing some software sales people to clamour for their organizations' software engineers/developers to "implement CMM."
  • The CMM is intended to enable an assessment of an organization's maturity for software development. It is an important tool for outsourcing and exporting software development work. Economic development agencies in India, Ireland, Egypt, and elsewhere have praised the CMM for enabling them to be able to compete for US outsourcing contracts on an even footing.
  • The CMM provides a good framework for organizational improvement. It allows companies to prioritize their process improvement initiatives.

Criticism

  • CMM has failed to take over the world. It's hard to tell exactly how wide spread it is as the SEI only publishes the names and achieved levels of compliance of companies that have requested this information to be listed[3]. The most current Maturity Profile for CMMI is available online[4].
  • CMM is well suited for bureaucratic organizations such as government agencies, large corporations and regulated monopolies. If the organizations deploying CMM are large enough, they may employ a team of CMM auditors reporting their results directly to the executive level. (A practice encouraged by SEI.) The use of auditors and executive reports may influence the entire IT organization to focus on perfectly completed forms rather than application development, client needs or the marketplace. If the project is driven by a due date, CMMs intensive reliance on process and forms may become a hindrance to meeting the due date in cases where time to market with some kind of product is more important than achieving high quality and functionality of the product.
  • Suggestions of scientifically managing the software process with metrics only occur beyond the Fourth level. There is little validation of the processes cost savings to business other than a vague reference to empirical evidence. It is expected that a large body of evidence would show that adding all the business overhead demanded by CMM somehow reduces IT headcount, business cost, and time to market without sacrificing client needs.
  • No external body actually certifies a software development center as being CMM compliant. It is supposed to be an honest self-assessment ([5] and [6]).
  • The CMM does not describe how to create an effective software development organization. The CMM contains behaviors or best practices that successful projects have demonstrated. Being CMM compliant is not a guarantee that a project will be successful, however being compliant can increase a project's chances of being successful.
  • The CMM can seem to be overly bureaucratic, promoting process over substance. For example, for emphasizing predictability over service provided to end users. More commercially successful methodologies (for example, the Rational Unified Process) have focused not on the capability of the organization to produce software to satisfy some other organization or a collectively-produced specification, but on the capability of organizations to satisfy specific end user "use cases" as per the Object Management Group's UML (Unified Modeling Language) approach[7].

The most beneficial elements of CMM Level 2 and 3

  • Creation of Software Specifications, stating what it is that is going to be developed, combined with formal sign off, an executive sponsor and approval mechanism. This is NOT a living document, but additions are placed in a deferred or out of scope section for later incorporation into the next cycle of software development.
  • A Technical Specification, stating how precisely the thing specified in the Software Specifications is to be developed will be used. This is a living document.
  • Peer Review of Code (Code Review) with metrics that allow developers to walk through an implementation, and to suggest improvements or changes. Note - This is problematic because the code has already been developed and a bad design can not be fixed by "tweaking", the Code Review gives complete code a formal approval mechanism.
  • Version Control - a very large number of organizations have no formal revision control mechanism or release mechanism in place.
  • The idea that there is a "right way" to build software, that it is a scientific process involving engineering design and that groups of developers are not there to simply work on the problem du jour.

See also

References

External links

de:Capability Maturity Model es:Modelo de Capacidad y Madurez fr:Capability Maturity Model Integration no:Capability Maturity Model pl:CMM pt:CMM sk:Capability Maturity Model sv:Capability Maturity Model