Recent updates to NPR 8735.2, Hardware Quality Assurance Program Requirements for Programs and Projects consolidate Office of Safety and Mission Assurance (OSMA) policy into a hierarchy, make life cycle Quality Assurance (QA) more obvious without increasing prescription, and present approaches and solutions for a set of mission assurance objectives.
“A clean sheet of paper was used to write NPR 8735.2 revision C — it’s a complete rewrite,” said Jeannette Plante, NASA Quality program executive. “The prior version of the NPR [NASA Procedural Requirements] was a pretty prescriptive guide for oversight of external mission hardware suppliers. Much of the content derived from work QA would do to satisfy Federal Acquisition Requirement Part 46, which is essentially second-party Quality oversight for contracts. The real meat of the Quality policy was in the NPD [NASA Policy Directive]. This content was moved into the NPR.”
The major changes address challenges of the previous revision including that the life-cycle view was so high level that programs and projects couldn’t see clearly what was available in the QA toolbox, especially in early project phases. Without early life cycle proactive measures, a lot of emphasis ended up attached to inspecting-in Quality, which increases programmatic risk in the form of schedule delays and technical risk in the form of unknown or reduced reliability. At later stages, the risk mitigation options are reduced to replace, rework, repair or use-as-is. In addition, NASA’s hybrid development/production environment increases the exposure to escapes in design-for-manufacturing, supplier risk and manufacturing readiness. By making the QA life cycle clearer in revision C, projects can be more proactive and less exposed to defects that may need to be accepted to preserve schedule.
Read the following Q&A for a more in-depth look at the changes in NPR 8735.2C:
How is the new NPR structured and does it place more emphasis on processes earlier in the life cycle?
The following changes reflect the new structure:
- The organization of the document now mimics the Systems Engineering V life cycle found in the Phase A through Phase E structure built into NPR 7120.5, NASA Space Flight Program and Project Management Requirements.
- The document steps through the life cycle in 10 domains. An 11th domain is Risk Management, which is parallel for all phases of the life cycle.
- The requirement that each program or project must have a formally defined and managed QA program is continued in the new revision for NPR 7120.5 programs and projects. It is recognized as optional for NPR 7120.8, NASA Research and Technology Program and Project Management Requirements efforts.
- Phase A activities include strategizing and planning, including high-level procurement approaches and how criticality management provides QA its focus. Emphasis is placed on the collaborative work QA does with the procurement activity to define and review both the contract clauses and the applicable technical standards that will be used to satisfy the “higher-level requirements” mandate in the Federal Acquisition Regulation (FAR).
- Requirements for design for manufacturability, leveraging of manufacturing and Quality technical standards, and Supply Chain Risk Management (SCRM) are used in Phase B, when many lower-integration products are manufactured.
- In-plant assessments of manufacturing readiness, including Quality Management System (QMS) robustness, are used in Phases B and C.
- Phases B and C are where production surveillance is performed. This is the most well-known QA program activity, and for many types of products, is critical for capturing objective evidence of product conformance. Second-party oversight of the suppliers is part of this work.
- Product acceptance is placed after product assurance because many NASA projects perform Integration and Test (I&T) operations in-house. Product acceptance is a collaboration between QA and the procurement activity.
- Requirements defined for assuring I&T procedures by design or by execution do not change the conforming status of the accepted hardware and are followed as intended at the end of Phase C and into Phase D. The same is done for launch preparations, launch and mission operations.
- The Risk Management requirements describe how programs and projects capture and address nonconformances and risks throughout the life cycle.
Previously, these domains were addressed primarily in the NPD, so the language was pretty high level and not in an order that walked through how QA plays out over time on a project. Also, the NPD-type language created openings to apply it beyond hardware used in an NPR 7120.5 mission. Without any Technical Authority (TA) structure, it’s really not possible to formally establish and manage a QA program. The new version invites NPR 7120.8 or other users to take advantage of the “tools in the toolbox” but doesn’t require those types of projects to stand up and execute a formal QA program.
Lastly, the prior policy had a strong undercurrent of attention on achieving safety objectives. Not to say this is not a good thing, but mission assurance is concerned with more than crew safety. For example, as shown by some of NASA’s flagship robotics missions, Quality problems can threaten mission success through cost and schedule impacts. For some Class D projects, the rubric for determining criticality is very different than that used for a human-rated mission. This is the key to successful requirements tailoring.
What is the difference between the formal QA program and the project’s QA function?
On the one hand, QA thrives when it’s standardized. It provides good predictability, it can be iterated for continuous improvement or to accommodate lessons learned, and users benefit from economies of scale. Most centers have built organizations, processes and even infrastructure — like a centerwide Problem Report/Problem Failure Report database — that create a repeatable approach to QA. The QMS provides an organizing platform for this. Projects don’t have to think much about how QA is done when, for every mission, the center can just apply these repeatable systems. In this case, QA might look more like a retail service provided by the Safety and Mission Assurance (SMA) Directorate, and projects could view this as a QA function.
On the other hand, this model has down sides. It is likely to be intolerant of tailoring or customization. Tension is created between the authorities and responsibilities of the QMS managers, who seek standardization, and the project’s risk managers, who seek risk-informed optimization. The project manager is ultimately responsible for mission success and cannot delegate QA risk acceptance. This is why the policy expects the project manager to establish and “own” the QA activities from cradle to grave. The articulation of how QA will be done for a particular mission is the QA Plan, and every activity to prepare and execute that plan makes up the QA program. The QA Plan should be finalized as part of the SMA Plan by the Systems Requirements Review (SRR).
NASA still wants to leverage the center’s QMS to the greatest extent possible to realize the benefits named earlier. The policy continues to require centers to sustain a QMS to NASA’s adopted standard, AS9100, Quality Systems - Aerospace - Model for Quality Assurance in Design, Development, Production, Installation and Servicing, and uses a control gate via the SMA director’s QA Plan review that confirms that discontinuities between the project’s QA Plan and the center’s QMS are known and managed. This is expected to be integrated with delivery of the QA Plan at the SRR.
If the project manager holds risk authority for QA, then what is the SMA TA’s role when the QA Plan heavily tailors the requirements in the policy?
This is a topic of great interest to QA personnel. They might wonder if projects can change industry-wide, NASA-adopted Quality criteria or how a project can choose to tailor requirements that are baselined in the NASA Quality policy. They see OSMA and its local SMA Directorates as the seat of QA expertise and the best authority for how QA is used for mission success.
This is both true and false. The new policy recognizes the authority of the Subject Matter Experts (SME) to create the tools used by the QA discipline. These experts will be in Engineering and SMA Directorates. But which tools will be used, and in what measure, to meet mission success objectives is ultimately the project’s decision. QA experts should match a QA program strategy with the project’s mission objectives and risk posture.
The SMA TA is expected to provide oversight to ensure all SMA-related risk-based decisions continue to support mission success. These distinctions get difficult to see when the SMA TA, or Chief SMA Officer (CSO), is also performing the duties of the SMA team manager (aka “dual hat”). The project’s QA leadership should make a case for their risk-informed recommendation and coordinate that with both project management and the SMA TA (CSO). Both the CSO and the QA personnel can formally dissent if a project seeks to take a risk that QA has serious objection to. The formal dissent process facilitates a deep dive into the concern to ensure the risk case is compelling and higher-level review is necessary. The new revision points to other OSMA policies for understanding how tailoring and risk acceptance is managed, as these policies and processes are not unique to the QA discipline..
The dual-hatted CSO model can be difficult to avoid for resource-constrained missions; however, it is not recommended, except in the most risk-tolerant cases, that CSOs also act as the QA SME lead or that an SMA manager without background in QA act as the QA lead. NPR 8735.2C is written with the assumption that a QA SME is used for QA program leadership throughout the life of the project. This personnel management decision should be reported at the SRR.
The QA lead is expected to have a deep understanding of the tools in the QA toolbox, the order and manner in which they are intended to be used, their cost and benefit to the project, the risks associated with using or not using one or some of those tools, how QA program data precipitates, and the leading indicators of QA program success or emerging problems.
It is often counterintuitive that with more risk tolerance, more expertise is needed to craft an effective QA program strategy. Investment in criticality assessment can help less experienced QA teams put their resources in the right places.
It sounds like some of the hardware will be critical and some will not. How can some of the hardware not be critical to mission success?
The notion of criticality assumes that not everything can be treated with the same high level of scrutiny; for QA, it’s about hardware items and processes. It’s easy to understand that if it fails and an astronaut or pilot is injured, that hardware should be considered critical. But what if the mission doesn’t carry any crew? Does that mean nothing is critical for the mission’s success?
The answer is no, meaning that there are other categories of mission success objectives in addition to protecting a flight crew. These include satisfying regulations from protecting workers, to procurement regulations, to reporting crimes (e.g., counterfeits, fraud), to complying with treaties for Planetary Protection. Programmatic objectives relate to meeting cost and schedule commitments. Other stakeholders’ objectives include not harming their assets, such as commercial launch facilities or the International Space Station. A discussion of the types of mission success objectives that drive criticality is in the preface section of the updated NPR.
The QA program strategy will need to respond to the project’s view of what is critical, how that is determined and what type of objective is threatened. The type of objective and the risk owner for that type (e.g., crew themselves, project managers, Office of Procurement Assistant Administrator) will influence risk mitigation approaches.
The Quality policy has always recognized that QA program elements are responsive to hardware and process criticality. The new policy promotes formalizing how criticality will be determined and making sure that information finds its way to the QA team so that it can strategize and plan accordingly. Understanding and communicating criticality is fundamental to effective QA program tailoring.
What activities — other than using a QA SME to put together the QA program plan and how they consider criticality when they do that — occur for QA during program or project formulation?
Acquisition strategy is a significant activity that involves QA during project formulation. If a prime contractor will be engaged for major hardware deliveries (or in the case of commercial vehicle services), the QA lead should be engaged with the contract officer(s) to understand the acquisition strategies and how they will drive the project’s regulatory commitments with respect to contract Quality oversight.
QA will need to ensure that certain FAR and NASA FAR Supplement (NFS) clauses are included in the contract. Some of those clauses will need considerable additional text. While NASA’s use of technical standards in mission hardware contracts is robust, significant gaps have been found related to a lack of effective QA clauses that communicate full life cycle QA program needs.
The policy requires some minimum supplier risk screening, via existing database records, prior to letting a contract. If there is no recent history with the supplier, an audit, assessment or survey is required to acquire the needed insight. This requirement was a response to an Inspector General finding and should not be tailored out. The Office of Procurement is also pursuing new supplier vetting procedures to address supplier risk from a fiduciary perspective.
The new policy sustains the existing position that the QA program should acquire data and analyze that data as a basis for decision-making. This is also a contemporary priority for NASA as a whole. The project will have to leverage existing data management systems and may need to augment them by creating its own to manage the quality data. It is expected that QA metrics will be reportable throughout the life of the project, both for self-awareness at the project and NASA Headquarters levels.
The project leadership will need a staffing and budget plan that supports execution of the QA plan. This budget and staffing plan should be reported initially at the SRR and updated throughout the project life cycle, as needs for QA may vary if issues with a technology, with quality defects or with a supplier exceed prior forecasts.
The new NPR 8705.4, Risk Classification for NASA Payloads provides tailoring guidance based on mission risk class for robotic missions. OSMA provides an abbreviated template in that NPR that the QA SME can use to walk down the various strategy and tailoring decisions that result in a finished QA program plan. A QA Plan-generating tool, based on NPR 8735.2C with consideration of mission risk class, is in development.
Beyond a new emphasis on early life-cycle QA Program activities, are there other significant changes to the policy in NPR 8735.2C?
As noted previously, the coverage of the domains of the QA discipline remain the same, though the articulation of the activities is fleshed out to a greater degree. Previously, QA supplier surveillance was a singular concept: Ensure that the supplier is meeting requirements. In revision C, differentiation is made between production readiness and product verification. More detail is provided for QA activities that will prevent problems versus relying on finding defects. A section is dedicated to the processes NASA uses to ensure QA inspectors are receiving robust QA leadership when planning and executing their surveillance tasks to both ensure their work does not create inefficiencies in the production workflow but also so that the volume of government contract Quality Assurance work is in line with criticality management strategies, is responsive to SCRM intelligence, and reflects the project’s risk tolerance. A new chapter reflects the most recent protocol agreements and budget processes NASA negotiated with the Defense Contract Management Agency, who perform QA surveillance on NASA’s behalf for some contracts. NPR 8735.2C is current with changes to the FAR and NFS that occurred in 2019 and 2020 related to counterfeit avoidance and controls.
Questions about the policy update can be directed to Plante.