Introduction
This analysis grew out of
self-study. It was motivated by a curiosity
to understand how various
governance/management frameworks (e.g. COSO,
COCO, and others) are the same or different
and perhaps finding potential for different
frameworks to inform each other.
It is also a result of having conducted
diverse performance and environmental audits
of many federal and provincial government
departments & agencies over a span of 40
years. Numerous mantras, concepts &
management models have been experienced.
Looking back, each had or have their own
context, constituency and purpose and have
had some lasting effect.
Overall, a personal view is that all
indicate the best any entity might do is
have meaningful purpose and have engaged
people and useful tools for achieving
desired results and avoiding undesirable
ones. The art to master is being in control
without controlling. Among the ingredients,
values and ethics are most important. Above
all – integrity. That and a great cup of
coffee.
And, the only thing permanent is change.
But as Deming said: “It is not necessary
to change. Survival is not mandatory.”
Sources of
Information
Publicly available information has been used
as obtained from the Treasury Board of
Canada Website and from publications
purchased from the American Society for
Quality (i.e. ASQ/ANSI/ISO 9001:2015 and the
book called ISO 9001:2015 Explained, fourth
edition by Cinanfrani and West).
Also, other materials have been considered
such as those obtained from taking the BSI
CQA audit leader course and from
presentations about ISO 9001:2015 (i.e. June
2015 at the NAC as sponsored by the Canadian
Public Service Excellence Network and the
Canadian General Standards Board and, ASQ –
How will you audit a risk assessment?: key
attribute for auditing ISO 9001:2015 by
Angelo Scangas in January, 2016).
Any observations and conclusions are
strictly my own.
Summary
This analysis was done using publicly
available information. An in depth
comparison of MAF and ISO 9001:2015 was not
possible from information as available on
the TBS website. This argues for better
transparency for MAF. Apologies to those who
would have liked probing beyond public
information.
There is parallel between the two models in
as much as they are similar in overall
purpose and both seek quality in management
and strong performing organizations. Both
seek continuous improvement as well as
greater efficiencies and stakeholder
satisfaction. The two are aligned in some
key elements. However, MAF is not formally
a quality management system (QMS) and there
are notable differences between the two.
There are also similarities.
In my estimation, there is potential for ISO
9001:2015 to enhance MAF. There is also
likely to be value in fathoming the contrast
between MAF “acceptable” performance ratings
and “shortcomings” reported by Auditors
General of Canada and Chief Audit Executives
of federal departments and agencies as well
as the recent non-payment of thousands of
public servant wages upon the introduction
of a new federal payroll system.
The opportunity could be taken by TBS
(and/or other federal entities) to consider
QMS frameworks in any future development
and/or application of MAF. There is
potential for MAF to be more comprehensive
(robust). At the same time, MAF could
inform ISO 9001:2015 with respect to the
importance of values and accountability.
Also, it seems both could be enhanced by
addressing sustainable development as a
condition for strong organizational
performance.
Finally, the two organizations responsible
for these two frameworks seek to improve
them. And, perhaps more importantly, this
little exercise illustrates frameworks can
inform each other. Which leads to another
curiosity – beyond their own borders and
membership, how do various independent
professional bodies/institutions seek to
learn and adapt from each other?
About MAF
The Treasury Board of Canada is a Cabinet
committee of ministers. It oversees the
government’s financial and human resources
and administrative responsibilities, and
establishes policies that govern each of
these areas. It has four main roles:
-
Management Board
-
Expenditure Management Board
-
Employer (by managing compensation and
labour relations)
-
Regulatory oversight (to advise the
Governor General on the approval of
Governor in Council regulations and
Orders in Council).
The Treasury Board Secretariat (TBS) is the
administrative arm of the Treasury Board
(TB). TBS supports the TB by making
recommendations and providing advice while
respecting the primary responsibility of
deputy heads in managing their
organizations, and their roles as accounting
officers before the Parliament of Canada (re
Federal Accountability Act).
MAF is a process in support of the role of
TB as the “government-wide” management board
of the federal government. It has been in
place for about 10 years and appears to
scope in (apply to) federal departments and
agencies as scheduled in the Financial
Administration Act. It operates on a three
year cycle.
MAF is represented as a “powerful tool” that
plays an important role in the improvement
of management practices in federal
departments and agencies. It is to identify
the key elements needed for sound management
and to ensure the federal public service
continues to focus on management excellence
and the delivery of effective programs and
services. The objectives of MAF are to:
-
Obtain an organizational and
government-wide view of the state of
management practices and performance;
-
Inform deputy ministers and heads of
agencies about their organizations’
management capacity;
-
Inform the TBS about the state of policy
implementation and practices
(compliance);
-
Identify areas of management strength
and any areas that require attention;
-
ICommunicate and track progress on
govern-wide management priorities; and
-
Continuously improve management
capabilities, effectiveness and
efficiency government-wide.
MAF is represented as a framework for a
well-managed organization. It sets out the
“conditions” that are required to achieve
strong organizational performance. These
conditions/elements are represented in this
diagram:
These elements are
described/defined as follows:
Public Sector Values:
Respect for people and democracy, serving
with integrity and demonstrating stewardship
and excellence.
Leadership and Strategic Direction:
Vision, mandate and strategic
priorities that guide the organization while
supporting policies, programs, and services
to Canadians.
Governance and Strategic
Management: Maintains effective
governance that integrates and aligns
priorities, plans, accountabilities and risk
management to ensure that internal
management functions support and enable high
performing policies, programs and services.
People Management:
Optimizes the work force and work
environment to enable high productivity and
performance, effective use of human
resources and increased employee engagement.
Financial and Asset Management:
Provides an effective and
sustainable financial management function
founded on sound internal controls, timely
and reliable reporting, and fairness and
transparency in the management of assets and
acquired services.
Information Management: Safeguards
and manages information and systems as a
public trust and a strategic asset that
supports effective decision-making and
efficient operations to maximize value in
the serviced to Canadians.
Management of Policy and Programs:
Designs and manages policies and
programs to ensure value for money in
achieving results.
Management of Service Delivery:
Deliver client-centred services
while optimizing partnerships and technology
to meet the needs of stakeholders.
Results and Accountability:
Uses performance results to ensure
accountability and drive ongoing
improvements and efficiencies to policies,
programs, and services to Canadians.
Continuous Learning and Innovation:
Manages through continuous
innovation and transformation, to promote
organizational learning and improve
performance.
MAF assessments provide observations on
where performance meets expectations on the
performance indicators that are reviewed,
and where there may be opportunity to
improve. MAF information is for use by
departmental managers to understand the
management capacity that exists in their
organizations and to identify areas that may
require attention. Assessments also give
deputy heads information to benchmark their
organizations’ performance.
MAF has evolved over time. In 2013-14 TBS
reviewed the assessment process and renewed
the MAF tool to ensure compliance with key
TB policies and directives. A new online
portal was also established for departments
to report into TBS. The framework also
changed. Previously MAF had 14 Areas of
Management. It now has seven.
The MAF assessment process sets out the
expectations of public sector managers and
deputy heads in specific “Areas of
Management” and measures organization
performance against expectations. Each Area
is said to represent key internal business
functions critical to strong performing
organizations.
There are seven Areas; four core and three
optional (where entity operations align):
- Financial Management
- Information Management and
Information Technology Management
(IM/IT)
- Management of Integrated Risk,
Planning and Performance
- People Management…. And the three
optional as:
- Management of Acquired Services and
Assets
- Security Management
- Service Management
These areas do not directly map or align
with the “conditions” (elements) of the MAF
framework. The seven areas are assumed
encompassed within the framework and linked
as part of the assessment process within
TBS.
MAF assessment appears based on
self-assessment by entities through use of
questionnaires developed by TBS functional
specialists with resulting information
reported to, and reviewed by, TBS. TBS
assessment and “ratings” are reviewed with
individual entities before being finalized.
According to the latest TBS Departmental
Performance Report (DPR), TBS seeks
continuous improvement in the “quality
of government-wide public service
management” (bold underline for emphasis).
Some performance indicators for TBS come
from MAF results. For example, TBS sets a
performance target of 75% (actual of 100%)
of federal organizations that obtained an
“acceptable” MAF rating for citizen-focussed
service, management of security, integrated
risk management, information and information
technology. Another target is set of 80%
(actual of 96%) of federal organizations
obtaining a MAF rating of “acceptable” for
use of information for decision making.
The fact that MAF is referenced in DPR’s
indicates that MAF assessment matters to
organizational performance and thus likely a
factor in evaluating the performance of
senior management including Deputy Ministers
and may impact performance pay. In which
case a lot probably goes on behind the
scenes to ensure MAF assessment is complete
and fair.
Further details for purpose of this exercise
could not be conveniently found regarding
MAF elements, methodology and criteria
(standards, conditions, expectations to be
met, performance measurement used and
reviewed etc.). After many search attempts,
further details could not be isolated about
sub-elements of the framework, criteria,
questionnaires used, and the
rating/assessment scale and how ratings are
calculated/derived. MAF web site
information is high level and diffuse when
“googled”.
Also, annual MAF assessment reports appear
not to be made public. However, MAF is oft
referenced in some way in a variety of
published reports by federal departments and
agencies (including but not limited to
Departmental Performance Reports, Reports on
Plans and Priorities, audit & evaluation
reports).
This limits comparison of MAF with ISO
9001:2015 (or other frameworks or models)
through use of publicly available
information. And, what defines quality of,
or in, public service management is not
defined in specifics made public.
All the forgoing argues for improved
transparency for MAF in keeping with a
current priority of the federal government.
A Conundrum
In doing this work, a contrast or
contradiction is noted; a puzzle if you
like. This is expressed as: How does one
reconcile high “acceptable” MAF ratings with
the frequent (and often serious)
observations reported by the Auditors
General of Canada and by Chief Audit
Executives (internal audit and program
evaluations conducted by individual
departments and agencies)?
And, how might acceptable ratings square
with the current problem of not paying
thousands of federal public service
employees after the introduction of a new
payroll system? Why did this happen? How
is such breakdown possible with a mature MAF
process in place? Would it be reasonable to
expect MAF to have prevented or have
expected such risk to be elevated up the
chain of command to Ministers of the Crown
including those of Treasury Board? Maybe it
was.
That said, significant problems with IT
projects are not that uncommon in private
and public sectors. Why so – what are the
root causes?
One could speculate the federal payroll
system problem had to do with something
amiss, not only in technical design, but
also in the change management or integrated
risk management process. Maybe there was a
miscalculation in deciding to move and
consolidate the payroll function and cut
payroll administrative staff before changing
over to the new Phoenix system and achieving
stability in new system performance. Maybe
the project was rushed and the system not
sufficiently tested especially for being
able to handle non-routine pay transactions
such as term or casuals.
But I digress and speculate. Topic for
another day. Am sure something will be
learned from this event to inform the
ongoing quality of public service
management. Perhaps no better time for the
5 why technique used in quality management
and auditing. Hopefully there will be a full
public accounting given the impacts and
millions of additional costs to remedy the
problem.
One could also ask how acceptable is
“acceptable” in MAF ratings; is there a
higher standard to be set and met?
Exploring such conundrum would take much
more time and effort than possible here.
But it would, in my opinion, be worth the
effort to reconcile and understand the “why”
in some depth so that the quality of public
service management might be improved through
MAF and/or improving the MAF framework and
process itself.
That said, MAF appears to be endeavouring to
focus increasingly on performance and
service standards. If so, there is
potential for QMS frameworks/models such as
ISO to inform MAF and/or the processes
within individual departments and agencies.
About ISO
9001:2015
The International Organization for
Standardization is a worldwide federation of
national standards bodies (members). ISO
9001 was first published in 1987, revised on
three occasions – the latest (5th edition)
being 9001: 2015. It replaces ISO 9001:2008
and certification under the 2008 standard
will no longer be valid after September
2018.
A few general things regarding ISO
9001:2015:
-
The Standard is based on the quality
management principles described in ISO
9000 (QMS – fundamentals and vocabulary)
and also relates to ISO 9004 (managing
for the sustained success of an
organization).
-
Requirements are generic and intended to
be applicable to any organization
regardless of its type and size, or the
products and services it provides. This
would include governments.
-
The Standard specifies requirements for
a
quality management system
when an organization needs to:
-
demonstrate its ability to
consistently provide products and
services that meet customer and
applicable statutory and regulatory
requirements; and
-
aims to enhance customer
satisfaction through the effective
application of the system, including
processes for improvement of the
system and the assurance of
conformity to customer and
applicable statutory and regulatory
requirements.
-
The adoption of a quality management
system is a strategic decision for an
organization that can help to improve
its performance and provide a sound
basis for sustainable development
initiatives.
-
he Standard can be used by internal or
external parties.
-
The Standard employs the process
approach, which incorporates the classic
Plan-Do-Check- Act (PDCA) cycle and
risk-based thinking. The PDCA cycle
enables an organization to ensure its
processes are adequately resourced and
managed, and that opportunities for
improvement are determined and acted on.
-
The Standard has made subtle but
important changes to the previous
Standard (2008) and can be seen as a
broader or general management framework.
Quality Management Systems (QMS)
requirements are set out in 10 sections
(elements) and 67 clauses/sub clauses (sub
elements). These are set out in Table 1.
Table 1 - Structure of ISO 9001:2015
1
|
Scope
|
|
2
|
Normative references
|
3
|
Terms and definitions – adopts ISO
9000:2015.
|
4
|
Context of the organization
|
4.1
|
Understanding the organization and
its context
|
|
|
4.2
|
Understanding the needs and
expectations of interested parties
|
|
|
4.3
|
Determining the scope of the quality
management system
|
|
|
4.4
|
Quality management system and its
processes
|
5
|
Leadership
|
5.1
|
Leadership and commitment
|
5.1.1 General
5.1.2 Customer focus
|
|
|
5.2
|
Policy
|
5.2.1 Establishing the quality
policy
5.2.2 Communicating the quality
policy
|
|
|
5.3
|
Organizational roles,
responsibilities and authorities
|
6
|
Planning
|
6.1
|
Actions to address risks and
opportunities
|
|
|
6.2
|
Quality objectives and planning to
achieve them
|
|
|
6.3
|
Planning of changes
|
7
|
Support
|
7.1
|
Resources
|
7.1.1 General
7.1.2 People
7.1.3 Infrastructure
7.1.4 Environment for the operation
of processes
7.1.5 Monitoring and measuring
resources
7.1.6 Organization knowledge
|
|
|
7.2
|
Competence
|
|
|
7.3
|
Awareness
|
|
|
7.4
|
Communications
|
|
|
7.5
|
Documented information
|
7.5.1 General
7.5.2 Creating and updating
7.5.3 Control of documented
information
|
8
|
Operation
|
8.1
|
Operational planning and control
|
|
|
8.2
|
Requirements for products and
services
|
8.2.1 Customer communication
8.2.2 Determining the requirements
for products and services
8.2.3 Review of the requirements for
products and services
8.2.4 Changes to requirements for
products and services
|
|
|
8.3
|
Design and development of products
and services
|
8.3.1 General
8.3.2 Design and development
planning
8.3.3 Design and development inputs
8.3.4 Design and development
controls
8.3.5 Design and development outputs
8.3.6 Design and development changes
|
|
|
8.4
|
Control of externally provided
processes, products and services
|
8.4.1 General
8.4.2 Type and extent of control
8.4.3 Information for external
providers
|
|
|
8.5
|
Production and service provision
|
8.5.1 Control of production and
service provisions
8.5.2 Identification and
traceability
8.5.3 Property belonging to
customers or external providers
8.5.4 Preservation
8.5.5 Post – delivery expectations
8.5.6 Control of changes
|
|
|
8.6
|
Release of products and services
|
|
|
8.7
|
Control of nonconforming outputs
|
9
|
Performance evaluation
|
9.1
|
Monitoring, measurement, analysis
and evaluation
|
9.1.1 General
9.1.2 Customer satisfaction
9.1.3 Analysis and evaluation
|
|
|
9.2
|
Internal audit
|
|
|
9.3
|
Management review
|
9.3.1 General
9.3.2 Management review inputs
9.3.3 Management review outputs
|
10
|
Improvement
|
10.1
|
General
|
|
|
10.2
|
Nonconformity and corrective action
|
|
|
10.3
|
Continual improvement
|
This numbering system is used to map
ISO 9001:2015 to MAF.
|
Comparing MAF With
ISO 9001: 2015
We now take a look at how MAF and ISO 9001:
2015 align/compare.
Table 2 - Comparing MAF & ISO 9001:2015
MAF ELEMENTS
|
MAF AREAS
© Core
(O) Optional
|
ISO 9001:2015 Requirements
|
NOTES
|
Public Sector Values
|
|
|
Values are not a distinct element in
the ISO Standard.
|
Leadership and Strategic Direction
|
Possible interrelationship with MAF
areas below.
|
5 – Leadership
|
Possible further link-alignment with
ISO element 4 – context of the
organization, in particular
understanding the needs and
expectations of interested parties.
|
Governance and Strategic Management
|
Possible interrelationship with MAF
areas below.
|
6 - Planning
|
Possible further link-alignment with
ISO element 4 – context of the
organization, in particular 4.1
requirement to determine external
and internal issues relevant to an
organizations purpose and strategic
direction and that affect its
ability to achieve the intended
result(s) of its quality management
system.
Not clear in MAF what level or type
of planning is covered (strategic,
operational etc.).
Not clear if and to what extent MAF
includes/encompasses action to
address both risks and opportunities
similar to 6.1 of the ISO Standard
(that also links to 4.1 and 4.2 of
the Standard).
|
People Management
|
© People Management
|
7 – Support: 7.1.2 People.
|
|
Financial and Asset Management
|
© Financial Management.
(O) Management of Acquired Services
(procurement) and Assets.
|
7 – Support: 7.1.3 Infrastructure;
7.1.5 monitoring and measuring
resources.
|
Possible link with ISO sub-element
5.3 Organization roles,
responsibilities and authorities.
|
Information Management
|
© Information Management and
Information Technology Management
(IM/IT).
|
7.5 Documented information.
|
IM/IT is not a named element or
subject/terminology used in the ISO
Standard.
However, requirement element
“Support”: under 7.5 on documented
information in effect covers IM/IT
in stating requirements for
creating, updating and controlling
of information required by the QMS.
Also, 5.1.1 General Leadership
requirements - item e) requires top
management to ensure resources for
the QMS are available.
This presumably would or
could encompass IM/IT.
|
Management of Policy and Programs
|
Possible relationship with MAF core
area below*.
|
5.2 Policy.
5.3 Organizational roles,
responsibilities and authorities.
|
|
Management of Service Delivery
|
(O) Service Management.
|
8 - Operation
|
Not able to tell if elements align
in particular detail and to what
level.
Possible that all of ISO section 8
(the largest section) requirements
would or could be relevant to this
MAF element.
If the MAF element of Management of
Service Delivery happens to include
the quality management system (QMS)
of a federal government organization
subject to MAF, then all of ISO
9001:2015 could be said relevant-
potentially applicable.
|
Results & Accountability
|
Possible relationship with MAF core
area below as starred*.
|
4.4.1 item c) – the organization
shall determine and apply the
criteria and methods (including
monitoring, measurements and related
performance indicators) needed to
ensure the effective operation and
control of processes.
7.1.5 Monitoring and measuring
resources.
9 – Performance evaluation.
|
Accountability is not a named
element or item in the ISO Standard.
Unlike ISO Standard element 9
(clauses 9.1 and 9.2), MAF does not
specify monitoring in its elements
or areas of management.
Internal audit and program
evaluation are not mentioned but
perhaps included as part of the MAF
assessment process within and
between TBS and individual
departments and agencies.
|
Continuous Learning & Innovation
|
|
4.4.1 Item g) the organization shall
evaluate QMS processes and implement
any changes needed to ensure that
these processes achieve intended
results.
4.4.1 Item h) – shall improve the
processes and the quality management
system.
10 – Improvement.
|
Uncertain how lessons are learned
and fed back (looped) into the MAF
process as part of continuous
improvement (similar to the Quality
PDCA cycle model).
|
|
*© Management of Integrated Risk,
Planning and Performance.
|
6 – Planning.
9 – Performance evaluation.
|
Not clear if and to what extent MAF
covers- explicitly requires action
to address risks and opportunities
similar to 6.1 of the ISO Standard.
Possible MAF assesses use of
performance information by
departments to identify risks and
establish priorities.
|
|
(O) Security Management.
|
|
Security is not a distinct subject
in ISO Standard.
However, 5.1.1 General
Leadership requirements - item e)
requires top management to ensure
resources for the QMS are available.
This presumably would or
could encompass security.
|
Analysis
From the research and preceding comparison
it is observed:
- An in depth comparison of MAF and
ISO 9001:2015 is not possible from
publicly available information. This
indicates potential for improved
transparency for MAF.
- There is parallel between the two in
as much as they both seek quality in
management and strong performing
organizations. Purposes are
similar. Both seek continuous
improvement as well as greater
efficiencies and stakeholder
satisfaction.
- The two align in some elements and
cross over links are possible.
- While there are parallels, it is up
for discussion whether MAF can be
considered a quality driven system (one
that is focussed on the quality of the
products and services and the
satisfaction of the customer/client).
- MAF speaks to quality in a general
way but is not formally a quality
management system. It does not speak to
operations and processes in the same way
ISO does. It mentions internal
controls but does not speak to “control”
and controls in the same way and extent
as ISO. Perhaps this is because
TBS sees operation control as exclusive
domain of individual departments and
agencies as part of “respecting the
primary responsibility of deputy heads
in managing their organizations, and
their roles as accounting officers
before the Parliament of Canada”.
This would be a challenging tension to
manage – the balance in being
responsible and accountable for an
entity whole while having hundreds of
highly autonomous and independent parts.
A complex orchestration is to be
appreciated. This may be a factor in the
Phoenix pay implementation problems.
- If the MAF element of Management of
Service Delivery includes the QMS of
federal entities, then all of ISO
9001:2015 could be said applicable or
useful to MAF.
- It is not clear if MAF looks for and
assesses the QMS of the entities subject
to assessment and might factor in
whether or not a federal entity has or
has not been ISO certified in a way
relevant to their operations.
- Unlike ISO Standard element 9, MAF
does not specify monitoring in its
elements or areas of management.
Internal audit or program evaluation are
not specifically named but perhaps are
included as part of the MAF assessment
process within and between TBS and
individual departments and agencies. At
the same time, the MAF assessment
process itself is a monitoring tool of
TBS.
- On face, it is uncertain how lessons
are learned and fed back (looped) into
the MAF process as part of continuous
improvement (similar to the Quality PDCA
cycle model). TBS (and or individual
entities) may well do something like
this as part of the continuous learning
and innovation rubric of MAF or as part
of the MAF assessment process. From a
quality management perspective, it would
be important to know how well
departments set service standards
(internal and external), address
performance results and client feedback.
Perhaps there is opportunity for
improving service metrics.
- It is not clear/evident if and to
what extent MAF explicitly requires
action by federal departments and
agencies to address both risks and
opportunities similar to 6.1 of the ISO
Standard. It is possible, however,
that MAF assesses use of performance
information by departments to identify
risks and establish priorities (part of
use of information for decision making).
And, there are indications that
individual departments may take MAF
assessment as part of their own
enterprise-wide/integrated risk
management system.
- It appears MAF could inform ISO
9001:2015 with respect to values and
accountability.
- Finally, it is not clear how MAF
takes into consideration sustainable
development. At the same time,
while the ISO Standard 9001:2015
mentions that, it has no particular
details or requirements either.
ISO has other standards relating to
sustainable development. Perhaps the
federal government has similar; but MAF
could or might integrate sustainable
development into the
requirements/conditions for strong
organizational performance.
Conclusion
An in depth comparison of MAF and ISO
9001:2015 was not possible from publicly
available information. This argues for
improved transparency for MAF.
There is parallel between the two frameworks
in as much as they both seek quality in
management and strong performing
organizations. Both seek continuous
improvement as well as greater efficiencies
and stakeholder satisfaction. The two are
aligned in some key elements. However, MAF
is not formally a quality management system
(QMS) and there are notable differences
between the two.
There is potential for ISO 9001:2015 to
enhance MAF. At the same time, MAF could
inform ISO 9001:2015 with respect to the
importance of values and accountability.
Also, both could be enhanced by addressing
sustainable development in a more concrete
and explicit way.
Finally, the two organizations responsible
for these two frameworks seek to
continuously improve them. And, perhaps
most importantly, frameworks can inform each
other.
End Note:
- Survival is not mandatory
As an extension of the forgoing, accounting
for quality or results at what price to whom
is central to serving the common good. An
emerging global shift is towards socially
responsible and accountable enterprise where
profit is not the singular imperative for
measuring success and value of an
organization. Social Responsibility and
Sustainability are key to the future. This
is where governments and not-for profits can
inform private enterprise and lead the
way. Again, another topic for another day.
A big thank you to my colleagues and friends
who took the time to comment on this paper. |