The document discusses an agile governance framework for large-scale agile projects at a government agency. It covers measuring the right metrics like balanced scorecards; avoiding common failures like using velocity for long-range forecasting; adopting better practices like Kanban for dependency management and Monte Carlo simulations for forecasting; and maintaining a repository of reference data on past projects. The framework provides controls, techniques, and procedures for auditing agile projects in a way that reduces risk for large software development efforts.
Powerpoint exploring the locations used in television show Time Clash
DHS Agile Governance Framework Provides Oversight at Scale
1. DHS Agile Center of Excellence (COE)
February 5, 2020
Agile Governance at Scale
Craeg K. Strong, CTO, Savant Financial Technologies
d/b/a Ariel Partners
2. Agenda
About me
Context
Genesis of the effort
What Is Governance
Traditional vs Agile Governance
Agile Audit Framework
Usage
Structure
Agile Governance
Measure the Right Things
Avoid Common Failure Modes
Adopt Better Practices
Keep Track
Key Takeaways
Q & A
3. Craeg Strong Software Development since 1988
Large Commercial & Government Projects
Agile Coach / DevOps Engineer
Kanban Trainer / SpecFlow Trainer
Performance & Scalability Architect
Certified Ethical Hacker
New York & Washington DC AreaCTO, Ariel Partners
AKT, KCP, KMP, CSM, CSP, CSPO,
ITILv3, PMI-ACP, PMP, LeSS, SAFe
www.arielpartners.com
cstrong@arielpartners.com
@ckstrong1
6. Genesis of this Effort
Creating Agile Auditing Framework
for US Agency
Agency Context
Lots of Oversight, including House
Ways and Means
Large $100M+ Agile Efforts
Relatively New to Agile
Large, Diverse Group of Stakeholders
All 50 states
Significant Legacy Component
7. Governance and Oversight: Audit Effectiveness
PROGRESS
and status
Legislative ,
Regulatory
Organizational
goals
Program
Value
Procurement
Practices
Costs and
Results
Productivity
Increases
Program
Performance,
measures and
effectiveness
Access &
Distribution
of Public
resources
Reliable
Budgeting
Cost
Decreases
Duplication,
Overlap and
conflict with other
programs
8. Agile Governance and Oversight
Scheduling and Budgeting Challenges
Lack of Detailed Plans Up-Front
Quality Assurance Challenges
Reduced Emphasis on Documentation
Measurement Challenges
Lack of Traditional Metrics such as Earned Value
Unfamiliar, Subjective Metrics such as “Story
Points”
Management Challenges
Plethora of Agile methods and practices
Diversity of Approaches
Conflicting advice
Rapidly evolving ecosystem
Traditional sources such as PMBOK have not
kept pace
Why is oversight of an Agile projects more difficult?
9. Traditional Versus Agile
Traditional Agile
1 Planning with Accuracy and Precision Planning with Accuracy and Adaptability
2 Predictive: Forecasting via Estimation Empirical: Forecasting via Statistics and Probability
3 Adherence to Plan Flexibility: Welcome changes/clarifications
4 Up-Front Requirements Gathering, Baselining Direct, Continuous Customer Involvement
5 Documentation First Automation First
6 Measure Major Milestones Measure Continuous Flow of Value
7 Handoffs Between Defined Roles and Duties Collaboration within and across teams, Cross-Training
8 Post-Mortem Lessons Learned Inspection and Adaptation via Continual Retrospectives
9 Comprehensive Analysis, Design, Documentation Lean Analysis, Design, Documentation
10 If something is risky and difficult, measure twice,
cut once. Make sure to get it right the first time!
If something is risky and difficult, then do it constantly.
Constant integration, constant refactoring, etc.
What makes agile so different?
11. Sources
Government Auditing Standards (“Yellow Book”)
Effective Practices and Federal Challenges in Applying Agile Methods
Technology Assessment Design Handbook
Organizational Transformation: A Framework for Assessing and Improving Enterprise
Architecture
A Framework for Assessing and Improving Process Maturity
10+ Congressional Reports
Digital Services Playbook
Management and Oversight of
Federal Information Technology
(FITARA)
TechFAR Handbook
Biannual FITARA Scorecard
12. Sources
Agile Development & Delivery
for Information Technology
Instruction Manual
PMBOK 6th Edition Agile Practice Guide
CMMI v2.0 Model At-A-Glance
How to Truly Scale Agile Development In the Enterprise – with CMMI
Using Agile with Scrum and CMMI
DIB Guide: Detecting Agile BS
DIB Ten Commandments of Software
DIB Metrics for Software Development
14. Control Areas & Critical Elements
Agile Methods and Frameworks (AMF) Controls
Critical Element AMF-1. Ensure Project Team uses an Appropriate Team-Level Agile Method
Critical Element AMF-2. Implement Effective Team-Level Agile Controls
Critical Element AMF-3. Ensure Project Team Uses an Appropriate Scaled Agile Method
Critical Element AMF-4. Implement Effective Scaled Agile Controls
Agile Requirements (AR) Controls
Critical Element AR-1. Ensure Project Team Has a Documented Vision and Overall Strategy
Critical Element AR-2. Implement Effective High-Level Scope Controls
Critical Element AR-3. Implement Effective Roadmap Controls
Critical Element AR-4. Implement Effective Requirements Elaboration & Maintenance Controls
Critical Element AR-5. Implement Effective Scope Management & Reduction Controls
Forecasting, Scheduling, & Planning (FSP) Controls
Critical Element FSP-1. Ensure Project Team Establishes Initial Project Forecast
Critical Element FSP-2. Implement Effective Progress Tracking Controls
…
15. Control Activities, Control Techniques, Audit Procedures
Control Activities Control Techniques Audit Procedures
AR-1.1 Project has a
documented vision
AR-1.1.1 Overall Objectives
and Goals for the Project
Have Been Documented
Project Vision artifact exists
Appropriate level of detail
All team members are aware
Evidence of being actively maintained
AR-1.1.2 Business Drivers for
Project Have Been
Documented
Vision describes business drivers, goals, and objectives
Goals include reasoning and justification
Goals are ordered
Target dates and cost, any budgetary or time considerations
Evidence of trade-off decisions informed by drivers
AR-1.1.3 High Level Functions
for Project Have Been
Documented
Vision includes high-level business functions with context
Functions out of scope are listed
Reasonable cardinality (dozens / hundreds, not thousands)
AR-1.1.4 Technical & Business
Constraints Have Been
Documented
Major Integrations, platform requirements, standards listed
Business constraints are listed (e.g. data center, place of
performance)
17. Highlights
1. Measure the Right Things
Broken Windows Strategy
Balanced Metrics
Human Centered Design
Testable Architecture
2. Avoid Common Failure Modes
Agile Methodology Failures
Using Velocity for Long-Range Forecasting
3. Adopt Better Practices
Kanban Flight Levels for Dependency Coordination
Monte Carlo Simulation Based Forecasts
4. Keep Track
We Need a High-Quality Repository of Reference Data
19. Broken Windows Strategy
Code Does Not Meet Style Guide Linting / Code Static Analysis
Huge Product Backlog Query: Find Old & Untouched Stories
Orphan / Unlinked Items in ALM Tool Query: Find Badguys (“Lint” for ALM)
ALM Has Poor Usability / Friction Count # Clicks, # Defaultable Items without
Default Value
Rubberstamp Reviews Count Short / Blank Peer Reviews
Duplicative / Copy Paste Code Detect Duplicates, Watch For Trend of Code
Size to Flatten
Sweat the Small Stuff
20. Balanced Metrics
Sprint Velocity &
Sprint Story Delivery Count
Average Throughput
Number of Unit Tests
Code Coverage
Deployment Frequency
Undelivered Story Points
Undelivered Story Count+
Average Lead Time
+
Build Time
+
Functional Coverage
+ Failed Deployment
Down-Time+
Balance Armor & Mobility
21. Human Centered Design
How Many Team Members Have
Participated in In-Context
Immersion?
Know Your Customer
22. Testable Architecture: APIs
Testability Brings
Instrumentation
Scalability
Pluggability
Performance Tunability
Tests-as-Documentation
The More Testable an Architecture Is, The Better It Is
28. Dependency Handling The Hard Way
Opportunity Cost of
Large Meeting
Tough to Detect all
Dependencies Up
Front
Significant Planning &
Management
Overhead
Method One: Giant Up-Front Meeting
29. Easier Dependency Handling: Kanban Coordination Board
Backlog Analysis Dev Test Done
Backlog
Modern
Mainframers
Backend
Beatniks
Mobile
Mod Squad
4
Analysis
4
Dev Test Done
Backlog
4
Analysis
4
Dev Test Deploy
5 4 4
Done
45 5 5
8
7
Integration Test DoneIn Progress
6
6 25
134
Next Up
30. Forecasting
Use Reference Class
Forecasting
Address Outliers, Increase
Predictability
Use Monte-Carlo
Simulations to Predict
Likelihood of Hitting Target
Proactive: Few Surprises!
Modern Methods Provide Improved Predictive Power
32. Keeping Track: Reference Class Forecasting
Degree of Complexity: Stakeholders, Interfaces, Legacy
Agile Methods & Practices Used
Retrospective Notes
Rate of “Dark Matter” Expansion
Cumulative Flow Diagram
Test Coverage Curve
Other Metrics
33. Key Takeaways
Current State of Large-Scale Agile Governance is Woefully Inadequate
Fixing This Requires A New Approach
Measure the Right Things
Avoid Common Failure Modes
Adopt Better Practices
Keep Track
Benefits: Lower Risk for Large-Scale Software Development
34. About Ariel
Other Offerings
Digital Transformation
Cloud Native App Development
Agile / Kanban Coaching
DevOps Jumpstart
Compliance As Code
Test Automation Jumpstart
Legacy Modernization
JIRA Jumpstart
Training Offerings
Fundamentals of Agile
Agile for Leaders & Executives
Kanban Management Professional
Professional Scrum
Human Centered Design
BDD With Cucumber Acceptance Testing
Agile Estimation, Forecasting, & Metrics
Agile Requirements
www.arielpartners.com
sales@arielpartners.com
Notes de l'éditeur
1
First I will tell you a little bit about myself and what we do at Ariel Partners.
Then I’ll describe how this came about
Next we’ll focus on the auditing framework we are putting together. What it looks like, how its used.
Then I want to get into some of the implications and highlights of governance approach espoused by the audit framework. What it means for an organization. This is where things may get a little bit spicy– controversial
Finally I will describe next steps and key takeaways
Then we should have some time for Q&A at the end.
But please feel free to ask questions during the talk
I have been in IT for a long time, after studying computer science at MIT
I own a small consulting and training company
We are located in the heart of times square
And we do business with federal and NYC local government as well as large commercial organizations.
I worked in many different roles, but now I work as an Agile coach and Trainer – but I still keep my hands dirty.
We were well positioned to take on this challenge.
We have a long history working with Agile methods over the last 20 years
In different settings
About six months ago we signed up to a difficult mission
We are working with the Inspector General’s office of a well-known US agency. Their agency develops some very large mission-critical software programs and they have started to use Agile. They need a framework that is compatible with Generally accepted government auditing standards, otherwise known as GAGAs.
7
We still have release readiness reviews in either case. We have up-front work in either case, just less of it for Agile.
Instead of phase gates, we have regular sprint reviews or Kanban Service Delivery Reviews.
We did our homework
Federal Information Technology Acquisition Reform provisions (FITARA) ,
Making Electronic Government Accountable By Yielding Tangible Efficiencies Act of 2016
(MEGABYTE),
Modernizing Government Technology (MGT) act (NEW AREA),
and Federal Information Security Modernization Act of 2014 (FISMA).
A total of 7 related areas are graded and combined to yield an overall A through F grade.
We did our homework
Federal Information Technology Acquisition Reform provisions (FITARA) ,
Making Electronic Government Accountable By Yielding Tangible Efficiencies Act of 2016
(MEGABYTE),
Modernizing Government Technology (MGT) act (NEW AREA),
and Federal Information Security Modernization Act of 2014 (FISMA).
A total of 7 related areas are graded and combined to yield an overall A through F grade.
Looks like FITARA Scorecard
Here is one example of how the audit framework would work: in this case talking about capturing the vision. It does not matter if we call this a CONOPS, or Capabilities and Constraints, or Charter. It could be a paragraph for a small project or it could be 30 pages for a $100M effort.
It is essential for all projects to document the vision, business goals, and assumptions up front. This concept is known as “commander’s intent” in military contexts and was codified by the Prussian army in 1807 after they conducted investigations to find out why Napoleon’s armies had consistently beaten them despite having smaller troop sizes. It enables decision-making to be pushed down to the lowest levels of a project organization while maintaining alignment
I met Alistair Cockburn in Zurich in 2014 when I was speaking at the Lean Agile Scrum conference there.
Alistair proposed the hexagonal design pattern in 2005.
Conceptually it makes sense to select best practices from a number of disciplines, but for beginners this is not a good idea. Can you imagine how confused a beginner would get learning a little bit from a bunch of different martial arts? Of course its different if you are Bruce Lee.
Five years ago we did not have a lot to choose from when it came to scaling agile, but now we have a wealth of options
People who commute in to NYC from New Jersey tell me that if they start their commute before 6:30am they can predict their commuting time pretty well. But if they start after 7 or 7:30 the commuting times are all over the place.