SlideShare une entreprise Scribd logo
1  sur  20
Télécharger pour lire hors ligne
DHS/ST/TSL-11/192
Human Factors Design-
Specification Process for the
Integrated Checkpoint
Program
Joshua Rubinstein, Ph.D.
John P. Chin, Ph.D
Mark W. Smith
U.S. Department of Homeland Security
Science and Technology Directorate
William J. Hughes Technical Center
Atlantic City International Airport, NJ 08405
June 2010
Final Report
This report is approved for public release and is on file at
the Transportation Security Laboratory Library, Atlantic City
International Airport, New Jersey, 08405.
This document is also available to the U.S. public through
the National Technical Information Service (NTIS),
Springfield, Virginia, 22161.
U.S. Department of Homeland Security
Science and Technology Directorate
NOTICE
This document is disseminated under the sponsorship of the U.S. Department of Homeland
Security in the interest of information exchange. The United States Government assumes no
liability for the contents or use thereof. The United States Government does not endorse
products or manufacturers. Trade or manufacturers names appear herein solely because they are
considered essential to the objective of this report.
Technical Report Documentation Page
1. Report No.
DHS/STD/TSL-11/192
2. Government Accession No. 3. Recipient’s Catalog No.
4. Title and Subtitle
Human Factor Design-Specification Process for the Integrated Checkpoint
Program
5. Report Date
6. Performing Organization Code
7. Author(s)
Joshua Rubinstein, Ph.D.
John P. Chin, Ph.D.
Mark W. Smith
8. Performing Organization Report No.
DHS/STD/TSL-11/192
9. Performing Organization Name and Address
U.S. Department of Homeland Security
Science and Technology Directorate
Transportation Security Laboratory, TSL-1
William J. Hughes Technical Center
Atlantic City International Airport, NJ 08405
10. Work Unit No. (TRAIS)
11. Contract or Grant No.
12. Sponsoring Agency Name and Address
U.S. Department of Homeland Security
Science and Technology Directorate
Transportation Security Laboratory, TSL-1
William J. Hughes Technical Center
Atlantic City International Airport, NJ 08405
13. Type of Report and Period Covered
14. Sponsoring Agency Code
S&T
15. Supplementary Notes
John Chin and Mark Smith are support contractors provided by Hi-Tec Systems, Inc. Egg Harbor Township, NJ
16. Abstract
This document outlines the process that will be used for developing user interface design
specifications for the Integrated Checkpoint Project (ICP) at the Transportation Security
Laboratory’s (TSL) Human Factors Lab. A User Centered Design (UCD) approach will be
followed in compliance with ISO Standard 13407. This standard specifies five major
activities: 1) Plan the human centered process, 2) Specify the context of use, 3) Specify new
user and organizational concept of operations, 4) Produce design solutions, and 5) Evaluate
designs against user requirements. This document describes the method and approach taken
within these five areas.
17. Key Words
Human factors, integrated checkpoint
18. Distribution Statement
This report is approved for public release and is on file at the
Transportation Security Laboratory Library, Atlantic City
International Airport, New Jersey, 08405.
This document is also available to the U.S. public through the
National Technical Information Service (NTIS), Springfield,
Virginia, 22161.
19. Security Classif. (of this report)
Unclassified
20. Security Classif. (of this page)
Unclassified
21. No. of Pages
20
22. Price
Reproduction of completed page authorized
DOCUMENT CHANGE HISTORY
Version Description/Author Date(s) TSL Approval
0.1 Tech edits/PB 07/08/10
iii
TABLE OF CONTENTS
Page
EXECUTIVE SUMMARY iv
ACRONYMS vi
1. Introduction 1
2. Plan the Human Centered Process 2
3. Specify Context of Use 3
3.1 Establishing Context based on Current Checkpoint Configuration 3
3.2 Proposed System and Concepts Under Development 6
3.2.1 Automated Target Recognition 7
3.3 Task Flow Analysis 7
4. Specify New User and Organizational CONOPS 7
4.1 User Interface and new screening CONOPS 8
4.2 Human Factors Design Specifications 9
5. Produce Design Solutions 9
5.1 Iterative Prototyping 9
6. Evaluate Design against user requirements 11
6.1 Human Factors Performance Testing 11
6.2 Design Artifacts 11
7. Reference 11
LIST OF FIGURES
Figure 1. Design specification development processes adapted from ISO 13407. 1
Figure 2. Activities to specify the context of use 3
Figure 3. Primary AIT checkpoint layout 5
Figure 4. Conceptual illustration of an integrated display 6
Figure 5. Activities for generating user specifications 8
Figure 6. Iterative design process for the Integrated Display Design 10
iv
EXECUTIVE SUMMARY
This document outlines the process that will be used for developing user interface design
specifications for the Integrated Checkpoint Project (ICP) at the Transportation Security
Laboratory’s (TSL) Human Factors Lab. A User Centered Design (UCD) approach will be
followed in compliance with ISO standard 13407. This standard specifies five major activities:
1) Plan the human centered process, 2) Specify the context of use, 3) Specify new user and
organizational concept of operations (CONOPS), 4) Produce design solutions, and 5) Evaluate
designs against user requirements. This document describes the method and a tailored approach
taken within these five areas.
Although a large array of activities can be performed under the auspices of ISO 13407, the
process defined for the Integrated Checkpoint Project will be tailored to a subset of activities due
to relevance, project timelines and resource constraints. The approach to the five areas is
described below.
1. Plan the human centered process. The team will prepare an overall schedule that outlines the
specifications development process for the Integrated Checkpoint project (ICP). Initially,
planning will involve stakeholders who will help to define project scope and direction. Once key
questions are answered, a follow-up kick-off team meeting should initiate the activities that will
develop into a project schedule.
2. Specify Context of Use. Observational site visits will help to establish the context of use of a
current implementation. This activity will help uncover hidden or implicit operational and
architectural practices that constitute de facto system requirements, and thereby may influence
the definition of a context for scenarios and use cases. For example, actual practice in the field
may differ from written Standard Operating Procedures (SOPs) and CONOPS. Unforeseen tasks
and unanticipated outcomes may be captured during field observations. Contingencies for
unexpected outcomes should be understood in terms of use cases.
A task flow will be established to understand the operators’ constraints and limits. Baseline
information about how current tasks are accomplished helps to determine how future tasks may
be redefined within new system implementations. Analysis software will be used to build and
run a simulation model of tasks and task networks. Task definitions, flows, and times collected
during the site visit to a Transportation Security Administration (TSA) Checkpoint will be used
to build the simulation. Multiple simulation runs will yield metrics of times and task branching
and will represent a quantitative model of the baseline (existing) system.
3. Specify New User and Organizational CONOPS. Introducing new technology, technology
integration, or operator interfaces will often necessitate new procedures and tasks. Definition of
both user procedures and organizational CONOPS will need to describe how the new
implementation should work. An integrated display design will therefore be co-developed with a
proposed CONOPS specific to that new technology integration. There will be no suggested
modification to the existing CONOPS that have no immediate or direct relationship to the
integrated display. Based on user scenarios, tasks, and proposed display design, a proposed new
CONOPS and user interface design will define the following:
v
• System sensor information presentation
• Organization of display and definition of display controls
• Operator responses and inputs
• Alarm resolution passenger-routing for each outcome (i.e., when suspected threat diverts
passenger to secondary screening)
• Changes to passenger traffic and control methods used at the checkpoint
4. Produce Design Solutions. The design solutions and associated user interface specifications
will be developed into working prototypes, which then will be iteratively refined though
repeating design-build-evaluate cycles. These cycles may be adjusted to be shorter or longer
based on what is most effective. This incremental process of prototype development will allow
for feedback through: a) expert evaluations, b) pilot testing and c) informal analysis and testing.
5. Evaluate Design against User Requirements. Once a high fidelity prototype operating in a
simulated environment (passenger flow) is developed, screening operators will be recruited to
participate in realistic checkpoint exercises, during which the user interface designs will be
evaluated by basic measurements such as the speed and accuracy of executing tasks. Key
metrics such as throughput will be calculated or projected based on testing results, then fed into
the simulation analysis.
A final draft of a Human Factors Design Specifications document will provide the specifics of a
final recommended integrated design and include:
• Static GUI & UI specifications of the display design,
• Task definitions and logical flows to define the specific dynamics of the GUI and UI,
• UML diagrams generated from source code of the high-fidelity simulation, and
• Screen snapshots of the high-fidelity simulation.
vi
ACRONYMS
AIT Advanced Image Technology whole body scanner
ATR Automatic Target Recognition
CONOPS Concept of Operations
IO Image Operator
SO Screening Operator
SOPs Standard Operating Procedures
SSD Shoe Scanning Device
TSA Transportation Security Administration
TSL Transportation Security Laboratory
TSO Transportation Security Officer
UCD User Centered Design
WTMD Walk Through Metal Detector
1
1. Introduction
This document outlines the specifications and design development process that will be used for
developing the user interface for the Integrated Checkpoint Project (ICP) at the Transportation
Security Laboratory’s (TSL) Human Factors Lab. A User Centered Design (UCD) approach will
be followed in compliance with ISO standard 13407 (Figure 1). This standard specifies five
major activities: 1) Plan the human centered process, 2) Specify the context of use, 3) Specify
new user and organizational concept of operations (CONOPS), 4) Produce design solutions, and
5) Evaluate designs against user requirements. This document describes the tailored method and
approach taken within these five areas.
1. Plan the human
centered process
5. Evaluate designs
against user
requirements
4. Produce design
solutions
2. Specify the
context of Use
3. Specify user and
organizational
requirements
Meets Requirements
Human Centered Design Process for Interactive Systems ISO 13407
Figure 1. Design specification development processes adapted from ISO 13407.
The ISO 13407 process defined for the Integrated Checkpoint project will be tailored to a subset
of activities due to relevance, project timelines, and resource constraints.
Significant tasks within each of the five major activities include:
2
a. Plan the Human Centered Process
• Stakeholder Meetings
• Kickoff Meeting
• Create Schedule
b. Specify Context of Use
• Observational Site Visits
• Describe Workflow
c. Specify New User and Organizational CONOPS
• Describe CONOPS Changes
• Describe Screening Procedure Changes
• Brainstorming Design Alternatives
• Low Fidelity Prototypes
• Modeling and Simulation
d. Produce Design Solutions
• Iterative Prototyping
• Hi Fidelity Prototyping
e. Evaluate Design Against User Requirements
• Performance Testing
2. Plan the Human Centered Process
The team will prepare an overall schedule that outlines the human factors design specification
development process for the Integrated Checkpoint project. Initially, planning will involve
stakeholders who will help to define project requirements, scope and direction. Once key
questions are answered, a follow-up kick-off team meeting will initiate the activities that will
develop into a project schedule. The schedule will be modified and updated continually to
incorporate progress and status on project tasks. Project requirements, scope and direction are
obtained from multiple sources.
1. Project Initiator (funding agency, customer, stakeholder) specifies
a. New technology features and capabilities
b. New performance objectives
c. New operational objectives (e.g., staffing, CONOPS)
2. End users, customers and stakeholders can provide requirements through
a. Intended-user feedback and interviews
b. Management feedback, briefings, and meetings
c. Inner-agency operational and technical groups (e.g., integrated product teams)
3
3. For modifications of legacy systems, requirements can be derived from
a. Existing agency documents (e.g., CONOPS, SOPs, Functional Requirements,
Operational Requirements, Procurement Specifications)
b. Site visits to evaluate legacy systems currently in use
c. Laboratory evaluations of legacy systems
d. Subject matter experts (e.g., security screeners)
3. Specify Context of Use
3.1 Establishing Context based on Current Checkpoint Configuration
Observational site visits will help to establish the context of use of a current implementation.
This activity will help uncover hidden or implicit requirements towards the definition of a
context for scenarios and use cases. Actual practice in the field may differ from written Standard
Operating Procedures (SOPs) and CONOPS. Unforeseen tasks and unanticipated outcomes may
be captured during field observations. A review of existing SOPs with stakeholders will help to
clarify possible gaps found in scenarios and tasks, illustrated in Figure 2. Observations will be
used to develop an understanding of the current checkpoint configuration CONOPS and use of
the Advanced Image Technology (AIT) whole body scanner and the Walk-Through Metal
Detector (WTMD).
• Existing Concepts of
Operations
• TSA Standard Operating
Procedures
• Field Observations
• Stakeholders Involvement
Scenarios and
Use Cases
• Operator Constraints and
Limits
• Distinct Tasks
• Decision Points
• Order of Operations
Task Flow Analysis
Specify Context of Use
Figure 2. Activities to specify the context of use
Passenger flow at a checkpoint (see Figure 3) will be captured in use cases. A possible “no-go”
outcome may occur when a suspected threat has been found, i.e., an “alarm result”. For example,
when using the WTMD, passengers may be asked to divest additional metal items that they may
4
have forgotten to divest initially when WTMD alarms. Another example is when a remote Image
Operator (IO) finds anomalies on the AIT image; passengers may be patted down while standing
at their assigned spot (See Figure 3, Location G). Those passengers requesting or requiring a
private pat down may be relocated to a private screening location.
5
Line
Check
Document
Baggage to X-RAY
TSO Baggage Screening
Primary AIT
Screening
T1
Line
TSO
PAX
T0
T2
AIT
T3
WTMD
T8
T5
Passenger B
T4
Bench
Recomposure
UpUp
PAX
Pat
Down
ETD
Secondary AIT
Screening
ETD
T5
PAX
TSO
TSO
TSO
T3
Entrance PAX
T2
T8
Bench
TSO
TSO
Remote AIT Imaging Room
T6
T6
T7
A
B
C
D
E
F
G
H
TSO
TSO
Figure 3. Primary AIT checkpoint layout
6
3.2 Proposed System and Concepts Under Development
A future integrated checkpoint display concept (see Figure 4) will be generated and reviewed
with the TSA during stakeholder meetings. Figure 4 represents a high level conceptual
illustration of the functional components of the integrated display. Specific design alternatives
will be developed during brainstorming sessions based on field observations and input from
subject matter experts.
Design considerations will include:
• Passenger Privacy
• Physical configuration/layout of the checkpoint
• User Interface of the Integrated Display
• Manning of stations at the checkpoint
• Operator tasks and communication
• Dynamics of passenger flow
Figure 4. Conceptual illustration of an integrated display
After brainstorming initial high-level designs, candidate approaches will be analyzed and
compared to the current task flows and timing of tasks. Design parameters may include changes
in manning of stations at the checkpoint and the passenger flow guided by control points or
methods. The result of the design analysis will be a preliminary high-level task flow and user
interface design(s).
7
3.2.1 Automated Target Recognition
Automated Target Recognition (ATR) software has continued to improve and evolve and
represents a technology that can potentially reduce checkpoint manpower requirements.
Proposed design solutions will be examined using ATR technology in place of an Image
Operator. Since ATR technologies may result in a higher alarm rate than the current human
operator, the design challenge will be to generate solutions that will provide technological
CONOPS and SOPs with False Alarm (FA) rates consistent with maintaining acceptable
checkpoint through-put rates.
3.3 Task Flow Analysis
Following field observations and subsequent study of the tasks involved, a task flow will be
established to understand the operator’s constraints and limits. For example, the task of
processing an image and rendering a decision regarding the presence of a threat might be
decomposed into subtasks, each with a typical duration, likelihood of occurrence, entry/exit
criteria, and dependencies on other subtasks. This analysis will clarify the variables impacting
throughput and where bottlenecks may occur. The goal will be to understand and document what
separate tasks should be performed in what order, and under what conditions, by the operator.
Baseline information about how current tasks are accomplished helps to determine how future
tasks may be redefined within new system implementations.
Analysis software will be used to build and run a simulation model of tasks and task networks.
Task definitions, flow, and times collected during the site visit to a TSA Checkpoint will be used
to build the simulation. Multiple simulation runs will yield metrics of times and task branching
and will represent a quantitative model of the baseline (existing) system.
Once the context of use is understood and established by examining the existing CONOPS,
SOPs, and task flow analysis, the next phase of specifying user requirements for the new
implementation will begin.
A second simulation model will be constructed to represent a proposed task network of
alternative designs as work progresses towards a final recommended integrated display design.
Simulation model runs will yield metrics that will be compared to the baseline model in order to
provide a preliminary comparison of task completion times and branching.
4. Specify New User and Organizational CONOPS
Introducing new technology, technology integration, or operator interfaces will often necessitate
new procedures and tasks. Definition of both user procedures and organizational CONOPS will
need to describe how the new implementation should work. An integrated display design will
therefore be co-developed with a proposed CONOPS specific to that new technology integration.
The work performed within this activity is illustrated in Figure 5.
8
4.1 User Interface and new screening CONOPS
There are several different possible definitions and uses of the term “CONOPS.” The main focus
of this analysis will be the impact of the new technology integration and how that integration will
necessitate modified screener staffing, technology layout, and passenger control procedures.
There will be no suggested modification to the existing CONOPS that has no immediate or direct
relationship to the integrated display. For example, document checking or recomposure will not
be directly impacted by the user interface design of the integrated display. On the other hand,
the functionality afforded to users via the integrated display may potentially affect
communications between the AIT Screening Operator (SO) and AIT Image Operator (IO). If so,
then CONOPS related to the operations controlled by the SO and IO may be impacted. Based on
user scenarios, tasks, and proposed display design, a proposed new CONOPS and user interface
design will define the following:
• System sensor information presentation
• Organization of display and definition of display controls
• Operator responses and inputs
• Alarm resolution passenger-routing for each outcome (i.e. suspected threat diverts
passenger to secondary screening)
• Changes to passenger traffic and control methods used at the checkpoint
Many of the initial design concepts and solutions will be generated through brainstorming
sessions. These solutions will be further refined through the development of low fidelity
prototypes.
• Define Sensor Information
• Define Display
• Define Operator responses
• Define Alarm Resolution for
each outcome
• Define Tasks of Operators
• Define Passenger Control
Methods
Define New ConOps
And Procedures
• Comparative Task
completion time and
throughput
• Accuracy of
Communications for
Anomaly location
• Task interactions
Map Task Functions
to User Interface
Design
Specify New User and Organizational CONOPS
Figure 5. Activities for generating user specifications
9
4.2 Human Factors Design Specifications
Human Factors design specifications will be defined from Use Case Definitions and Task Flow
Analysis. Design specifications will be mapped to specific tasks and associated metrics that may
be multi-dimensional. For example, a single operator of the integrated display shall control
passenger flow through three different sensors (AIT, WTMD and Shoe Scanning Device: SSD),
resulting in a reduction in manpower requirements, equal or increased passenger through-put,
and task completion times comparable to a non-integrated checkpoint with a dedicated operator
for each sensor.
The final set of Human Factors design specifications will be validated with stakeholders to
ensure that the right set of features is being developed.
5. Produce Design Solutions
Context and task analysis work, performed at the earlier stages of the overall effort, is used in
this stage to explore and mature new designs. Candidate design solutions are evolved from early
“brainstorming” sessions into low fidelity prototypes and associated task definitions and
structures. Designs are matured through the analysis and design activities that drive the
Integrated Display Design, illustrated in Figure 6.
5.1 Iterative Prototyping
Design solutions and associated user interface specifications will be developed iteratively into a
prototype (see Figure 6). Working prototypes will be built and evaluated on a cycle of several
days or a few weeks. This incremental and iterative process of prototype development will allow
for feedback through: a) expert evaluations, b) pilot testing and c) informal analysis and testing.
Testing of prototypes and simulations will involve expert evaluations and small pilot testing
based on the availability of Transportation Security Officers (TSOs).
10
Figure 6. Iterative design process for the Integrated Display Design
Each iteration cycle will be used to design, build, and review both Integrated Display design
alternatives and associated CONOPS. Early paper prototypes will allow stakeholders and other
key team members to participate in a design-evaluation cycle. Paper prototypes will lead to
simple static screen mockups and simple dynamic software simulations. Feedback from
evaluators on the team and from end users at the TSA will drive changes to the prototypes.
The Integrated Checkpoint project places the iterative development of high fidelity prototypes
and usability testing in a second phase separate from the initial low fidelity prototyping
activities. Since access to end users (TSOs) is limited, testing prototypes with both TSOs and
pseudo-passengers together may be required. Therefore, the test bed will simulate the flow of
passengers during the high fidelity prototype testing of the user interface. A full-scale airport
security checkpoint mockup will form the “test bed” of Integrated Display design and evaluation
work.
11
6. Evaluate Design against user requirements
6.1 Human Factors Performance Testing
Once a high fidelity prototype operating in a simulated environment (passenger flow) is
developed, screening operators will be used to test the user interface. The design will be
evaluated by basic measurements such as the speed and accuracy of executing tasks by operators.
Key metrics such as throughput will be calculated or projected based on testing results, and will
then be fed into the simulation analysis tool. A test plan will be developed to structure and guide
the performance testing activity.
6.2 Design Artifacts
Software tools will be used to generate Unified Modeling Language (UML) diagrams from the
prototype source code. UML is a widely accepted method to specify and document object-
oriented software systems. UML offers a standard way to visualize a system's architecture,
including elements such as:
• actors
• business processes
• (logical) components
• activities
• programming language statements
• database schemas, and
• reusable software components
A final draft of a Human Factors Design Specifications document will provide the specifics of a
final recommended integrated design:
• Static GUI & UI specifications of the display design,
• Task definitions and logical flows to define the specific dynamics of the GUI and UI,
• UML diagrams generated from source code of the high-fidelity simulation, and
• Screen snapshots of the high-fidelity simulation.
7. Reference
ISO 13407: Human Centred Design Process for Interactive Systems. 1999.
12
THIS PAGE INTENTIONALLY LEFT BLANK

Contenu connexe

Tendances

OS VERIFICATION- A SURVEY AS A SOURCE OF FUTURE CHALLENGES
OS VERIFICATION- A SURVEY AS A SOURCE OF FUTURE CHALLENGESOS VERIFICATION- A SURVEY AS A SOURCE OF FUTURE CHALLENGES
OS VERIFICATION- A SURVEY AS A SOURCE OF FUTURE CHALLENGESIJCSES Journal
 
Bio-Inspired Modelling of Software Verification by Modified Moran Processes
Bio-Inspired Modelling of Software Verification by Modified Moran ProcessesBio-Inspired Modelling of Software Verification by Modified Moran Processes
Bio-Inspired Modelling of Software Verification by Modified Moran ProcessesIJCSEA Journal
 
BIO-INSPIRED MODELLING OF SOFTWARE VERIFICATION BY MODIFIED MORAN PROCESSES
BIO-INSPIRED MODELLING OF SOFTWARE VERIFICATION BY MODIFIED MORAN PROCESSESBIO-INSPIRED MODELLING OF SOFTWARE VERIFICATION BY MODIFIED MORAN PROCESSES
BIO-INSPIRED MODELLING OF SOFTWARE VERIFICATION BY MODIFIED MORAN PROCESSESIJCSEA Journal
 
Regression test selection model: a comparison between ReTSE and pythia
Regression test selection model: a comparison between ReTSE and pythiaRegression test selection model: a comparison between ReTSE and pythia
Regression test selection model: a comparison between ReTSE and pythiaTELKOMNIKA JOURNAL
 
AUTOMATED PENETRATION TESTING: AN OVERVIEW
AUTOMATED PENETRATION TESTING: AN OVERVIEWAUTOMATED PENETRATION TESTING: AN OVERVIEW
AUTOMATED PENETRATION TESTING: AN OVERVIEWcscpconf
 
Discreate eventsimulation idef
Discreate eventsimulation idefDiscreate eventsimulation idef
Discreate eventsimulation idefMandar Trivedi
 
Test case prioritization using firefly algorithm for software testing
Test case prioritization using firefly algorithm for software testingTest case prioritization using firefly algorithm for software testing
Test case prioritization using firefly algorithm for software testingJournal Papers
 
Software Defect Prediction Techniques in the Automotive Domain: Evaluation, S...
Software Defect Prediction Techniques in the Automotive Domain: Evaluation, S...Software Defect Prediction Techniques in the Automotive Domain: Evaluation, S...
Software Defect Prediction Techniques in the Automotive Domain: Evaluation, S...RAKESH RANA
 
Software testing quiz questions and answers
Software testing quiz questions and answersSoftware testing quiz questions and answers
Software testing quiz questions and answersRajendraG
 
Industrial perspective on static analysis
Industrial perspective on static analysisIndustrial perspective on static analysis
Industrial perspective on static analysisChirag Thumar
 
IRJET - Neural Network based Leaf Disease Detection and Remedy Recommenda...
IRJET -  	  Neural Network based Leaf Disease Detection and Remedy Recommenda...IRJET -  	  Neural Network based Leaf Disease Detection and Remedy Recommenda...
IRJET - Neural Network based Leaf Disease Detection and Remedy Recommenda...IRJET Journal
 
Automated exam question set generator using utility based agent and learning ...
Automated exam question set generator using utility based agent and learning ...Automated exam question set generator using utility based agent and learning ...
Automated exam question set generator using utility based agent and learning ...Journal Papers
 

Tendances (18)

OS VERIFICATION- A SURVEY AS A SOURCE OF FUTURE CHALLENGES
OS VERIFICATION- A SURVEY AS A SOURCE OF FUTURE CHALLENGESOS VERIFICATION- A SURVEY AS A SOURCE OF FUTURE CHALLENGES
OS VERIFICATION- A SURVEY AS A SOURCE OF FUTURE CHALLENGES
 
Bio-Inspired Modelling of Software Verification by Modified Moran Processes
Bio-Inspired Modelling of Software Verification by Modified Moran ProcessesBio-Inspired Modelling of Software Verification by Modified Moran Processes
Bio-Inspired Modelling of Software Verification by Modified Moran Processes
 
BIO-INSPIRED MODELLING OF SOFTWARE VERIFICATION BY MODIFIED MORAN PROCESSES
BIO-INSPIRED MODELLING OF SOFTWARE VERIFICATION BY MODIFIED MORAN PROCESSESBIO-INSPIRED MODELLING OF SOFTWARE VERIFICATION BY MODIFIED MORAN PROCESSES
BIO-INSPIRED MODELLING OF SOFTWARE VERIFICATION BY MODIFIED MORAN PROCESSES
 
Sta unit 2(abimanyu)
Sta unit 2(abimanyu)Sta unit 2(abimanyu)
Sta unit 2(abimanyu)
 
Zander diss
Zander dissZander diss
Zander diss
 
Regression test selection model: a comparison between ReTSE and pythia
Regression test selection model: a comparison between ReTSE and pythiaRegression test selection model: a comparison between ReTSE and pythia
Regression test selection model: a comparison between ReTSE and pythia
 
Ijetcas14 468
Ijetcas14 468Ijetcas14 468
Ijetcas14 468
 
AUTOMATED PENETRATION TESTING: AN OVERVIEW
AUTOMATED PENETRATION TESTING: AN OVERVIEWAUTOMATED PENETRATION TESTING: AN OVERVIEW
AUTOMATED PENETRATION TESTING: AN OVERVIEW
 
Ijetr011834
Ijetr011834Ijetr011834
Ijetr011834
 
Discreate eventsimulation idef
Discreate eventsimulation idefDiscreate eventsimulation idef
Discreate eventsimulation idef
 
Test case prioritization using firefly algorithm for software testing
Test case prioritization using firefly algorithm for software testingTest case prioritization using firefly algorithm for software testing
Test case prioritization using firefly algorithm for software testing
 
Hh3512801283
Hh3512801283Hh3512801283
Hh3512801283
 
Software Defect Prediction Techniques in the Automotive Domain: Evaluation, S...
Software Defect Prediction Techniques in the Automotive Domain: Evaluation, S...Software Defect Prediction Techniques in the Automotive Domain: Evaluation, S...
Software Defect Prediction Techniques in the Automotive Domain: Evaluation, S...
 
Software testing quiz questions and answers
Software testing quiz questions and answersSoftware testing quiz questions and answers
Software testing quiz questions and answers
 
Industrial perspective on static analysis
Industrial perspective on static analysisIndustrial perspective on static analysis
Industrial perspective on static analysis
 
IRJET - Neural Network based Leaf Disease Detection and Remedy Recommenda...
IRJET -  	  Neural Network based Leaf Disease Detection and Remedy Recommenda...IRJET -  	  Neural Network based Leaf Disease Detection and Remedy Recommenda...
IRJET - Neural Network based Leaf Disease Detection and Remedy Recommenda...
 
50120140507007
5012014050700750120140507007
50120140507007
 
Automated exam question set generator using utility based agent and learning ...
Automated exam question set generator using utility based agent and learning ...Automated exam question set generator using utility based agent and learning ...
Automated exam question set generator using utility based agent and learning ...
 

En vedette

PML Certification 01 HIGI
PML Certification 01 HIGIPML Certification 01 HIGI
PML Certification 01 HIGIPaulin Lu
 
PartialCapstoneforLinkedIn
PartialCapstoneforLinkedInPartialCapstoneforLinkedIn
PartialCapstoneforLinkedInJK Costello
 
M.Phalguna Satish 1+ exp .net
M.Phalguna Satish 1+ exp .netM.Phalguna Satish 1+ exp .net
M.Phalguna Satish 1+ exp .netPhalguna Satish
 
Laenergiarenovable 170129215236
Laenergiarenovable 170129215236Laenergiarenovable 170129215236
Laenergiarenovable 170129215236daniana9559
 

En vedette (8)

PML Certification 01 HIGI
PML Certification 01 HIGIPML Certification 01 HIGI
PML Certification 01 HIGI
 
Неделя математики
Неделя математикиНеделя математики
Неделя математики
 
104-355-1-PB
104-355-1-PB104-355-1-PB
104-355-1-PB
 
Prezentacja1
Prezentacja1Prezentacja1
Prezentacja1
 
Resume
ResumeResume
Resume
 
PartialCapstoneforLinkedIn
PartialCapstoneforLinkedInPartialCapstoneforLinkedIn
PartialCapstoneforLinkedIn
 
M.Phalguna Satish 1+ exp .net
M.Phalguna Satish 1+ exp .netM.Phalguna Satish 1+ exp .net
M.Phalguna Satish 1+ exp .net
 
Laenergiarenovable 170129215236
Laenergiarenovable 170129215236Laenergiarenovable 170129215236
Laenergiarenovable 170129215236
 

Similaire à DHSSTTSL11192.HSI Process (1)

DHS - Using functions points to estimate agile development programs (v2)
DHS - Using functions points to estimate agile development programs (v2)DHS - Using functions points to estimate agile development programs (v2)
DHS - Using functions points to estimate agile development programs (v2)Glen Alleman
 
REGULARIZED FUZZY NEURAL NETWORKS TO AID EFFORT FORECASTING IN THE CONSTRUCTI...
REGULARIZED FUZZY NEURAL NETWORKS TO AID EFFORT FORECASTING IN THE CONSTRUCTI...REGULARIZED FUZZY NEURAL NETWORKS TO AID EFFORT FORECASTING IN THE CONSTRUCTI...
REGULARIZED FUZZY NEURAL NETWORKS TO AID EFFORT FORECASTING IN THE CONSTRUCTI...ijaia
 
IP05_USA AI. 11 - Human Factors.pdf
IP05_USA AI. 11 - Human Factors.pdfIP05_USA AI. 11 - Human Factors.pdf
IP05_USA AI. 11 - Human Factors.pdfToto Subagyo
 
WR Based Opinion Mining on Traffic Sentiment Analysis on Social Media
WR Based Opinion Mining on Traffic Sentiment Analysis on Social MediaWR Based Opinion Mining on Traffic Sentiment Analysis on Social Media
WR Based Opinion Mining on Traffic Sentiment Analysis on Social MediaIRJET Journal
 
Software development life cycle
Software development life cycle Software development life cycle
Software development life cycle shefali mishra
 
An Approach of Improve Efficiencies through DevOps Adoption
An Approach of Improve Efficiencies through DevOps AdoptionAn Approach of Improve Efficiencies through DevOps Adoption
An Approach of Improve Efficiencies through DevOps AdoptionIRJET Journal
 
Monitoring and Visualisation Approach for Collaboration Production Line Envir...
Monitoring and Visualisation Approach for Collaboration Production Line Envir...Monitoring and Visualisation Approach for Collaboration Production Line Envir...
Monitoring and Visualisation Approach for Collaboration Production Line Envir...Waqas Tariq
 
Investigating Geographic Information System Technologies A Global Positioning...
Investigating Geographic Information System Technologies A Global Positioning...Investigating Geographic Information System Technologies A Global Positioning...
Investigating Geographic Information System Technologies A Global Positioning...Simon Sweeney
 
DESIGN PATTERNS IN THE WORKFLOW IMPLEMENTATION OF MARINE RESEARCH GENERAL INF...
DESIGN PATTERNS IN THE WORKFLOW IMPLEMENTATION OF MARINE RESEARCH GENERAL INF...DESIGN PATTERNS IN THE WORKFLOW IMPLEMENTATION OF MARINE RESEARCH GENERAL INF...
DESIGN PATTERNS IN THE WORKFLOW IMPLEMENTATION OF MARINE RESEARCH GENERAL INF...AM Publications
 
The tasks You are assumed to be one of the software consultants .docx
The tasks You are assumed to be one of the software consultants .docxThe tasks You are assumed to be one of the software consultants .docx
The tasks You are assumed to be one of the software consultants .docxsarah98765
 
A CASE Lab Report - Project File on "ATM - Banking System"
A CASE Lab Report - Project File on  "ATM - Banking System"A CASE Lab Report - Project File on  "ATM - Banking System"
A CASE Lab Report - Project File on "ATM - Banking System"joyousbharat
 
A SECURITY EVALUATION FRAMEWORK FOR U.K. E-GOVERNMENT SERVICES AGILE SOFTWARE...
A SECURITY EVALUATION FRAMEWORK FOR U.K. E-GOVERNMENT SERVICES AGILE SOFTWARE...A SECURITY EVALUATION FRAMEWORK FOR U.K. E-GOVERNMENT SERVICES AGILE SOFTWARE...
A SECURITY EVALUATION FRAMEWORK FOR U.K. E-GOVERNMENT SERVICES AGILE SOFTWARE...IJNSA Journal
 
A SECURITY EVALUATION FRAMEWORK FOR U.K. E-GOVERNMENT SERVICES AGILE SOFTWARE...
A SECURITY EVALUATION FRAMEWORK FOR U.K. E-GOVERNMENT SERVICES AGILE SOFTWARE...A SECURITY EVALUATION FRAMEWORK FOR U.K. E-GOVERNMENT SERVICES AGILE SOFTWARE...
A SECURITY EVALUATION FRAMEWORK FOR U.K. E-GOVERNMENT SERVICES AGILE SOFTWARE...IJNSA Journal
 
Advanced Verification Methodology for Complex System on Chip Verification
Advanced Verification Methodology for Complex System on Chip VerificationAdvanced Verification Methodology for Complex System on Chip Verification
Advanced Verification Methodology for Complex System on Chip VerificationVLSICS Design
 
Airline ticket reservation system
Airline ticket reservation systemAirline ticket reservation system
Airline ticket reservation systemSH Rajøn
 
Smart Traffic Monitoring System Report
Smart Traffic Monitoring System ReportSmart Traffic Monitoring System Report
Smart Traffic Monitoring System ReportALi Baker
 
Hospital E-Token Management(outdoor)
Hospital E-Token Management(outdoor)Hospital E-Token Management(outdoor)
Hospital E-Token Management(outdoor)ANISUR RAHMAN
 
IRJET- Speech and Hearing
IRJET- Speech and HearingIRJET- Speech and Hearing
IRJET- Speech and HearingIRJET Journal
 
system development life cycle
system development life cyclesystem development life cycle
system development life cycleSuhleemAhmd
 

Similaire à DHSSTTSL11192.HSI Process (1) (20)

DHS - Using functions points to estimate agile development programs (v2)
DHS - Using functions points to estimate agile development programs (v2)DHS - Using functions points to estimate agile development programs (v2)
DHS - Using functions points to estimate agile development programs (v2)
 
REGULARIZED FUZZY NEURAL NETWORKS TO AID EFFORT FORECASTING IN THE CONSTRUCTI...
REGULARIZED FUZZY NEURAL NETWORKS TO AID EFFORT FORECASTING IN THE CONSTRUCTI...REGULARIZED FUZZY NEURAL NETWORKS TO AID EFFORT FORECASTING IN THE CONSTRUCTI...
REGULARIZED FUZZY NEURAL NETWORKS TO AID EFFORT FORECASTING IN THE CONSTRUCTI...
 
IP05_USA AI. 11 - Human Factors.pdf
IP05_USA AI. 11 - Human Factors.pdfIP05_USA AI. 11 - Human Factors.pdf
IP05_USA AI. 11 - Human Factors.pdf
 
WR Based Opinion Mining on Traffic Sentiment Analysis on Social Media
WR Based Opinion Mining on Traffic Sentiment Analysis on Social MediaWR Based Opinion Mining on Traffic Sentiment Analysis on Social Media
WR Based Opinion Mining on Traffic Sentiment Analysis on Social Media
 
Software development life cycle
Software development life cycle Software development life cycle
Software development life cycle
 
An Approach of Improve Efficiencies through DevOps Adoption
An Approach of Improve Efficiencies through DevOps AdoptionAn Approach of Improve Efficiencies through DevOps Adoption
An Approach of Improve Efficiencies through DevOps Adoption
 
Monitoring and Visualisation Approach for Collaboration Production Line Envir...
Monitoring and Visualisation Approach for Collaboration Production Line Envir...Monitoring and Visualisation Approach for Collaboration Production Line Envir...
Monitoring and Visualisation Approach for Collaboration Production Line Envir...
 
Investigating Geographic Information System Technologies A Global Positioning...
Investigating Geographic Information System Technologies A Global Positioning...Investigating Geographic Information System Technologies A Global Positioning...
Investigating Geographic Information System Technologies A Global Positioning...
 
DESIGN PATTERNS IN THE WORKFLOW IMPLEMENTATION OF MARINE RESEARCH GENERAL INF...
DESIGN PATTERNS IN THE WORKFLOW IMPLEMENTATION OF MARINE RESEARCH GENERAL INF...DESIGN PATTERNS IN THE WORKFLOW IMPLEMENTATION OF MARINE RESEARCH GENERAL INF...
DESIGN PATTERNS IN THE WORKFLOW IMPLEMENTATION OF MARINE RESEARCH GENERAL INF...
 
The tasks You are assumed to be one of the software consultants .docx
The tasks You are assumed to be one of the software consultants .docxThe tasks You are assumed to be one of the software consultants .docx
The tasks You are assumed to be one of the software consultants .docx
 
A CASE Lab Report - Project File on "ATM - Banking System"
A CASE Lab Report - Project File on  "ATM - Banking System"A CASE Lab Report - Project File on  "ATM - Banking System"
A CASE Lab Report - Project File on "ATM - Banking System"
 
A SECURITY EVALUATION FRAMEWORK FOR U.K. E-GOVERNMENT SERVICES AGILE SOFTWARE...
A SECURITY EVALUATION FRAMEWORK FOR U.K. E-GOVERNMENT SERVICES AGILE SOFTWARE...A SECURITY EVALUATION FRAMEWORK FOR U.K. E-GOVERNMENT SERVICES AGILE SOFTWARE...
A SECURITY EVALUATION FRAMEWORK FOR U.K. E-GOVERNMENT SERVICES AGILE SOFTWARE...
 
A SECURITY EVALUATION FRAMEWORK FOR U.K. E-GOVERNMENT SERVICES AGILE SOFTWARE...
A SECURITY EVALUATION FRAMEWORK FOR U.K. E-GOVERNMENT SERVICES AGILE SOFTWARE...A SECURITY EVALUATION FRAMEWORK FOR U.K. E-GOVERNMENT SERVICES AGILE SOFTWARE...
A SECURITY EVALUATION FRAMEWORK FOR U.K. E-GOVERNMENT SERVICES AGILE SOFTWARE...
 
Advanced Verification Methodology for Complex System on Chip Verification
Advanced Verification Methodology for Complex System on Chip VerificationAdvanced Verification Methodology for Complex System on Chip Verification
Advanced Verification Methodology for Complex System on Chip Verification
 
Airline ticket reservation system
Airline ticket reservation systemAirline ticket reservation system
Airline ticket reservation system
 
Smart Traffic Monitoring System Report
Smart Traffic Monitoring System ReportSmart Traffic Monitoring System Report
Smart Traffic Monitoring System Report
 
Hospital E-Token Management(outdoor)
Hospital E-Token Management(outdoor)Hospital E-Token Management(outdoor)
Hospital E-Token Management(outdoor)
 
software engineering
software engineering software engineering
software engineering
 
IRJET- Speech and Hearing
IRJET- Speech and HearingIRJET- Speech and Hearing
IRJET- Speech and Hearing
 
system development life cycle
system development life cyclesystem development life cycle
system development life cycle
 

DHSSTTSL11192.HSI Process (1)

  • 1. DHS/ST/TSL-11/192 Human Factors Design- Specification Process for the Integrated Checkpoint Program Joshua Rubinstein, Ph.D. John P. Chin, Ph.D Mark W. Smith U.S. Department of Homeland Security Science and Technology Directorate William J. Hughes Technical Center Atlantic City International Airport, NJ 08405 June 2010 Final Report This report is approved for public release and is on file at the Transportation Security Laboratory Library, Atlantic City International Airport, New Jersey, 08405. This document is also available to the U.S. public through the National Technical Information Service (NTIS), Springfield, Virginia, 22161. U.S. Department of Homeland Security Science and Technology Directorate
  • 2. NOTICE This document is disseminated under the sponsorship of the U.S. Department of Homeland Security in the interest of information exchange. The United States Government assumes no liability for the contents or use thereof. The United States Government does not endorse products or manufacturers. Trade or manufacturers names appear herein solely because they are considered essential to the objective of this report.
  • 3. Technical Report Documentation Page 1. Report No. DHS/STD/TSL-11/192 2. Government Accession No. 3. Recipient’s Catalog No. 4. Title and Subtitle Human Factor Design-Specification Process for the Integrated Checkpoint Program 5. Report Date 6. Performing Organization Code 7. Author(s) Joshua Rubinstein, Ph.D. John P. Chin, Ph.D. Mark W. Smith 8. Performing Organization Report No. DHS/STD/TSL-11/192 9. Performing Organization Name and Address U.S. Department of Homeland Security Science and Technology Directorate Transportation Security Laboratory, TSL-1 William J. Hughes Technical Center Atlantic City International Airport, NJ 08405 10. Work Unit No. (TRAIS) 11. Contract or Grant No. 12. Sponsoring Agency Name and Address U.S. Department of Homeland Security Science and Technology Directorate Transportation Security Laboratory, TSL-1 William J. Hughes Technical Center Atlantic City International Airport, NJ 08405 13. Type of Report and Period Covered 14. Sponsoring Agency Code S&T 15. Supplementary Notes John Chin and Mark Smith are support contractors provided by Hi-Tec Systems, Inc. Egg Harbor Township, NJ 16. Abstract This document outlines the process that will be used for developing user interface design specifications for the Integrated Checkpoint Project (ICP) at the Transportation Security Laboratory’s (TSL) Human Factors Lab. A User Centered Design (UCD) approach will be followed in compliance with ISO Standard 13407. This standard specifies five major activities: 1) Plan the human centered process, 2) Specify the context of use, 3) Specify new user and organizational concept of operations, 4) Produce design solutions, and 5) Evaluate designs against user requirements. This document describes the method and approach taken within these five areas. 17. Key Words Human factors, integrated checkpoint 18. Distribution Statement This report is approved for public release and is on file at the Transportation Security Laboratory Library, Atlantic City International Airport, New Jersey, 08405. This document is also available to the U.S. public through the National Technical Information Service (NTIS), Springfield, Virginia, 22161. 19. Security Classif. (of this report) Unclassified 20. Security Classif. (of this page) Unclassified 21. No. of Pages 20 22. Price Reproduction of completed page authorized
  • 4. DOCUMENT CHANGE HISTORY Version Description/Author Date(s) TSL Approval 0.1 Tech edits/PB 07/08/10
  • 5. iii TABLE OF CONTENTS Page EXECUTIVE SUMMARY iv ACRONYMS vi 1. Introduction 1 2. Plan the Human Centered Process 2 3. Specify Context of Use 3 3.1 Establishing Context based on Current Checkpoint Configuration 3 3.2 Proposed System and Concepts Under Development 6 3.2.1 Automated Target Recognition 7 3.3 Task Flow Analysis 7 4. Specify New User and Organizational CONOPS 7 4.1 User Interface and new screening CONOPS 8 4.2 Human Factors Design Specifications 9 5. Produce Design Solutions 9 5.1 Iterative Prototyping 9 6. Evaluate Design against user requirements 11 6.1 Human Factors Performance Testing 11 6.2 Design Artifacts 11 7. Reference 11 LIST OF FIGURES Figure 1. Design specification development processes adapted from ISO 13407. 1 Figure 2. Activities to specify the context of use 3 Figure 3. Primary AIT checkpoint layout 5 Figure 4. Conceptual illustration of an integrated display 6 Figure 5. Activities for generating user specifications 8 Figure 6. Iterative design process for the Integrated Display Design 10
  • 6. iv EXECUTIVE SUMMARY This document outlines the process that will be used for developing user interface design specifications for the Integrated Checkpoint Project (ICP) at the Transportation Security Laboratory’s (TSL) Human Factors Lab. A User Centered Design (UCD) approach will be followed in compliance with ISO standard 13407. This standard specifies five major activities: 1) Plan the human centered process, 2) Specify the context of use, 3) Specify new user and organizational concept of operations (CONOPS), 4) Produce design solutions, and 5) Evaluate designs against user requirements. This document describes the method and a tailored approach taken within these five areas. Although a large array of activities can be performed under the auspices of ISO 13407, the process defined for the Integrated Checkpoint Project will be tailored to a subset of activities due to relevance, project timelines and resource constraints. The approach to the five areas is described below. 1. Plan the human centered process. The team will prepare an overall schedule that outlines the specifications development process for the Integrated Checkpoint project (ICP). Initially, planning will involve stakeholders who will help to define project scope and direction. Once key questions are answered, a follow-up kick-off team meeting should initiate the activities that will develop into a project schedule. 2. Specify Context of Use. Observational site visits will help to establish the context of use of a current implementation. This activity will help uncover hidden or implicit operational and architectural practices that constitute de facto system requirements, and thereby may influence the definition of a context for scenarios and use cases. For example, actual practice in the field may differ from written Standard Operating Procedures (SOPs) and CONOPS. Unforeseen tasks and unanticipated outcomes may be captured during field observations. Contingencies for unexpected outcomes should be understood in terms of use cases. A task flow will be established to understand the operators’ constraints and limits. Baseline information about how current tasks are accomplished helps to determine how future tasks may be redefined within new system implementations. Analysis software will be used to build and run a simulation model of tasks and task networks. Task definitions, flows, and times collected during the site visit to a Transportation Security Administration (TSA) Checkpoint will be used to build the simulation. Multiple simulation runs will yield metrics of times and task branching and will represent a quantitative model of the baseline (existing) system. 3. Specify New User and Organizational CONOPS. Introducing new technology, technology integration, or operator interfaces will often necessitate new procedures and tasks. Definition of both user procedures and organizational CONOPS will need to describe how the new implementation should work. An integrated display design will therefore be co-developed with a proposed CONOPS specific to that new technology integration. There will be no suggested modification to the existing CONOPS that have no immediate or direct relationship to the integrated display. Based on user scenarios, tasks, and proposed display design, a proposed new CONOPS and user interface design will define the following:
  • 7. v • System sensor information presentation • Organization of display and definition of display controls • Operator responses and inputs • Alarm resolution passenger-routing for each outcome (i.e., when suspected threat diverts passenger to secondary screening) • Changes to passenger traffic and control methods used at the checkpoint 4. Produce Design Solutions. The design solutions and associated user interface specifications will be developed into working prototypes, which then will be iteratively refined though repeating design-build-evaluate cycles. These cycles may be adjusted to be shorter or longer based on what is most effective. This incremental process of prototype development will allow for feedback through: a) expert evaluations, b) pilot testing and c) informal analysis and testing. 5. Evaluate Design against User Requirements. Once a high fidelity prototype operating in a simulated environment (passenger flow) is developed, screening operators will be recruited to participate in realistic checkpoint exercises, during which the user interface designs will be evaluated by basic measurements such as the speed and accuracy of executing tasks. Key metrics such as throughput will be calculated or projected based on testing results, then fed into the simulation analysis. A final draft of a Human Factors Design Specifications document will provide the specifics of a final recommended integrated design and include: • Static GUI & UI specifications of the display design, • Task definitions and logical flows to define the specific dynamics of the GUI and UI, • UML diagrams generated from source code of the high-fidelity simulation, and • Screen snapshots of the high-fidelity simulation.
  • 8. vi ACRONYMS AIT Advanced Image Technology whole body scanner ATR Automatic Target Recognition CONOPS Concept of Operations IO Image Operator SO Screening Operator SOPs Standard Operating Procedures SSD Shoe Scanning Device TSA Transportation Security Administration TSL Transportation Security Laboratory TSO Transportation Security Officer UCD User Centered Design WTMD Walk Through Metal Detector
  • 9. 1 1. Introduction This document outlines the specifications and design development process that will be used for developing the user interface for the Integrated Checkpoint Project (ICP) at the Transportation Security Laboratory’s (TSL) Human Factors Lab. A User Centered Design (UCD) approach will be followed in compliance with ISO standard 13407 (Figure 1). This standard specifies five major activities: 1) Plan the human centered process, 2) Specify the context of use, 3) Specify new user and organizational concept of operations (CONOPS), 4) Produce design solutions, and 5) Evaluate designs against user requirements. This document describes the tailored method and approach taken within these five areas. 1. Plan the human centered process 5. Evaluate designs against user requirements 4. Produce design solutions 2. Specify the context of Use 3. Specify user and organizational requirements Meets Requirements Human Centered Design Process for Interactive Systems ISO 13407 Figure 1. Design specification development processes adapted from ISO 13407. The ISO 13407 process defined for the Integrated Checkpoint project will be tailored to a subset of activities due to relevance, project timelines, and resource constraints. Significant tasks within each of the five major activities include:
  • 10. 2 a. Plan the Human Centered Process • Stakeholder Meetings • Kickoff Meeting • Create Schedule b. Specify Context of Use • Observational Site Visits • Describe Workflow c. Specify New User and Organizational CONOPS • Describe CONOPS Changes • Describe Screening Procedure Changes • Brainstorming Design Alternatives • Low Fidelity Prototypes • Modeling and Simulation d. Produce Design Solutions • Iterative Prototyping • Hi Fidelity Prototyping e. Evaluate Design Against User Requirements • Performance Testing 2. Plan the Human Centered Process The team will prepare an overall schedule that outlines the human factors design specification development process for the Integrated Checkpoint project. Initially, planning will involve stakeholders who will help to define project requirements, scope and direction. Once key questions are answered, a follow-up kick-off team meeting will initiate the activities that will develop into a project schedule. The schedule will be modified and updated continually to incorporate progress and status on project tasks. Project requirements, scope and direction are obtained from multiple sources. 1. Project Initiator (funding agency, customer, stakeholder) specifies a. New technology features and capabilities b. New performance objectives c. New operational objectives (e.g., staffing, CONOPS) 2. End users, customers and stakeholders can provide requirements through a. Intended-user feedback and interviews b. Management feedback, briefings, and meetings c. Inner-agency operational and technical groups (e.g., integrated product teams)
  • 11. 3 3. For modifications of legacy systems, requirements can be derived from a. Existing agency documents (e.g., CONOPS, SOPs, Functional Requirements, Operational Requirements, Procurement Specifications) b. Site visits to evaluate legacy systems currently in use c. Laboratory evaluations of legacy systems d. Subject matter experts (e.g., security screeners) 3. Specify Context of Use 3.1 Establishing Context based on Current Checkpoint Configuration Observational site visits will help to establish the context of use of a current implementation. This activity will help uncover hidden or implicit requirements towards the definition of a context for scenarios and use cases. Actual practice in the field may differ from written Standard Operating Procedures (SOPs) and CONOPS. Unforeseen tasks and unanticipated outcomes may be captured during field observations. A review of existing SOPs with stakeholders will help to clarify possible gaps found in scenarios and tasks, illustrated in Figure 2. Observations will be used to develop an understanding of the current checkpoint configuration CONOPS and use of the Advanced Image Technology (AIT) whole body scanner and the Walk-Through Metal Detector (WTMD). • Existing Concepts of Operations • TSA Standard Operating Procedures • Field Observations • Stakeholders Involvement Scenarios and Use Cases • Operator Constraints and Limits • Distinct Tasks • Decision Points • Order of Operations Task Flow Analysis Specify Context of Use Figure 2. Activities to specify the context of use Passenger flow at a checkpoint (see Figure 3) will be captured in use cases. A possible “no-go” outcome may occur when a suspected threat has been found, i.e., an “alarm result”. For example, when using the WTMD, passengers may be asked to divest additional metal items that they may
  • 12. 4 have forgotten to divest initially when WTMD alarms. Another example is when a remote Image Operator (IO) finds anomalies on the AIT image; passengers may be patted down while standing at their assigned spot (See Figure 3, Location G). Those passengers requesting or requiring a private pat down may be relocated to a private screening location.
  • 13. 5 Line Check Document Baggage to X-RAY TSO Baggage Screening Primary AIT Screening T1 Line TSO PAX T0 T2 AIT T3 WTMD T8 T5 Passenger B T4 Bench Recomposure UpUp PAX Pat Down ETD Secondary AIT Screening ETD T5 PAX TSO TSO TSO T3 Entrance PAX T2 T8 Bench TSO TSO Remote AIT Imaging Room T6 T6 T7 A B C D E F G H TSO TSO Figure 3. Primary AIT checkpoint layout
  • 14. 6 3.2 Proposed System and Concepts Under Development A future integrated checkpoint display concept (see Figure 4) will be generated and reviewed with the TSA during stakeholder meetings. Figure 4 represents a high level conceptual illustration of the functional components of the integrated display. Specific design alternatives will be developed during brainstorming sessions based on field observations and input from subject matter experts. Design considerations will include: • Passenger Privacy • Physical configuration/layout of the checkpoint • User Interface of the Integrated Display • Manning of stations at the checkpoint • Operator tasks and communication • Dynamics of passenger flow Figure 4. Conceptual illustration of an integrated display After brainstorming initial high-level designs, candidate approaches will be analyzed and compared to the current task flows and timing of tasks. Design parameters may include changes in manning of stations at the checkpoint and the passenger flow guided by control points or methods. The result of the design analysis will be a preliminary high-level task flow and user interface design(s).
  • 15. 7 3.2.1 Automated Target Recognition Automated Target Recognition (ATR) software has continued to improve and evolve and represents a technology that can potentially reduce checkpoint manpower requirements. Proposed design solutions will be examined using ATR technology in place of an Image Operator. Since ATR technologies may result in a higher alarm rate than the current human operator, the design challenge will be to generate solutions that will provide technological CONOPS and SOPs with False Alarm (FA) rates consistent with maintaining acceptable checkpoint through-put rates. 3.3 Task Flow Analysis Following field observations and subsequent study of the tasks involved, a task flow will be established to understand the operator’s constraints and limits. For example, the task of processing an image and rendering a decision regarding the presence of a threat might be decomposed into subtasks, each with a typical duration, likelihood of occurrence, entry/exit criteria, and dependencies on other subtasks. This analysis will clarify the variables impacting throughput and where bottlenecks may occur. The goal will be to understand and document what separate tasks should be performed in what order, and under what conditions, by the operator. Baseline information about how current tasks are accomplished helps to determine how future tasks may be redefined within new system implementations. Analysis software will be used to build and run a simulation model of tasks and task networks. Task definitions, flow, and times collected during the site visit to a TSA Checkpoint will be used to build the simulation. Multiple simulation runs will yield metrics of times and task branching and will represent a quantitative model of the baseline (existing) system. Once the context of use is understood and established by examining the existing CONOPS, SOPs, and task flow analysis, the next phase of specifying user requirements for the new implementation will begin. A second simulation model will be constructed to represent a proposed task network of alternative designs as work progresses towards a final recommended integrated display design. Simulation model runs will yield metrics that will be compared to the baseline model in order to provide a preliminary comparison of task completion times and branching. 4. Specify New User and Organizational CONOPS Introducing new technology, technology integration, or operator interfaces will often necessitate new procedures and tasks. Definition of both user procedures and organizational CONOPS will need to describe how the new implementation should work. An integrated display design will therefore be co-developed with a proposed CONOPS specific to that new technology integration. The work performed within this activity is illustrated in Figure 5.
  • 16. 8 4.1 User Interface and new screening CONOPS There are several different possible definitions and uses of the term “CONOPS.” The main focus of this analysis will be the impact of the new technology integration and how that integration will necessitate modified screener staffing, technology layout, and passenger control procedures. There will be no suggested modification to the existing CONOPS that has no immediate or direct relationship to the integrated display. For example, document checking or recomposure will not be directly impacted by the user interface design of the integrated display. On the other hand, the functionality afforded to users via the integrated display may potentially affect communications between the AIT Screening Operator (SO) and AIT Image Operator (IO). If so, then CONOPS related to the operations controlled by the SO and IO may be impacted. Based on user scenarios, tasks, and proposed display design, a proposed new CONOPS and user interface design will define the following: • System sensor information presentation • Organization of display and definition of display controls • Operator responses and inputs • Alarm resolution passenger-routing for each outcome (i.e. suspected threat diverts passenger to secondary screening) • Changes to passenger traffic and control methods used at the checkpoint Many of the initial design concepts and solutions will be generated through brainstorming sessions. These solutions will be further refined through the development of low fidelity prototypes. • Define Sensor Information • Define Display • Define Operator responses • Define Alarm Resolution for each outcome • Define Tasks of Operators • Define Passenger Control Methods Define New ConOps And Procedures • Comparative Task completion time and throughput • Accuracy of Communications for Anomaly location • Task interactions Map Task Functions to User Interface Design Specify New User and Organizational CONOPS Figure 5. Activities for generating user specifications
  • 17. 9 4.2 Human Factors Design Specifications Human Factors design specifications will be defined from Use Case Definitions and Task Flow Analysis. Design specifications will be mapped to specific tasks and associated metrics that may be multi-dimensional. For example, a single operator of the integrated display shall control passenger flow through three different sensors (AIT, WTMD and Shoe Scanning Device: SSD), resulting in a reduction in manpower requirements, equal or increased passenger through-put, and task completion times comparable to a non-integrated checkpoint with a dedicated operator for each sensor. The final set of Human Factors design specifications will be validated with stakeholders to ensure that the right set of features is being developed. 5. Produce Design Solutions Context and task analysis work, performed at the earlier stages of the overall effort, is used in this stage to explore and mature new designs. Candidate design solutions are evolved from early “brainstorming” sessions into low fidelity prototypes and associated task definitions and structures. Designs are matured through the analysis and design activities that drive the Integrated Display Design, illustrated in Figure 6. 5.1 Iterative Prototyping Design solutions and associated user interface specifications will be developed iteratively into a prototype (see Figure 6). Working prototypes will be built and evaluated on a cycle of several days or a few weeks. This incremental and iterative process of prototype development will allow for feedback through: a) expert evaluations, b) pilot testing and c) informal analysis and testing. Testing of prototypes and simulations will involve expert evaluations and small pilot testing based on the availability of Transportation Security Officers (TSOs).
  • 18. 10 Figure 6. Iterative design process for the Integrated Display Design Each iteration cycle will be used to design, build, and review both Integrated Display design alternatives and associated CONOPS. Early paper prototypes will allow stakeholders and other key team members to participate in a design-evaluation cycle. Paper prototypes will lead to simple static screen mockups and simple dynamic software simulations. Feedback from evaluators on the team and from end users at the TSA will drive changes to the prototypes. The Integrated Checkpoint project places the iterative development of high fidelity prototypes and usability testing in a second phase separate from the initial low fidelity prototyping activities. Since access to end users (TSOs) is limited, testing prototypes with both TSOs and pseudo-passengers together may be required. Therefore, the test bed will simulate the flow of passengers during the high fidelity prototype testing of the user interface. A full-scale airport security checkpoint mockup will form the “test bed” of Integrated Display design and evaluation work.
  • 19. 11 6. Evaluate Design against user requirements 6.1 Human Factors Performance Testing Once a high fidelity prototype operating in a simulated environment (passenger flow) is developed, screening operators will be used to test the user interface. The design will be evaluated by basic measurements such as the speed and accuracy of executing tasks by operators. Key metrics such as throughput will be calculated or projected based on testing results, and will then be fed into the simulation analysis tool. A test plan will be developed to structure and guide the performance testing activity. 6.2 Design Artifacts Software tools will be used to generate Unified Modeling Language (UML) diagrams from the prototype source code. UML is a widely accepted method to specify and document object- oriented software systems. UML offers a standard way to visualize a system's architecture, including elements such as: • actors • business processes • (logical) components • activities • programming language statements • database schemas, and • reusable software components A final draft of a Human Factors Design Specifications document will provide the specifics of a final recommended integrated design: • Static GUI & UI specifications of the display design, • Task definitions and logical flows to define the specific dynamics of the GUI and UI, • UML diagrams generated from source code of the high-fidelity simulation, and • Screen snapshots of the high-fidelity simulation. 7. Reference ISO 13407: Human Centred Design Process for Interactive Systems. 1999.