The document discusses test administrator error in biometric data collection. It notes that test administrator error is not currently included in the Human-Biometric Sensor Interaction model. The literature review found that test administrator training and performance metrics are needed to reduce errors and ensure data quality. The methodology section outlines a plan to identify sources of error, test administrator surveys and focus groups, and implement procedure improvements to reduce errors collected in a biometric study.
How to Troubleshoot Apps for the Modern Connected Worker
(2012) The Role of Test Administrator and Error proposal
1. BIOMETRICS LAB
Biometric Standards, Performance and Assurance Laboratory
Department of Technology, Leadership and Innovation
THE ROLE OF TEST
ADMINISTRATOR AND
ERROR
MICHAEL BROCKLY
MARCH 6, 2013
2. STATEMENT OF THE PROBLEM
• Test administrator error is not currently
included in the Human-Biometric Sensor
Interaction model, thereby potentially
attributing data collection errors to the
wrong metric
3. SIGNIFICANCE
• The test administrator has been ignored
in the Human Biometric Sensor
Interaction (HBSI)
• A portion of biometric data collection
error is due to the test administrator
• Test methodology needs to take test
administrator errors into account
• Taking additional performance issues
into account will help to meet the criteria
of data collection best practices
4. BIOMETRICS LAB
Biometric Standards, Performance and Assurance Laboratory
Department of Technology, Leadership and Innovation
REVIEW OF LITERATURE
5. QUALITY OF BIOMETRIC DATA
• “Data quality one of the most important
factors in the effectiveness of a biometric
system” (Hicklin & Khanna, 2006)
• “Poor data quality is responsible for
many or even most matching errors in
biometric systems” (Hicklin &
Khanna, 2006)
6. QUALITY OF METADATA
• Very important in biometric data
collections
• Connects biometric sample with the
variables that affect the sample
• Examples include:
– Gender
– Fingerprint characteristics such as moisture
– Number of attempts needed
7. TEST ADMINISTRATOR
• Critical to the biometric acquisition
process
• Takes various roles in data collection
• Used to reduce the amount of poor
quality data in a system
8.
9. BIOMETRIC PERFORMANCE
• Many factors affect the system
performance
• Human factors and usability
• Studies have shown that the subject has
a direct impact on the performance of the
system
11. TEST ADMINISTRATOR ERROR
• Can occur in biometric data and in
metadata
• Adversely affects the quality of biometric
data
• Literature has documented the need for
test administrator performance metrics
(Hicklin & Khanna, 2006)
12. TRAINING
• One method to reduce test administrator
error
• Prevent poor quality from the source
• Adhere to ISO 17025
– Internal auditing checklist
13. QUALITIES OF THE TEST
ADMINISTRATOR
• Knowledge
– Understanding of the test
– To correct procedures
• Leadership
– To instruct the test subjects
– Providing assistance if necessary
14. WORKLOAD
• Test administrators will have multiple
responsibilities
• Workload needs to be balanced
• Use automation when possible
– Reduce unwanted workload
– Prevent mental calculations
15. FATIGUE
• Fatigue, stress and distractions will affect
test administrator performance
• Maintaining vigilance and attention
reduces over time (Graves et al., 2011)
16. STRESS
• Additional errors and quality problems
increase with test administrator workload
and stress (Hicklin & Khanna, 2006)
• Throughput times
– Time constraints
17. DESIGNING THE DATA
COLLECTION
• System is designed to provide
functionality along with ease of use
• Cognitively engineered system
• Usability testing
18. SYSTEM EASE OF USE
• Well-made Graphical User Interface
(GUI)
– Free of extraneous information
• Ease of use for both test administrator
and subject
20. IMPACT ON THE SYSTEM
• Costs associated
• If errors remain unresolved it can
jeopardize data quality
• Impact on HBSI
21. SUMMARY OF RELATED WORK
• Literature has mentioned the need for a
test administrator (Graves et al., 2011)
(Theofanos et al., 2007)
• There is a need for test administrator
performance metrics
• The test administrator is not included in
the HBSI model
29. EXPERIMENTAL SETUP
• Data from survey is used to create
significance for project
• Data is analyzed from DHS Aging Study
visit 1
• System changes put into affect for DHS
Aging Study visit 2
30. PROCEDURE IMPROVEMENTS
• Based off test administrator error
frequencies
• Recommendations from literature and
test administrator surveys
• Improvements in:
– Consent (Demographic)
– Driver’s License Capture (Demographic)
– Fingerprint Statistics Capture (Metadata)
– Face Capture (Biometric data)
31. CONSENT
• Creating electronic consent form
• Eliminates need for paper documents
• Documents signed electronically
• Records saved to database
32. DRIVER’S LICENSE
• Introduce a procedure to check and
enter data directly into the database
• Subjects with missing or incorrect data
are automatically flagged for verification
33. FINGERPRINT STATISTICS
• Introduce procedure to enter data
directly into the database
– Mandatory that all fields are entered
• Corrected method for collecting oiliness
(sebum)
34. FACE COLLECTION
• Create standardized camera settings
• Correct test administrator challenge of
looking at external portrait template for a
standard distance
– Integrated portrait template on the device
itself
35. AFTER APPROVAL
• Put all system changes into effect
• Collect data in visit 2
• Analyze data for old and new errors
• Conduct post-collection survey for test
administrators
• Recommend further changes if
necessary
37. REFERENCES
• Braun, D. (1998). The role of funding agencies in the cognitive development of
science. Research Policy, 27(8), 807–821. doi:10.1016/S0048-7333(98)00092-4
• Campbell, J., & Madden, M. (2009). ILO Seafarers’ Identity Documents Biometric
Interoperability Test (ISBIT-4) Report. ILO (Vol. 2003, pp. 1–162)
• Database. (n.d.). Merriam-Webster dictionary. Retrieved from http://www.merriam-
webster.com/dictionary/database
• Druckman, J.N. and Green, D.P. and Kuklinski, J.H. and Lupia, A. (2011).
Cambridge Handbook of Experimental Political Science. Cambridge University
Press.
• Dumas, J., & Loring, B. (2008). Moderating Usability Tests. Elsevier. doi:978-0-12-
373933-9
• Elliott, S., Kukula, E., & Modi, S. (2007). Issues Involving the Human Biometric
Sensor Interface. In S. Yanushkevich, P. Wang, M. Gavrilova & S. Srihari
(Eds.), Image Pattern Recognition: Synthesis and Analysis in Biometrics (Vol.
67, pp. 339-363). Singapore: World Scientific
• Elliott, S. J., & Kukula, E. P. (2010). A Definitional Framework for the Human-
Biometric Sensor Interaction Model). doi:10.1117/12.850595
• Ernst, A., Jiang, H., Krishnamoorthy, M., & Sier, D. (2004). Staff scheduling and
rostering: A review of applications, methods and models. European Journal of
Operational Research, 153(1), 3–27. doi:10.1016/S0377-2217(03)00095-X
38. REFERENCES
• Hicklin, A., & Khanna, R. (2006). The Role of Data Quality in Biometric Systems.
White Paper. Mitretek Systems (February 2006), 1–77. Retrieved from
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.110.4351&rep=rep1
&type=pdf
• International Ergonomics Association (IEA). (2006). The Discipline of Ergonomics.
Retrieved February 23, 2011 from
http://www.iea.cc/01_what/What%20is%20Ergonomics.html
• International Organization for Standardization (ISO). (2005). Biometric
Performance Testing and Reporting – Part 1: Principles and Framework. ISO.IEC
FCD 19795-1
• International Standards Organization. (2006b). Software engineering – Software
product Quality Requirements and Evaluation (SQuaRE) – Common Industry
Format (CIF) for usability test reports (No. ISO/IEC 25062:2006(E)). Geneva:
ISO/IEC.
• International Organization for Standardization (ISO). (2010). Information
processing systems – Vocabulary – Part 37: Harmonized Biometric Vocabulary.
ISO/IEC FCD 19795-6.2
• International Organization for Standardization (ISO). (2011). Information
technology – Biometric performance testing and reporting – Part 6: Testing
methodologies for operational evaluation. ISO/IEC FCD 19795-6.2
39. REFERENCES
• Kushniruk, a W., Patel, V. L., & Cimino, J. J. (1997). Usability testing in medical
informatics: cognitive approaches to evaluation of information systems and user
interfaces. Proceedings : a conference of the American Medical Informatics
Association / ... AMIA Annual Fall Symposium. AMIA Fall Symposium, 218–22.
Retrieved from
http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=2233486&tool=pmcentre
z&rendertype=abstract
• Kukula, E., & Elliott, S. (2006). Implementing Ergonomic Principles in a Biometric
System: A Look at the Human Biometric Sensor Interaction (HBSI). Proceedings
40th Annual 2006 International Carnahan Conference on Security Technology (pp.
86–91). Lexington, KY: IEEE. doi:10.1109/CCST.2006.313434
• Kukula, E. P., & Elliott, S. J. (2009). Ergonomic Design for Biometric Systems.
Encyclopedia of Biometrics.
• Kukula, E., & Proctor, R. (2009). Human-Biometric Sensor Interaction: Impact of
Training on Biometric System and User Performance. In M. J. Smith & G.
Salvendy (Eds.), Human Interface, Part II, HCII 2009 (pp. 168–177). Berlin /
Heidelberg: Springer. doi:10.1007/978-3-642-02559-4_19
• Mansfield, T., Kelly, G., David, C., & Jan, K. (2001). Biometric Product Testing
Final Report (pp. 1–22). Teddington. Retrieved from
http://www.lgiris.com/download/brochure/uk_report.pdf
40. REFERENCES
• Murata, A., & Iwase, H. (1998). EFFECTIVENESS OF COGNITIVELY
ENGINEERED HUMAN INTERFACE DESIGN, 20(5), 7–10. doi:0-7803-5164-9/98
• Norman, D. A. (1986). Cognitive engineering. In D.A. Norman & S.W. Draper
(Eds.), User centered system design. Hillsdale, NJ: Erlbaum.
• Plan For Biometric Qualified Product List (QPL). (2005).
• Redman, T. C. (1998). Poor Data Quality on the Typical Enterprise.
Communications of the ACM, 41(2), 79–82.
• Ruthruff, E. (1996). A test of the deadline model for speed-accuracy tradeoffs.
Perception & Psychophysics, 58(1), 56–64.
• Sekaran, U. (2003) Research methods for business: A skill building approach.
• Senjaya, Benny. M.S., Purdue University, December 2010. The Impact of
Instructional Training Methods on the Biometric Data Collection Agent. Major
Professor: Stephen Elliott.
• Theofanos, M., Stanton, B., Micheals, R., & Orandi, S. (2007). Biometric
Systematic Uncertainty and the User. IEEE Conference on Biometrics:
Theory, Applications and Systems (pp. 1–6). doi:978-1-4244-1597-7/07
41. REFERENCES
• Wayman, J. (1997). A generalized biometric identification system model.
Conference Record or the Thirty-First Asilomar Conference on Signals, Systems
and Computers, 1, 291-295. Pacific Grove, California: IEEE.
doi:10.1109/ACSSC.1997.6802
• Wickens, CD., Lee, J.D., Liu, Y., and Gordon-Becker, S.E. (2004). An Introduction
to Human Factors Engineering. 2nd Edition, Prentice Hall, Upper Saddle
River, NJ.