Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Extending systems architecting for human considerations through model-based systems engineering
(USC Thesis Other)
Extending systems architecting for human considerations through model-based systems engineering
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
Copyright
by
Douglas Orellana
2018
EXTENDING SYSTEMS ARCHITECTING FOR HUMAN
CONSIDERATIONS THROUGH MODEL-BASED SYSTEMS
ENGINEERING
by
Douglas W. Orellana
M.S. Systems Engineering, Johns Hopkins University
B.S. Electrical Engineering, Johns Hopkins University
Dissertation
Presented to the Faculty of the Graduate School of
The University of Southern California,
Viterbi School of Engineering
Department of Astronautical Engineering,
Systems Architecting and Engineering Program
in Partial Fulfillment
of the Requirements
for the Degree of
Doctor of Philosophy in Astronautical Engineering
The University of Southern California
May 2018
3
ABSTRACT
From early heliographs to the modern day alphabets, humans have communicated
with one another by using a combination of symbols. As groups gather and began using
the same symbols, formal languages were developed within cultural boundaries. What
were common to each group and language were the building blocks that allowed people
to express and communicate with each other. Today, engineers have developed their own
vocabulary and symbols to communicate with each other.
The vocabulary and symbols are used in models and documents to represent a
system under development. As systems have evolved into more complex entities, the
need to increase and formalize modeling semantics has garnered greater importance.
When system architects and engineers saw the power of the Object Management Group’s
(OMG) Unified Modeling Language (UML) within the software engineering community,
they began to use UML for system development. When creating descriptive system
models with UML, the system engineering community recognized a gap in UML for
systems engineering. UML did not provide the necessary terminology that the system
community was accustomed to. In order to evolve the language, the system engineering
community decided to extend UML to meet their needs. Evolving UML with common
terminology frequently used within the system engineering community led to the creation
of the OMG System Modeling Language (SysML). Since its inception in 2007, SysML
has become the de-facto language for system architects and engineers for descriptive
system models. Most of the research dedicated to system modeling has been focused on
upfront conceptual design and architecture in the traditional system engineering
discipline. As we move forward to integrate descriptive models into analytical models,
there is a key opportunity to integrate other viewpoints into the system model by
4
extending current semantics and adding other non-traditional systems engineering
disciplines.
Today with the role of the human changing from operator of a system to an agent
within the system (Madni, 2011a) and with the need for greater system adaptability, great
importance is being placed on system architects and engineers to integrate the human into
the systems, to facilitate the interactions between the human and machine, as well as the
interfaces between them. In order to do this, the human element needs to be taken into
account and appropriately modeled to support human-machine system tradeoffs. Current
systems engineering practices address human-system integration as an afterthought (i.e.
only after the system has been architected). In this situation, when changes to the system
accumulate, redesign costs can spiral out of control. The issue is that people not trained in
the human factors engineering discipline are unable to communicate with those that are,
due to differences in understanding human characteristics, terminology and language.
Even within the human system integration and human factor communities there is no
general agreement on terminology and language. In order to better integrate humans into
systems, new semantics are needed to extend current system modeling semantics (e.g.
those associated with current model-based methods). The integration of the new
semantics will allow for human elements to be analyzed within a holistic view of the
system and the integration of the human element analysis at the system architecting
phase.
5
“We cannot seek achievement for ourselves and forget about progress and prosperity for
our community… Our ambitions must be broad enough to include the aspirations and
needs of others, for their sakes and for our own.”
Cesar Chavez
For all the immigrants and children of immigrants looking to find the American Dream,
Si Se Puede!
6
Acknowledgements
First and foremost I would like to thank my parents. Without their tireless
support, prayers, and love, I would not have been able to accomplish anything that I have
achieved thus far in my career. I would also like to thank my wife, who without her
encouragement, I would never have started this doctorate, let alone finish in the final leg.
She is a true inspiration to me, and she is always pushing me to be the best that I can be.
To my daughter, I hope our efforts to finish our doctorates give you the inspiration to
dream big and not be afraid to take risks. Your heritage and your roots are a legacy that
you will carry with you for the rest of your life.
A special thanks to Dr. Azad Madni for his help and guidance during this doctoral
process, I could not have asked for a better advisor, who understood the delicate balance
between research and practice. He is a role model and I hope to one day achieve even 1%
of what he has done in my career. His help and dedication to the underrepresented
minorities will help change the face of engineering in a time when we will need diversity
of thought to solve the next big challenge.
There are two professors, Dr. Ilene Busch-Vishniac and Dr. James West, which
started me on this path of academic research. Words cannot describe how thankful I am
for the life changing opportunity they gave me while at Hopkins. As an undergraduate
research assistant, they provided the guidance and inspiration to dream big, and to tell
myself I could do it even if I did not see others like me. They have been a constant
inspiration to continuously break barriers, to make it easier for the next generation, and
hopefully I am doing them proud.
Lastly, I would like to thank my committee Dr. Daniel Erwin and Dr. James
Moore for their support in my final push to complete this process.
7
Table of Contents
Acknowledgements ..................................................................................................6
List of Tables .........................................................................................................12
List of Figures ........................................................................................................13
NOMENCLATURE ..............................................................................................17
Chapter 1: Introduction .........................................................................................19
1.1 Research Motivation and Scope..............................................................19
1.2 Summary of Research .............................................................................20
1.2.1 Research Questions .....................................................................20
1.2.2 Research Contributions ...............................................................21
1.3 Structure of Dissertation .........................................................................22
Chapter 2: Literature Review ................................................................................24
2.1 Terminology ............................................................................................24
2.2 Characterizing Human Capabilities, Limitations and Challenges ..........26
2.2.1 Human Performance ...................................................................26
2.2.2 Human Error ...............................................................................27
2.2.3 User Acceptance .........................................................................29
2.2.4 Risk Perception and Behavior .....................................................30
2.2.5 Decision Making .........................................................................30
2.2.6 Human Adaptability ....................................................................32
2.3 Characterizing Human System Interactions, Interfaces, and
Integration Analysis Challenges ..........................................................32
2.4 Traditional System Engineering Process ................................................37
2.5 The Human Centered System Engineering Process................................43
2.6 Model-Based Systems Engineering ........................................................45
2.7 Human Centered Model-Based System Engineering .............................49
2.8 Gap Identification ...................................................................................60
8
Chapter 3: Research Methodology........................................................................62
3.1 Research Question and Hypotheses ........................................................62
3.1.1 Research Question and Hypothesis .............................................62
3.2 Research Approach and Design ..............................................................63
3.2.1 Method and Process Development..............................................63
3.2.2 Tool and Model Based Environment Development and
Implementation ...........................................................................67
3.2.2.1 Architectural Models ......................................................67
3.2.2.2 Analytical Models ...........................................................69
3.2.2.3 HSI Integration Tool .......................................................69
3.2.3 Case Study Illustration ................................................................70
3.3 HSI Ontology and Human Centered MBSE Methodology, Process
and Tool Validation .............................................................................71
Chapter 4: Human System Integration Ontology ..................................................73
4.1 HSI Ontology: What is It? ......................................................................74
4.1.1 Mechanisms ................................................................................76
4.1.2 Requirements ..............................................................................77
4.1.3 Human Agent ..............................................................................78
4.1.4 Behavior ......................................................................................78
4.1.5 Structure ......................................................................................79
4.1.6 Parametrics ..................................................................................80
4.2 Detailed HSI Ontology ...........................................................................80
4.2.1 HSI Ontology – Mechanisms Branch .........................................80
4.2.1.1 Governance .....................................................................81
4.2.1.2 Human Centered Architecture Procedure .......................81
4.2.1.3 Human System Verification ............................................83
4.2.1.4 Limitations ......................................................................83
4.2.2 HSI Ontology – Requirement Branch .........................................84
4.2.2.1 Human System Interaction Requirement ........................85
4.2.2.2 Human System Interaction Performance Requirement ...85
4.2.2.3 Human System Interface Requirements ..........................85
9
4.2.2.4 Human Agent Definition.................................................86
4.2.2.5 Training Requirement .....................................................86
4.2.3 HSI Ontology – Human Agent Branch .......................................87
4.2.3.1 Human Agent ..................................................................87
4.2.3.2 Role .................................................................................88
4.2.3.3 Competencies ..................................................................89
4.2.3.4 Anthropometric Features ................................................89
4.2.3.5 Resources ........................................................................89
4.2.3.6 Stressors ..........................................................................89
4.2.3.7 Workload.........................................................................90
4.2.3.8 Workload Management Strategy ....................................90
4.2.4 HSI Ontology – Behavior Branch ...............................................91
4.2.4.1 Storyboards .....................................................................92
4.2.4.2 Human Task Network .....................................................92
4.2.4.3 Human Agent Function ...................................................92
4.2.4.4 Human Agent Task .........................................................92
4.2.4.5 Component Maintenance Task .......................................93
4.2.4.6 Human Agent Decision ...................................................93
4.2.4.7 Human Interface Interaction Diagram ............................94
4.2.4.8 Visual Operation .............................................................94
4.2.4.9 Motor Operation..............................................................94
4.2.4.10 Auditory Operation .......................................................94
4.2.4.11 Speech Operation ..........................................................95
4.2.4.12 Cognitive Operation ......................................................95
4.2.4.13 Human Agent State Machine ........................................95
4.2.4.14 Stressors ........................................................................95
4.2.4.15 Workload.......................................................................95
4.2.5 HSI Ontology – Structure Branch ...............................................96
4.2.5.1 Human System Interface .................................................97
4.2.5.2 Human System Interface Mock Up.................................97
10
4.2.6 HSI Ontology – Parametrics Branch...........................................98
4.2.6.1 Human System Interface Analysis ..................................98
4.2.6.2 Workload Analysis..........................................................98
4.2.6.3 Cognitive Analysis ..........................................................99
4.3 HSI Ontology Reasoning and Metrics ....................................................99
Chapter 5: Implementing a Human Centered Model-Based System
Architecting and Engineering Environment ...............................................101
5.1 The Human System Integration Profile ................................................102
5.2 Integrating Descriptive Models with Analytical Models ......................107
Chapter 6: Human Centered Model Based Systems Engineering Case Study ...110
6.1 Case Study System Overview ...............................................................110
6.2 Architecture and Analysis .....................................................................110
6.2.1 The Concept of Operations .......................................................110
6.2.2 The Operational Concept and Systems Architecture ................113
6.2.3 The Human System Integration Analysis .................................120
6.3 Human Centered Model Based Systems Engineering Results .............126
6.4 Case Study Summary ............................................................................131
Chapter 7: Summary and Future Research Implications ....................................132
7.1 Summary & Research Implications ......................................................132
7.2 Research Contributions .........................................................................135
7.3 summary of accomplishments...............................................................136
7.4 Future Research ....................................................................................137
References ............................................................................................................140
Appendix A: Geospatial Intelligence System Architecture ................................151
A.1 GEOINT System Architecture (Manual) .............................................151
A.2 GEOINT System Architecture (Changes with AutoAssist) .................163
Appendix B: Human System Integration Analysis ..............................................164
B.1 GEOINT Human System Architecture ................................................164
B.2 GEOINT Human-System Analysis (Manual) ......................................171
11
B.3 GEOINT Human-System Analysis (Changes with AutoAssist) ..........191
12
List of Tables
Table 4-1. HSI Ontology Metrics ........................................................................100
Table 5-1. Human System Integration Concepts Extensions...............................105
Table 5-2. IMPRINT-Magic Draw Metamodel Mapping....................................108
Table 6-1. System Function to Operational Traceability Matrix .........................116
Table 6-2. Workload Overload Task Drivers ......................................................130
Table B-1. GEOINT System Primary Image Analyst Workload Trace ..............171
Table B-2. GEOINT System Primary Image Analyst Workload Detail ..............187
Table B-3. GEOINT System Primary Image Analyst Workload Trace with
Auto Assist ......................................................................................191
Table B-4. GEOINT System Primary Image Analyst Workload Detail with
Auto Assist ......................................................................................207
13
List of Figures
Figure 2-1. Evolution of Systems Engineering Process (International Council
on Systems Engineering, 2011; Systems Management College,
2001) .................................................................................................38
Figure 2-2. ANSI/EIA 632 Processes for Engineering Systems (American
National Standards Institute & Electronic Industries Alliance,
1998) .................................................................................................40
Figure 2-3. IEEE 1220 Application and Management of Systems Engineer
Process (Institute of Electrical and Electronics Engineers, 2005) ....41
Figure 2-4. ISO/IEC 15288:2008 Systems and Software Engineering: System
Life-cycle Processes (International Standards Organization et al.,
2008) .................................................................................................42
Figure 2-5. Systems Engineering Vee ...................................................................43
Figure 2-6. System Model (Wickens & Hollands, 1999) .....................................50
Figure 2-7. Human as a Subsystem (Chapanis, 1996) ...........................................51
Figure 2-8. The activities and ISO/IEC perspectives in Arnold et al. (2002)
Human System Model.......................................................................59
Figure 2-9. Gap Identification Matrix ....................................................................61
Figure 3-1. Extending MBSE for HSI ...................................................................65
Figure 3-2. Model Based Environment Implementation .......................................67
Figure 4-1. Ontology Domain Diagram. ................................................................74
Figure 4-2. HSI Top Level Ontology .....................................................................75
Figure 4-3. HSI Ontology – Mechanisms ..............................................................80
Figure 4-4. System Architecture Process ...............................................................81
14
Figure 4-5. HSI Ontology – Requirements ............................................................84
Figure 4-6. HSI Ontology – Human Agent ............................................................87
Figure 4-7. HSI Ontology –Behavior.....................................................................91
Figure 4-8. HSI Ontology – Structure....................................................................96
Figure 4-9. HSI Ontology – Parametrics ...............................................................98
Figure 5-1. Model Based Environment Implementation .....................................101
Figure 5-2. Human System Integration Profile Stereotypes ................................103
Figure 5-3. Human System Integration Profile Enumerated Data Types ............104
Figure 5-4. Human System Integration Profile Stereotype Customizations ........107
Figure 6-1. Perform Geospatial Intelligence Concept of Operations ..................112
Figure 6-2. Perform Counter Improvised Explosive Device Operations .............113
Figure 6-3. Monitor Direct Imagery Feeds OPSCON .........................................114
Figure 6-4. GEOINT Control Station Display Mock Up .....................................117
Figure 6-5. GEOINT UAV System Hierarchy ....................................................118
Figure 6-6. Human Requirements Traceability to System Model .......................118
Figure 6-7. Exploit Full Motion Video Imagery Task Network .........................119
Figure 6-8. Conduct Aerial Route Reconnaissance Task Network .....................120
Figure 6-9. Monitor Direct Imagery Feeds IMPRINT Mission ...........................121
Figure 6-10. Exploit Full Motion Video Imagery IMPRINT Task Network ......122
Figure 6-11. IMPRINT VACP Scale Values (Hunn & Heuckeroth, 2006).........123
Figure 6-12. Original Primary Analyst Workload Analysis Results (Hunn et
al., 2008) .........................................................................................124
Figure 6-13. Replicated Image Analyst Workload Analysis Results ..................125
Figure 6-14. Monitor Direct Imagery Feeds with Automated Assist Functions..127
15
Figure 6-15. Image Analyst Workload Analysis Results with Auto Assist
Functions .........................................................................................128
Figure 6-16. Architecture Workload Comparison ...............................................129
Figure A-1. Perform Geospatial Intelligence Concept of Operations ..................151
Figure A-2. Perform Counter Improvised Explosive Device Operations ............152
Figure A-3. Monitor Direct Imagery Feeds Operational Concept .......................153
Figure A-4. Exploit Full Motion Imagery............................................................154
Figure A-5. Maintain Voice and Chat with AVO ................................................154
Figure A-6. Direct AUV Sensor Employment .....................................................155
Figure A-7. Conduct Aerial Route Reconnaissance ............................................155
Figure A-8. Pick IMINT Sensor to Satisfy GEOINT ..........................................156
Figure A-9. ID Roadways on Imagery .................................................................156
Figure A-10. Plot Coordinates on a Map Image ..................................................157
Figure A-11. ID Unconventional Act on Imagery ...............................................157
Figure A-12. Provide Chip, Image, or Video Clip ...............................................158
Figure A-13. ID Vehicle Types on Imagery ........................................................158
Figure A-14. ID Roadways on Imagery ...............................................................159
Figure A-15. ID on Video and Object or Event ...................................................159
Figure A-16. Respond to Request for Imagery ....................................................159
Figure A-17. GEOINT System Hierarchy ..........................................................160
Figure A-18. Primary Image Analyst Definition .................................................160
Figure A-19. GEOINT Requirements ..................................................................161
Figure A-20. Primary Image Analyst GUI Mock Up ..........................................161
Figure A-21. Perform any Audio/Visual Capture VACP Sequence Diagram .....162
Figure A-22. Monitor Direct Imagery Feeds with Auto Assist ...........................163
16
Figure B-1. Monitor Direct Imagery Feeds IMPRINT Model ............................164
Figure B-2. Maintain Voice and Chat with AVO IMPRINT Model ...................164
Figure B-3. Exploit Full Motion Video IMPRINT Model ..................................165
Figure B-4. Conduct Aerial Route Reconnaissance IMPRINT Model ................166
Figure B-5. Pick IMINT Sensors to Satisfy GEOINT .........................................166
Figure B-6. ID Roadways on Imagery .................................................................167
Figure B-7. Plot Coordinates on a Map, Image or Geospatial .............................167
Figure B-8. ID Unconventional Act on Imagery .................................................168
Figure B-9. Provide Chip, Image or Video clip ...................................................168
Figure B-10. ID Vehicle Types on Imagery ........................................................169
Figure B-11. ID on Video an Object or Event .....................................................169
Figure B-12. Respond to Request for Imagery ....................................................169
Figure B-13. Direct AUV Sensor Employment ...................................................170
17
NOMENCLATURE
Acronyms and Abbreviations
ABS Automatic Braking System
ARL Army Research Laboratory
AUV Aerial Unmanned Vehicle
CPA Critical Path Analysis
DOD Department of Defense
DODAF Department of Defense Architectural Framework
GOMS Goals, Operators, Methods, and Selection Rules
HFACS Human Factor Analysis and Classification System
HLA High Level Architecture
HMI Human Machine Integration
HSI Human System Integration
HSI3 Human System Interactions, Interfaces, and Integration
HTA Hierarchical Task Analysis
HV Human View
IMPRINT Improved Performance Research Integration Tool
IDEF Integration Definitions
IEC International Electrotechnical Commission
ISO International Standards Organization
JASS Job Assessment Software System
MABA-MABA Men are better at, machines are better at
MBE Model Based Engineering
MBSE Model Based Systems Engineering
18
MDE Model Driven Engineering
MIDAS Man-Machine Integration Design and Analysis System
MODAF Ministry of Defence Architectural Framework
MOE Measure of Effectiveness
MOP Measure of Performance
NAF NATO Architectural Framework
NASA National Aeronautics and Space Administration
NATO North Atlantic Treaty Organization
OMG Object Management Group
OODA Observe, Orient, Decide, and Act
SAGAT Situation Awareness Global Assessment Technique
SART Situation Awareness Rating Technique
SE Systems Engineering
SEAPRINT System Engineering, Acquisition and Personnel Integration
SEP System Engineering Process
SMI Soldier-machine interface
SysML System Modeling Language
TACOM Tank-Automotive and Armaments Command
THERP Technique for Human Error Rate Prediction
TLX Task Load Index
UAV Unmanned Aerial Vehicle
UML Unified Modeling Language
UPDM Unified Profile for DODAF and MODAF
US United States of America
VTT Vetronics Technologies Test Bed
19
Chapter 1: Introduction
1.1 RESEARCH MOTIVATION AND SCOPE
Model-Based Systems Engineering (MBSE) is playing an increasingly important
role in the development of complex systems. Although engineers have used models as
way to describe aspects of reality for centuries and humans use mental models on a daily
basis, the use of descriptive and analytical models in systems architecting and system
engineering has only really been in the forefront of conversation within the last decade.
As systems have evolved into more complex entities, the need to increase and
formalize modeling semantics has garnered greater importance. When system architects
and engineers saw the power of the Object Management Group’s (OMG) Unified
Modeling Language (UML) within the software engineering community, they began to
use UML for system development. When creating descriptive system models with UML,
the system engineering community recognized a gap in UML for systems engineering.
UML did not provide the necessary terminology that the system community was
accustomed to. In order to evolve the language, the system engineering community
decided to extend UML to meet their needs. Evolving UML with common terminology
frequently used within the system engineering community led to the creation of the OMG
System Modeling Language (SysML). Since its inception in 2007, SysML has become
the de-facto language for system architects and system engineers for descriptive system
models. Most of the research dedicated to system modeling has been focused on upfront
conceptual design and architecture in the traditional system engineering discipline. As we
move forward to integrate these descriptive models into analytical models, there is a key
opportunity to integrate other viewpoints into the system model by extending current
semantics and adding other non-traditional systems engineering disciplines.
20
Today, with the role of the human changing that of a system operator to system
agent within the system (Madni, 2011a), and with the need for greater system
adaptability, great importance is being placed on how to integrate the human into
systems, to facilitate human-machine interactions, as well as how to design human-
machine interfaces. In order to do this, the human element needs to be taken into account
and appropriately modeled to support human-machine tradeoffs analysis. Current systems
engineering practices address human-system integration as an afterthought (i.e. after
architectures have been specified). In this situation, when changes to the system
accumulate, redesign costs can spiral out of control. The issue is that people not trained in
the human factors engineering discipline are unable to communicate with those that are,
due to differences in understanding human considerations that affect total system design,
terminology and language. Even within the human system integration and human factor
communities there is no general agreement with respect to the latter. In order to better
integrate humans into systems, new semantics are needed to extend current system
engineering modeling semantics. The integration of the new semantics will allow for
human elements to be analyzed in the holistic view of the system, in particular during the
architecting phase of system development, when up to 70% of the systems costs will be
allocated (International Council on Systems Engineering, 2011).
1.2 SUMMARY OF RESEARCH
1.2.1 Research Questions
The research problem I address is the integration of human considerations into
the architecting process and the development of an MBSE extension to consider the
human element. The motivating real world example is the interaction between human
agents and unmanned aerial vehicles (UAV).
21
With the increasing use of UAVs, the need to manage cyber-social operations has
multiplied in order to increase the safety and balance workload between multiple human
agents and his or her responsible UAVs. Processes are constantly being questioned to
whether they should be automated or done by the human agent in order to provide
enough cognitive bandwidth to minimize the number of UAV pilots needed on
operations. In order to succeed in this endeavor, the human considerations will need to be
brought upfront in the development process to better understand the functional tradeoffs
between the human and the machine, as well as analyze for metrics to ensure better
integration between the human and the machine. My research hypothesis is that by
extending Model-Based Systems Engineering methods, processes, and tools through key
viewpoints and factors, human performance can be increased. These factors and
attributes can be used to improve the evaluation of interactions, interfaces, and
integration of the human element into complex systems.
1.2.2 Research Contributions
Inspired by the evolving human role, a gap has been found in MBSE methods,
processes, and tools to support an integrative modeling environment to study human
system interactions, interfaces and integration during the system architecting process. The
goal of this research is to address this subject from a perspective of extending MBSE to
address human considerations during the architecting process. In particular, the research
looks at creating and validating a Human System Integration (HSI) ontology and using
these semantics to assist in integration multiple system models and viewpoints to explore
and address HSI issues during systems architecting, in particular the human workload. It
adds modeling elements to current viewpoints and creates new viewpoints with new
modeling elements, attributes, and constructs that will allow system architects and
22
engineers to take descriptive models and flow system information into analytical
modeling techniques towards the human element viewpoint. As a result of this research,
the HSI ontology provides the meta-model building blocks to standardize a profile for
system engineering modeling languages and a starting point for further extensions and
tailoring to fit different technology domains.
1.3 STRUCTURE OF DISSERTATION
This research is concerned with establishing the foundation for extending model-
based systems engineering for HSI. Applying the resultant methods, processes, and tools
to an exemplar system will demonstrate the capabilities of the framework and research
contribution.
Chapter one has introduced the need for new semantics and provides the rationale
and motivation for advancing the research in model based systems engineering for human
system interactions, interfaces, and integration design and assessment.
Chapter two reviews the current literature and research findings in HSI, define
important terminology, model based systems engineering, current HSI modeling efforts,
HSI extension pathways, and identifies the gaps that this research aims to fill.
Chapter three outlines the key technical building blocks required for the research
approach. This chapter presents the research question and hypothesis for this research.
The approach describes how I designed the verification of the research and the steps to
complete the research. The research design goes into specifics about the target
application and how it helps verify the research.
Chapter four outlines and describes the HSI ontology and its element. It defines
the elements and its usage in modeling human considerations into systems.
23
Chapter five outlines the HSI ontology implementation into a HSI profile and the
model-based environment that was built.
Chapter six describes and analyzes the efforts to model a real world example and
how the new process, methodology, and tools have helped in the development of this
system model.
Chapter seven summaries the research and highlights key findings and future
research opportunities based on findings from this current research.
24
Chapter 2: Literature Review
The literature review for human system integration (HSI) research consists of
examining HSI methods from a systems architecting and engineering perspective, and
evaluating model-based systems engineering (MBSE) methods, processes, and tools from
an extension perspective. To this end, Chapter 2 identifies the gaps in current methods,
processes, and tools that would enable human considerations using model-based system
engineering for architecting complex systems.
2.1 TERMINOLOGY
According to the Institute of Electrical and Electronics Engineers (IEEE) standard
15288, a system is a “man-made, created and utilized to provide products and/or services
in defined environments for the benefit of users and other stakeholders. These systems
may be configured with one or more of the following system elements: hardware,
software, data, humans, processes (e.g., processes for providing service to users),
procedures (e.g., operator instructions), facilities, material and naturally occurring
entities” (International Standards Organization, International Electrotechnical
Commission, & Institute of Electrical and Electronics Engineers, 2008).
An architecture is a set of “fundamental concepts or properties of a system in its
environment embodied in its elements, relationships, and in the principles of its design
and evolution” (International Standards Organization, International Electrotechnical
Commission, & Institute of Electrical and Electronics Engineers, 2011). Consequently, a
system architecture can be defined through a multiple-architecture approach: functional
architecture, physical architecture, and operational architecture (Buede, 2000).
In constructing the multiple architectures descriptions, an architectural viewpoint
(view) defines one or more concerns from one or more of the system’s stakeholders.
25
Architectural Frameworks attempt to standardize notations for standard views to describe
the system architecture (functional, physical, or operational).
The architecture and views can be represented through models. “A model is any
incomplete representation of reality, an abstraction … [where a] question or set of
questions … can reliably [be] answer[ed]” (Buede, 2000). In developing models, a
simulation adds the dimension of time and non-linear dynamic behaviors in a virtual
environment, which can be supplemented through real world asset integration. An
environment here is external entities and conditions at the boundary of the system scope.
The system and environment can be influenced by their interactions and interfaces.
Analysis refers to the interpretation of measures of effectiveness (MOE) and
measures of performance (MOP) gathered from the model data. The evaluation compares
the viability and practicality of MOEs and MOPs between candidate architectures.
Human System Integration (HSI) is a “technical and managerial concept … [that]
significantly and positively influence the complex relationships among: (1) people as
designers, customers, users, and repairers of technology; (2) government and industrial
organizations that regulate, acquire, design, manufacture, and/or operate technology; and
(3) methods and processes for design, production, and operation of systems and
equipment” (Booher, 2003). HSI methods and processes attempt to design for “normal”
human use, but also they must take into account possible misuse, unintended use, and
abuses of the system (Chapanis, 1996). Humans can be classified as a system user or
system agent. A system user “is outside the system and uses the system as a tool to
accomplish a goal” (Buie & Vallone, 1995). A system agent performs functions that
support both the agents and systems goals (Buie & Vallone, 1995).
26
2.2 CHARACTERIZING HUMAN CAPABILITIES, LIMITATIONS AND CHALLENGES
Humans have used tools and built systems to enhance human capabilities for ages.
In the beginning the systems were built to enhance and extend the physical performance
of humans. With the age of computers, the use of cybernetics has increased the
complexity of systems. Systems no longer just enhance and extend the physical
performance of humans, but now, are extending and enhancing the cognitive performance
of humans (Hollnagel & Woods, 1999). As this complexity has increased, humans have
transitioned from operator of the system to a system agent, making them integral to the
overall performance and optimization of systems. To truly optimize system performance,
human capabilities and limitations must be considered. If one tries to dehumanize the
human agent within the total system architecture, the system will lose the value of human
capabilities, and only maintain the limitations (Kurstedt, 2000a, 2000b, Madni, 2011a,
2012).
2.2.1 Human Performance
Human Performance is affected by many factors: fatigue, boredom, stress,
attitudes, motivation, and personality (Chapanis, 1996). One of the challenges that this
presents is that each human has their own threshold levels for when these factors may
affect physical and cognitive performance. It is necessary to provide the correct level of
stimulation so the human operator or agent can be engaged in their role and functionality.
For Example, the Yerkes-Dodson Law (1908) defines the optimal regime of human
performance. Outside of the inverted U-curve, factors such as fatigue, boredom, stress,
and motivation can have an adverse impact on human performance. Also, architects and
engineers need to be aware of human recovery time, where humans can restore expended
physical and mental energy (Chapanis, 1996).
27
Under cognitive overload conditions, humans typically employ various cognitive
strategies, to release stress and fatigue: degraded task performance and shedding tasks
(Madni, 2010, 2011a). When humans receive information, the brain goes through an
attending process. This process filters out information that is not needed for current tasks
(Chapanis, 1996). Through the attending process humans internally develop a task
hierarchy where humans classify, organize, and prioritize the information received
(Sheridan & Ferrell, 1974). The prioritization causes less important tasks’ performance to
degrade due to the low priority of processing information needed for those tasks. If these
tasks are still impeding higher priority tasks from being degraded, in order to keep the
level of performance needed at the higher priority tasks, lower level tasks get shed.
For rare occurring events, humans are unlikely to respond to the rare event due to
the fact that humans select the outcomes by the degree in which it is represented by the
evidence at hand (Kahneman & Tversky, 1973). Due to this fact, rare event monitoring
should be assigned to automation with facilities to issue the necessary alerts and provide
feedback to humans to intervene, if needed.
2.2.2 Human Error
Human error has been attributed as a key factor to system failures (Three Mile
Island, Chernobyl, Metrolink Train Accident, etc.). Human error is hard to design for,
due to the failure being measured both by explicit and implicit criterion that is arbitrarily
chosen under different context (Sheridan, 2008). Some believe that training is the key to
minimize human errors, while others look to the human system interface as the root of the
issue: not allocating resources and tasks appropriately between the system and the human
(Madni, 2010). In order to overcome weaknesses in training and in the human system
interface, systems need to be architected as expert consultation system: a system that
28
either teaches or assists in the system’s function (Madni, 1988c). But even with
attempting to integrate HSI with traditional system architecting and engineering domains,
HSI domains are fragmented themselves; there are no standard metrics to measure human
performance, and there are no driving principles that can guide human performance
(Madni, 2010).
Human operators and agents, although usually trained to use or be part of the
system, may lack full knowledge of the systems ins-and-outs. This gap in knowledge of
the systems operations can contribute to system failures and errors (Madni, 2010, 2011a).
The human agent/operator’s actions may mismatch the actions that the system may be
requiring from them. As seen in Three Mile Island findings, one of the causes of failure
was deficiency in procedures that caused confusion and incorrect actions (Three Mile
Island, 1979). Due to lack of knowledge of the system, operators did not know how to
react to the conditions being seen. This lack of knowledge of the internal workings of the
plant and the supporting systems led to events that would ultimately disrupt the plant. If
the operators had had intimate knowledge about the system they might have been able to
prevent Three Mile Island accident by counteracting early warnings that were disregarded
by the procedures (Three Mile Island, 1979).
Stress and errors are usually tied to one another when humans explain why errors
were made. Humans naturally tend to make more errors under stress. Stressors can be
both physical and psychological factors that can degrade or influence information
processing and cognition performance (Wickens & Hollands, 1999). As such, stressors
must be managed in architecting systems with humans. Architects need to understand that
human allocated functionality will add certain stressors to the human, and must consider
how that may affect the humans’ physical or psychological performance over time. This
will introduce new requirements into the system for human required capabilities and
29
limitations. For example, in a very intensive task it might be advantageous to create a
requirement on the system that would require there to be two human agents, where the
agents alternate break times in order to not degrade performance of the system.
Even with full knowledge of the system and minimizing stressors, system failures
that have been attributed to human error truly comes down to poor system architectures
and designs, rather than the human reaction (Wickens & Hollands, 1999). It usually
requires more than one breakdown in the system, and in some cases the perfect
combination of system degradation, to cause the system failures. In the well-known
example of the Metrolink train, located outside of Los Angeles, an accident occurred
when the operator did not react quickly enough to the change as the indicator lights
turned on. This was not considered in the architecture or design. If the system had been
architected from the beginning realizing that humans have lapses and slips,
countermeasures of possible fault occurrences could have architected into the system
preventing a collision in the stretch of the single lane tracks.
2.2.3 User Acceptance
As technology has progressed and has been made available to the masses, human
trust depends on consistency of machine responses to human (Madni, 2017). Today
humans use their smartphones (iPhone, Android phones, etc.) as another part of their
body, quite often having more intimate relationships with the device than the people
around them. This trust between humans and devices can make or break a system.
Although the average person does not know the inner workings of their smartphones,
they trust that they are safe and secure to use because of its familiarity to them (Madni,
2010, 2011a). This familiarity of phones and technology integration to them has
improved the confidence humans using these devices. As such, architects must be able to
30
build a trust through the transparency and consistency of the human system interface
(Madni, 2015; Madni & Freedy, 1986). The simpler the design, the better, as over
complexity can be rejected by those that do not understand it (Madni, 2010, 2011a).
2.2.4 Risk Perception and Behavior
Risk is evaluated through the probability of occurrence and the severity of
consequences if the event were to occur. Each person has their own tolerance for risks
(Madni, 2010, 2011a), although in certain scenarios humans tend to react in a similar
manner even though their thresholds may be different. Humans tend to predict not based
on probabilities but on intuitive outcome (Tversky & Kahneman, 1998), which affect the
risk one is willing to take. In tests run by Slovic and Tversky (1974), students showed
contradictions to the independence principle (sure thing principle), where students did not
pick a gamble that inherently is better in accordance with the principle. This deviation
from what would be considered the norm is why architects need to better understand risk
perception of humans while architecting systems. On the other hand, if the consequence
of an action cannot be perceived by a human, this uncertainty falls under the probability
of the environment which the human can only compare to (Wickens & Hollands, 1999).
2.2.5 Decision Making
In complex systems, human operators or agents must make vital decisions quickly
and under high stress. Humans gather information from the five senses. The stimulus is
then stored in one of three areas of memory: sensory memory, which is short term storage
received from the sense, working memory, which stores a limited amount of information
for conscious thinking, and long-term memory, where information is stored permanently
(Axelsson, 2002). When making decisions, due to our reduced ability to process all
information in working memory, humans tend to simplify and disregard or under weigh
31
factors that can attribute to decisions, especially under stress (Axelsson, 2002; Madni,
2011a).
While stress is an environmental force that can alter a human decision making
process, humans already internally are at odds, balancing intuitive and analytic cognition
(Hammond, Hamm, Grassia, & Pearson, 1987). Intuitive cognition is judgment; it is
based on feeling or past experiences. While analytical cognition is calculated, it is based
on a normative approach. Because humans balance these modes of cognition, biases are
formed when making decisions that could impair the outcome of the decision process.
Madni (1988), Vasarhelyi (1977) examined decision making through cognitive aspects
for man-machine systems. Tversky and Kahneman (1998) present cognitive heuristics
that should be used when architecting that can affect system performance due to the
decision making process of humans: anchoring, availability, representativeness, internal
coherence, and consistency. Gerhardt-Powals (1996) presents 10 cognitive principles to
help architects design systems for human interactions. These heuristics and principles are
needed because humans tend to make decisions based off of the confidence they have
regarding the information that is available, which can cause systematic errors (Tversky &
Kahneman, 1998).
Even once a decision has been made, humans are more confident in their answers
than they should be. Fischhoff, Slovic, and Lichtenstein (1977) set up experiments
showing that people over-judge their confidence in their answers; in Experiment 3,
subjects that estimated odds of 50:1 should have estimated 3:1 and subjects that estimated
1000:1 should have estimated 5:1. This overconfidence in decision-making can lead
humans to be overly confident in an action that can lead to failures. To combat this,
information needed to assess the decision-making process needs to readily available and
32
the architect must plan processes and procedures for the human agent or operator to get
necessary training on the system and its operations.
2.2.6 Human Adaptability
Although it seems that humans can accomplish many tasks at once, humans do
not multitask well (Madni, 2011a). If tasks require intensive processing, humans switch
tasks in a sequential manner, multiplexing the processing capacity between the tasks, and
limited by a natural switching rate (Sheridan & Ferrell, 1974). The switching rate can
become more burdensome as the tasks context become more distinct from one another
(Monsell, 2003). The power of a human is the ability to learn tasks; as tasks are learned,
the processing capacity grows, and the amount of multiplexing is reduced (Sheridan &
Ferrell, 1974). Although humans can adapt to new conditions with added stress, this
adaptability process is slow and is neither absolute nor perfect, and will constrain the
adaptability of systems (Madni, 2011a).
Humans bring a level of intelligence to systems that allow humans to respond to
real-time unforeseen events (Arnold, Earthy, & Sherwood-Jones, 2002; Madni, 2010,
2011a). This added benefit to integrating humans into a system is great, but it comes at
the cost of all the other capabilities and limitation humans can have. The speed in which
humans can adapt is not fast. Humans adapt at a slow pace, and although they can
problem solve unforeseen events, it might not be at the rate in which the system may
need it.
2.3 CHARACTERIZING HUMAN SYSTEM INTERACTIONS, INTERFACES, AND
INTEGRATION ANALYSIS CHALLENGES
“The longer one waits to implement HSI, the more negative impact will be shown
on the total [lifecycle cost]” (U.S. Air Force, 2009). Similar to other aspects of the system
33
development the decisions made earlier in the development process can attribute to the
costs that it would take to fix these decisions later on. The later decisions on the system
are taken, the more the costs to integrate those changes increase, up to one hundred times
more expensive once the system is in production. Because of the costs to integrate,
architectural decisions and changes are cheaper during the conceptual design and
architecture process, HSI must be considered during the conceptual design and
architecting process and carried through the system development, in order to reduce costs
and reduce integration issues from the beginning. Anywhere from 40 to 60 percent of
system lifecycle costs can be attributed to HSI (Hardman & Colombi, 2012). By
addressing HSI up front in the development process, the system can be made more
responsive to human capabilities and limitations.
HSI integrates the domains of manpower, personnel, training, survivability,
safety, occupational health, environment, habitability, and human factors engineering
(Booher, 2003). Throughout the HSI domains, there are no standardized semantics or
language that can be used throughout the community. When you add HSI into another
aspect of system architecting and engineering even less information is shared. ISO/IEC
15288 attempts to solve the discontinuity between systems architecting, engineering and
element design within specialty engineering disciplines, by providing a framework of
common system level thinking (Arnold et al., 2002; International Standards Organization
et al., 2008). EIA 632 has also been mapped to system engineering processes to better
help integrate the HSI processes (Ruault, 2004).
In examining HSI, there are questions that need to be answered in order to fully
comprehend the role of the human within the system (Chua & Feigh, 2011):
What should the human-system do?
How should the human-system achieve these goals?
34
Who should operate the system?
When and where might the system operate?
In order to answer these questions, HSI needs to be concerned with making the right
tradeoffs between HSI and other system aspects, whether the human-machine system
exhibits the desired behavior, whether any system software is perceived to be usable, and
whether the system tries to reduce the likelihood of human error and take advantage of
human capabilities (Hobbs, 2008; Horn & Barnaba, 1988; Madni, 2010; Milner &
Wheeler, 2001; Rizvi, Singh, & Bhardwaj, 2009).
HSI must become a core consideration within system architecting and engineering
for it to affect the system architecting process. By concurrently developing a system with
system engineering process (SEP) and tying in HSI methods, the human will become an
important component of the architecture and design (Buie & Vallone, 1995).
The need for more integration has been caused by the complexity of systems and
the amount of automation systems have been designed with. Automation is used to
extend functions humans do not wish perform or cannot perform (Parasuraman, Sheridan,
& Wickens, 2000). As technology keeps improving, automated systems are more
responsive to the humans they interact with (Dolan & Narkevicius, 2005). Managing
these interactions requires architects to decide the correct level of automation to minimize
human performance costs (Parasuraman et al., 2000). In order to choose the correct level
of automation and the desired functions for humans during the requirement analysis and
functional analysis, the architect must survey the functions and capabilities the system
must do and begin to allocate these functions between the system and the human. As the
allocation takes shape, and the system progresses through the development process,
system and human performance must be analyzed to ensure that decisions in the
allocation will not affect the overall performance.
35
When allocating functions the architects must decide which functions are better
performed by humans or machines. Sheridan (2000) recommends that architects must go
beyond men are better at–machine are better at (MABA-MABA) (Fitts, 1951) to optimize
the functional partition. Due to advances in technology and layers of control and
command in systems, the solution is not as simple as MABA-MABA. Other tools that
could help with partitioning are Task Analysis (Hobbs, 2008; Polson, 1992), Cognitive
Function Analysis (Boy, 1998; Roth, Patterson, & Mumaw, 2002), Hierarchical Task
Analysis (HTA) (Shepherd, 1998), Goals, Operators, Methods, and Selection Rules
(GOMS) (Card, Moran, & Newell, 1983), and Critical Path Analysis (CPA) (Baber,
2004). After allocating functions between the system and human, it is important to relate
the functions in a greater context by not only viewing the information needed for the
functions, but also viewing how operations will be completed by diagramming the
interactions (McMaster & Baber, 2005; Meilich, 2007).
In order to optimize a system and make it resistant and resilient to failures
techniques need to be used to look at the processes and tasks being performed for
inconsistencies and deviations (Cugola, Di Nitto, Fuggetta, & Ghezzi, 1996). Where
humans are concerned, they are not treated any differently than any other agent or
subsystem of the system. Techniques for identifying and attempting to reduce the impact
and likelihood of human errors that affect the system have been developed: Technique for
Human Error Rate Prediction (THERP) (Swain, 1964).
Once functions have been allocated, analysis on human performance will need to
ensue. To determine effectiveness of the functional allocation, metrics and measures for
mental and physical workload, situational awareness, complacency and skill degradation
will need to be determined (Parasuraman & Riley, 1997; Parasuraman et al., 2000;
Parasuraman, Sheridan, & Wickens, 2008). Concept maps allow architects to analyze the
36
relationships of concepts to better understand how they all fit together (Novak & Cañas,
2006). The job assessment software system (JASS) allows the architect to design for forty
basic human abilities and aptitudes that underlie performance of tasks given to the human
operator or agent (Rossmeissl, Tillman, Rigg, & Best, 1983). NASA’s Task Load index
(TLX) allows to quantify mental and physical workload (Hart, 2006; Hart & Staveland,
1988).
Situational Awareness is an area where architects need to control the number of
observe, orient, decide, and act (OODA) loops for assessing system operations. The
speed that a human can go through the OODA loop for his or her tasks will directly affect
the system performance, or system failure. The speed will correlate with human load
being expended, which in turn will use the human resources necessary to execute the
OODA loops and perform tasks.
Even with optimizing the functional allocation, training can affect the human
system performance. Some think that what cannot be done through the development
process can be compensated by a good training program (Madni, 2010; Madni, Madni, &
Garcia, 2005a, 2005b; Madni, Madni, Garcia, & Sorensen, 2005) or assisting the operator
or human agent through designing aiding (Madni, 2011b). In the design phases
ergonomics takes a key factor, whether it is where a human sits or the displays in which
the human has to operate with (Booher, 2003; Hartson & Hix, 1989). Then there is the
role of human agents interacting with other agents and trying to analyze the social and
emotional aspects of humans in regards to the system. In order to analyze for this,
looking at the interpersonal cycle (Kurstedt, 2000a, 2000b) you can begin to see the
synergy between the human agents and the relationships between their thoughts, feelings
and actions.
37
Cognitive engineering attempts to integrate HSI aspects to get a holistic human
view with the system. The infusion of cognitive architectures into system architectures
begin to depict the cognitive building blocks (Madni, Sage, & Madni, 2005) to a total
cognitive system approach and better understanding of a human-system solution (Roth et
al., 2002; Sharples et al., 2002; Wood & Roth, 1988).
As seen in this section, there are many methods, tools, processes and facets of
HSI that can be analyzed. Unfortunately, all of these tools, methods, and processes do not
integrate all facets of HSI analysis or into other aspects of system architecting and
engineering. Model based system architecting and engineering can attempt to integrate
HSI into the larger role of system architecting and engineering and bring HSI upfront
when architectural and design decisions are being made.
2.4 TRADITIONAL SYSTEM ENGINEERING PROCESS
There are multiple standards and processes and levels of compliance that help
address system development and human system integration (HSI); these standards and
processes are managed by varying bodies such as the International Electro-technical
Commission (IEC), American National Standards Institute (ANSI), and International
Standardization Organization (ISO). Similarly, the United States Government manages
standards for military system development. All the current standards are set of
agreements between enterprise bodies, but there is no set universal agreement. For my
research, I have focused on the better-known standards as discussed below.
38
Figure 2-1. Evolution of Systems Engineering Process (International Council on Systems
Engineering, 2011; Systems Management College, 2001)
The systems engineering process has evolved over time. Systems engineering was
conceived in the defense industry as early defense systems advanced technology. Many
of the modern luxuries we use today had their inception in early technological advances
seen during World War II and beyond. From the arms race to the space race, these leaps
of technology required new methods and processes for developing more complex
systems. Although they might not have called it systems engineering, the foundations
those engineers placed and the heuristics that they followed have laid the foundation
upon which we continue to evolve technology. As they developed the basic principles for
system science, systems theory, and system engineering, the complexity of today’s social
cyber physical systems requires newer methods that can be integrated into the
foundational processes to bring upfront the topics not emphasized in the past.
As discussed thus far, the human element needs to be considered and as seen in
the following processes, human considerations are usually integrated into the
implementation aspect of the systems engineering process and are not reflected in the
39
early definition stages. The Institute of Electrical and Electronics Engineers (2005), state
that “The human elements are not identified in the system hierarchy since the intent of
the hierarchy is to identify the system element for which the system is being defined, and
the human/system integration issues should be addressed in terms of the human’s role in
operating, producing, supporting, etc.”
40
Figure 2-2. ANSI/EIA 632 Processes for Engineering Systems (American National
Standards Institute & Electronic Industries Alliance, 1998)
41
Figure 2-3. IEEE 1220 Application and Management of Systems Engineer Process
(Institute of Electrical and Electronics Engineers, 2005)
42
Figure 2-4. ISO/IEC 15288:2008 Systems and Software Engineering: System Life-cycle
Processes (International Standards Organization et al., 2008)
43
Figure 2-5. Systems Engineering Vee
2.5 THE HUMAN CENTERED SYSTEM ENGINEERING PROCESS
The traditional system engineering process takes requirements elicited from
stakeholders, as well as operational understanding of the environment and mission to
better understand the required behavior of the system. Systems engineers explore the
functions and how they are best allocated to logical groupings that enable low coupling
and high cohesion between the functions. The logical architecture is then mapped to a
physical architecture. Current architecture processes gloss over the human element by
just identifying interfaces to the human element, but usually do not go deeper into the
understanding of the human element.
In a human centered system architecting process, addressing the human element
means addressing many more considerations than just the identification of the human-
machine interface. During behavioral analysis, human functions are explored in light of
operations and missions. Success of the system equates to a balance of functionality
between the machine and the human. Functions get allocated to logical groupings and
44
human roles. The allocation of functions must follow human consideration, in particular,
how it relates to cognitive, sensory, and physical traits of the human.
During behavioral analysis, human elements are allocated tasks (activities) for
which the element is responsible. In current systems engineering methods, architects tend
to identify very generic high-level functions for these elements, which tend to be set-
aside until human system integration professionals become involved in the
implementation. This point in the architecture is crucial, as an architect weighs the level
of automation and data the system may need to have, which in turn, can affect the
effectiveness of the human element. With the ability to explicitly include the human
element as part of the architecture, architects can begin to balance human element tasking
and the necessary situational awareness that the human need to have during operations.
The logical architecture is then expanded to a physical architecture. One particular
area of interest is the implementation of the interface and how it integrates with the
cognitive, sensory, and physical traits of the human. Human factors and ergonomics play
a big role in whether an interface will be a success because human have limitations on
what can be processed and tolerated over time.
In order to support the process explained above, a human system integration
ontology can provide the necessary semantics needed to better integrate the human
considerations into the system architecting process. The ontology provides the necessary
semantics and rules that align human considerations with traditional system architecting
considerations within current model based systems engineering constructs. It provides an
avenue for system architecture to align itself with human system analyses that will
disposition system architecture gaps in performance due to the human element. With the
ontology, the integration of descriptive and analytical models is possible as well as
45
providing a framework and rules for which to analyze system architectures with human
considerations.
2.6 MODEL-BASED SYSTEMS ENGINEERING
The use of models is a standard practice in engineering. System architecting and
engineering employ models to represent complex systems. Whether descriptive,
analytical, or dynamic, models are used to simplify a problem, to ensure consistency
among various parts, and to communicate the solution trade space for system design
synthesis. The use of descriptive and analytical models allows engineers to ensure
consistency and increase communication effectiveness. Historically, model based
engineering practices and methodologies have been used in traditional engineering; in
mechanical engineering, studying the effect of stress on structures or in electrical
engineering, studying the behavior of circuits.
MBSE integrated system methods and processes to create modeling artifacts as
opposed to merely documents as process outputs. MBSE stores all system information in
an easy-to-access form, in a central repository. The full context of the problem and
multiple levels of system architecture and design are captured in the repository. The
system model is captured using various viewpoints (lenses) that allow for multiple types
of analyses to be conducted. The integration of viewpoints and analyses results enables
the creation of a total system picture (Madni, 2015).
Even without using a standard tool or pen and paper, engineers tend to use
cognitive models to better understand the problem at hand. Cognitive models are
extremely powerful tools for understanding an approach to solving analytical problems
and building the knowledge base of the problem (Madni, 1988a, 1988c). These primary
cognitive models must be translated to more formal and standardized models as system
46
complexity increases to simplify and capture more information about the system. By
modeling a complex system, architects and engineers are able to better capture the four
facets of complexity discussed by Axelsson (2002): scale, diversity, connectivity, and
optimization. In particular the architect can track elegant system characteristics and
metrics (Madni, 2012) to better trade system alternatives (Bahill & Madni, 2017; Neches
& Madni, 2012).
MBSE elevates models in the engineering process to a central and governing role
in the specification, design, integration, validation, and operation of a system (Estefan,
2008). Since the 1950s different techniques have been used to model systems, from
functional block diagrams to integration definitions (IDEF) (Mayer, Crump, Fernandes,
Keen, & Painter, 1995; Mayer, Painter, & DeWitte, 1992) to system modeling language
(SysML) (Object Management Group, 2010). All the languages are built on semantics
and have a syntax that defines proper ways of combining the symbols to form thoughts
and concepts (Buede, 2009).
Since the release of the Object Management Group (OMG) SysML in 2007,
MBSE has been in the forefront of discussion in the aerospace and defense industry.
According to Memmel, Gundelsweiler, & Reiterer (2007), SysML is one of the most
powerful modeling languages for system engineering. Due to its popularity within the
system engineering community, SysML has become the de-facto modeling language for
developing descriptive models. With growing awareness, management of system
development complexity can no longer be done using traditional document-centric
methods. System artifacts can no longer stay static: changes need to propagate through
the various documents, and need to be managed and communicated to all stakeholders.
In 2008, Estefan completed a survey on leading model based system engineering
(MBSE) methodologies used in the industry: Object-Oriented System Engineering
47
Method (Booch, 1986; Cloutier & Griego, 2008; Krikorian, 2003), IBM Rational
Telelogic Harmony – System Engineering, IBM Rational Unified Process for System
Engineering (Nolan, Brown, Balmelli, Bohn, & Ueli, 2008), Vitech MBSE, NASA Jet
Propulsion Lab State Analysis, Dori Object Process Methodology, System Modeling
Process, and Process Pipelines in Object Oriented Architectures. These methodologies
are expected to be adopted and tailored into the engineering community, but they are all
focused on upfront architecture and do not integrate specialty engineering aspects nor do
they support integration of the descriptive models to analytical modes.
Extending semantics and syntax to fit new domains has been the cornerstone of
the modeling community to increase capabilities. SysML itself is an extension to the
OMG’s Unified Modeling Language (UML) (Object Management Group, 2011). Before
SysML was developed, system architect and engineers attempted to use UML for system
definition (Hoffmann, 2005). Because UML was designed for software architecture, there
were many constructs that needed to improve to satisfy system definition. SysML was
developed to provide system engineers with a useful set of modeling constructs to capture
system structure, behavior, requirements and parametric relationships for systems
comprised of both software and hardware elements. By selectively extending UML to
reflect system semantics, SysML made it easier for system architect and engineers to
model systems. These diagrams allow the system architect and engineer to capture and
allocate functions and tasks to the system elements. SysML is a good foundation for the
system engineering community to build upon and extend (Herzog & Pandikow, 2005). In
the same vein, today there are groups who have begun to look at creating semantics that
could potentially help extend UML/SysML to include constructs related to HSI
(Bruseberg, 2008).
48
SysML has allowed new ways to integrate descriptive and analytical models.
These methods allow the use of SysML diagrams for design analysis using and
integrating a variety of modeling and simulation tools. Examining all the different kinds
of models that are developed for a system, the need to integrate these models becomes
apparent as they each add different information that is useful for the system definition
(Neches & Madni, 2012).
To get the various viewpoints, it begins by integrating requirements and models
(Dick & Chard, 2004). Requirements are agreed upon criteria to sell off the system. By
linking the requirements to the models, architects and engineers may be more assured that
the models are to the level of fidelity that has been agreed upon. As such, different layers
of models will need to be not only linked to requirements, but also to each other as well
as to all the engineering disciplines. SysML helps with integration of descriptive models
to continuous system dynamics models by partnering SysML with UML and the OMG’s
Modelica modeling language (Johnson, Paredis, & Burkhart, 2012; Johnson, Paredis,
Jobe, & Burkhart, 2007). This partnership allows the descriptive models to be combined
with analytical models and simulations for better understanding of system behavior and
metrics.
Vanderperren and Dehaene (2006) discuss approaches to couple UML/SysML
with Matlab and Simulink. Their approach allows them to synchronize SysML models
with Simulink models to better understand metrics and aspects of the system. Peak et al.
(2007) use the SysML parametric diagrams, in conjunction with Modelica modeling
language, and are able to model and analyze mechanical dynamics as well as other
system dynamics. In an effort to make it easier to integrate models together tool vendors
are making it easier for the engineers and architects by creating the next generation
system architecting and engineering workspace for complex systems (Bajaj et al., 2011).
49
This new workspace eases the architecture process by allowing models of different
disciplines and domains to integrate into one. The International Council on Systems
Engineering space systems working group has modeled and analyzed a cube satellite
using this new workspace (Spangelo, 2013; Spangelo et al., 2012)
“In every case, the essence of a model is the question or set of questions that the
model can reliably answer for us” (Buede, 2009). In Vision Systems 2020, Boehm et al.
(2010) and Booz Allen Hamilton (2010) have seen that MBSE is an enabler for creating
more affordable and resilient systems, but it needs to be extended to include more
engineering domains as well as be tailored for mission domains. When it comes to using
MBSE for HSI analysis, MBSE must be extended to support answering questions for
integrating a human into the overall system and optimizing system performance. Just as
extending SysML has helped expand the use of modeling to the system architecting and
engineering field, extending SysML for the specialty systems engineering field,
specifically for HSI, will further allow for increased communication leading to system
performance enhancement.
2.7 HUMAN CENTERED MODEL-BASED SYSTEM ENGINEERING
The objective of human centered model-based system engineering is to consider
human actions in multiple viewpoints. This multifold consideration of human actions
should increase functional effectiveness and it should allow us to apply information about
human characteristics and behavior in a more systematic way (Balakrishnan, 2002).
Landsburg et al. (2008) presents various cases in which different aspects of HSI have
been applied to different programs; however, it can be seen that there is not one holistic
approach to consider HSI. Modeling and simulation can provide an excellent workspace
50
to achieve the trade-offs necessary across the HSI and system domains (Narkevicius,
2008).
Figure 2-6. System Model (Wickens & Hollands, 1999)
To understand HIS, there has to be a holistic view of the system. At the highest
level of abstraction, both the machine and the human subsystems will take in inputs and
will be affected by stressors and processes. This combination of inputs leads to the
system output, whether it be positive or negative responses to the environment, as seen in
Figure 2-7.
51
Figure 2-7. Human as a Subsystem (Chapanis, 1996)
Within the system, the balance between the human and machine must be
considered. In the past humans were usually modeled as external entities (Arnold et al.,
2002), but as seen in Figure 2-6 and Figure 2-7, in accordance with ISO 15288, humans
are now treated as agents and must be considered as any other subsystem. Analyzing how
the interactions between the sub elements work as one. As seen in Figure 2-7, the human
agent senses the outputs of the machine and the human response is the input to the
machine. Both machine and humans have a set of required capabilities and functionality
to meet the system vision and objectives.
In order to better understand the functionality, systems concept mapping tools
(Novak & Cañas, 2006) can be used to be explore functionality. Once functions and tasks
are developed, there is a need to categorize functions and tasks to the humans and
machines. Currently, modeling approaches for human factors analysis and the human
machine interface assessment revolves around human performance modeling using task
networks.
52
Micro Saint Sharp, a task analysis-modeling tool, was developed for psychologist
and human factor engineers to analyze human and system activities. The developed task
networks include an array of attributes such as timing, probabilities, conditions, etc.
(Bloechle & Schunk, 2003; Hood, Laughery, & Dahl, 1993). Micro Saint has been
opened to other tools using the Department of Defense (DoD) High Level Architecture
(HLA) for inter-model communications (Bloechle & Laughery Jr., 1999), allowing
multiple tools and teams to analyze process and human workload with a larger tool set.
In order to ensure an easier integration the need for common semantics will allow
information from various tools to be understood in a common method as well as ease data
exchange from models.
Over the years, the Army Research Laboratory (ARL) has used the building
blocks of Micro Saint and developed the Improved Performance Research Integration
Tool (IMPRINT). IMPRINT is a modeling tool for human-system task network with
specialized analytic capabilities to analyze human versus system function allocation,
mission effectiveness modeling, maintenance manpower determination, mental workload
estimation, prediction of human performance under extreme conditions, and assessment
of performance as a function of varying personnel skills and abilities (Allender, 2000).
IMPRINT has been used to develop models to describe crew workload levels in the
Shadow unmanned aerial vehicle (Hunn & Heuckeroth, 2006) and the US Army Tank-
Automotive and Armaments Command (TACOM) Vetronics Technologies Test Bed
(VTT) soldier-machine interface (SMI) simulator (Mitchell, 2003). Both these models
studied workload allocation to different crew roles to better analyze their mental loads
during the mission. During the analysis, areas of high mental load could call for
investigating whether or not system automation is a viable option for the task or if there
are possible system characteristics that could alleviate some of the human mental load.
53
Madni (1988a & 1988b) and Allender (2000) have described various tools to structure
and analyze the impact of different functional allocation schemes on cognitive workload.
ACT-R helps with the study of the cognitive capabilities of the human agent.
ACT-R is a multi-agent simulation environment that supports developing cognitive
models in an attempt to predict and better understand human behavior (Taatgen, Lebiere,
& Anderson, 2006). ACT-R analyzes the cognitive capabilities and limitations by
assessing the interaction of declarative and procedural memory for decision making using
symbols and rules that affect chunks of memory (Lebiere & Anderson, 1993).
Cognitecture™ is a next generation unified agent architecture (NGUAA) developed by
Intelligent Systems Technology, Inc. under DARPA-sponsored research. This approach
overcame the limitations of existing agent-based architectures by incorporating cognitive
and behavioral considerations to ensure that human behavior did not come across as too
robotic. This agent architecture, which was successfully employed in human behavioral
simulation models for distributed training simulations, became a DARPA success story
(Madni, Moini, & Madni, 2006).
Man-Machine Integration Design and Analysis System (MIDAS) identifies and
models human and automation interactions by representing human-machine functions,
designed for crew stations (Corker & Smith, 1993). MIDAS attempts to take multiple
views considering physical representation, perception and attention, human cognitive
load, physical world, relationships, rules, and decision activities to build an integrated
model. MIDAS includes many agents that are responsible for each aspect. Jack looks at
humans’ size and joints in a virtual mannequin for physical activities and simulated
human behavior (Badler, Phillips, & Webber, 1993). MIDAS attempts to bring in
multiple HSI facets together, but like the task analysis tools it does not integrate well with
54
other specialty engineering disciplines and still leaves HSI after the system architecture
has been built.
Another key area that has been investigated is human performance enhancement
is through aiding and training (Madni, 2011b). With respect to HSI, models and
simulations allow humans to familiarize themselves with system controls and system
behaviors by interacting with the system without potentially causing harm to the system.
The models and simulations also offer the opportunity to train on low probability events
that can potentially cause off nominal operation that humans do not expect to experience.
In these situations, training can be provided to speed up the response time, were such an
event to occur. Advisor 3.5 allows architects to look at which method of delivery of these
tasks could better be suited for learning purposes, as well as potential uses in design
(BNH Expert Software Inc, 2000).
As with current HSI modeling tools, these tools are independent of the
architecture process and the decision-making in the conceptual design of the system.
Therefore, engineers using these types of tools and methodologies usually tend to insert
themselves into the process after the architecture is set and any major design decision
may have huge monetary implications. These tools are usually only used for design after
the fact. In order to truly integrate HSI into the architecture process and the rest of the
engineering process, semantics must be developed to ease integrations of engineering
methodologies, processes, and tools, as well as opening up communications between
engineering disciplines.
Building common semantics for HSI has not had much traction within the HSI
community or the system engineering community. There have been two attempts of
building partial constructs to be used in the architecting and engineering of systems but
neither has come to fruition, or been fully implemented or used.
55
In an attempt to fulfill the gap in architecting humans into system, the IDEF
community attempted to develop IDEF 8, Human-System Interaction Design Method.
IDEF 8 was created to look at three different levels of human system interaction: 1)
overall system operation, 2) role-centered scenario of system use, and 3) design
objectives implemented through a library of metaphors used as best practices for detailed
design (Mayer et al., 1995). IDEF 8 employed interaction diagrams (activity based
diagrams) to facilitate allocation of functions between a user and a system. The functions
described had to deal with actions detailing interactions with physical controls and
displays. Then, with the use of the library of metaphors, the designers are able to use the
metaphors to design the controls and displays of the system. Although IDEF 8 was a
good attempt to bring human system interaction upfront in the lifecycle it never took off
and it had limited coverage of HSI aspects.
Recognizing the need for human viewpoints of the system, the North Atlantic
Treaty Organization (NATO) undertook an effort to examine ways to better evaluate
human system compatibility, and created the NATO RTO HFM 155 Human View
Workshop. The Human Views were intended to expand the NATO Architectural
Framework (NAF) by documenting the unique implications of humans for system design
(Handley & Smillie, 2008). With the emergence of SysML, The United States
Department of Defense (DoD) and the United Kingdom Ministry of Defence (MoD) with
industry partners developed a profile extending both UML and SysML to depict the DoD
Architecture Framework (DoDAF) and the MoD Architecture Framework (MoDAF)
constructs. The Unified Profile for DODAF and MODAF (UPDM) emerged as a way to
use the UML/SysML tools to capture system architecture using these frameworks and
their respective meta models (Hause, 2010).
56
With the introduction of the human viewpoint in NAF and the potential
introduction into DoDAF and MoDAF, these constructs will be required during
acquisition of defense systems for the respective organizations. The system currently
being developed using these frameworks has slowly been transitioning to using model
architecture tools to capture and manage complex architectures. The tools and the
modeling languages they support will need to be extended to include the human
constructs in accord with the suggested human view meta-models. The change of
paradigm to bring human factors upfront instead of having them remain an after thought,
will allow architectures to be more aligned with human requirements and limitations, as
well as with the changing role of humans as agents instead of as simply a system
user/operators.
In an attempt to better understand the role of the human agent, the NATO
proposed Human Views look at various aspects of humans that may affect system
performance. The NATO Human Views are HV-A Concept, HV-B Constraints, HV-C
Functions, HV-D Roles, HV-E Human Network, HV-F Training, HV-G Metrics, and
HV-H Human Dynamics (Handley & Smillie, 2008). In particular the following subset of
views are intended to allow architects to better understand the human-machine
interactions in accordance with the NATO definitions (Handley & Smillie, 2008):
HV-A View - similar to an OV-1 in DoDAF, the view is a high level pictorial and
textual description focused on where the human system interactions occur. This high
level understanding is necessary to model the context in which the human machine
interaction will occur; conditions and assumptions captured will communicate how the
interfaces will evolve throughout the system life cycle.
HV-B Constraints View - these views will express the human limitation,
capabilities, and the expected roles and functions of the overall system. Predicting
57
manpower and human characteristics will specify the type of roles needed and the
numbers of human needed to fill each role. It will depict the evolution of the system to
show the dynamics of how these roles affect the system by adding or subtracting human
agents.
HV-C Functions View – these views are intended to define the partitioning and
allocation of functions to the human versus the system. It will allow to further decompose
functions to workable tasks that a human would accomplish in a process or work
instruction. By decomposing the functions, performance criteria can then be allocated to
each task in accordance with the role that will complete the task. By defining the
function, there will be traceability to the justification of why a function is allocated to a
human instead of a machine.
HV-D Roles View – this view allows the characterization of roles by defining
competencies, responsibilities, and authority. Each role characterized will bound and
constrains what the human can or cannot inflict on the system.
HV-E Human Network View – this view looks at capturing human-to-human
interactions. As agents of a complex system, there may be a need to capture these
interactions between human agents, as they might affect how the system will perform.
HV-G Metrics View – this view looks at defining performance metrics for
assessing human agents in accordance with their role and required competencies.
HV-H Human Dynamics View – this view take the previous views and add a
time perspective to see the evolution of the parameters defined and how they may change
during operations and with evolution of the system.
Handley and Smillie (2009) present a preliminary mapping to a subset of human
views. This mapping can relate to artifacts in IMPRINT. However, the human views are
only focused on acquisition aspects of HSI rather than system architecting and system
58
development. The human views present an opportunity for the architecture process by
adding new views as constructs to architecture modeling languages. The modeling
languages not only allow architects to document and communicate the human viewpoint,
but also for the traceability and integration of these constructs with system architecture
artifacts. This capability allowed for a consistent language, and constructs between
architects and human factor engineers to relate the human agents interaction with the
machine elements of the complex system. It also provided a suitable mechanism for
decision making to ensure that this significant cost driver is addressed upfront (Baker,
Stewart, Pogue, & Ramotar, 2008).
The human constructs not only provide the ability to integrate the human system
integration factors to the architecture process, but the same semantics will allow the
human system integration requirements to be tested, verified, and validated. The same
constructs used for architecting complex systems will be able to be used to design the test
systems and test cases. The common semantics will provide the traceability capabilities
already built in to system constructs to ensure that human agent activities are being tested
under varying scenarios that are based on the use case scenarios developed in the
architecture. The semantics will also allow values of attributes to be captured, allowing
test data and human attributes to be captured from testing activities. The data captured
can then be used to improve the workload analysis by adding real data to the tasking
network analyses.
In order to align common HSI semantics for interdisciplinary use and full
lifecycle coverage the common semantics should cover the varying system aspects as
discussed in standards, such as ISO/IEC 15288. Arnold et al. (2002) discusses how a
human system model with four views that covers ISO/IEC activities in perspectives, as
seen in Figure 2-8: human factors in the lifecycle, human factors integration, human-
59
centered design, and human resource processes. In the same vein, common semantics
should expand these four views for full coverage.
Figure 2-8. The activities and ISO/IEC perspectives in Arnold et al. (2002) Human
System Model
As conveyed in the following chapter, this research filled in several key gaps
presented in this literature review. Specifically, a common ontology for HSI was
developed to cover the system and other aspects of HSI. As a result to this ontology, the
constructs enabled optimization of human centered model based system architecting and
engineering of human-machine systems.
60
2.8 GAP IDENTIFICATION
This literature review presents ongoing research and current practices in human
tasking, human performance analysis, human modeling, system modeling, modeling
systems in accordance with architectural frameworks, and integrative system models. As
seen in Figure 2-9 none of the research has covered human element consideration in the
system engineering process using model-based systems engineering methods, processes,
and tools. As seen in this literature review, the human element is usually only considered
in the detailed design phases of the system lifecycle, after major architectural decisions
have already been made.
Creating system engineering semantics for human system interactions, interfaces,
and integration aids in the system architecting and engineering decision-making during
the system development. These semantics ease communications between system
engineers and human factors and human system integration engineers by opening up
overall system analysis with the human element and carrying those semantics through the
system architecture to detailed design and throughout the system development.
61
Figure 2-9. Gap Identification Matrix
62
Chapter 3: Research Methodology
The purpose of this research is to determine the extent to which Model-Based
Systems Engineering (MBSE) can be extended to integrate, modify, and create methods,
processes and tools to architect and engineer the human element into complex systems.
The purpose of this chapter is to 1) identify the questions that this research is trying to
answer and explain the hypothesis of this research, 2) describe the approach to answering
and proving the hypothesis, 3) describe the design of the procedures used in architecting
and engineering the human element, and 4) provide the verification methodology for
methods and processes used for analyzing the human element.
3.1 RESEARCH QUESTION AND HYPOTHESES
As seen in Section 2.4, Model-Based Systems Engineering (MBSE), the MBSE
methods, processes, and tools have been focused in the conceptual design and
architecting of complex system through the Object Management Group System Modeling
Language (SysML) and Unified Profile for DoDAF and MoDAF (UPDM). There has
been little work done to represent the human element into these methods, processes, and
tools. In particular, to enable full lifecycle coverage of these methods, processes, and
tools through an integrated model based environment.
3.1.1 Research Question and Hypothesis
This research attempts to answer, “How to increase human performance through
Model-Based Systems Engineering extensions to support human centered architecting
and evaluate the human performance in complex systems?”
The research hypothesis is that by extending system semantics with human-
system factors and attributes Model Based Systems Engineering can be used to evaluate
the interactions, interfaces, and integration of the human element.
63
The second hypothesis is that by extending Model-Based Systems Engineering
methods, processes, and tools through key viewpoints and factors, the ability to
understand and analyze human performance can be increased.
3.2 RESEARCH APPROACH AND DESIGN
The research takes a three-phase approach: method and process development, tool
and model based environment development and implementation, and case study
illustration.
3.2.1 Method and Process Development
As seen in Chapter two, there are many methods and processes in the human
system integration (HSI) community and model based system architecting and
engineering community. These methods and processes will be taken into consideration to
develop an integrative methodology and process for evaluating a human agent and its
effect on total system performance.
Looking at the various tools and analyses that have been done relative to humans
as agents within a complex systems, most tools have been user-centric or system centric
(Price, 1994). We need to go beyond that. Advances in computing and cybernetics are
changing the focus of human-machine behavior. As such, performance analyses tools
need to address the cognitive aspect of behavior, since systems are doing more than
enhancing human physical traits. They are enhancing cognitive capabilities and mental
capacities.
As seen in Chapter two, there are several tools, methodologies, and frameworks
that have been developed by each discipline to answer and analyze their respective areas,
whether it is human factors or system architecting and engineering. The need to truly
64
view systems as a cognitive extension to humans requires integrating these models to
analyze interaction behavior and cognitive workload over time, as well as analyzing
alternative function allocation schemes to assess how system behavior and human
cognitive load changes based on the allocation scheme. The allocation of functions to
balance human agent cognitive load and machine workload is critical because both
humans and machines each have different strengths and limitations (Madni, 1988b).
These strengths and limitations to some extent and in some cases can be captured in
SysML models for systems. By integrating these models, it becomes possible to explore
functions that either the human or the machine could perform, and then analyze the
effects each would have on the other.
With the rise of automation, there is a tendency to automate anything and
everything to the extent possible. In reality, there exists the need to further analyze
autonomy levels to ensure a proper balance of safety and performance optimization of
humans as agents. It is important to note that humans detect failures better when they are
active components of the overall system (Hollnagel & Woods, 1999). If too much is
automated, there exist the risks of losing a skill or of experiencing skill degradation
(Parasuraman et al., 2000). A simple and pervasive example is to look at handwriting.
Since computers became a readily available consumer product, more and more schools
require the use of computers for typing essays and reports. Children are being exposed to
the keyboard earlier than they are to using a pen for writing. Script writing, once a
required skill, is now being replaced with typing. Another example of risking
degradation of skill is the automatic flight systems on airplanes. A US advisory board is
recommending pilots fly more manual hours because with current conditions, pilots are
increasing their proficiency in computer-aided flying, but degrading their manual flight
65
abilities (Todd & McConnell, 2011). Due to these risks, the proper level of system
automation needs to be re-examined.
Parasuraman et al. (2000) describes a framework for automation design that could
be used to integrate the structural, behavioral, requirement, and parametric models of
human agents and systems to seek a balance between automation and manual functions.
The framework presented is one way this research can approach extending model based
system architecting and engineering, to create new methods, processes and tools to
integrate humans into the system using a model based engineering approach.
Figure 3-1. Extending MBSE for HSI
In order to fill the current gap in modeling HSI throughout system architecting in
a multidisciplinary team, this research establishes and employs the methods, processes,
and tools for architects and engineers to evaluate for HSI for effective decision-making
and system development. To enable the methods, processes, and tools, a HSI ontology
66
was developed. The HSI ontology is an innovative approach for enabling the extension of
MBSE and creating the blueprint to standardizing common HSI and system engineering
semantics. The HSI ontology describes a HSI meta-model that enables integrating current
SysML and UPDM model elements, attributes, viewpoints with HSI analysis techniques,
as well as other system profiles, into a common modeling environment. The common
modeling environment is a key element for modeling HSI throughout the system lifecycle
with the support of a multidisciplinary team of engineers. The resulting meta-model has
evolved into a modeling language profile that can be further developed for greater
modeling fidelity and tailored for future system model development to include new
domains and any new viewpoints that may be needed in the future.
With the ontology established, a HSI profile allows for the case study system to
be modeled with explicit HSI considerations. The case study system is discussed in
section 3.2.2. Current modeling research findings provide the building blocks for the
modeling practices, relationships, and metrics. These findings are leveraged both for
developing the ontology and the case study system model. Any gaps and limitations with
the building blocks have been filled or improved upon. The HSI ontology is a necessary
development because HSI is currently being ignored in system architecting. The ontology
provides a comprehensive assessment of HSI.
With the system model developed, the next step is to integrate the system model
with the HSI considerations to be integrated to an overall model based environment. This
integrates the descriptive system models to the analytical system models used to optimize
system performance metrics. The research exploits existing research and explores
alternatives to options to provide the most precise results.
67
3.2.2 Tool and Model Based Environment Development and Implementation
In order to be able to use human centered model based architecting and
engineering methods and processes, a model-based environment was created to support
system modeling for HSI. Without this environment the HSI ontology would not be able
to enable the study of the human agents into systems.
Figure 3-2. Model Based Environment Implementation
As seen in figure 3-2, the model-based environment consists of architectural
models, analytical models and a HSI integration tool. The model-based environment will
be implemented in a certain configuration, but will focus on the theory. The theory will
be tool agnostic so other tools with similar capabilities can implement the environment to
support the HSI ontology and modeling constructs.
3.2.2.1 Architectural Models
The standardized modeling languages are the basis for descriptive modeling, and
have been extended by the HSI ontology. Due to their large following with the system
engineering community, the architectural model segment uses the de-facto system
68
modeling languages of SysML and UPDM. SysML provides semantics and modeling
constructs to describe a system through requirement, behavioral, structural, and
parametric diagrams. Through these diagrams an architect and engineer can describe
systems through varying viewpoints. UPDM provides semantics and modeling constructs
to describe systems and system of systems through standardized architectural
frameworks, in particular the Department of Defense Architectural Framework 2.0.
Between SysML and UPDM it provides the building blocks for the HSI Profile, which
improved and extended the SysML and UPDM semantics and modeling constructs to fit
HSI analysis.
In order to use SysML and UPDM, an architectural modeling tool was used to
capture all system-engineering artifacts in a common modeling database for the model-
based environment. For this research, No Magic’s Cameo EA v18.2 was used. Although
other architectural tools can also be used, this research study chose Cameo EA due to
current successes in use for system modeling research, in particular, as it is the tool of
choice of the standards body, OMG and the International Council on Systems
Engineering. Cameo EA is an architecture and system modeling tool that can be used by
system engineers, business and system analysts, architects, software engineers, business
and IT consultants, and project leaders. It supports UML, SysML, and UPDM, as well as
other standard modeling languages.
Cameo EA allows for simple descriptive modeling simulations based off of
architectural artifacts developed in UML and SysML. The Cameo EA simulation toolkit
executes activity models as well a state machine. In addition to behavioral models Magic
Draw simulation kit has the ability to use parametric diagrams to solve parametric
equations modeled within the tool. It also provides extensive application protocol
interface (API) to include other expression solvers.
69
3.2.2.2 Analytical Models
In the development of systems a variety of models, simulations and analytical
data is captured to help optimize and analyze the system for performance. Typical tools
used in this arena range from Microsoft Excel to Matlab, or custom-made tools. In
chapter two, many HSI tools were presented that are focused on HSI analytics. In
particular this research uses the Army Research Laboratory Improved Performance
Research Integration Tool (IMPRINT).
IMPRINT is a modeling tool for human-system task network with specialized
analytic capabilities to analyze human versus system function allocation, mission
effectiveness modeling, maintenance manpower determination, mental workload
estimation, prediction of human performance under extreme conditions, and assessment
of performance as a function of varying personnel skills and abilities (Allender, 2000).
3.2.2.3 HSI Integration Tool
The HSI Integration tool is the key to enabling system descriptive models and
HSI analytical models. This tool was built from the ground up to become a gateway
between Cameo EA to IMPRINT. The integration tool analyzes the tool metamodels and
maps the tool metamodels to make a conversion from one tool to another. This
integration tool used the defined HSI ontology as the as the driver for the mapping that
can be used to integrate more tools to fit within the framework this research built. The
conversion of data is solely focused on items of interest in accordance with the HSI
ontology that drives the necessary HSI analysis.
70
3.2.3 Case Study Illustration
In order to illustrate the human centered model based architecting and engineering
methods, processes, and tools, a model focused on the interaction between an Image
Analyst and an Unmanned Aerial Vehicle was developed.
The UAV system models and its interactions with an Image Analyst were
developed in accordance with publicly available literature from the Army Research
Laboratory (ARL) in Aberdeen, Maryland (Hunn & Heuckeroth, 2006; Hunn,
Schweitzer, Cahir, & Finch, 2008). This model was built to focus on the viewpoint of
architecting the human agent into the system and focused on the Image Analyst interface
with the Image Analyst Control System to provide Geospatial information to the
intelligence community and cue UAV operators to needed mission requirements.
The Geospatial UAV system model is not a complete system model and only
shows a typical high level systems engineering modeling package that will deep dive in a
scenario that shows the use of the HSI Ontology. This scenario tracks the image analyst
tasks as an UAV delivers full motion video and images from its electro-optical and
infrared mission sensor suite. The task described in this scenario is a replication of the
study done at ARL (Hunn et al., 2008). This study was chosen due to the appropriate
level of details and fidelity needed to drive detailed design analysis in IMPRINT to
showcase that the ontology can ease the transition of human element design into the
overall system, as well as increase communication of system parameters.
The Geospatial UAV system was proposed as a case study system because 1) it
offers a viable candidate to conceptualize human centered model based architecting and
engineering methods, processes, and tools, 2) it is a complex system 3) because current
UAV systems must consider a variety of HSI metrics to reduce current manning
requirements, and 4) in a effort to move to UAV swarms, the current UAV systems in
71
development must consider balancing automation with operator tasking. The HSI
ontology drove the metrics that were considered. These metrics are focused on the four
pillars of the descriptive modeling: requirements, behavior, structure, and parametric.
3.3 HSI ONTOLOGY AND HUMAN CENTERED MBSE METHODOLOGY, PROCESS AND
TOOL VALIDATION
The HSI ontology provides a set of terminology and domain specific semantics
that drive the overall human element consideration. Validation of the HSI ontology
consists of answering the following questions:
What are key HSI factors that will drive HSI analysis from conceptual
architecture through detailed design?
What are the HSI impacts to requirements, behavior, structure, parametric,
and system engineering process mechanisms?
What is the correlation between system attributes and HSI attributes?
What are the dependencies between HSI attributes and factors?
Using the HSI ontology to build a meta-model and in turn the SysML profile,
enabled the use the HSI ontology within the system model to consider the human
element. This integration of terminology and modeling constructs allowed for responses
to the listed questions to be answered. The responses to these questions have resulted in
the ability to determine human element impact on the desired system, which drives the
performance analysis that was further validated from the data of the operational systems.
The case study illustration serves as the validation medium of the HSI ontology
and the human centered MBSE methodology, process, and tools. In order to validate the
system model, data from current Unmanned Aerial Vehicle design was used; in
particular, the control system’s interface with image analyst. By using current UAV
design and operating procedures, I was able to use performance data of human analysis to
72
compare results of the scenarios and use cases modeled using the HSI ontology and
human centered MBSE methods, processes, and tools. Similar results of performance
prove that the ontology and human centered MBSE methods, processes, and tools add
benefit to the system architecting process and overall system engineering body of
knowledge.
73
Chapter 4: Human System Integration Ontology
From early heliographs to the modern day alphabets, humans have communicated
with one another by using a combination of symbols. As groups gathered and began
using the same symbols, formal languages were developed within cultural boundaries.
Common to each group and language were the building blocks that allowed people to
express and communicate with each other. Today engineers have developed their own
vocabulary and symbols to communicate with one another. One of the methods for
capturing and standardizing communications is creating an ontology. An ontology is a
structure of concepts (entities) and logical relationships. It captures the vocabulary being
used by given consistent definitions to entities (semantics), how to express the
relationship between entities (syntax), and the appropriate use of the entities given
different context (context sensitivity). Ontologies are big part of establishing knowledge
systems and lend themselves to be used in model based environments due to their
inherent focus on entities, which translate into individual pieces of data. Models are
driven by strict axioms to be useful, and ontologies provide a method for those axioms to
be explored and defined, whether it is the ontology of mathematical concepts like
differential equations (which are driven by known physical rule sets), or modeling
languages such as SysML (that provide a very simplistic view for meta object
relationships within systems engineering, e.i. a block is composed of parts).
In this chapter, new semantics are explored to better integrate humans into
systems and extend current system modeling semantics. The integration of the new
semantics allows for human elements to be analyzed in the holistic view of the system.
74
4.1 HSI ONTOLOGY: WHAT IS IT?
Figure 4-1. Ontology Domain Diagram.
An absolute and integral part of any design is proper communication among all
stakeholders in the development process (Balakrishnan, 2002). The HSI ontology
becomes an integral part of having better communications between all engineers,
particular system architects, system engineers, and human specialty engineers. Figure 4-1
shows how the HSI ontology defines a meta-model and constrains an extension profile to
extend SysML and MBSE for HSI considerations.
75
Figure 4-2. HSI Top Level Ontology
76
There are several areas that affect HSI; the HSI ontology looks at various areas
within the framework of the system modeling pillars and other considerations that will
give a more holistic system view with the perspective of the human. Collectively, these
factors will provide the semantic underpinnings for defining and managing the human
element within the mission and system context. A unified view of these factors is
presented in the top level HSI ontology (Figure 4-2). The HSI ontology offers a unifying
means of concerns and expectations of the human element. Considering the various
factors in these areas can proceed to increase communication between system architects,
engineers and human factors/human system specialists. Specifically, the HSI ontology
provides the building blocks to bring human element considerations upfront versus
relegating it to the detailed design phase. The HSI ontology informs us that HSI is
composed of mechanisms, requirements, human agents, behavior, structure, and
parametric. The HSI ontology can guide the HSI processes and facilitate communication
among stakeholders by allowing a better representation of the man machine system into a
descriptive model that will drive analytical models.
The key concepts represented in Figure 4-2 are discussed in the next subsections.
As seen in Figure 4-2, elements in green represent the pillars of MBSE, elements in blue
represent SE semantics considered, and the elements in tan are the HSI concepts being
integrated together for in depth understanding of human consideration into the system
architecture.
4.1.1 Mechanisms
The mechanism portion of the HSI ontology is focused on the human system
integration processes, procedures, tests, and verification required. This viewpoint of the
ontology will focus its effort at a basic level to assure appropriate precautions have been
77
taken to integrating the human element into the overall system. The integration
procedures will attempt to circumvent adverse effects and failures between components
(Madni & Sievers, 2013) with a specific focus on effects produced by the human agent as
well as effects produced by that impact the human agent.
4.1.2 Requirements
During requirement exploration one of the key steps to understand the human
requirements is to interview the potential users and agents to better understand the roles:
need to figure out requirements for skills of roles to allow to better selection of personnel
(Balakrishnan, 2002).
Requirements serve as the contractual guidance for the acceptance criteria of any
system under development. Requirements range from functional to performance and are
detailed at different level of abstractions. At the top level the requirements specify intent
versus implementation. As requirements get refined and derived the requirements get
more precise in nature until it begins to specify the implemented configuration of the
system. In order to integrate the human agent into the system more attention in the
specification of the human agent and HSI must be specified at all levels of abstraction.
The ontology will attempt to explicitly highlight these human centered requirements in
the modeling environment as you would highlight any other system functional and
performance requirement. By explicitly highlighting these requirements it will be easier
to trace the requirements to the other aspects of the modeling pillars: behavior, structure,
and parametric. The written requirements should complement the system model, as to
overcome limitations on inferring what is not explicitly modeled in the system model
(Leveson, 2000).
78
4.1.3 Human Agent
While the human agent should be treated as any another element within the
system, the machine should complement the human agent and match human
characteristics to the agent functions and performance needs (Miller, Crowson Jr, &
Narkevicius, 2003). It is equally important to specify human agent characteristics as well
as other system agents. The human agent characteristics should include, but not be
limited to, physical traits, cognitive limitations, sensory performance, and social factors.
The human agent will extend the block and actor objects in SysML to better
specify human agent in the system under development. In this area, the human agent will
have played a certain role in the system operations and system capabilities. Along with
this role, a set of constraints will be specified to understand the limits on the strengths
and weaknesses of the human agent through specifying a skill set that is the minimum
requirement for the role to be played by the human agent. These two areas should help
the system architect better match the human agent to the role it is expected to play in
overall system performance.
4.1.4 Behavior
Modeling behavior using SysML and other object oriented modeling languages is
based on use cases and use case scenarios. Both concepts attempt to capture system usage
through high-level interactions of system stakeholders and actors with the system. These
use cases and use case scenarios will enhance written requirements by refining the
requirements to create a descriptive model. Not only does the requirement refinement
describe the interactions, but it also shows external visible exchanges, explores user
expectations, and defines intended purpose of system usage.
The use cases are further refined through activity diagrams and sequence
diagrams. Activity diagrams give the general flow of action (functions), while sequence
79
diagrams give a step-by-step description of operations and exchanges. In both of these
artifacts, it is important to explicitly detail which functions, operations, and
accompanying attributes can potentially enhance the analysis of the human element. In
particular, these functions, operations, and attributes need to allow for ease of transition
from conceptual architecture to detailed design. The parallelism between the aspects that
are analyzed in detailed design should be considered upfront to account for the human
element impact on the overall architecture, not just the performance of the system.
State machines are used to describe system/subsystem behavior in event driven
form. The events identified in this artifact can occur in one of the system states or can
drive a transition from one state to another. As in the case with systems, humans can be
described using state machines. These states can transition under certain events that affect
human cognitive state or performance. By creating specific state machines for the human
element, the architect can specify certain behavior, which the human element must
exhibit in response to certain events. This formalism could also drive the study of the
human element/role and limitations that may affect system performance due to state
changes in the human.
4.1.5 Structure
The structural diagrams describe the system structure through blocks and parts.
Within the framework, any system object can be defined using the block object. In a
similar manner, the HSI ontology will be able to extend the semantics used in the
structural diagrams to describe the human agent as well as human system interfaces.
These extended semantics will allow these two concepts to be considered upfront and
closely tied to the top-level requirements. The ontology will also extend the attributes and
parameters looked at in this context.
80
4.1.6 Parametrics
The parametrics diagrams are intended to support engineering analysis of critical
system parameters (often the measure of effectiveness and measure of performance). The
evaluation of these metrics pertains to performance, physical characteristics, and
“illities.” The parametric pillar has not been used much until recently. Currently the
parametrics artifacts are beginning to be used to trace and link mission and performance
metrics of cube satellites to analytical models (Bajaj et al., 2011; Spangelo et al., 2012).
Similarly, workload analysis, task analysis, and other HSI analysis could be traced and
linked to the human aspects of the system model. The ontology can be extended with the
necessary semantics and mechanisms needed to ensure that traceability and links to the
analytical models can be established and maintained in a model driven environment.
4.2 DETAILED HSI ONTOLOGY
4.2.1 HSI Ontology – Mechanisms Branch
Figure 4-3. HSI Ontology – Mechanisms
81
The mechanisms put into place will allow for the ontology to be used through
methodology and process.
4.2.1.1 Governance
The HSI ontology creates a framework for integrating system-engineering
concepts with the HSI concepts. This integration maps the relationships of these concepts
and is the basis for developing rules of modeling objects and how they can be used.
Although the ontology can be implemented in many ways, the goal is to use this ontology
to create a specialized HSI profile or schema that helps engineers model HIS concepts in
conjunction with the system architecture. By identifying the HSI concepts in the system
architecture, engineers are able to optimize the system with the human agent instead of
optimizing the system and then considering the human agent’s use of the system.
The profile and schema should give engineers a bridge for integrating system
architecture parameters and content and drive the development HSI models. The ontology
has been developed to allow a mapping of tool meta-models into the ontology to develop
point-to-point solutions or other tool integrations. The ontology provides a standard set of
required concepts that the system architecture must capture driving the definition of the
system architecture.
4.2.1.2 Human Centered Architecture Procedure
Figure 4-4. System Architecture Process
82
As seen in Figure 4-4, the typical system architecture process involves defining
the operational, functional, and physical architecture. Each architecture phase gives a
different viewpoint of the systems and unravels a different level of fidelity and
complexity of the system. The HSI ontology allows an architect to follow this well-
known process and still consider the human agent as an element of the system.
During the definition of the operational architecture, operational activities are
defined. This is where considering the agent begins to take place. As you will see in
section 4.2.4, the operational activity can be composed of human functions and is the
beginning of understanding the interaction of the human agent within an operational
construct. As the process continues into defining the functional architecture, we see
human tasks being traded between the human agent and the machine.
Both the operational architecture and functional architecture are more focused on
the interactions of the human agent with the machine versus the interface. As such the
ontology looks at various behavioral attributes a human agent may have as well as
defining possible roles a human agent can have within the different operational scenarios
a human agent could experience, as defined by the operational architecture.
In the definition of the physical architecture, we are more focused on the
interfaces between system agents, whether human or machine. The HSI ontology
accounts for this by adding HSI design constructs to help with the overall system
architecture. Mocking-up graphical user interfaces and better understanding the role of
that interface on the human agent assists completion of better analysis to integrate the
human agent into the system, whether the analysis should look at cognitive or physical
features.
Through all these architecture definitions, the requirement analysis process is also
occurring in concurrent fashion. The ontology builds on this process by providing more
83
explicit requirement call outs that allow for better requirement visibility and tracking. As
part of the requirement tracking, requirements can more easily be traceable to the
architecture as these explicit HSI requirements are considered early before
implementation and detailed design phases.
4.2.1.3 Human System Verification
By having the system architecture drive the content in the latter stages of the
engineering lifecycle, verification of human system interfaces and interactions can be
planned by using the same behavioral and structure content of the model to seed test
cases and test scripts. This can integrate the use of model based test profiles (Object
Management Group, 2013) for higher level of fidelity and traceability between the
system data and products.
4.2.1.4 Limitations
The HSI ontology is focused on three types of HSI analyses that will set a
foundation for the integration of HSI concepts into the system architecture processes.
These analyses are discussed in section 4.2.6, HSI Ontology – Parametrics.
84
4.2.2 HSI Ontology – Requirement Branch
Figure 4-5. HSI Ontology – Requirements
85
4.2.2.1 Human System Interaction Requirement
Human System Interaction Requirements are functional requirements that
describe an interaction between a human agent and a machine. Requirements that use this
concept describes a human agent function or task.
4.2.2.2 Human System Interaction Performance Requirement
Human System Interaction Performance Requirements are performance
requirements that describe performance of an interaction between a human agent and a
machine. The interaction performance requirement is composed at a minimum of
definition of interaction timing, accuracy, and success criteria. The timing requirement
specifies the amount of time a human agent function or task needs to be completed in.
The accuracy requirement specifies the accuracy to which the human agent can complete
a human function or task. The success criteria requirement specifies the percentage of
time a human function or task is completed.
4.2.2.3 Human System Interface Requirements
Human System Interface Requirements specify sensory and physical
characteristics that can affect the effectivity of the human-system interface. These
requirements can be against a graphical user interface, virtual interfaces, and physical
interfaces. Graphical user interfaces are usually designed using standards for colors and
layout given the organization. The amount of data being presented can affect human
reaction, as well as the frequency of data update. Requirements to reduce human agent
fatigue and keep human agent arousal are key to reduce the risk of error. By identifying
human system interface requirements, engineers would ensure that they are considered
earlier on in system trades.
86
4.2.2.4 Human Agent Definition
“You are only as strong as your weakest link.” When it comes to having the
human agent be part of the system, it is the one element that can be unpredictable and
may be considered the weakest link. In order to optimize the system, the human agent
needs to be defined through personal aptitude and skill level. Personal aptitude describes
innate or acquired abilities that correspond to system operations. Skill level defines the
level of expertise that the human agent may need in order to be able to complete allocated
functions and tasks within the appropriate accuracy and success completion rate.
Other aspects of the human definition that can limit the system effectiveness are
stressors, either through the system environment (temperature, humidity, etc.) or physical
effects (hours of sleep, hours on the job, etc). By identifying stressors early on you can
make decisions on how to reduce the stressors, either through functional allocation or
environmental controls. Fatigue can also be attributed to a natural limit for work, and
knowing the overload threshold for the human agent can allow engineers to make
decisions on how many agents are needed or if there are other ways to reduce load.
4.2.2.5 Training Requirement
Training requirements specify the needed training a human agent would be
required to have in order to be part of the system. This may include the frequency of the
training and the standards for evaluating how effective the human agent is, as well as the
training itself. Training requirements detail knowledge areas and skills areas for a human
agent, as well as the training methods needed to reach the goal of preparing the human
agent to those specifications.
87
4.2.3 HSI Ontology – Human Agent Branch
Figure 4-6. HSI Ontology – Human Agent
4.2.3.1 Human Agent
The human agent concept represents the human element of the system. As part of
the human element definition a minimum set of attributes define limitations of the human
agent: skill level, specialty, training frequency, and interface usage.
The skill level attribute is a measure of whether the agent is a novice or a
journeyman. The skill level is related to the given specialty the agent is assigned. The
specialty attribute is an assigned mission role the agent should be serving. For each
88
specialty, the human agent should keep a training frequency that will attribute to the skill
level the human agent has.
Due to the fact that the human agent must interface with a machine in some
manner, in order for the human to be more efficient, it is necessary to know how often the
human agent uses the interface. The amount of usage of the interface will also attribute to
the skill level the human agent will have.
The human agent has the following relationships to other concepts to be discussed
below:
The human agent has a Role
The human agent has Resources
The human agent has a Workload Management Strategy
The human agent experiences Stressors
The human agent has Competencies
The human agent experiences Workload
The human agent is characterized by Anthropometric Features
The human agent acts in an Operational Activity
The human agent completes Human Agent Task
The human agent uses Human System Interface
4.2.3.2 Role
A Role is a type of agent that the system needs for its operation. The difference
between the role and the human agent is that a human agent could serve many roles
although, in an ideal world, to reduce the amount of context switching, one role will be
played by one human agent with the same specialty for better efficiency. A Role is
89
similarly defined like a Human Agent with specialty, skill level, training frequency, and
interface usage attributes.
Some highly used roles in HSI analyses are the Operator, Maintainer, and Supply
and Support Personnel. The Operator role uses a system and typically is defined by the
mission owning organization. The Maintainer role maintains a system. The Supply and
Support Personnel are typically related to defense systems and are in charge of ensuring
systems are supplied with the necessary equipment to complete the mission.
4.2.3.3 Competencies
The human element has competencies that define the human elements’
knowledge, skills, and abilities. The set of competencies that a human agent has gives
insight on whether or not the person can fit a role with his or her competencies.
4.2.3.4 Anthropometric Features
The human agent is characterized by Anthropometric Features. Anthropometric
Features are the measurements of the human agent’s body: height, weight, reach, etc.
4.2.3.5 Resources
The human agent has resources that it uses to complete any task. The human
agent resources are attributed to human abilities whether the agent needs to use visual,
speech, cognitive, auditory or motor. The resources are consumed during a Human Agent
Task execution (concept relationship can be seen in Figure 4-6).
4.2.3.6 Stressors
A human agent experiences stressors that contribute to the human agent
performance and correlate to the success of the overall system. Environmental conditions
90
such as temperature, vibrations, noise, wind, and humidity contribute towards stressors
experienced by the human agent.
4.2.3.7 Workload
The human agent experiences workload, which can be decomposed into four
categories of workload: perceptual, cognitive, motor, and communication.
4.2.3.8 Workload Management Strategy
Human Agents have a workload management strategy to help with how to deal
with overloaded situations. The workload management strategies available are no
strategy, shed task, sequential, interrupt task, reassign task, and interleave tasks.
91
4.2.4 HSI Ontology – Behavior Branch
Figure 4-7. HSI Ontology –Behavior
92
4.2.4.1 Storyboards
Storyboards are an ideal platform for capturing human agent stories. They
graphically capture what the human agent must do in order to interact with the machine
portion of the system.
4.2.4.2 Human Task Network
Human Task Network captures the decomposition of human agent functions into
lower level human agent function and human agent tasks. The functions/tasks are
structured to show the flow between the lower level functions and tasks. Human task
networks can define aspects of the storyboard.
4.2.4.3 Human Agent Function
A Human Agent Function relates to an operational activity as part of an
operational activity model. The human agent function describes the human agent
behavior in context of the operational activity. In most cases, human agent functions can
be decomposed into human agent tasks and lower level human agent functions. The flows
of these sets of decomposition functions/tasks are captured in a human task network.
4.2.4.4 Human Agent Task
A human task is the lowest level of behavior a human agent completes. A human
task consumes resources as defined by Section 4.2.3.5 and used to complete an
interaction with a human system interface. The human agent task contributes to the
workload a human agent experiences, as well as contributes to the stressors the human
agent experiences.
93
4.2.4.5 Component Maintenance Task
A Component Maintenance Task is a type of Human Agent Task that is
specifically focused on maintaining a system. Some common maintenance tasks include
adjust and repair, inspection, remove and replace, test and check, and troubleshoot. These
maintenance tasks have attributes that describe the maintenance type, support level,
maintainer specialty, maintenance skill level, number of maintainers, and mean time to
repair.
The Component Maintenance Task can be a preventative, scheduled, or corrective
maintenance type. A Component Maintenance task support level corresponds to the
location where maintenance can be done. Organizational support maintenance can be
done at the unit level and are done on the system itself. Direct support is maintenance that
needs to be done at a local facility. Maintenance can be done from unit level all the way
to the system level at these direct support facilities. General support is remotely done and
only can be done at unit levels. Contact team support brings maintenance to the system.
Each component maintenance task can be done by a human agent specialty with a
specific skill level. In some cases, more than one maintainer is needed to complete
component maintenance tasks. Like human agent tasks, component maintenance tasks
have a time to complete, but in this case to maintain mean time to repair (MTTR).
4.2.4.6 Human Agent Decision
In order to gain more insight on human decision-making, a modeler builds the
operational model or human task networks. For human interaction analysis, it is
necessary to understand how often a human decision is being made and the type of
memory it needs (working, short term, and long term memory). The type of memory
being used may affect timing constraints, as well as allow adjustments to be made on
94
whether a human agent should be making the decision or if the decision should be
automated.
4.2.4.7 Human Interface Interaction Diagram
The human interface interaction diagram focuses on the human agent interaction
with the human system interface. Each operation described in this diagram describes a
human resource that is being used in the use of the interface for a scenario.
4.2.4.8 Visual Operation
A visual operation is a visual function that must be done by a human agent to
interact with a human system interface. This can be defined in an interaction diagram
describing lower level interaction between the human agent and the human system
interface.
4.2.4.9 Motor Operation
A motor operation is a motor function that must be done by a human agent to
interact with a human system interface. Motor functions can be tactile, fine or gross
motor functions. This can be defined in an interaction diagram describing lower level
interaction between the human agent and the human system interface.
4.2.4.10 Auditory Operation
An auditory operation is an auditory function that must be done by a human agent
to interact with a human system interface. This can be defined in an interaction diagram
describing lower level interaction between the human agent and the human system
interface.
95
4.2.4.11 Speech Operation
A speech operation is a speech function that must be done by a human agent to
interact with another human agent. This can be defined in an interaction diagram
describing lower level interaction between human agents.
4.2.4.12 Cognitive Operation
A cognitive operation is a cognitive function that must be done by a human agent
to interact with a human system interface. This can be defined in an interaction diagram
describing lower level interaction between the human agent and the human system
interface.
4.2.4.13 Human Agent State Machine
The Human Agent State Machine attempts to capture the dynamic state of the
human agent in accordance with conditions of the system and the environment.
4.2.4.14 Stressors
In Section 4.2.3.6, stressors are defined as entities that are experienced by human
agents. In this part of the ontology, human agent tasks contribute to building stressors that
affect the human agent. Stressors can be categorized as physical or mental stressors, some
which are directly affected by the human agent task and others that are externally affected
by human behavior outside of the system.
4.2.4.15 Workload
A Human Agent Task contributes to workload a human agent may experience.
Workload can be a mixture of cognitive, motor, communication, and perceptual
workload. The workload can directly be attributed to the visual, auditory, motor,
cognitive, and speech operations.
96
4.2.5 HSI Ontology – Structure Branch
Figure 4-8. HSI Ontology – Structure
97
4.2.5.1 Human System Interface
A Human System Interface is part of a subsystem, which is part of a system. This
interface is what incorporates the human agent into the system. Human system interfaces
can be physical and/or digital. Physical interfaces can include levers, buttons, etc. Digital
interfaces are usually through a display and require a graphical user interface. The driving
aspect of designing a human system interface is the agent experience with the interface. If
the interface is too cumbersome, the human agent will not be able to stay engaged for
system success. System success is driven on how well these interfaces are designed.
4.2.5.2 Human System Interface Mock Up
In order to help with the success of graphical interfaces, the Human System
Interface Mock Up is a visual depiction of the graphical interfaces that the human agent
will be interfacing with. These mock-ups can be used in combination with storyboards to
better understand the human machine interactions through the human system interface,
decreasing uncertainty of interface use.
98
4.2.6 HSI Ontology – Parametrics Branch
Figure 4-9. HSI Ontology – Parametrics
4.2.6.1 Human System Interface Analysis
As seen in Section 4.2.2 – 4.2.5, the ontology has many concepts related to
analyzing the human system interface. With the use of parametrics, engineers can outline
trade studies to improve the quality of the interface being developed. The analysis can be
a combination of behavioral and structural attributes contributing to the study of the
interfaces effectiveness.
4.2.6.2 Workload Analysis
The majority of the ontology is focused on bringing definition to workload
analysis parameters upfront in the system architecture definition. The majority of the
concepts described in 4.2.3 and 4.2.4 feed workload analysis. The workload analysis can
be focused on operational analysis or on maintenance analysis. The results of these
99
analyses can alter the allocation of activities from the human agent to the machine, or
discover new manning requirements.
4.2.6.3 Cognitive Analysis
Similar to the workload analysis, the cognitive analysis focuses on the behavior of
the human agent. Cognitive analysis is a more in depth analysis of human system
interactions. Engineers can identify high areas of cognition by the interactions specified
for high level of degree of skills, memory use, perception, and whether or not there is
context switching.
4.3 HSI ONTOLOGY REASONING AND METRICS
Ontologies provide a great method for communities to agree on semantics and
relationships between semantics. The data model itself, however, does not provide
enough rigors to modeling and it is necessary to use reasoners to provide rules sets that
are built on the ontologies. The HSI ontology provides reasoning through causality and
allows setting up rules of man and machine interactions for system performance. The
following table identifies the ontology metrics that defines the HSI reasoning if the
ontology is used.
100
Table 4-1. HSI Ontology Metrics
Metrics
Axiom 559
Logical Axiom Count 365
Declaration Axioms Count 194
Class Count 99
Object Property Count 41
Data Property Count 41
DL Expressivity ALCRIF(D)
Class Axioms
SubClassOf 76
DisjointClasses 14
Object Property Axioms
SubObjectPropertyOf 29
DisjointObjectProperties 3
InverseFunctionalObjectProperty 14
AsymmetricObjectProperty 1
ObjectPropertyDomain 39
ObjectPropertyRange 39
Data Property Axioms
SubDataPropertyOf 39
EquivalentDataProperties 3
FunctionalDataProperty 1
DataPropertyDomain 43
DataPropertyRange 39
The HSI ontology metrics is focused on the HSI considerations and its
relationship to the system considerations as described in the ontology. As seen in Table
4-1, there are 559 axioms described in the ontology. With that alone, the gap of sematic
relationships that were not previously represented in the system architecting process
becomes evident. These axioms lay the foundational constructs and rules for which
human elements can be modeled.
101
Chapter 5: Implementing a Human Centered Model-Based System
Architecting and Engineering Environment
Methods and processes can only get you so far without supporting tools. As
system developments turn to modeling for efficiencies and simplifying complex
developments, the tools to support these types of methods and processes must also catch
up. With the process discussed in Chapter 2 and the HSI ontology discussed in Chapter 4,
the tools supporting both system modeling and HSI analysis need to be integrated in
order to take full advantage of architecting systems with human considerations.
Figure 5-1. Model Based Environment Implementation
The tools used in the human centered model-based system architecting and
engineering environment are Cameo EA, a HSI integration tool, and IMPRINT. Cameo
EA, a systems architecture tool, was used to develop the descriptive models using
multiple language profiles: Unified Modeling Language (UML), System Modeling
Language (SysML), and Unified Profile for DODAF and MODAF (UPDM). Descriptive
models are graphical based models describing real world events and the relationships
102
between factors responsible for them. In order to bring in the human considerations into
the descriptive modeling efforts, the HSI ontology was implemented within a Cameo EA
profile that could be used in conjunction with the main language profiles. The
implementation of the Human System Integration Profile is described in Section 5.1.
Once the HSI profile was implemented and the system model was developed
using the semantics and axioms of the ontology, the modeling efforts from traditional
systems engineering and human system integration efforts needed to be integrated. In
order to integrate two traditional tools, the HSI ontology was leveraged to do the
mapping between the tools, allowing for the right information captured in Cameo EA to
seed the human system integration analysis (an analytical model). Analytical models are
static or dynamic mathematical models expressing changes of a system.
5.1 THE HUMAN SYSTEM INTEGRATION PROFILE
Human System Integration Profile is one implementation of the HSI Ontology,
which is composed of mechanisms, requirements, human agents, behavior, structure, and
parametrics. The Human System Integration Profile focuses on developing the modeling
constructs for requirements, human agents, behavior, and structure and is govern by the
mechanisms identified in the ontology. The following discusses in greater detail the
various constructs that were developed to extend Unified Modeling Language based
tools, in particular Cameo EA.
103
Figure 5-2. Human System Integration Profile Stereotypes
104
Figure 5-3. Human System Integration Profile Enumerated Data Types
Figure 5-2 shows the HSI Profile stereotypes. As shown in this figure, the profile
stereotypes include various considerations associated with a human agent that need to be
taken into account in the system architectural extensions. These considerations are the
105
key to HSI. Figure 5-3 shows any enumerated types associated with attributes allocated to
the stereotypes that represent the classes from the HSI ontology.
Table 5-1 presents the HSI concept extensions. The table specifically shows the
HSI concept, the UML primitives, and the SysML categories associated with the concept.
Table 5-1. Human System Integration Concepts Extensions
Concept UML SysML
Human System Interaction
Requirement
Class Requirement
Human System Interaction
Performance Requirement
Class Requirement
Human System Interface
Requirement
Class Requirement
Human Agent Definition Class Requirement
Training Requirement Class Requirement
Human Agent Class, Actor Block, Actor
Role Class Block
Human Task Network Activity Diagram Activity Diagram
Human Agent Function Activity Activity
Human Agent Task Action Action
Component Maintenance
Task
Action Action
Human Agent Decision Decision Node Decision Node
Human Interface Interaction
Diagram
Sequence Diagram Sequence Diagram
Visual Operation Operation Operation
Motor Operation Operation Operation
Auditory Operation Operation Operation
Speech Operation Operation Operation
Cognitive Operation Operation Operation
Human Agent State
Machine
State Machine Diagram State Machine Diagram
Human System Interface Interface Interface
Storyboards Class Diagram
Graphically capture what the
human agent must do in
order to interact with the
machine portion of the
system
Human System Interface Class Diagram A visual depiction of the
106
Concept UML SysML
Mock-Up Diagram graphical interfaces that the
human agent will be
interfacing with.
For usability purposes, the stereotypes used, were customized in accordance with
the axioms of the HSI ontology. Each customization in Figure 5-4 shows which
stereotype the customization is for, who can own that object, if there are any derivative
diagrams associated with the object, and what types the object can own.
Once packaged, and once the ontology was implemented into a profile, the profile
could be reused by other architects using the same tool set. Although Cameo EA was
chosen for testing purposes, this same approach can be implemented in other tools as
long as it follows the HSI ontology and its axioms.
107
Figure 5-4. Human System Integration Profile Stereotype Customizations
5.2 INTEGRATING DESCRIPTIVE MODELS WITH ANALYTICAL MODELS
To ensure data interoperability between Cameo EA and IMPRINT, the HSI
ontology was leveraged through the Cameo HSI profile to do a mapping between the
tools to translate the data for the parts of the model that were of interest. Using Cameo
Workbench, the native data in Cameo EA could be translated and converted to XML file,
108
which could be packaged into a native IMPRINT file. In order to leverage this
technology, a set of conversion scripts were created to read in the Cameo Model, search
the model for human agents, human functions, human agent tasks, and system of interest.
As seen in Table 5-2, these items were then converted into native items in IMPRINT.
Table 5-2. IMPRINT-Magic Draw Metamodel Mapping
IMPRINT Object Type Magic Draw HSI Profile Stereotype
Warfighter Human Agent
Mission N/A
Function Human Function
Tasks Human Agent Task
System System, System of Interest
Interface Human System Interface
The human agents in Cameo EA were directly translated to warfighters objects in
IMPRINT. If any of the human agent attributes were filled, it would replace those fields
with the value instead of the default values that are set.
For the behavioral portion of the model, it searched the architecture for any
activity using a human function or human agent task. If it was a top-level system
function, it translated that function over as a mission object in IMPRRINT. Once the
mission object is identified, the scripts built out the lower level structure of the mission to
functions and tasks, pulling in specified values to these objects attributes as populated,
otherwise using the default values.
The final step is the structural translation of the system and its composition
including subsystems, components, human system interfaces, and component
maintenance tasks. Although the analysis was originally meant to focus more on the
109
operational analysis, the scripts were extended to include maintenance analysis as a way
to show that this method can be extended. The extensions could be for other analysis, as
well as to establish that a similar method can be used with other tools, if the right
application protocol interfaces are available to read and write the data to native tool
formats.
110
Chapter 6: Human Centered Model Based Systems Engineering Case
Study
6.1 CASE STUDY SYSTEM OVERVIEW
The U.S. Department of Defense has been moving forward using more unmanned
capabilities due to more hazardous missions, smaller numbers in personnel and more
demands on resources. As such, they are slowly integrating unmanned platforms to
cooperatively operate within standard forces to assist in mission objectives.
For this case study, a Geospatial mission was chosen to study the interactions
between the Unmanned Aerial Vehicle (UAV) and the Image Analyst. All the data used
for this case study was gathered from publicly available literature from the Army
Research Laboratory (ARL) in Aberdeen, Maryland (Hunn & Heuckeroth, 2006; Hunn et
al., 2008). The models built were focused on the lenses of the human agent, in particular
the Image Analyst and its interface with the UAV.
The case study partially replicates the Geospatial UAV study done at ARL (Hunn
et al., 2008) due to data availability, but appropriate level of details and fidelity was
extracted to drive the architecture and analysis from the system architecture to the
detailed human system design.
6.2 ARCHITECTURE AND ANALYSIS
6.2.1 The Concept of Operations
To replicate the system engineering process as closely as possible the Geospatial
UAV system model started with developing the overarching concept of operations
(CONOPS) through adaptation of the UAV studies presented by ARL (Hunn &
Heuckeroth, 2006; Hunn, Schweitzer, Cahir, & Finch, 2008). Figure 6-1 shows that the
UAV and its supporting personnel receive the orders and plan the mission. Once the
111
mission is planned, the UAV goes through its launch procedures and eventually is
launched and travels to the area of interest. While in the area of interest, it Performs
Counter Improvised Explosive Device Operations. Once the mission is complete, the
UAV goes back to base and performs landing and post mission procedures.
112
Figure 6-1. Perform Geospatial Intelligence Concept of Operations
113
Figure 6-2. Perform Counter Improvised Explosive Device Operations
For the purposes of this case study, the Perform Counter Improvised Explosive
Device Operations operational activity was decomposed to better understand the
operations functions needed to complete the operations use case. This lower level
operational level detail was needed as the system views were created and the model
hones in on system functions that support the task.
6.2.2 The Operational Concept and Systems Architecture
Once the CONOPS is well understood, usually an operational concept (OPSCON)
is created to better understand the system to system interactions, or in this case, the
system to human interaction. Figure 6-3 shows the high level OPSCON for Performing
Counter Improvised Explosive Device Operations.
114
Figure 6-3. Monitor Direct Imagery Feeds OPSCON
As Figure 6-3 describes the mission scenario is focused on the Image Analyst and
its interactions with the systems to coordinate and execute the mission. In this OPSCON,
there are five agents: the UAV, UAV Operator, UAV Mission Control Station,
Geospatial Intelligence (GEOINT) Analyst Control Station, and the Image Analyst. The
scenario modeled, starts by geospatial information from the UAV being sent down to the
GEOINT control station, where the Image Analyst can exploit video and images being
115
captured. As the feed comes in, the analyst must process features on the feed of potential
areas of interest. Once features are identified, intelligence is forwarded to other analysts
for further processing and data fusion with other intelligence sources. If the analyst
decides that he or she needs more information in a certain geographical location, he or
she can cue the UAV operators with new mission requirements.
Typically, to ensure system functions cover the operational activities, a
traceability table is created, for this case study, the human functions were explicitly
looked at and traced to the operational activities they supported.
116
Table 6-1. System Function to Operational Traceability Matrix
Human Functions
Identify
HVIs
Identify IED
Ambush
Sites
Identify IEDs
Production Sites
and Weapons
Cache
Locate
IEDs
Ambush
Sites
Locate IEDs
Production Sites
and Weapons
Cache
Track
HVIs
96D-1050 Plot Coordinates on a Map Image v3 X X X
96D-1050 Plot Coordinates on a Map Image v4 X X X
96D-1050 Plot Coordinates on a Map Image v5 X X X
96D-1050 Plot Coordinates on a Map Image v6 X X X
96D-1204 ID Roadways on Imagery v2 X X
96D-1215 ID Vehicles Types on Imagery v2 X
96D-1232 Exploit Full Motion Video Imagery v2 X X X X X X
96D-1237 ID Unconventional Act on Imagery v2 X X
96D-1301 Pick IMINT Sensors to Satisfy GEOINT
v2
X X X X X X
Conduct Aerial Route Reconnaissance v2 X X X X X X
Direct AUV Sensor Employment v2 X X X X X X
Direct AUV Sensor Employment v3 X X X X X X
ID on Video an Object or Event v2 X X X
Maintain Voice and Chat with AVO v2 X X X
Maintain Voice and Chat with AVO v3 X X X
Provide Chip, Image or Video Clip v2 X
Provide Chip, Image or Video Clip v3 X
Provide Chip, Image or Video Clip v4 X
Respond to Request for Imagery v2 X X X
117
As part of this OPSCON, early mockups of displays and critical information
related to the agents can be produced for review and further refinement and development
as seen in Figure 6-4.
Figure 6-4. GEOINT Control Station Display Mock Up
Figure 6-5 shows a typical UAV hierarchy of the type of subsystems and
components a UAV has and will dictate the allocation of the system functions to it while
the human functions are solely allocated to the human agent and its agent resources
(visual, auditory, cognitive, and psychomotor).
118
Figure 6-5. GEOINT UAV System Hierarchy
As the architecture gets flushed out, requirements both explicit and implicit can
be tracked through the system model. Figure 6-6 shows how human agent requirements
get traced to the system elements, as it is being developed and better understood.
Figure 6-6. Human Requirements Traceability to System Model
Typically at this point in the system engineering process, the system architects
focus in on decomposing the functions relative to the system of interest and rarely focus
on human agent tasking decomposition. Due to the HSI extension, not only was a mock-
up of a human system interface captured in the system model, but also human functions
were decomposed using task networks in the same system model that decomposed the
system functions down to further components. The following figures are examples of
some of the human functions that were decomposed into task network. Typically, this
may or may not happen at this stage, depending on the level of information available to
the architects and the understanding of the human system integration engineer. In some
119
occasions, the high level human function will be identified and be left for further
definition in the human system analysis and brought back up into integration with the rest
of the system model when better understood.
Figure 6-7. Exploit Full Motion Video Imagery Task Network
120
Figure 6-8. Conduct Aerial Route Reconnaissance Task Network
6.2.3 The Human System Integration Analysis
Whether parts or all of the human functions are defined in the system architecture,
starting the human system integration analysis upfront in the architecting process not only
strengthens the architecture but also allows for architecture to be adapted and optimizes
the system to the human agents.
For the case study, the human agent functions and tasks were identified in the
system architecture, as defined in the figures above and can be seen in Appendix A. Once
the functions and tasks necessary to define the GEOINT scenario were completed, the
system architecture model was converted into the inputs of an IMPRINT model to begin
the human system integration analysis. The elements created in the IMPRINT model
mirrored the human elements of the system architecture elements in Cameo EA with all
attributes that are defined according to the Human System Integration Profile. As seen in
121
Figure 6-9, 6-10, and in Appendix B, the human elements converted can be constructed
with the same relationships defined in the system architecture.
Figure 6-9. Monitor Direct Imagery Feeds IMPRINT Mission
122
Figure 6-10. Exploit Full Motion Video Imagery IMPRINT Task Network
As discussed previously, the human functions and tasks were extracted from an
ARL study (Hunn et al., 2008). In this study, the tasking for the Imagery Analyst was
surveyed for task time and workload in accordance to visual (V), auditory (A), cognitive
(C), and psychomotor (P) resources. In the ARL study, they ran the model and analyzed
123
for overall workload (O
w
). O
w
in IMPRINT is defined as O
w
= V + A + C + P. Overloads
occur when any single resource goes above a seven or the O
w
goes above a 40. Figure 6-
11 shows the value scales for each resource that is attached to each task described in the
model.
Figure 6-11. IMPRINT VACP Scale Values (Hunn & Heuckeroth, 2006)
124
In order to replicate the ARL study, the case study ensured that the workload
analysis results resulted in similar results as the original study. Figure 6-12 shows the
original workload analysis results.
Figure 6-12. Original Primary Analyst Workload Analysis Results (Hunn et al., 2008)
Similar to the pattern of original results, the replicated study had similar shape
and workload spikes, as seen in Figure 6-13.
125
Figure 6-13. Replicated Image Analyst Workload Analysis Results
Unlike the previous study, which isolated how these results integrated into the
system architecture, the replicated study allowed for deeper analysis and highlights which
human functions need attention and how they integrate to system functions. As described
by Hunn et al., (2008), the analysis was done by having the Image Analyst conduct all
functions manually and without the use of any assistive aids or automation.
Unfortunately, Hunn et al., (2008)’s study was not tightly integrated to the system
architecture, and was done separate from the architecting effort. Traditionally, this has
been common, as the architecture has lacked integration with the rest of the detailed
design and analysis. With the human centered model based systems engineering approach
that has been developed in this research, these results can be tied back to a very tightly
126
coupled relationship between the human system integration analysis and the overall
system architecture.
6.3 HUMAN CENTERED MODEL BASED SYSTEMS ENGINEERING RESULTS
The three areas of workload overload identified in Figure 6-13 were analyzed to
see what aspects of the human tasking attributed to the overload. Then, each human
function was analyzed for how they integrated to the system functions. As the functional
picture began to be discovered, system functions that could aid the Image Analyst reduce
his or her workload were explored. The workload analysis was able to identify those
areas and was able to trace to the system architecture due to the tight integration that the
Human System Integration Profile provided.
With the workload results in hand, it was obvious that the human functions that
could be improved would be any identification function. By allocating automated
identification functions to the GEOINT Analyst Control System to assist the human
agent, the human agent VACP resource workload would reduce, in accordance with the
scale values used to measure those resource values in IMPRINT.
As seen in Figure 6-14, the light yellow color functions in the GEOINT Analyst
Control System are the system functions that have been added to assist the human agent
to reduce workload. As part of these added features, instead of having the raw feed, the
system functions provides cues that allow the human agent to filter out areas that are not
of concern. Due to these new semantics that were added into the system architecture,
knowing the exact data that the human agent will need for situational awareness increases
compatibility between the human-system interfaces.
127
Figure 6-14. Monitor Direct Imagery Feeds with Automated Assist Functions
128
Once the changes to the functional architecture had been made, the new VACP
values were entered into the IMPRINT model to re run the discrete event VACP analysis
over a 100 run Monte Carlo. Figure 6-15 shows the results of the new architecture with
the automated assist functions in place.
Figure 6-15. Image Analyst Workload Analysis Results with Auto Assist Functions
129
Figure 6-16. Architecture Workload Comparison
Even with the new automated assistance, the Image Analyst workload still reaches
two peaks that are over the workload threshold of 40. The first peak is very early on and
is slightly over the threshold at 43.6. Due to the slight overload for a short period of time
(about 2 minutes) the architecture would be able to stand. The second peak, though, lasts
for a little over 3 minutes with a peak overload at 58. In Figure 6-16, the Image Analyst
workload demands of both functional architectures are compared. Overall, with the new
architecture there was a 13% average reduction in workload throughout the operational
concept, with a maximum workload reduction of 43% in some moments in time. This
architecture was not able to reduce the workload for the tasks described in Table 6-2.
0.00
10.00
20.00
30.00
40.00
50.00
60.00
70.00
00:00:00.00 00:40:09.94 01:00:50.15 01:27:55.82 01:36:23.66 01:51:31.88 02:05:46.40
Workload Comparison Chart
Manual AutoAssist
130
Table 6-2. Workload Overload Task Drivers
Time Lapsed Human Function Human Task O
w
01:28:18.75
96D-1232 Exploit Full Motion Video Imagery v2 Conduct analysis of data frame by frame v2
50.90 Direct AUV Sensor Employment v3 Provide direction/guidance to UAV sensor operator
96D-1050 Plot Coordinates on a Map, Image or Geospatial v3 Verify the grid zone designator on the map
01:28:30.18
96D-1232 Exploit Full Motion Video Imagery v2 Conduct analysis of data frame by frame v2
58.00
96D-1237 ID Unconventional Act on Imagery v2
Determine the requirement by examining the exploitation
requirement
Direct AUV Sensor Employment v3 Provide direction/guidance to UAV sensor operator
01:30:04.46
96D-1237 ID Unconventional Act on Imagery v2
Determine the requirement by examining the exploitation
requirement
57.30
96D-1232 Exploit Full Motion Video Imagery v2 Prepare any SPOT or SALUTE reports v2
Direct AUV Sensor Employment v3 Provide direction/guidance to UAV sensor operator
01:30:47.55
96D-1237 ID Unconventional Act on Imagery v2
Locate the unconventional activity on the imagery / map
sheet
46.80
96D-1232 Exploit Full Motion Video Imagery v2 Prepare any SPOT or SALUTE reports v2
Direct AUV Sensor Employment v3 Provide direction/guidance to UAV sensor operator
01:31:05.19
96D-1237 ID Unconventional Act on Imagery v2
Locate the unconventional activity on the imagery / map
sheet
40.10
96D-1232 Exploit Full Motion Video Imagery v2 Perform any object recognition and identification v2
Direct AUV Sensor Employment v3 Provide direction/guidance to UAV sensor operator
01:31:41.86
96D-1237 ID Unconventional Act on Imagery v2
Locate the unconventional activity on the imagery / map
sheet
47.00
96D-1232 Exploit Full Motion Video Imagery v2 Perform any audio/video capture v2
Direct AUV Sensor Employment v3 Provide direction/guidance to UAV sensor operator
131
Since automated aids did not reduce the workload, the architecture was re-
evaluated to see if other architectural changes could be made. Revisiting the original
study, Hunn et al., (2008) studied other human agents as part of a larger concept of
operations that could affect architecture results through team dynamics. If given the right
conditions, in a team environment, an analysis can be looked at where other human
agents can pick up tasks through team dynamics. Due to the focus of this research being
on individual impact to a system, this is a shortfall of the human system integration
ontology, will need to be researched and added at a future date. This ontology will
however facilitate said future research, since the core items are already part of the
ontology. Further, it will allow architects to identify key items needed for human agent
situational awareness that are critical for human to human task distribution or if a human
is a secondary option to the machine.
6.4 CASE STUDY SUMMARY
As seen in this case study, the implementation of the human system integration
ontology as a system modeling profile enabled human considerations to be more
accurately integrated into the system architecture. With the added benefit of this tight
integration, traditional human system analysis could be done; with the added benefit of
in-depth analysis of how those considerations directly affect human performance. By
later trading different functional architecture, the new human centered architecting
method established an architecture that is closer to an optimal architecture needed to
execute system operations, decreasing workload across the scenario on average by 13%.
132
Chapter 7: Summary and Future Research Implications
7.1 SUMMARY & RESEARCH IMPLICATIONS
Model-Based Systems Engineering (MBSE) is the current paradigm for system
engineering. System Modeling Language (SysML) is the notation for implementing the
various models and views required in the systems engineering lifecycle. Thus far, MBSE
has focused on the front end of system engineering. Today, MBSE is being extended to
address other lifecycle phases. As part of this extended MBSE scope, the integration of
humans with and within systems has been addressed. Inspired by the transition of human
from operators to system agents, it was necessary to define several human views to
provide the basis for addressing human system integration concerns, and for specifying
human system interaction design and human machine interfaces. This research addressed
the subject from the perspective of extending MBSE to address HSI concerns. In
particular, the research looked at creating HSI ontology and using these semantics to
assist in integrating multiple system models to explore and address HSI issues during
system architecting.
As a result of these modeling advances, system architects and system engineers
are able to analyze for the human agent using a model-based systems engineering
approach. This extension serves as a framework for future work in extending model-
based systems engineering to other specialty engineering disciplines and to the system
“ilities”. In the HSI community, it advances current practices by expanding the HSI
consideration out of the system design phases to throughout the system lifecycle by
enabling HSI considerations during system architecting and system engineering methods
and processes.
The new human centered model based systems engineering methods, processes,
and tools established in this research have changed how architectures explore human
133
interactions, interfaces, and integration. As seen in Chapter 6, the UAV Image Analyst
case study, the new methods, processes and tools allow the architect and human systems
integration engineer to evaluate the impact of different levels of automation on human
performance. The original UAV Image Analyst study was fully manual. The Image
Analyst had to receive the direct full motion video or still images from the UAV and
evaluate for various factors that would get cued for further processing and data fusion to
other systems and personnel. With the new methods, processes, and tools, a new
architecture was developed that included image identification and cueing automation.
With these changes the Image Analyst was still responsible for reviewing the full motion
video and still images but had the luxury of the full motion video and still imagery being
prepopulated with areas of interest that would then be corrected as needed by the image
analyst. The image analyst moved from a worker role to a supervisor role, which reduced
overall workload on the Image Analyst. In fact, the workload analysis showed that the
architecture with automation reduced workload on average 13% throughout the scenario
analyzed.
Along with direct comparison of architecture optimization for the human agent,
the new methods, processes, and tools enable sensitivity analysis, exploring the
sensitivity of changes to the machine on human performance. Another analysis that could
have been run using the processes, methods, and tools, was how each addition of an
automation function changed the human performance until reaching full automation of
the identification functions. Although the research did not cover human error analysis as
a primary research focus, a byproduct of the integrations presented in this research is to
include human error analysis as another factor. The sensitivity analysis could explore the
level of failure or errors due to changes on how the machine handles human interaction,
interfaces, and integration.
134
Another aspect shown in the case study was the new set of tools, which made it
easier to ascertain what information the human needed for situational awareness
purposes. Situational awareness becomes a key constraint when the architecture moves
the human agent from a worker role to a supervisor role. In the case study, these
considerations were not as critical, since the system was to designate areas of interest that
could be used for intelligence through data fusion. For other systems, where the
supervisory role is critical to success of the mission or can affect life, having better
knowledge of what the human agent must have to react to could mean the difference
between success and failure, where the worst consequence could lead to death. For
example, in the UAV market today, there is a push to put more artificial intelligence
onboard the UAVs in order to minimize control of the human agent, and allowing the
growth of multiple planes to be controlled by one agent. The problem with this scenario
is that if a failure occurs, in order for the human agent to react, first the agent must
acknowledge that there is a failure, then figure out the status, as well as what actions he
or she can take to correct the situation. The level of situational awareness on the ground
is usually not as high as in the UAV, since UAVs tend to have multiple sensors and, in
some cases, functional duplication on measurements. All this data tends not to get to the
ground due to bandwidth limitations on communication systems or mission constraints;
however, the new methods, processes, and tools, give a better picture to the information a
human agent needs in order to make decisions. By understanding more explicitly what
the vehicle could be using to make decisions and what data is actually getting to the
human agent, the architect can analyze gaps in situational awareness early on the
development lifecycle versus in test and evaluation or worse yet, in operations and
sustainment.
135
Although this research was focused on increasing human performance by
extending MBSE for human considerations, one of the outcomes of this research has been
to expend the use of ontologies to drive model integration by becoming the foundation
for integrating MBSE methods, processes, and tools through a more data centric
approach. The human model based environment built as part of this research was not
based on point-to-point solutions, but data integration through a data broker. This
approach changes how models are integrated and allows MBSE environments to be
scalable - the driven force being the ontology versus the tools, keeping methods and
processes tool agnostic. The focus is taken away from the tool-to-tool integration and
focused on the data broker that talks to each tool for the correct data in accordance to the
ontology axioms.
7.2 RESEARCH CONTRIBUTIONS
This research contributes to the advancement of the systems engineering body of
knowledge, by extending traditional system architecting methods, processes, and tools to
integrate human considerations for a greater perspective in system optimization. The
primary benefactors of this research are systems engineering practitioners who focus on
architecting and human system integration. The new methods, processes, and tools
developed through this research allow them new opportunities for system integration.
Secondary benefactors are system-engineering researchers trying to expand MBSE.
The primary contributions in Systems Engineering and Architecting, Human
System Integration and Model-Based Systems Engineering are:
1. The development of a Human System Integration Ontology to integrate
traditional systems engineering semantics with HSI semantics
136
a. Evaluated, via model based reaosner the HSI ontology for
conformance and consistency
b. Changed how architectures address human interactions, interfaces,
and integration
c. Enabled in-depth look as change of human role from worker to one
of a supervisor
2. Expanded the system architecting process to include HSI considerations
up front.
3. Created capability for in-depth sensitivity analysis exploring changes to
the machine on human performance
4. Developed a new set of tools for human-system definition and analysis
a. A HSI Profile was created, in accordance with the HSI ontology, to
extend the Systems Modeling Language
b. An architecture conversion tool was developed and implemented to
seed the integrated human considerations from the architecture to a
HSI analysis tool
5. Created a framework for future MBSE extensions to other specialty
engineering disciplines
6. Changed how tool integration is approached from a tool perspective to a
data centric perspective
7.3 SUMMARY OF ACCOMPLISHMENTS
The two hypothesis stated in Section 3.1.1 were tested to determine their validity:
1. The first hypothesis was that by extending system semantics with human-
system factors and attributes Model Based Systems Engineering can be
137
used to evaluate the interactions, interfaces, and integration of the human
element. Integrating the systems engineering semantics to HSI semantics,
architects are now better suited to evaluate human system interactions,
interfaces, and overall integration. Architects are now able to seed HSI
analysis tools straight from the architecture, letting the design engineers
detail design and evaluate for inconsistencies or degradation in
performance.
2. The second hypothesis was that by extending Model-Based Systems
Engineering methods, processes, and tools through key viewpoints and
factors, human performance can be increased. As seen in the case study,
using the new process, method, and tools developed we were able to more
explicitly reduce an overall workload for the human agent by 13% over
the scenario. This reduction in workload was due to re-architecting the
Image Analyst Control System to include more automated functions that
reduce the human agents’ workload. By having integrated semantics,
architects gain faster results on areas to focus on as well as more explicitly
account for the human agent resources needed and the type of information
the human agent may need.
7.4 FUTURE RESEARCH
Although this research makes great strides toward providing better integration
between traditional systems architecting and human considerations, it has only focused
on individual interactions and is limited to workload and cognitive analysis. The
following are areas that would improve the work completed in this research:
138
1. The HSI ontology is focused on cognitive analysis, workload task
analysis, and human system interface analysis. The ontology does not
cover error and fault defense analysis due to human interactions nor does
it cover analysis of team interactions or organizational analysis. Another
emerging area of systems engineering that this ontology can evolve to
solving is how humans integrate to adaptable systems. In order to fill these
gaps in the future, more in-depth analysis of the underlying concepts in
each area must be done to find the causality of the relationships between
the human agent and the machine, each, in itself, being a big area of study
as new topics emerge. The HSI ontology is far from a complete work. Just
like language evolves with time, the HSI ontology will evolve with the
maturity of new technologies and with the new relationships being built
within the systems engineering domain and the human system integration
domain.
2. The HSI profile must be refined from a prototype to a more polished plug-
in, with automation and usability scripts for ease of use and adoption
within the engineering community. As the ontology changes the HSI
profile must be updated to stay in sync with the changes made to the
ontology.
3. The integration tool must be refined from prototype to a more versatile
plug-in, allowing new analysis and tool integrations to be included in
accordance with new additions to the ontology
In addition to the integration between systems engineering and human system
integration disciplines, the framework represented in this research could be applied to
other areas:
139
1. The ontology can scale to other domains using the framework laid in this
research in the areas of resilience, system security, and safety. Those
ontologies for specialty engineering should be created to better understand
how those considerations affect the larger architecture. In some cases,
strides are already being made to include these types of semantics in future
architecture frameworks, but have yet to be really researched for
optimization.
2. The use of ontologies should not just be used for expanding the semantics
between engineering domains, but also should be used to focus in on
developing mission domain specific ontologies.
3. One of the biggest finding that needs to be investigated more is the use
ontology for tool-to-tool integration. By taking a data centric approach, the
opportunity to use emerging big data methods for model and architecture
analysis, defined by the ontological axioms and smarter engineering aids,
can be created to automate and verify the model data.
140
References
Allender, L. (2000). Modeling Human Performance: Impacting System Design,
Performance, and Cost. In Proceedings of the Military, Government and
Aerospace Simulation Symposium, 2000 Advanced Simulation Technologies
Conference (Vol. 32, pp. 139–144). Washington, DC: Society for Computer
Simulation; 1999. Retrieved from
http://www.arl.army.mil/www/pages/447/Astc2000-Allender.pdf
American National Standards Institute, & Electronic Industries Alliance. (1998).
Processes for Engineering a System.
Arnold, S., Earthy, J., & Sherwood-Jones, B. (2002). Addressing the People Problem -
ISO/IEC 15288 and the Human-System Life Cycle. In INCOSE International
Symposium (pp. 1–7).
Axelsson, J. (2002). Towards an Improved Understanding of Humans as the Components
that Implement Systems Engineering. In Proceedings 12th Symposium of the
International Council on System Engineering (pp. 1–6). Las Vegas,. Retrieved
from http://www.mrtc.mdh.se/wfcs2002/index.html/html/jakob/incose02.pdf
Baber, C. (2004). Critical Path Analysis. In Handbook of Human Factors and
Ergonomics Methods. Boca Raton, FL: CLC Press.
Badler, N. I., Phillips, C. B., & Webber, B. L. (1993). Simulating humans: Computer
graphics, animation, and control. Center for Human …. New York, New York,
USA: Oxford University Press. Retrieved from
http://www.cis.upenn.edu/~badler/book/book.html
Bahill, T., & Madni, A. M. (2017). Trade-off Decisions in System Design. Springer.
Bajaj, M., Zwemer, D., Peak, R., Phung, A., Scott, A. G., & Wilson, M. W. (2011).
SLIM: Collaborative Model-Based Systems Engineering Workspace for Next-
Generation Complex Systems. In IEEE Aerospace Conference (pp. 1–15).
Retrieved from http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=5747539
Baker, K., Stewart, A., Pogue, C., & Ramotar, R. (2008). Human Views: Extensions to
the Department of Defense Architecture Framework. Contract.
Balakrishnan, P. K. (2002). Analysis of Human Factors in Specific Aspects of System
Design. In INCOSE International Symposium (pp. 1–9).
Bloechle, W. K., & Laughery Jr., K. R. (1999). Simulation Interoperability using Micro
Saint Simulation Software. In Proceedings of the Winter Simulation Conference
(pp. 286–288).
Bloechle, W. K., & Schunk, D. W. (2003). Micro Saint Sharp Simulation Software. In
Proceedings of the Winter Simulation Conference (pp. 182–187).
BNH Expert Software Inc. (2000). Advisor 3.5 User Guide.
141
Boehm, B., Bayuk, J., Desmukh, A., Graybill, R., Lane, J., Levin, A., … Wade, J. (2010).
Systems 2020 Strategic Initiative.
Booch, G. (1986). Object-Oriented Development. IEEE Transactions on Software
Engineering, SE-12(2), 211–221. https://doi.org/10.1109/TSE.1986.6312937
Booher, H. R. (2003). Handbook of Human System Integration. Hoboken, NJ: Wiley-
Interscience. Retrieved from
http://onlinelibrary.wiley.com/doi/10.1002/0471721174.fmatter/summary
Booz Allen Hamilton. (2010). Systems-2020 Study Final Report.
Boy, G. A. (1998). Cognitive Function Analysis for Human-Centered Automation of
Safety-Critical Systems. In Proceedings of the SIGCHI Conference on Human
Factors in Computing Systems (pp. 265–272). New York, New York, USA: ACM
Press. https://doi.org/10.1145/274644.274682
Bruseberg, A. (2008). Human Views for MODAF as a Bridge Between Human Factors
Integration and Systems Engineering. Journal of Cognitive Engineering and
Decision Making, 2(3), 220–248. https://doi.org/10.1518/155534308X377090.
Buede, D. M. (2000). The Engineering Design of System: Models and Methods (First
Edit). New York, New York, USA: John Wiley & Sons, Inc.
Buede, D. M. (2009). The Engineering Deisgn of Systems: Models and Methods. (A. P.
Sage, Ed.) (Second Edi). Hoboken, NJ: John Wiley & Sons, Inc.
Buie, E., & Vallone, A. (1995). Human-System Interaction Development: An Integral
Part of the Systems Engineering Process. In Proceesings of the 5th Annual
International Symposium of the National Council on Systems Engineering (pp.
927–930). Retrieved from
http://libsys.uah.edu/library/incose/Contents/Papers/95/9592.pdf
Card, S., Moran, T. P., & Newell, A. (1983). The Psychology of Human Computer
Interaction. Hillside, NJ: Lawrence Erlbaum Associates.
Chapanis, A. (1996). Human Factors in Systems Engineering. New York, New York,
USA: John Wiley & Sons, Inc.
Chua, Z. K., & Feigh, K. M. (2011). Integrating Human Factors Principles into Systems
Engineering. In IEEE/AIAA 30th Digital Avionics Systems Conferences (DASC)
(pp. 240–245). Retrieved from
http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=6096089
Cloutier, R., & Griego, R. (2008). Applying Object Oriented Systems Engineering to
Complex Systems. In IEEE International Systems Conference (SysCon).
Corker, K. M., & Smith, B. R. (1993). An Architecture and Model For Cognitive
Engineering Simulation Analysis: Application to Advanced Aviation Automation.
In AIAA Computing in Aerospace.
142
Cugola, G., Di Nitto, E., Fuggetta, A., & Ghezzi, C. (1996). A Framework for
Formalizing Inconsistencies and Deviations in Human-Centered Systems. ACM
Transactions on Software Engineering and Methodology (TOSEM), 5(3), 191–
230. https://doi.org/10.1145/234426.234427
Dick, J., & Chard, J. (2004). The Systems Engineering Sandwich: Combining
Requirements, Models and Design. White Paper, Telelogic. Retrieved from
http://scholar.google.com/scholar?hl=en&btnG=Search&q=intitle:The+Systems+
Engineering+Sandwich+:+Combining+requirements+,+models+and+design#0
Dolan, N., & Narkevicius, J. M. (2005). Systems Engineering, Acquisition and Personnel
Integration (SEAPRINT): Achieving the Promise of Human Systems Integration.
In Meeting Proceedings RTO-MP-HFM-124 (pp. 1–6). Neuilly-sur-Seine, France.
Estefan, J. A. (2008). Survey of Model-Based Systems Engineering (MBSE)
Methodologies. Jet Propulsion Lab, 25, 1–70. Retrieved from
http://www.omgsysml.org/MBSE_Methodology_Survey_RevB.pdf
Fischhoff, B., Slovic, P., & Lichtenstein, S. (1977). Knowing with Certainty: The
Appropriateness of Extreme Confidence. Journal of Experimental Psychology:
Human Perception and Performance, 3(4), 552–564.
Fitts, P. M. (1951). Human Engineering for an Effective Air Navigation and Traffic
Control System. Columbus, OH.
Gerhardt-Powals, J. (1996). Cognitive Engineering Principles for Enhancing Human-
Computer Performance. International Journal of Human-Computer Interaction,
8(2), 189–211.
Hammond, K. R., Hamm, R. M., Grassia, J., & Pearson, T. (1987). Direct Comparison of
the Efficacy of Intuitive and Analytical Cognition in Expert Judgment. IEEE
Transactions on Systems, Man, and Cybernetics, 17(5), 753–770.
Handley, H. A. H., & Smillie, R. J. (2008). Architecture Framework Human View: The
NATO Approach. Journal Systems Engineering, 11(2), 156–164.
https://doi.org/10.1002/sys.v11:2
Handley, H. A. H., & Smillie, R. J. (2009). Human View Dynamics — The NATO
Approach. Systems Engineering, 72–79. https://doi.org/10.1002/sys
Hardman, N., & Colombi, J. (2012). An Empirical Methodology for Human Integration
in the SE Technical Processes. Systems Engineering, 15(2), 172–190.
https://doi.org/10.1002/sys
Hart, S. G. (2006). Nasa-Task Load Index (NASA-TLX); 20 Years Later. In Proceedings
of the Human Factors and Ergonomics Society Annual Meeting (pp. 904–908).
Santa Monica, CA. https://doi.org/10.1037/e577632012-009
Hart, S. G., & Staveland, L. E. (1988). Development of NASA-TLX (Task Load index):
Result of Empirical and Theoretical Research. In Human Mental Workload (P. A.
143
Hanc). Amsterdam: North Holland Press. Retrieved from
http://humanfactors.arc.nasa.gov/groups/TLX/tlxpublications.html
Hartson, H. R., & Hix, D. (1989). Human-Computer Interface Development: Concepts
and Systems for its Management. ACM Computing Surveys, 21(1), 5–92.
https://doi.org/10.1145/62029.62031
Hause, M. (2010). The Unified Profile for DoDAF/MODAF (UPDM) Enabling Systems
of Systems on Many Levels. In IEEE International Systems Conference (pp. 426–
431). Ieee. https://doi.org/10.1109/SYSTEMS.2010.5482450
Herzog, E., & Pandikow, A. (2005). SysML – An Assessment. In Syntell AB, SE 100.
Retrieved from
http://pdf.aminer.org/000/259/725/the_assessment_of_object_oriented_modelling
_elements_of_the_uml.pdf
Hobbs, A. (2008). Three Principles of Human-System Integration. In Proceedings of the
8th Australian Psychology Symposium (pp. 1–8). Retrieved from
http://humansystems.arc.nasa.gov/publications/3Principles_HSI.pdf
Hoffmann, H.-P. (2005). UML 2.0-Based Systems Engineering Using a Model-Driven
Development Approach. CrossTalk The Journal of Defense Software
Engineering, 1–18.
Hollnagel, E., & Woods, D. D. (1999). Cognitive Systems Engineering: New Wine in
New Bottles. International Journal of Human-Computer Studies, 51(2), 339–56.
Retrieved from http://www.ncbi.nlm.nih.gov/pubmed/11543350
Hood, L., Laughery, K. R., & Dahl, S. (1993). Fundamentals of Simulation using Micro
Saint. In Proceedings of the Winter Simulation Conference (pp. 218–222).
Horn, J. A., & Barnaba, J. M. (1988). Successful Integration of Human Engineering
Principles in System Design: A Discussion and Recommendations. In
Proceedings of the IEEE National Aerospace and Electronics Conference
(NAECON) (pp. 904–906). Dayton, OH. Retrieved from
http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=195115
Hunn, B. P., & Heuckeroth, O. H. (2006). A Shadow Unmanned Aerial Vehicle (UAV)
Improved Performance Research Integration Tool (IMPRINT) Model Supporting
Future Combat Systems. Aberdeen Proving Ground, MD. Retrieved from
http://oai.dtic.mil/oai/oai?verb=getRecord&metadataPrefix=html&ident
ifier=ADA443567
Hunn, B. P., Schweitzer, K. M., Cahir, J. A., & Finch, M. M. (2008). IMPRINT Analysis
of an Unmanned Air System Geospatial Information Process. Aberdeen Proving
Ground, MD.
Institute of Electrical and Electronics Engineers. (2005). Application and Management of
the Systems Engineer Process (Vol. 2007).
144
International Council on Systems Engineering. (2011). Systems Engineering Handbook.
(C. Haskins, Ed.) (3.2.2). San Diego, CA.
International Standards Organization, International Electrotechnical Commission, &
Institute of Electrical and Electronics Engineers. (2008). ISO/IEC/IEEE Standard
15288 Systems and Software Engineering - System Life Cycle Processes.
International Standards Organization, International Electrotechnical Commission, &
Institute of Electrical and Electronics Engineers. (2011). ISO/IEC/IEEE Standard
42010 Systems and Software Engineering - Architecture Description.
Johnson, T. A., Paredis, C. J. J., & Burkhart, R. (2012). Integrating Models and
Simulations of Continuous Dynamics Into SysML. In Journal of Computing and
Information Science in Engineering (Vol. 12). https://doi.org/10.1115/1.4005452
Johnson, T. A., Paredis, C. J. J., Jobe, J. M., & Burkhart, R. (2007). Modeling
Continuous System Dynamics in SysML. In Proceedings of ASME International
Mechanical Engineering Congress and Exposition (pp. 1–9). Seattle, WA, USA.
Retrieved from http://srl.gatech.edu/publications/2007/JohnsonParedis-
IMECE2007_DRAFT.pdf
Kahneman, D., & Tversky, A. (1973). On the Psychology of Prediction. Psychological
Review, 80(4), 237–251.
Krikorian, H. F. (2003). Introduction to Object-Oriented Systems Engineering, Part 1. IT
Professional, (March|April), 38–42. Retrieved from
http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=1191791
Kurstedt, H. (2000a). A Systems Engineer’s Guide to Human Subsystems. In
Proceedings of EUSEC. Retrieved from
http://books.google.com/books?hl=en&lr=&id=0RvIMfcILm8C&oi=fnd&pg=PA
317&dq=A+Systems+Engineer’s+Guide+to+Human+Subsystems&ots=wWxpqR
mX3J&sig=0sx_w8PWhZIGC1HdjEM6sU4tcFc
Kurstedt, H. (2000b). Applying Engineering Principles to Human Components in
Complex Systems. In INCOSE International Symposium. Retrieved from
http://scholar.google.com/scholar?hl=en&btnG=Search&q=intitle:Applying+Engi
neering+Principles+to+Human+Components+in+Complex+Systems#2
Landsburg, A. C., Avery, L., Beaton, R., Bost, J. R., Comperatore, C., Khandpur, R., …
Sheridan, T. B. (2008). The Art of Successfully Applying Human Systems
Integration. American Society of Naval Engineers Journal, 120(1), 77–107.
https://doi.org/10.1111/j.1559-3584.2008.00113.x
Lebiere, C., & Anderson, J. R. (1993). A Connectionist Implementation of the ACT-R
Production System. In Proceedings of the Fiftheeth Annual Confernce of the
Cognitive Science Society (pp. 635–640). Retrieved from http://act-
r.psy.cmu.edu/papers/234/lebiere_and_anderson_93.pdf
145
Leveson, N. G. (2000). Intent Specifications: An Approach to Building Human-Centered
Specifications. IEEE Transactions on Software Engineering, 26(1), 15–35.
https://doi.org/10.1109/32.825764
Madni, A. M. (1988a). HUMANE: A Designer’s Assistant for Modeling and Evaluating
Function Allocation Options. In Proceedings of Ergonomics of Advanced
Manufacturing and Automated Systems Conference (pp. 291–302). Louisville,
KY.
Madni, A. M. (1988b). HUMANE: A Knowledge-Based Simulation Environment for
Human-Machine Function Allocation. In Proceedings of the IEEE National
Aerospace and Electronics Conference (pp. 860–866).
Madni, A. M. (1988c). The Role of Human Factors in Expert Systems Design and
Acceptance. Human Factors Journal, 30(4), 395–414.
https://doi.org/10.1177/001872088803000403
Madni, A. M. (2010). Integrating Humans with Software and Systems: Technical
Challenges and a Research Agenda. Systems Engineering, 13(3), 232–245.
https://doi.org/10.1002/sys
Madni, A. M. (2011a). Integrating Humans With and Within Complex Systems.
CrossTalk, May/June, 4–8.
Madni, A. M. (2011b). Towards a Generalizable Aiding-Training Continuum for Human
Performance Enhancement. Systems Engineering, 14(2), 129–140.
https://doi.org/10.1002/sys.20166
Madni, A. M. (2012). Elegant Systems Design: Creative Fusion of Simplicity and Power.
Systems Engineering, 15(3), 347–354. https://doi.org/10.1002/sys
Madni, A. M. (2015). Expanding Stakeholder Participation in Upfront System
Engineering Through Storytelling in Virtual Worlds. Systems Engineering (Vol.
18).
Madni, A. M. (2017). Mutual Adaptation in Human-Machine Teams. ISTI-WP-02-
012017.
Madni, A. M., & Freedy, A. (1986). Intelligent Interfaces for Human Control of
Advanced Automation and Smart Systems in Human Productivity Enhancement.
Training and Human Factors in Systems Design, 1(J. Zeidner), 318–331.
Madni, A. M., Madni, C. C., & Garcia, S. (2005a). Cognitive Model-enabled Simulation-
based Training of Aerospace Operations Center Operators. In Proceedings of the
11th International Conference on Human-Computer Interaction. Las Vegas, NV.
Madni, A. M., Madni, C. C., & Garcia, S. (2005b). Cognitive Model-enabled Simulation-
based Training of AOC Operators. In Proceedings of AIAA Infotech@Aerospace.
Arlington, VA.
146
Madni, A. M., Madni, C. C., Garcia, S., & Sorensen, H. B. (2005). Intelligent Agents for
Individual and Team Training Applications. In Proceedings of AIAA
Infotech@Aerospace. Arlington, VA.
Madni, A. M., Moini, A., & Madni, C. C. (2006). Cognitecture
TM
: Cognition-Based
Interactive Architecture for Next Generation Intelligent Agents. Phase II Final
Technical Report, ISTI-FTR-567-8/06, Contract #W31P4Q-04-C-R222.
Madni, A. M., Sage, A. P., & Madni, C. C. (2005). Infusion of Cognitive Engineering
into Systems Engineering Processes and Practices. In IEEE International
Conference on Systems, Man and Cybernetics (Vol. 1, pp. 960–965). Ieee.
https://doi.org/10.1109/ICSMC.2005.1571270
Madni, A. M., & Sievers, M. (2013). Systems Integration: Key Perspectives,
Experiences, and Challenge. Systems Engineering, 16(4), 1–23.
Mayer, R. J., Crump, J. W., Fernandes, R., Keen, A., & Painter, M. K. (1995).
Information Integration for Concurrent Engineering (IICE) Compedum of
Methods Report. Wright-Patterson Air Force Base, Ohio. Retrieved from
http://www.idef.com/pdf/idef3_fn.pdf
Mayer, R. J., Painter, M. K., & DeWitte, P. S. (1992). IDEF Family of Methods for
COncurrent Engineering and Business Re-engineering Applications.
McMaster, R., & Baber, C. (2005). Integrating Human Factors into Systems Engineering
through a Distributed Cognition Notation. In The FIEE and MOD HFI DTC
Symposium on People and Systems - Who are we Designing for? (pp. 77–83). Iee.
https://doi.org/10.1049/ic:20050454
Meilich, A. (2007). Human Systems Integration - A Systems of Systems Engineering
Challenge. In IEEE International Conference on System of Systems Engineering
(pp. 1–6). San Antonio, TX. https://doi.org/10.1109/SYSOSE.2007.4304216
Memmel, T., Gundelsweiler, F., & Reiterer, H. (2007). Agile Human-Centered Software
Engineering. In Proceedings of the 21st British HCI Group Conference on People
and Computers: HCI ... But Not As We Know It (Vol. 1, pp. 167–175). Retrieved
from http://dl.acm.org/citation.cfm?id=1531317
Miller, N. L., Crowson Jr, J. J., & Narkevicius, J. M. (2003). Human Characteristics and
Measures in Systems Design. In H. R. Booher (Ed.), Handbook of Human
Systems Integration (pp. 699–742). John Wiley & Sons, Inc.
Milner, S., & Wheeler, P. (2001). Integrating Human Factors and Systems Engineering.
In The Second International Conference on Human Interfaces in Controls Rooms,
Cockpits and Command Centres (pp. 240–245). Retrieved from http://digital-
library.theiet.org/content/conferences/10.1049/cp_20010469
Mitchell, D. K. (2003). Advanced Improved Performance Research Integration Tool
(IMPRINT) Vetronics Technology Test Bed Model Development. Aberdeen
147
Proving Ground, MD. Retrieved from http://www.dtic.mil/cgi-
bin/GetTRDoc?AD=ADA417350&Location=U2&doc=GetTRDoc.pdf
Monsell, S. (2003). Task Switching. Trends in Cognitive Sciences, 7(3), 134–140.
https://doi.org/10.1016/S1364-6613(03)00028-7
Narkevicius, J. M. (2008). Human Factors and Systems Engineering - Integrating for
Successful Systems Development. In Proceedings of the Human Factors and
Ergonomics Society Annual Meeting (Vol. 52, pp. 1961–1963).
https://doi.org/10.1177/154193120805202409
Neches, R., & Madni, A. M. (2012). Towards Affordably Adaptable and Effective
Systems. Systems Engineering, 16(2), 224–234. https://doi.org/10.1002/sys
Nolan, B., Brown, B., Balmelli, L., Bohn, T., & Ueli, W. (2008). Model Driven Systems
Development with Rational Products.
Novak, J. D., & Cañas, A. J. (2006). The Theory Underlying Concept Maps and How to
Construct Them, Technical Report IHMC CmapTools 2006-01. Florida Institute
for Human and Machine Cognition, 1–33. Retrieved from
http://cmap.ihmc.us/Publications/ResearchPapers/
TheoryUnderlyingConceptMaps.pdf
Object Management Group. (2010). Systems Modeling Language v1.2. Needham, MA,
USA.
Object Management Group. (2011). Unified Modeling Language, Infrastructure v2.4.1.
Object Management Group. (2013). UML Testing Profile v1.2. Retrieved from
http://www.omg.org/spec/UTP/1.2
Parasuraman, R., & Riley, V. (1997). Humans and Automation: Use, Misuse, Disuse,
Abuse. Human Factors: The Journal of the Human Factors and Ergonomics
Society, 39(2), 230–253. https://doi.org/10.1518/001872097778543886
Parasuraman, R., Sheridan, T. B., & Wickens, C. D. (2000). A Model for Types and
Levels of Human Interaction with Automation. IEEE Transactions on Mystems,
Man, and Cybernetics. Part A, Systems and Humans : A Publication of the IEEE
Systems, Man, and Cybernetics Society, 30(3), 286–97. Retrieved from
http://www.ncbi.nlm.nih.gov/pubmed/11760769
Parasuraman, R., Sheridan, T. B., & Wickens, C. D. (2008). Situation Awareness, Mental
Workload, and Trust in Automation: Viable, Empirically Supported Cognitive
Engineering Constructs. Journal of Cognitive Engineering and Decision Making,
2(2), 140–160. https://doi.org/10.1518/155534308X284417.
Peak, R. S., Burkhart, R. M., Friedenthal, S. A., Wilson, M. W., Bajaj, M., & Kim, I.
(2007). Simulation-Based Design Using SysML Part 1: A Parametrics Primer. In
INCOSE International Symposium (pp. 1–20).
148
Polson, P. G. (1992). Cognitive Walkthroughs: A Method for Theory-Based Evaluation
of User Interfaces. International Journal of Man-Machine Studies, 36(5), 741–
773. https://doi.org/10.1016/0020-7373(92)90039-N
Price, S. W. (1994). An Improved Interface Simulation Architecture. Auburn University.
Retrieved from
http://www.eng.auburn.edu/files/acad_depts/csse/csse_technical_reports/CSSE94-
13.pdf
Rizvi, A. H., Singh, L. P., & Bhardwaj, A. (2009). Human Integration Matrix: An
Adaptive Framework to Enhance Human Systems Integration in New Product
Development. In Portland International Conference on Management of
Engineering & Technology (PICMET) (pp. 1989–1997). Portland, OR.
https://doi.org/10.1109/PICMET.2009.5261937
Rossmeissl, P. G., Tillman, B. W., Rigg, K. E., & Best, P. R. (1983). Job Assessment
Software System (JASS) for Analysis of Weapon Systems Personnel Requirements.
Alexandria, VA.
Roth, E. M., Patterson, E. S., & Mumaw, R. J. (2002). Cognitive Engineering: Issues in
User-Centered System Design. In Encyclopedia of Software Engineering, 2nd
Edition (pp. 163–179). New York, New York, USA: John Wiley & Sons, Inc.
Ruault, J. R. (2004). Bridging System Engineering and Human Factors Engineering: A
Step Forward. In INCOSE International Symposium.
Sharples, M., Jeffery, N., du Boulay, J. B. H., Teather, D., Teather, B., & du Boulay, G.
H. (2002). Socio-Cognitive Engineering: A Methodology for the Design of
Human-Centred Technology. European Journal of Operational Research, 136,
310–323. Retrieved from
http://www.sciencedirect.com/science/article/pii/S0377221701001187
Shepherd, A. (1998). HTA As A Framework For Task Analysis. Ergonomics, 41(11),
1537–1552. https://doi.org/10.1080/001401398186063
Sheridan, T. B. (2000). Function allocation: algorithm, alchemy or apostasy?
International Journal of Human-Computer Studies, 52(2), 203–216.
https://doi.org/10.1006/ijhc.1999.0285
Sheridan, T. B. (2008). Risk, Human Error, and System Resilience: Fundamental Ideas.
Human Factors: The Journal of the Human Factors and Ergonomics Society,
50(3), 418–426. https://doi.org/10.1518/001872008X250773
Sheridan, T. B., & Ferrell, W. R. (1974). Man-Machine Systems: Information, Control,
and Decision Models of Human Performance. Cambridge, MA: MIT Press.
Slovic, P., & Tversky, A. (1974). Who Accepts Savage’s Axiom? Behavioral Science,
19(6), 368–373.
149
Spangelo, S. C. (2013). Model Based Systems Engineering (MBSE) Applied to Radio
Aurora Explorer (RAX) CubeSat Mission Operational Scenarios. In IEEE
Aerospace Conference (pp. 1–18). Big Sky, MT.
https://doi.org/http://dx.doi.org.libproxy.usc.edu/10.1109/AERO.2013.6496894
Spangelo, S. C., Kaslow, D., Delp, C., Cole, B., Anderson, L., Fosse, E., … Cutler, J.
(2012). Applying Model Based Systems Engineering (MBSE) to a Standard
CubeSat. In 2012 IEEE Aerospace Conference (pp. 1–20). Ieee.
https://doi.org/10.1109/AERO.2012.6187339
Swain, A. D. (1964). THERP, SC-R-64-1338. Albuquerque, NM.
Systems Management College. (2001). Systems Engineering Fundamentals. Defense
Acquisition University Press.
Taatgen, N. A., Lebiere, C., & Anderson, J. R. (2006). Modeling Paradigms in ACT-R. In
R. Sun (Ed.), Cognition and Multi-Agent Interaction: From Cognitive Modeling
to Social Simulation (pp. 29–52). Cambridge University Press. Retrieved from
http://act-r.psy.cmu.edu/papers/570/SDOC4697.pdf
Three Mile Island. (1979). Three Mile Island. Technology and Society, 7(28), 795–795.
https://doi.org/10.1126/science.795-a
Todd, B., & McConnell, D. (2011). Autopilots May Dull Skills of Pilots, Committee
Says - CNN. Retrieved November 1, 2011, from http://articles.cnn.com/2011-09-
01/travel/airlines.autopilot_1_colgan-air-airline-pilots-association-
autopilot?_s=PM:TRAVEL
Tversky, A., & Kahneman, D. (1998). Judgment under Uncertainty: Heuristics and
Biases. In G. Mather, F. Verstraten, & S. Anstis (Eds.), The Motion After Effect
(Vol. 185, pp. 585–600). MIT Press.
U.S. Air Force. (2009). Air Force Human Systems Integration Handbook. Brooks City-
Base, TX.
Vanderperren, Y., & Dehaene, W. (2006). From UML/SysML to Matlab/Simulink:
Current State and Future Perspectives. In Proceedings of the Design Automation
& Test in Europe Conference. Ieee. https://doi.org/10.1109/DATE.2006.244002
Vasarhelyi, M. A. (1977). Man-Machine Planning Systems: A Cognitive Style
Examination of Interactive Decision Making. Journal of Accounting Research,
15(1), 138–153. https://doi.org/10.2307/2490560
Wickens, C. D., & Hollands, J. G. (1999). Engineering Psychology and Human
Performance. (N. Roberts, Ed.). Upper Saddle River, New Jersey: Prentice Hall.
Retrieved from
http://webfiles.ita.chalmers.se/~mys/HumanAspects/WickensHollands/0_Wicken
s_Index_Preface.pdf
150
Wood, D. D., & Roth, E. M. (1988). Cognitive Engineering: Human Problem Solving
with Tools. Human Factors Journal, 30(4), 415–430. Retrieved from
http://hfs.sagepub.com/content/30/4/415.short
Yerkes, R. M., & Dodson, J. D. (1908). The Relation of Strength of Stimulus to Rapidity
of Habit ‐ Formation. Journal of Comparative Neurology and Psychology, 18(5),
459–482. Retrieved from
http://onlinelibrary.wiley.com/doi/10.1002/cne.920180503/full
151
Appendix A: Geospatial Intelligence System Architecture
A.1 GEOINT SYSTEM ARCHITECTURE (MANUAL)
Figure A-1. Perform Geospatial Intelligence Concept of Operations
152
Figure A-2. Perform Counter Improvised Explosive Device Operations
153
Figure A-3. Monitor Direct Imagery Feeds Operational Concept
154
Figure A-4. Exploit Full Motion Imagery
Figure A-5. Maintain Voice and Chat with AVO
155
Figure A-6. Direct AUV Sensor Employment
Figure A-7. Conduct Aerial Route Reconnaissance
156
Figure A-8. Pick IMINT Sensor to Satisfy GEOINT
Figure A-9. ID Roadways on Imagery
157
Figure A-10. Plot Coordinates on a Map Image
Figure A-11. ID Unconventional Act on Imagery
158
Figure A-12. Provide Chip, Image, or Video Clip
Figure A-13. ID Vehicle Types on Imagery
159
Figure A-14. ID Roadways on Imagery
Figure A-15. ID on Video and Object or Event
Figure A-16. Respond to Request for Imagery
160
Figure A-17. GEOINT System Hierarchy
Figure A-18. Primary Image Analyst Definition
161
Figure A-19. GEOINT Requirements
Figure A-20. Primary Image Analyst GUI Mock Up
162
Figure A-21. Perform any Audio/Visual Capture VACP Sequence Diagram
163
A.2 GEOINT SYSTEM ARCHITECTURE (CHANGES WITH AUTOASSIST)
Figure A-22. Monitor Direct Imagery Feeds with Auto Assist
164
Appendix B: Human System Integration Analysis
B.1 GEOINT HUMAN SYSTEM ARCHITECTURE
Figure B-1. Monitor Direct Imagery Feeds IMPRINT Model
Figure B-2. Maintain Voice and Chat with AVO IMPRINT Model
165
Figure B-3. Exploit Full Motion Video IMPRINT Model
166
Figure B-4. Conduct Aerial Route Reconnaissance IMPRINT Model
Figure B-5. Pick IMINT Sensors to Satisfy GEOINT
167
Figure B-6. ID Roadways on Imagery
Figure B-7. Plot Coordinates on a Map, Image or Geospatial
168
Figure B-8. ID Unconventional Act on Imagery
Figure B-9. Provide Chip, Image or Video clip
169
Figure B-10. ID Vehicle Types on Imagery
Figure B-11. ID on Video an Object or Event
Figure B-12. Respond to Request for Imagery
170
Figure B-13. Direct AUV Sensor Employment
171
B.2 GEOINT HUMAN-SYSTEM ANALYSIS (MANUAL)
Table B-1. GEOINT System Primary Image Analyst Workload Trace
Clock Function Name Task Name Workload
00:00:00.00 (Root) Operator One H to H2 Start 0.00
00:00:00.00 (Root) Operator One H to H2 Start 0.00
00:00:00.00 Maintain Voice and Chat with AVO v2 START 0.00
00:00:00.00 96D-1232 Exploit Full Motion Video Imagery v2 START 0.00
00:00:00.00 Conduct Aerial Route Reconnaissance v2 START 0.00
00:00:00.00 Direct AUV Sensor Employment v2 START 0.00
00:00:00.00 Maintain Voice and Chat with AVO v2 START 0.00
00:00:00.00 96D-1232 Exploit Full Motion Video Imagery v2 START 0.00
00:00:00.00 Conduct Aerial Route Reconnaissance v2 START 0.00
00:00:00.00 Direct AUV Sensor Employment v2 START 0.00
00:00:00.00 Maintain Voice and Chat with AVO v2 START 0.00
00:00:00.00 Maintain Voice and Chat with AVO v2 Provide direction/guidance to UAV 17.50
00:00:00.00 96D-1232 Exploit Full Motion Video Imagery v2 START 17.50
00:00:00.00 96D-1232 Exploit Full Motion Video Imagery v2 Determine the requirement by examining the
exploitation requirement(s)
31.30
00:00:00.00 Conduct Aerial Route Reconnaissance v2 START 31.30
00:00:00.00 Conduct Aerial Route Reconnaissance v2 Monitor Terrain from which the enemy can influence
the route
49.40
00:00:00.00 Direct AUV Sensor Employment v2 START 49.40
00:00:00.00 Direct AUV Sensor Employment v2 Break 49.40
00:00:00.00 Maintain Voice and Chat with AVO v2 Provide direction/guidance to UAV 49.40
00:00:00.00 96D-1232 Exploit Full Motion Video Imagery v2 Determine the requirement by examining the
exploitation requirement(s)
49.40
00:00:00.00 Conduct Aerial Route Reconnaissance v2 Monitor Terrain from which the enemy can influence 49.40
172
Clock Function Name Task Name Workload
the route
00:00:00.00 Direct AUV Sensor Employment v2 Break 49.40
00:00:27.29 96D-1232 Exploit Full Motion Video Imagery v2 Determine the requirement by examining the
exploitation requirement(s)
35.60
00:00:27.29 96D-1232 Exploit Full Motion Video Imagery v2 Obtain any supporting data or references 52.60
00:00:27.29 96D-1232 Exploit Full Motion Video Imagery v2 Obtain any supporting data or references 52.60
00:02:43.23 96D-1232 Exploit Full Motion Video Imagery v2 Obtain any supporting data or references 35.60
00:02:43.23 96D-1232 Exploit Full Motion Video Imagery v2 Review Historical Reports 53.80
00:02:43.23 96D-1232 Exploit Full Motion Video Imagery v2 Review Historical Reports 53.80
00:06:16.00 Maintain Voice and Chat with AVO v2 Provide direction/guidance to UAV 36.30
00:06:16.00 Maintain Voice and Chat with AVO v2 Break 36.30
00:06:16.00 Maintain Voice and Chat with AVO v2 Break 36.30
00:09:23.86 96D-1232 Exploit Full Motion Video Imagery v2 Review Historical Reports 18.10
00:09:23.86 96D-1232 Exploit Full Motion Video Imagery v2 Conduct Analysis and Manipulation of data 35.30
00:09:23.86 96D-1232 Exploit Full Motion Video Imagery v2 Conduct Analysis and Manipulation of data 35.30
00:14:59.24 96D-1232 Exploit Full Motion Video Imagery v2 Conduct Analysis and Manipulation of data 18.10
00:14:59.24 96D-1232 Exploit Full Motion Video Imagery v2 Review Target Folders 35.60
00:14:59.24 96D-1232 Exploit Full Motion Video Imagery v2 Review Target Folders 35.60
00:16:50.10 96D-1232 Exploit Full Motion Video Imagery v2 Review Target Folders 18.10
00:16:50.10 96D-1232 Exploit Full Motion Video Imagery v2 Obtain imagery and geospatial data 18.10
00:16:50.10 96D-1232 Exploit Full Motion Video Imagery v2 Obtain imagery and geospatial data 18.10
00:22:52.86 96D-1232 Exploit Full Motion Video Imagery v2 Obtain imagery and geospatial data 18.10
00:22:52.86 96D-1232 Exploit Full Motion Video Imagery v2 Streaming video downlinked from aerial vehicle 34.60
00:22:52.86 96D-1232 Exploit Full Motion Video Imagery v2 Streaming video downlinked from aerial vehicle 34.60
00:23:39.43 Conduct Aerial Route Reconnaissance v2 Monitor Terrain from which the enemy can influence
the route
16.50
00:23:39.43 Conduct Aerial Route Reconnaissance v2 Monitor Control Measures 34.60
00:23:39.43 Conduct Aerial Route Reconnaissance v2 Monitor Control Measures 34.60
173
Clock Function Name Task Name Workload
00:25:17.77 Conduct Aerial Route Reconnaissance v2 Monitor Control Measures 16.50
00:25:17.77 Conduct Aerial Route Reconnaissance v2 ID Potential ambush, IED locations 33.50
00:25:17.77 Conduct Aerial Route Reconnaissance v2 ID Potential ambush, IED locations 33.50
00:26:41.74 96D-1232 Exploit Full Motion Video Imagery v2 Streaming video downlinked from aerial vehicle 17.00
00:26:41.74 96D-1232 Exploit Full Motion Video Imagery v2 Signatures developed through the analysis of FMV 32.80
00:26:41.74 96D-1232 Exploit Full Motion Video Imagery v2 Signatures developed through the analysis of FMV 32.80
00:40:09.94 96D-1232 Exploit Full Motion Video Imagery v2 Signatures developed through the analysis of FMV 17.00
00:40:09.94 96D-1232 Exploit Full Motion Video Imagery v2 Conduct analysis of data at various speeds 34.00
00:40:09.94 96D-1232 Exploit Full Motion Video Imagery v2 Conduct analysis of data at various speeds 34.00
00:40:15.52 Direct AUV Sensor Employment v2 Break 34.00
00:40:15.52 Direct AUV Sensor Employment v2 Provide direction/guidance to UAV sensor operator 53.50
00:40:15.52 Direct AUV Sensor Employment v2 Provide direction/guidance to UAV sensor operator 53.50
00:43:57.86 Conduct Aerial Route Reconnaissance v2 ID Potential ambush, IED locations 36.50
00:43:57.86 Conduct Aerial Route Reconnaissance v2 ID other restrictive passages or obstacles 53.50
00:43:57.86 Conduct Aerial Route Reconnaissance v2 ID other restrictive passages or obstacles 53.50
00:45:00.52 Direct AUV Sensor Employment v2 Provide direction/guidance to UAV sensor operator 34.00
00:45:00.52 Direct AUV Sensor Employment v2 END 34.00
00:45:00.52 Direct AUV Sensor Employment v2 END 34.00
00:45:00.52 Direct AUV Sensor Employment v2 END 34.00
00:45:00.52 Direct AUV Sensor Employment v3 START 34.00
00:45:00.52 Direct AUV Sensor Employment v3 START 34.00
00:45:00.52 Direct AUV Sensor Employment v3 START 34.00
00:45:00.52 Direct AUV Sensor Employment v3 Break 34.00
00:45:00.52 Direct AUV Sensor Employment v3 Break 34.00
00:46:38.84 Maintain Voice and Chat with AVO v2 Break 34.00
00:46:38.84 Maintain Voice and Chat with AVO v2 END 34.00
00:46:38.84 Maintain Voice and Chat with AVO v2 END 34.00
00:46:38.84 Maintain Voice and Chat with AVO v2 END 34.00
174
Clock Function Name Task Name Workload
00:46:38.84 Maintain voice and chat with AVO v3 START 34.00
00:46:38.84 Maintain voice and chat with AVO v3 START 34.00
00:46:38.84 Maintain voice and chat with AVO v3 START 34.00
00:46:38.84 Maintain voice and chat with AVO v3 TaskProvide direction/guidance to UAV 51.50
00:46:38.84 Maintain voice and chat with AVO v3 TaskProvide direction/guidance to UAV 51.50
00:50:35.80 Maintain voice and chat with AVO v3 TaskProvide direction/guidance to UAV 34.00
00:50:35.80 Maintain voice and chat with AVO v3 Break 34.00
00:50:35.80 Maintain voice and chat with AVO v3 Break 34.00
00:55:04.18 96D-1232 Exploit Full Motion Video Imagery v2 Conduct analysis of data at various speeds 17.00
00:55:04.18 96D-1232 Exploit Full Motion Video Imagery v2 Conduct analysis of data frame by frame 34.70
00:55:04.18 96D-1232 Exploit Full Motion Video Imagery v2 Conduct analysis of data frame by frame 34.70
00:55:41.80 96D-1232 Exploit Full Motion Video Imagery v2 Conduct analysis of data frame by frame 17.00
00:55:41.80 96D-1232 Exploit Full Motion Video Imagery v2 Perform any object recognition and identification 34.20
00:55:41.80 96D-1232 Exploit Full Motion Video Imagery v2 Perform any object recognition and identification 34.20
00:58:38.00 96D-1232 Exploit Full Motion Video Imagery v2 Perform any object recognition and identification 17.00
00:58:38.00 96D-1232 Exploit Full Motion Video Imagery v2 Perform any geographic positioning 34.00
00:58:38.00 96D-1232 Exploit Full Motion Video Imagery v2 Perform any geographic positioning 34.00
00:59:40.50 Conduct Aerial Route Reconnaissance v2 ID other restrictive passages or obstacles 17.00
00:59:40.50 Conduct Aerial Route Reconnaissance v2 Maintain comms with supporting / supported and
adjacent units
33.50
00:59:40.50 Conduct Aerial Route Reconnaissance v2 Maintain comms with supporting / supported and
adjacent units
33.50
01:00:17.99 96D-1232 Exploit Full Motion Video Imagery v2 Perform any geographic positioning 16.50
01:00:17.99 96D-1232 Exploit Full Motion Video Imagery v2 Perform any mensuration functions 31.40
01:00:17.99 96D-1232 Exploit Full Motion Video Imagery v2 Perform any mensuration functions 31.40
01:00:50.15 Conduct Aerial Route Reconnaissance v2 Maintain comms with supporting / supported and
adjacent units
14.90
01:00:50.15 Conduct Aerial Route Reconnaissance v2 END 14.90
175
Clock Function Name Task Name Workload
01:00:50.15 Conduct Aerial Route Reconnaissance v2 END 14.90
01:00:50.15 Conduct Aerial Route Reconnaissance v2 END 14.90
01:00:50.15 96D-1301 Pick IMINT Sensors to Satisfy GEOINT v2 START 14.90
01:00:50.15 96D-1301 Pick IMINT Sensors to Satisfy GEOINT v2 START 14.90
01:00:50.15 96D-1301 Pick IMINT Sensors to Satisfy GEOINT v2 START 14.90
01:00:50.15 96D-1301 Pick IMINT Sensors to Satisfy GEOINT v2 Determine the capabilities and limitations of the
imagery sensors
27.10
01:00:50.15 96D-1301 Pick IMINT Sensors to Satisfy GEOINT v2 Determine the capabilities and limitations of the
imagery sensors
27.10
01:06:57.28 96D-1301 Pick IMINT Sensors to Satisfy GEOINT v2 Determine the capabilities and limitations of the
imagery sensors
14.90
01:06:57.28 96D-1301 Pick IMINT Sensors to Satisfy GEOINT v2 Select imagery sensor/platform that will satisfy the
requirement
27.10
01:06:57.28 96D-1301 Pick IMINT Sensors to Satisfy GEOINT v2 Select imagery sensor/platform that will satisfy the
requirement
27.10
01:07:23.30 96D-1301 Pick IMINT Sensors to Satisfy GEOINT v2 Select imagery sensor/platform that will satisfy the
requirement
14.90
01:07:23.30 96D-1301 Pick IMINT Sensors to Satisfy GEOINT v2 END 14.90
01:07:23.30 96D-1301 Pick IMINT Sensors to Satisfy GEOINT v2 END 14.90
01:07:23.30 96D-1301 Pick IMINT Sensors to Satisfy GEOINT v2 END 14.90
01:07:23.30 96D-1204 Identify Roadways on Imagery v2 START 14.90
01:07:23.30 96D-1204 Identify Roadways on Imagery v2 START 14.90
01:07:23.30 96D-1204 Identify Roadways on Imagery v2 START 14.90
01:07:23.30 96D-1204 Identify Roadways on Imagery v2 Determine the requirement by examining the
exploitation requirement(s)
30.70
01:07:23.30 96D-1204 Identify Roadways on Imagery v2 Determine the requirement by examining the
exploitation requirement(s)
30.70
01:08:57.93 96D-1232 Exploit Full Motion Video Imagery v2 Perform any mensuration functions 15.80
01:08:57.93 96D-1232 Exploit Full Motion Video Imagery v2 Signatures developed through the analysis of FMV v2 31.60
176
Clock Function Name Task Name Workload
01:08:57.93 96D-1232 Exploit Full Motion Video Imagery v2 Signatures developed through the analysis of FMV v2 31.60
01:10:23.76 96D-1204 Identify Roadways on Imagery v2 Determine the requirement by examining the
exploitation requirement(s)
15.80
01:10:23.76 96D-1204 Identify Roadways on Imagery v2 Locate the roadway on the imagery 31.60
01:10:23.76 96D-1204 Identify Roadways on Imagery v2 Locate the roadway on the imagery 31.60
01:11:07.65 96D-1204 Identify Roadways on Imagery v2 Locate the roadway on the imagery 15.80
01:11:07.65 96D-1204 Identify Roadways on Imagery v2 Identify the status of the roadway 32.30
01:11:07.65 96D-1204 Identify Roadways on Imagery v2 Identify the status of the roadway 32.30
01:11:45.02 96D-1204 Identify Roadways on Imagery v2 Identify the status of the roadway 15.80
01:11:45.02 96D-1204 Identify Roadways on Imagery v2 Identify any bridges by type 32.30
01:11:45.02 96D-1204 Identify Roadways on Imagery v2 Identify any bridges by type 32.30
01:13:17.41 96D-1204 Identify Roadways on Imagery v2 Identify any bridges by type 15.80
01:13:17.41 96D-1204 Identify Roadways on Imagery v2 Identify any underpasses 32.30
01:13:17.41 96D-1204 Identify Roadways on Imagery v2 Identify any underpasses 32.30
01:17:51.53 96D-1204 Identify Roadways on Imagery v2 Identify any underpasses 15.80
01:17:51.53 96D-1204 Identify Roadways on Imagery v2 Identify any tunnels 32.30
01:17:51.53 96D-1204 Identify Roadways on Imagery v2 Identify any tunnels 32.30
01:18:53.08 96D-1232 Exploit Full Motion Video Imagery v2 Signatures developed through the analysis of FMV v2 16.50
01:18:53.08 96D-1232 Exploit Full Motion Video Imagery v2 Conduct analysis of data at various speeds v2 33.50
01:18:53.08 96D-1232 Exploit Full Motion Video Imagery v2 Conduct analysis of data at various speeds v2 33.50
01:23:45.59 96D-1204 Identify Roadways on Imagery v2 Identify any tunnels 17.00
01:23:45.59 96D-1204 Identify Roadways on Imagery v2 Identify any areas where the roadway is constricted 34.00
01:23:45.59 96D-1204 Identify Roadways on Imagery v2 Identify any areas where the roadway is constricted 34.00
01:24:03.03 96D-1232 Exploit Full Motion Video Imagery v2 Conduct analysis of data at various speeds v2 17.00
01:24:03.03 96D-1232 Exploit Full Motion Video Imagery v2 Conduct analysis of data frame by frame v2 34.70
01:24:03.03 96D-1232 Exploit Full Motion Video Imagery v2 Conduct analysis of data frame by frame v2 34.70
01:26:20.54 96D-1204 Identify Roadways on Imagery v2 Identify any areas where the roadway is constricted 17.70
01:26:20.54 96D-1204 Identify Roadways on Imagery v2 END 17.70
177
Clock Function Name Task Name Workload
01:26:20.54 96D-1204 Identify Roadways on Imagery v2 END 17.70
01:26:20.54 96D-1204 Identify Roadways on Imagery v2 END 17.70
01:26:20.54 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
START 17.70
01:26:20.54 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
START 17.70
01:26:20.54 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
START 17.70
01:26:20.54 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
Determine the scale of the map sheet in use v1 30.30
01:26:20.54 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
Determine the scale of the map sheet in use v1 30.30
01:26:23.50 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
Determine the scale of the map sheet in use v1 17.70
01:26:23.50 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
Plot given geographic coordinates 34.00
01:26:23.50 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
Plot given geographic coordinates 34.00
01:26:41.81 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
Plot given geographic coordinates 17.70
01:26:41.81 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
Create lines of latitude and longitude by connecting the
grid tick marks
34.00
01:26:41.81 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
Create lines of latitude and longitude by connecting the
grid tick marks
34.00
01:27:35.67 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
Create lines of latitude and longitude by connecting the
grid tick marks
17.70
01:27:35.67 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
Plot any given MGRS coordinates 34.20
01:27:35.67 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
Plot any given MGRS coordinates 34.20
178
Clock Function Name Task Name Workload
01:27:55.82 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
Plot any given MGRS coordinates 17.70
01:27:55.82 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
Verify the grid zone designator on the map 31.40
01:27:55.82 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
Verify the grid zone designator on the map 31.40
01:28:02.46 Maintain voice and chat with AVO v3 Break 31.40
01:28:02.46 Maintain voice and chat with AVO v3 END 31.40
01:28:02.46 Maintain voice and chat with AVO v3 END 31.40
01:28:02.46 Maintain voice and chat with AVO v3 END 31.40
01:28:02.46 (Root) Model END 31.40
01:28:02.46 (Root) Model END 31.40
01:28:02.46 (Root) Model END 31.40
01:28:18.75 Direct AUV Sensor Employment v3 Break 31.40
01:28:18.75 Direct AUV Sensor Employment v3 Provide direction/guidance to UAV sensor operator 50.90
01:28:18.75 Direct AUV Sensor Employment v3 Provide direction/guidance to UAV sensor operator 50.90
01:28:30.18 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
Verify the grid zone designator on the map 37.20
01:28:30.18 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
END 37.20
01:28:30.18 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
END 37.20
01:28:30.18 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
END 37.20
01:28:30.18 96D-1237 ID Unconventional Act on Imagery v2 START 37.20
01:28:30.18 96D-1237 ID Unconventional Act on Imagery v2 START 37.20
01:28:30.18 96D-1237 ID Unconventional Act on Imagery v2 START 37.20
01:28:30.18 96D-1237 ID Unconventional Act on Imagery v2 Determine the requirement by examining the
exploitation requirement
58.00
179
Clock Function Name Task Name Workload
01:28:30.18 96D-1237 ID Unconventional Act on Imagery v2 Determine the requirement by examining the
exploitation requirement
58.00
01:30:04.46 96D-1232 Exploit Full Motion Video Imagery v2 Conduct analysis of data frame by frame v2 40.30
01:30:04.46 96D-1232 Exploit Full Motion Video Imagery v2 Prepare any SPOT or SALUTE reports v2 57.30
01:30:04.46 96D-1232 Exploit Full Motion Video Imagery v2 Prepare any SPOT or SALUTE reports v2 57.30
01:30:47.55 96D-1237 ID Unconventional Act on Imagery v2 Determine the requirement by examining the
exploitation requirement
36.50
01:30:47.55 96D-1237 ID Unconventional Act on Imagery v2 Locate the unconventional activity on the imagery /
map sheet
53.00
01:30:47.55 96D-1237 ID Unconventional Act on Imagery v2 Locate the unconventional activity on the imagery /
map sheet
53.00
01:31:05.19 96D-1232 Exploit Full Motion Video Imagery v2 Prepare any SPOT or SALUTE reports v2 36.00
01:31:05.19 96D-1232 Exploit Full Motion Video Imagery v2 Perform any object recognition and identification v2 53.30
01:31:05.19 96D-1232 Exploit Full Motion Video Imagery v2 Perform any object recognition and identification v2 53.30
01:31:41.86 96D-1232 Exploit Full Motion Video Imagery v2 Perform any object recognition and identification v2 36.00
01:31:41.86 96D-1232 Exploit Full Motion Video Imagery v2 Perform any audio/video capture v2 53.20
01:31:41.86 96D-1232 Exploit Full Motion Video Imagery v2 Perform any audio/video capture v2 53.20
01:33:41.53 Direct AUV Sensor Employment v3 Provide direction/guidance to UAV sensor operator 33.70
01:33:41.53 Direct AUV Sensor Employment v3 END 33.70
01:33:41.53 Direct AUV Sensor Employment v3 END 33.70
01:33:41.53 Direct AUV Sensor Employment v3 END 33.70
01:33:41.53 (Root) Model END 33.70
01:33:41.53 (Root) Model END 33.70
01:33:41.53 (Root) Model END 33.70
01:34:12.39 96D-1237 ID Unconventional Act on Imagery v2 Locate the unconventional activity on the imagery /
map sheet
17.20
01:34:12.39 96D-1237 ID Unconventional Act on Imagery v2 Determine if the imagery quality is sufficient to ID the
activity
33.70
01:34:12.39 96D-1237 ID Unconventional Act on Imagery v2 Determine if the imagery quality is sufficient to ID the 33.70
180
Clock Function Name Task Name Workload
activity
01:35:19.21 96D-1237 ID Unconventional Act on Imagery v2 Determine if the imagery quality is sufficient to ID the
activity
17.20
01:35:19.21 96D-1237 ID Unconventional Act on Imagery v2 ID the type of unconventional activity 33.30
01:35:19.21 96D-1237 ID Unconventional Act on Imagery v2 ID the type of unconventional activity 33.30
01:35:26.94 96D-1232 Exploit Full Motion Video Imagery v2 Perform any audio/video capture v2 16.10
01:35:26.94 96D-1232 Exploit Full Motion Video Imagery v2 Perform any geographic positioning v2 32.20
01:35:26.94 96D-1232 Exploit Full Motion Video Imagery v2 Perform any geographic positioning v2 32.20
01:36:23.66 96D-1232 Exploit Full Motion Video Imagery v2 Perform any geographic positioning v2 16.10
01:36:23.66 96D-1232 Exploit Full Motion Video Imagery v2 Perform any mensuration functions v2 31.00
01:36:23.66 96D-1232 Exploit Full Motion Video Imagery v2 Perform any mensuration functions v2 31.00
01:38:55.63 96D-1232 Exploit Full Motion Video Imagery v2 Perform any mensuration functions v2 16.10
01:38:55.63 96D-1232 Exploit Full Motion Video Imagery v2 Perform any manipulation functions (zoom, rotate,
overlay frames)
31.00
01:38:55.63 96D-1232 Exploit Full Motion Video Imagery v2 Perform any manipulation functions (zoom, rotate,
overlay frames)
31.00
01:40:26.79 96D-1232 Exploit Full Motion Video Imagery v2 Perform any manipulation functions (zoom, rotate,
overlay frames)
16.10
01:40:26.79 96D-1232 Exploit Full Motion Video Imagery v2 Perform any change detection 31.00
01:40:26.79 96D-1232 Exploit Full Motion Video Imagery v2 Perform any change detection 31.00
01:40:53.56 96D-1237 ID Unconventional Act on Imagery v2 ID the type of unconventional activity 14.90
01:40:53.56 96D-1237 ID Unconventional Act on Imagery v2 END 14.90
01:40:53.56 96D-1237 ID Unconventional Act on Imagery v2 END 14.90
01:40:53.56 96D-1237 ID Unconventional Act on Imagery v2 END 14.90
01:40:53.56 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
START 14.90
01:40:53.56 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
START 14.90
01:40:53.56 96D-1050 Plot Coordinates on a Map, Image or START 14.90
181
Clock Function Name Task Name Workload
Geospatial v4
01:40:53.56 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
Determine the scene of the map sheet in use v1 27.50
01:40:53.56 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
Determine the scene of the map sheet in use v1 27.50
01:40:56.46 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
Determine the scene of the map sheet in use v1 14.90
01:40:56.46 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
Plot given geographic coordinates 31.20
01:40:56.46 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
Plot given geographic coordinates 31.20
01:41:11.56 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
Plot given geographic coordinates 14.90
01:41:11.56 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
Create lines of latitude and longitude by connecting the
grid tick marks
31.20
01:41:11.56 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
Create lines of latitude and longitude by connecting the
grid tick marks
31.20
01:42:13.35 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
Create lines of latitude and longitude by connecting the
grid tick marks
14.90
01:42:13.35 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
Plot any given MGRS coordinates 31.40
01:42:13.35 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
Plot any given MGRS coordinates 31.40
01:42:34.00 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
Plot any given MGRS coordinates 14.90
01:42:34.00 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
Verify the grid zone designator on the map 28.60
01:42:34.00 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
Verify the grid zone designator on the map 28.60
01:43:13.83 96D-1050 Plot Coordinates on a Map, Image or Verify the grid zone designator on the map 14.90
182
Clock Function Name Task Name Workload
Geospatial v4
01:43:13.83 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
END 14.90
01:43:13.83 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
END 14.90
01:43:13.83 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
END 14.90
01:43:13.83 Provide chip, image or video clip v2 START 14.90
01:43:13.83 Provide chip, image or video clip v2 START 14.90
01:43:13.83 Provide chip, image or video clip v2 START 14.90
01:43:13.83 Provide chip, image or video clip v2 Provide chip, image, or video clip for additional analysis 29.60
01:43:13.83 Provide chip, image or video clip v2 Provide chip, image, or video clip for additional analysis 29.60
01:46:21.94 Provide chip, image or video clip v2 Provide chip, image, or video clip for additional analysis 14.90
01:46:21.94 Provide chip, image or video clip v2 END 14.90
01:46:21.94 Provide chip, image or video clip v2 END 14.90
01:46:21.94 Provide chip, image or video clip v2 END 14.90
01:46:21.94 96D-1215 ID Vehicle Types on Imagery v2 START 14.90
01:46:21.94 96D-1215 ID Vehicle Types on Imagery v2 START 14.90
01:46:21.94 96D-1215 ID Vehicle Types on Imagery v2 START 14.90
01:46:21.94 96D-1215 ID Vehicle Types on Imagery v2 Determine the requirement by examining the
exploitation requirements
28.70
01:46:21.94 96D-1215 ID Vehicle Types on Imagery v2 Determine the requirement by examining the
exploitation requirements
28.70
01:47:27.47 96D-1215 ID Vehicle Types on Imagery v2 Determine the requirement by examining the
exploitation requirements
14.90
01:47:27.47 96D-1215 ID Vehicle Types on Imagery v2 Locate vehicles types on imagery 31.90
01:47:27.47 96D-1215 ID Vehicle Types on Imagery v2 Locate vehicles types on imagery 31.90
01:49:01.58 96D-1232 Exploit Full Motion Video Imagery v2 Perform any change detection 17.00
01:49:01.58 96D-1232 Exploit Full Motion Video Imagery v2 Prepare any SPOT or SALUTE reports 33.90
183
Clock Function Name Task Name Workload
01:49:01.58 96D-1232 Exploit Full Motion Video Imagery v2 Prepare any SPOT or SALUTE reports 33.90
01:49:04.60 96D-1215 ID Vehicle Types on Imagery v2 Locate vehicles types on imagery 16.90
01:49:04.60 96D-1215 ID Vehicle Types on Imagery v2 ID deception attempts to the vehicles 33.90
01:49:04.60 96D-1215 ID Vehicle Types on Imagery v2 ID deception attempts to the vehicles 33.90
01:51:28.32 96D-1215 ID Vehicle Types on Imagery v2 ID deception attempts to the vehicles 16.90
01:51:28.32 96D-1215 ID Vehicle Types on Imagery v2 END 16.90
01:51:28.32 96D-1215 ID Vehicle Types on Imagery v2 END 16.90
01:51:28.32 96D-1215 ID Vehicle Types on Imagery v2 END 16.90
01:51:28.32 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
START 16.90
01:51:28.32 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
START 16.90
01:51:28.32 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
START 16.90
01:51:28.32 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
Determine the scale of the map sheet in use v1 16.90
01:51:28.32 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
Determine the scale of the map sheet in use v1 16.90
01:51:31.88 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
Determine the scale of the map sheet in use v1 16.90
01:51:31.88 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
Plot given geographic coordinates 33.20
01:51:31.88 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
Plot given geographic coordinates 33.20
01:51:59.71 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
Plot given geographic coordinates 16.90
01:51:59.71 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
Create lines of latitude and longitude by connecting the
grid tick marks
33.20
01:51:59.71 96D-1050 Plot Coordinates on a Map, Image or Create lines of latitude and longitude by connecting the 33.20
184
Clock Function Name Task Name Workload
Geospatial v5 grid tick marks
01:53:39.68 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
Create lines of latitude and longitude by connecting the
grid tick marks
16.90
01:53:39.68 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
Plot any given MGRS coordinates 33.40
01:53:39.68 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
Plot any given MGRS coordinates 33.40
01:53:53.38 96D-1232 Exploit Full Motion Video Imagery v2 Prepare any SPOT or SALUTE reports 16.50
01:53:53.38 96D-1232 Exploit Full Motion Video Imagery v2 Perform any object recognition and identification v3 33.70
01:53:53.38 96D-1232 Exploit Full Motion Video Imagery v2 Perform any object recognition and identification v3 33.70
01:54:07.97 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
Plot any given MGRS coordinates 17.20
01:54:07.97 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
Verify the grid zone designator on the map 30.90
01:54:07.97 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
Verify the grid zone designator on the map 30.90
01:54:43.62 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
Verify the grid zone designator on the map 17.20
01:54:43.62 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
END 17.20
01:54:43.62 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
END 17.20
01:54:43.62 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
END 17.20
01:54:43.62 Provide Chip, Image or Video Clip v3 START 17.20
01:54:43.62 Provide Chip, Image or Video Clip v3 START 17.20
01:54:43.62 Provide Chip, Image or Video Clip v3 START 17.20
01:54:43.62 Provide Chip, Image or Video Clip v3 Provide chip, image, or video clip for additional analysis 31.90
01:54:43.62 Provide Chip, Image or Video Clip v3 Provide chip, image, or video clip for additional analysis 31.90
185
Clock Function Name Task Name Workload
01:56:08.03 96D-1232 Exploit Full Motion Video Imagery v2 Perform any object recognition and identification v3 14.70
01:56:08.03 96D-1232 Exploit Full Motion Video Imagery v2 Perform any audio/video capture v3 14.70
01:56:08.03 96D-1232 Exploit Full Motion Video Imagery v2 Perform any audio/video capture v3 14.70
01:57:53.06 Provide Chip, Image or Video Clip v3 Provide chip, image, or video clip for additional analysis 0.00
01:57:53.06 Provide Chip, Image or Video Clip v3 END 0.00
01:57:53.06 Provide Chip, Image or Video Clip v3 END 0.00
01:57:53.06 Provide Chip, Image or Video Clip v3 END 0.00
01:57:53.06 ID on video an object or event v2 START 0.00
01:57:53.06 ID on video an object or event v2 START 0.00
01:57:53.06 ID on video an object or event v2 START 0.00
01:57:53.06 ID on video an object or event v2 ID Object, area or activity of interest on an image or
video
15.30
01:57:53.06 ID on video an object or event v2 ID Object, area or activity of interest on an image or
video
15.30
01:58:10.64 96D-1232 Exploit Full Motion Video Imagery v2 Perform any audio/video capture v3 15.30
01:58:10.64 96D-1232 Exploit Full Motion Video Imagery v2 Perform any geographic positioning v3 15.30
01:58:10.64 96D-1232 Exploit Full Motion Video Imagery v2 Perform any geographic positioning v3 15.30
01:58:44.69 ID on video an object or event v2 ID Object, area or activity of interest on an image or
video
0.00
01:58:44.69 ID on video an object or event v2 END 0.00
01:58:44.69 ID on video an object or event v2 END 0.00
01:58:44.69 ID on video an object or event v2 END 0.00
01:58:44.69 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v6
START 0.00
01:58:44.69 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v6
START 0.00
01:58:44.69 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v6
START 0.00
01:58:44.69 96D-1050 Plot Coordinates on a Map, Image or END 0.00
186
Clock Function Name Task Name Workload
Geospatial v6
01:58:44.69 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v6
END 0.00
01:58:44.69 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v6
END 0.00
01:58:44.69 Respond to Request for Imagery v2 START 0.00
01:58:44.69 Respond to Request for Imagery v2 START 0.00
01:58:44.69 Respond to Request for Imagery v2 START 0.00
01:58:44.69 Respond to Request for Imagery v2 Respond to request for imagery 16.20
01:58:44.69 Respond to Request for Imagery v2 Respond to request for imagery 16.20
01:59:29.90 96D-1232 Exploit Full Motion Video Imagery v2 Perform any geographic positioning v3 16.20
01:59:29.90 96D-1232 Exploit Full Motion Video Imagery v2 Perform any manipulation functions v3 31.10
01:59:29.90 96D-1232 Exploit Full Motion Video Imagery v2 Perform any manipulation functions v3 31.10
02:02:02.66 96D-1232 Exploit Full Motion Video Imagery v2 Perform any manipulation functions v3 16.20
02:02:02.66 96D-1232 Exploit Full Motion Video Imagery v2 Perform any change detection v3 31.10
02:02:02.66 96D-1232 Exploit Full Motion Video Imagery v2 Perform any change detection v3 31.10
02:02:38.71 Respond to Request for Imagery v2 Respond to request for imagery 14.90
02:02:38.71 Respond to Request for Imagery v2 END 14.90
02:02:38.71 Respond to Request for Imagery v2 END 14.90
02:02:38.71 Respond to Request for Imagery v2 END 14.90
02:02:38.71 Provide Chip, Image or Video Clip v4 START 14.90
02:02:38.71 Provide Chip, Image or Video Clip v4 START 14.90
02:02:38.71 Provide Chip, Image or Video Clip v4 START 14.90
02:02:38.71 Provide Chip, Image or Video Clip v4 Provide chip, image, or video clip for additional analysis 29.60
02:02:38.71 Provide Chip, Image or Video Clip v4 Provide chip, image, or video clip for additional analysis 29.60
02:05:46.40 Provide Chip, Image or Video Clip v4 Provide chip, image, or video clip for additional analysis 14.90
02:05:46.40 Provide Chip, Image or Video Clip v4 END 14.90
02:05:46.40 Provide Chip, Image or Video Clip v4 END 14.90
187
Clock Function Name Task Name Workload
02:05:46.40 Provide Chip, Image or Video Clip v4 END 14.90
02:05:46.40 (Root) Model END 14.90
02:05:46.40 (Root) Model END 14.90
02:05:46.40 (Root) Model END 14.90
02:10:37.07 96D-1232 Exploit Full Motion Video Imagery v2 Perform any change detection v3 0.00
02:10:37.07 96D-1232 Exploit Full Motion Video Imagery v2 Prepare any SPOT or Salute reports v3 0.00
02:10:37.07 96D-1232 Exploit Full Motion Video Imagery v2 Prepare any SPOT or Salute reports v3 0.00
02:14:02.45 96D-1232 Exploit Full Motion Video Imagery v2 Prepare any SPOT or Salute reports v3 0.00
02:14:02.45 96D-1232 Exploit Full Motion Video Imagery v2 END 0.00
02:14:02.45 96D-1232 Exploit Full Motion Video Imagery v2 END 0.00
02:14:02.45 96D-1232 Exploit Full Motion Video Imagery v2 END 0.00
02:14:02.45 (Root) Model END 0.00
02:14:02.45 (Root) Model END 0.00
02:14:02.45 (Root) Model END 0.00
Table B-2. GEOINT System Primary Image Analyst Workload Detail
Clock Auditory Cognitive Visual Psychomotor Overall
Workload
00:00:00.00 4.20 13.60 17.90 13.70 49.40
00:00:27.29 4.20 12.90 17.90 17.60 52.60
00:02:43.23 4.20 13.60 17.90 18.10 53.80
00:06:16.00 0.00 9.90 12.90 13.50 36.30
00:09:23.86 0.00 9.90 12.40 13.00 35.30
00:14:59.24 0.00 9.20 12.90 13.50 35.60
00:16:50.10 0.00 4.60 7.00 6.50 18.10
00:22:52.86 0.00 9.20 12.40 13.00 34.60
00:23:39.43 4.60 11.10 12.40 6.50 34.60
188
Clock Auditory Cognitive Visual Psychomotor Overall
Workload
00:25:17.77 0.00 9.20 11.30 13.00 33.50
00:26:41.74 0.00 9.20 11.30 12.30 32.80
00:40:09.94 0.00 9.20 11.80 13.00 34.00
00:40:15.52 4.20 14.50 17.20 17.60 53.50
00:43:57.86 4.20 14.50 17.20 17.60 53.50
00:45:00.52 0.00 9.20 11.80 13.00 34.00
00:46:38.84 4.20 12.90 16.80 17.60 51.50
00:50:35.80 0.00 9.20 11.80 13.00 34.00
00:55:04.18 0.00 9.90 11.80 13.00 34.70
00:55:41.80 0.00 9.90 11.30 13.00 34.20
00:58:38.00 0.00 9.20 11.80 13.00 34.00
00:59:40.50 0.00 9.20 11.30 13.00 33.50
01:00:17.99 0.00 8.30 10.80 12.30 31.40
01:00:50.15 0.00 8.30 10.40 8.40 27.10
01:06:57.28 0.00 8.30 10.40 8.40 27.10
01:07:23.30 0.00 9.00 11.30 10.40 30.70
01:08:57.93 0.00 9.90 11.30 10.40 31.60
01:10:23.76 0.00 9.20 10.80 11.60 31.60
01:11:07.65 0.00 9.90 10.80 11.60 32.30
01:11:45.02 0.00 9.90 10.80 11.60 32.30
01:13:17.41 0.00 9.90 10.80 11.60 32.30
01:17:51.53 0.00 9.90 10.80 11.60 32.30
01:18:53.08 0.00 9.90 11.30 12.30 33.50
01:23:45.59 0.00 9.90 11.80 12.30 34.00
01:24:03.03 0.00 10.60 11.80 12.30 34.70
01:26:20.54 0.00 9.90 11.30 9.10 30.30
01:26:23.50 0.00 9.90 11.80 12.30 34.00
189
Clock Auditory Cognitive Visual Psychomotor Overall
Workload
01:26:41.81 0.00 9.90 11.80 12.30 34.00
01:27:35.67 0.00 10.60 11.30 12.30 34.20
01:27:55.82 0.00 9.00 11.30 11.10 31.40
01:28:02.46 0.00 9.00 11.30 11.10 31.40
01:28:18.75 4.20 14.30 16.70 15.70 50.90
01:28:30.18 4.20 17.40 18.30 18.10 58.00
01:30:04.46 4.20 16.70 18.30 18.10 57.30
01:30:47.55 4.20 14.50 16.70 17.60 53.00
01:31:05.19 4.20 15.30 16.20 17.60 53.30
01:31:41.86 4.20 15.20 16.20 17.60 53.20
01:33:41.53 0.00 9.90 10.80 13.00 33.70
01:34:12.39 0.00 9.90 10.80 13.00 33.70
01:35:19.21 0.00 9.00 11.30 13.00 33.30
01:35:26.94 0.00 7.40 11.80 13.00 32.20
01:36:23.66 0.00 7.40 11.30 12.30 31.00
01:38:55.63 0.00 7.40 11.30 12.30 31.00
01:40:26.79 0.00 7.40 11.30 12.30 31.00
01:40:53.56 0.00 8.30 10.80 8.40 27.50
01:40:56.46 0.00 8.30 11.30 11.60 31.20
01:41:11.56 0.00 8.30 11.30 11.60 31.20
01:42:13.35 0.00 9.00 10.80 11.60 31.40
01:42:34.00 0.00 7.40 10.80 10.40 28.60
01:43:13.83 1.00 7.40 10.80 10.40 29.60
01:46:21.94 0.00 9.00 11.30 8.40 28.70
01:47:27.47 0.00 9.00 11.30 11.60 31.90
01:49:01.58 0.00 9.80 11.80 12.30 33.90
01:49:04.60 0.00 9.80 11.80 12.30 33.90
190
Clock Auditory Cognitive Visual Psychomotor Overall
Workload
01:51:28.32 0.00 4.50 5.90 6.50 16.90
01:51:31.88 0.00 9.10 11.80 12.30 33.20
01:51:59.71 0.00 9.10 11.80 12.30 33.20
01:53:39.68 0.00 9.80 11.30 12.30 33.40
01:53:53.38 0.00 10.60 10.80 12.30 33.70
01:54:07.97 0.00 9.00 10.80 11.10 30.90
01:54:43.62 1.00 9.00 10.80 11.10 31.90
01:56:08.03 1.00 3.70 5.40 4.60 14.70
01:57:53.06 2.00 3.70 5.00 4.60 15.30
01:58:10.64 2.00 3.70 5.00 4.60 15.30
01:58:44.69 2.00 3.70 5.90 4.60 16.20
01:59:29.90 2.00 7.40 11.30 10.40 31.10
02:02:02.66 2.00 7.40 11.30 10.40 31.10
02:02:38.71 1.00 7.40 10.80 10.40 29.60
02:05:46.40 0.00 3.70 5.40 5.80 14.90
02:10:37.07 0.00 0.00 0.00 0.00 0.00
191
B.3 GEOINT HUMAN-SYSTEM ANALYSIS (CHANGES WITH AUTOASSIST)
Table B-3. GEOINT System Primary Image Analyst Workload Trace with Auto Assist
Clock Function Name Task Name Workload
00:00:00.00 (Root) Operator One H to H2 Start 0.00
00:00:00.00 (Root) Operator One H to H2 Start 0.00
00:00:00.00 Maintain Voice and Chat with AVO v2 START 0.00
00:00:00.00 96D-1232 Exploit Full Motion Video Imagery v2 START 0.00
00:00:00.00 Conduct Aerial Route Reconnaissance v2 START 0.00
00:00:00.00 Direct AUV Sensor Employment v2 START 0.00
00:00:00.00 Maintain Voice and Chat with AVO v2 START 0.00
00:00:00.00 96D-1232 Exploit Full Motion Video Imagery v2 START 0.00
00:00:00.00 Conduct Aerial Route Reconnaissance v2 START 0.00
00:00:00.00 Direct AUV Sensor Employment v2 START 0.00
00:00:00.00 Maintain Voice and Chat with AVO v2 START 0.00
00:00:00.00 Maintain Voice and Chat with AVO v2 Provide direction/guidance to UAV 17.50
00:00:00.00 96D-1232 Exploit Full Motion Video Imagery v2 START 17.50
00:00:00.00 96D-1232 Exploit Full Motion Video Imagery v2 Determine the requirement by examining the exploitation
requirement(s)
31.30
00:00:00.00 Conduct Aerial Route Reconnaissance v2 START 31.30
00:00:00.00 Conduct Aerial Route Reconnaissance v2 Monitor Terrain from which the enemy can influence the
route
41.60
00:00:00.00 Direct AUV Sensor Employment v2 START 41.60
00:00:00.00 Direct AUV Sensor Employment v2 Break 41.60
00:00:00.00 Maintain Voice and Chat with AVO v2 Provide direction/guidance to UAV 41.60
00:00:00.00 96D-1232 Exploit Full Motion Video Imagery v2 Determine the requirement by examining the exploitation
requirement(s)
41.60
00:00:00.00 Conduct Aerial Route Reconnaissance v2 Monitor Terrain from which the enemy can influence the
route
41.60
192
Clock Function Name Task Name Workload
00:00:00.00 Direct AUV Sensor Employment v2 Break 41.60
00:00:27.29 96D-1232 Exploit Full Motion Video Imagery v2 Determine the requirement by examining the exploitation
requirement(s)
27.80
00:00:27.29 96D-1232 Exploit Full Motion Video Imagery v2 Obtain any supporting data or references 40.10
00:00:27.29 96D-1232 Exploit Full Motion Video Imagery v2 Obtain any supporting data or references 40.10
00:02:43.23 96D-1232 Exploit Full Motion Video Imagery v2 Obtain any supporting data or references 27.80
00:02:43.23 96D-1232 Exploit Full Motion Video Imagery v2 Review Historical Reports 43.60
00:02:43.23 96D-1232 Exploit Full Motion Video Imagery v2 Review Historical Reports 43.60
00:06:16.00 Maintain Voice and Chat with AVO v2 Provide direction/guidance to UAV 26.10
00:06:16.00 Maintain Voice and Chat with AVO v2 Break 26.10
00:06:16.00 Maintain Voice and Chat with AVO v2 Break 26.10
00:09:23.86 96D-1232 Exploit Full Motion Video Imagery v2 Review Historical Reports 10.30
00:09:23.86 96D-1232 Exploit Full Motion Video Imagery v2 Conduct Analysis and Manipulation of data 27.50
00:09:23.86 96D-1232 Exploit Full Motion Video Imagery v2 Conduct Analysis and Manipulation of data 27.50
00:14:59.24 96D-1232 Exploit Full Motion Video Imagery v2 Conduct Analysis and Manipulation of data 10.30
00:14:59.24 96D-1232 Exploit Full Motion Video Imagery v2 Review Target Folders 27.80
00:14:59.24 96D-1232 Exploit Full Motion Video Imagery v2 Review Target Folders 27.80
00:16:50.10 96D-1232 Exploit Full Motion Video Imagery v2 Review Target Folders 10.30
00:16:50.10 96D-1232 Exploit Full Motion Video Imagery v2 Obtain imagery and geospatial data 10.30
00:16:50.10 96D-1232 Exploit Full Motion Video Imagery v2 Obtain imagery and geospatial data 10.30
00:22:52.86 96D-1232 Exploit Full Motion Video Imagery v2 Obtain imagery and geospatial data 10.30
00:22:52.86 96D-1232 Exploit Full Motion Video Imagery v2 Streaming video downlinked from aerial vehicle 26.80
00:22:52.86 96D-1232 Exploit Full Motion Video Imagery v2 Streaming video downlinked from aerial vehicle 26.80
00:23:39.43 Conduct Aerial Route Reconnaissance v2 Monitor Terrain from which the enemy can influence the
route
16.50
00:23:39.43 Conduct Aerial Route Reconnaissance v2 Monitor Control Measures 34.60
00:23:39.43 Conduct Aerial Route Reconnaissance v2 Monitor Control Measures 34.60
00:25:17.77 Conduct Aerial Route Reconnaissance v2 Monitor Control Measures 16.50
193
Clock Function Name Task Name Workload
00:25:17.77 Conduct Aerial Route Reconnaissance v2 ID Potential ambush, IED locations 26.80
00:25:17.77 Conduct Aerial Route Reconnaissance v2 ID Potential ambush, IED locations 26.80
00:26:41.74 96D-1232 Exploit Full Motion Video Imagery v2 Streaming video downlinked from aerial vehicle 10.30
00:26:41.74 96D-1232 Exploit Full Motion Video Imagery v2 Signatures developed through the analysis of FMV 26.10
00:26:41.74 96D-1232 Exploit Full Motion Video Imagery v2 Signatures developed through the analysis of FMV 26.10
00:40:09.94 96D-1232 Exploit Full Motion Video Imagery v2 Signatures developed through the analysis of FMV 10.30
00:40:09.94 96D-1232 Exploit Full Motion Video Imagery v2 Conduct analysis of data at various speeds 20.60
00:40:09.94 96D-1232 Exploit Full Motion Video Imagery v2 Conduct analysis of data at various speeds 20.60
00:40:15.52 Direct AUV Sensor Employment v2 Break 20.60
00:40:15.52 Direct AUV Sensor Employment v2 Provide direction/guidance to UAV sensor operator 40.10
00:40:15.52 Direct AUV Sensor Employment v2 Provide direction/guidance to UAV sensor operator 40.10
00:43:57.86 Conduct Aerial Route Reconnaissance v2 ID Potential ambush, IED locations 29.80
00:43:57.86 Conduct Aerial Route Reconnaissance v2 ID other restrictive passages or obstacles 40.10
00:43:57.86 Conduct Aerial Route Reconnaissance v2 ID other restrictive passages or obstacles 40.10
00:45:00.52 Direct AUV Sensor Employment v2 Provide direction/guidance to UAV sensor operator 20.60
00:45:00.52 Direct AUV Sensor Employment v2 END 20.60
00:45:00.52 Direct AUV Sensor Employment v2 END 20.60
00:45:00.52 Direct AUV Sensor Employment v2 END 20.60
00:45:00.52 Direct AUV Sensor Employment v3 START 20.60
00:45:00.52 Direct AUV Sensor Employment v3 START 20.60
00:45:00.52 Direct AUV Sensor Employment v3 START 20.60
00:45:00.52 Direct AUV Sensor Employment v3 Break 20.60
00:45:00.52 Direct AUV Sensor Employment v3 Break 20.60
00:46:38.84 Maintain Voice and Chat with AVO v2 Break 20.60
00:46:38.84 Maintain Voice and Chat with AVO v2 END 20.60
00:46:38.84 Maintain Voice and Chat with AVO v2 END 20.60
00:46:38.84 Maintain Voice and Chat with AVO v2 END 20.60
00:46:38.84 Maintain voice and chat with AVO v3 START 20.60
194
Clock Function Name Task Name Workload
00:46:38.84 Maintain voice and chat with AVO v3 START 20.60
00:46:38.84 Maintain voice and chat with AVO v3 START 20.60
00:46:38.84 Maintain voice and chat with AVO v3 TaskProvide direction/guidance to UAV 38.10
00:46:38.84 Maintain voice and chat with AVO v3 TaskProvide direction/guidance to UAV 38.10
00:50:35.80 Maintain voice and chat with AVO v3 TaskProvide direction/guidance to UAV 20.60
00:50:35.80 Maintain voice and chat with AVO v3 Break 20.60
00:50:35.80 Maintain voice and chat with AVO v3 Break 20.60
00:55:04.18 96D-1232 Exploit Full Motion Video Imagery v2 Conduct analysis of data at various speeds 10.30
00:55:04.18 96D-1232 Exploit Full Motion Video Imagery v2 Conduct analysis of data frame by frame 28.00
00:55:04.18 96D-1232 Exploit Full Motion Video Imagery v2 Conduct analysis of data frame by frame 28.00
00:55:41.80 96D-1232 Exploit Full Motion Video Imagery v2 Conduct analysis of data frame by frame 10.30
00:55:41.80 96D-1232 Exploit Full Motion Video Imagery v2 Perform any object recognition and identification 20.60
00:55:41.80 96D-1232 Exploit Full Motion Video Imagery v2 Perform any object recognition and identification 20.60
00:58:38.00 96D-1232 Exploit Full Motion Video Imagery v2 Perform any object recognition and identification 10.30
00:58:38.00 96D-1232 Exploit Full Motion Video Imagery v2 Perform any geographic positioning 27.30
00:58:38.00 96D-1232 Exploit Full Motion Video Imagery v2 Perform any geographic positioning 27.30
00:59:40.50 Conduct Aerial Route Reconnaissance v2 ID other restrictive passages or obstacles 17.00
00:59:40.50 Conduct Aerial Route Reconnaissance v2 Maintain comms with supporting / supported and adjacent
units
33.50
00:59:40.50 Conduct Aerial Route Reconnaissance v2 Maintain comms with supporting / supported and adjacent
units
33.50
01:00:17.99 96D-1232 Exploit Full Motion Video Imagery v2 Perform any geographic positioning 16.50
01:00:17.99 96D-1232 Exploit Full Motion Video Imagery v2 Perform any mensuration functions 31.40
01:00:17.99 96D-1232 Exploit Full Motion Video Imagery v2 Perform any mensuration functions 31.40
01:00:50.15 Conduct Aerial Route Reconnaissance v2 Maintain comms with supporting / supported and adjacent
units
14.90
01:00:50.15 Conduct Aerial Route Reconnaissance v2 END 14.90
01:00:50.15 Conduct Aerial Route Reconnaissance v2 END 14.90
195
Clock Function Name Task Name Workload
01:00:50.15 Conduct Aerial Route Reconnaissance v2 END 14.90
01:00:50.15 96D-1301 Pick IMINT Sensors to Satisfy GEOINT v2 START 14.90
01:00:50.15 96D-1301 Pick IMINT Sensors to Satisfy GEOINT v2 START 14.90
01:00:50.15 96D-1301 Pick IMINT Sensors to Satisfy GEOINT v2 START 14.90
01:00:50.15 96D-1301 Pick IMINT Sensors to Satisfy GEOINT v2 Determine the capabilities and limitations of the imagery
sensors
27.10
01:00:50.15 96D-1301 Pick IMINT Sensors to Satisfy GEOINT v2 Determine the capabilities and limitations of the imagery
sensors
27.10
01:06:57.28 96D-1301 Pick IMINT Sensors to Satisfy GEOINT v2 Determine the capabilities and limitations of the imagery
sensors
14.90
01:06:57.28 96D-1301 Pick IMINT Sensors to Satisfy GEOINT v2 Select imagery sensor/platform that will satisfy the
requirement
27.10
01:06:57.28 96D-1301 Pick IMINT Sensors to Satisfy GEOINT v2 Select imagery sensor/platform that will satisfy the
requirement
27.10
01:07:23.30 96D-1301 Pick IMINT Sensors to Satisfy GEOINT v2 Select imagery sensor/platform that will satisfy the
requirement
14.90
01:07:23.30 96D-1301 Pick IMINT Sensors to Satisfy GEOINT v2 END 14.90
01:07:23.30 96D-1301 Pick IMINT Sensors to Satisfy GEOINT v2 END 14.90
01:07:23.30 96D-1301 Pick IMINT Sensors to Satisfy GEOINT v2 END 14.90
01:07:23.30 96D-1204 Identify Roadways on Imagery v2 START 14.90
01:07:23.30 96D-1204 Identify Roadways on Imagery v2 START 14.90
01:07:23.30 96D-1204 Identify Roadways on Imagery v2 START 14.90
01:07:23.30 96D-1204 Identify Roadways on Imagery v2 Determine the requirement by examining the exploitation
requirement(s)
30.70
01:07:23.30 96D-1204 Identify Roadways on Imagery v2 Determine the requirement by examining the exploitation
requirement(s)
30.70
01:08:57.93 96D-1232 Exploit Full Motion Video Imagery v2 Perform any mensuration functions 15.80
01:08:57.93 96D-1232 Exploit Full Motion Video Imagery v2 Signatures developed through the analysis of FMV v2 31.60
01:08:57.93 96D-1232 Exploit Full Motion Video Imagery v2 Signatures developed through the analysis of FMV v2 31.60
196
Clock Function Name Task Name Workload
01:10:23.76 96D-1204 Identify Roadways on Imagery v2 Determine the requirement by examining the exploitation
requirement(s)
15.80
01:10:23.76 96D-1204 Identify Roadways on Imagery v2 Locate the roadway on the imagery 26.10
01:10:23.76 96D-1204 Identify Roadways on Imagery v2 Locate the roadway on the imagery 26.10
01:11:07.65 96D-1204 Identify Roadways on Imagery v2 Locate the roadway on the imagery 15.80
01:11:07.65 96D-1204 Identify Roadways on Imagery v2 Identify the status of the roadway 26.10
01:11:07.65 96D-1204 Identify Roadways on Imagery v2 Identify the status of the roadway 26.10
01:11:45.02 96D-1204 Identify Roadways on Imagery v2 Identify the status of the roadway 15.80
01:11:45.02 96D-1204 Identify Roadways on Imagery v2 Identify any bridges by type 26.10
01:11:45.02 96D-1204 Identify Roadways on Imagery v2 Identify any bridges by type 26.10
01:13:17.41 96D-1204 Identify Roadways on Imagery v2 Identify any bridges by type 15.80
01:13:17.41 96D-1204 Identify Roadways on Imagery v2 Identify any underpasses 26.10
01:13:17.41 96D-1204 Identify Roadways on Imagery v2 Identify any underpasses 26.10
01:17:51.53 96D-1204 Identify Roadways on Imagery v2 Identify any underpasses 15.80
01:17:51.53 96D-1204 Identify Roadways on Imagery v2 Identify any tunnels 26.10
01:17:51.53 96D-1204 Identify Roadways on Imagery v2 Identify any tunnels 26.10
01:18:53.08 96D-1232 Exploit Full Motion Video Imagery v2 Signatures developed through the analysis of FMV v2 10.30
01:18:53.08 96D-1232 Exploit Full Motion Video Imagery v2 Conduct analysis of data at various speeds v2 27.30
01:18:53.08 96D-1232 Exploit Full Motion Video Imagery v2 Conduct analysis of data at various speeds v2 27.30
01:23:45.59 96D-1204 Identify Roadways on Imagery v2 Identify any tunnels 17.00
01:23:45.59 96D-1204 Identify Roadways on Imagery v2 Identify any areas where the roadway is constricted 27.30
01:23:45.59 96D-1204 Identify Roadways on Imagery v2 Identify any areas where the roadway is constricted 27.30
01:24:03.03 96D-1232 Exploit Full Motion Video Imagery v2 Conduct analysis of data at various speeds v2 10.30
01:24:03.03 96D-1232 Exploit Full Motion Video Imagery v2 Conduct analysis of data frame by frame v2 28.00
01:24:03.03 96D-1232 Exploit Full Motion Video Imagery v2 Conduct analysis of data frame by frame v2 28.00
01:26:20.54 96D-1204 Identify Roadways on Imagery v2 Identify any areas where the roadway is constricted 17.70
01:26:20.54 96D-1204 Identify Roadways on Imagery v2 END 17.70
01:26:20.54 96D-1204 Identify Roadways on Imagery v2 END 17.70
197
Clock Function Name Task Name Workload
01:26:20.54 96D-1204 Identify Roadways on Imagery v2 END 17.70
01:26:20.54 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
START 17.70
01:26:20.54 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
START 17.70
01:26:20.54 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
START 17.70
01:26:20.54 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
Determine the scale of the map sheet in use v1 30.30
01:26:20.54 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
Determine the scale of the map sheet in use v1 30.30
01:26:23.50 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
Determine the scale of the map sheet in use v1 17.70
01:26:23.50 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
Plot given geographic coordinates 34.00
01:26:23.50 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
Plot given geographic coordinates 34.00
01:26:41.81 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
Plot given geographic coordinates 17.70
01:26:41.81 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
Create lines of latitude and longitude by connecting the grid
tick marks
34.00
01:26:41.81 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
Create lines of latitude and longitude by connecting the grid
tick marks
34.00
01:27:35.67 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
Create lines of latitude and longitude by connecting the grid
tick marks
17.70
01:27:35.67 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
Plot any given MGRS coordinates 34.20
01:27:35.67 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
Plot any given MGRS coordinates 34.20
01:27:55.82 96D-1050 Plot Coordinates on a Map, Image or Plot any given MGRS coordinates 17.70
198
Clock Function Name Task Name Workload
Geospatial v3
01:27:55.82 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
Verify the grid zone designator on the map 31.40
01:27:55.82 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
Verify the grid zone designator on the map 31.40
01:28:02.46 Maintain voice and chat with AVO v3 Break 31.40
01:28:02.46 Maintain voice and chat with AVO v3 END 31.40
01:28:02.46 Maintain voice and chat with AVO v3 END 31.40
01:28:02.46 Maintain voice and chat with AVO v3 END 31.40
01:28:02.46 (Root) Model END 31.40
01:28:02.46 (Root) Model END 31.40
01:28:02.46 (Root) Model END 31.40
01:28:18.75 Direct AUV Sensor Employment v3 Break 31.40
01:28:18.75 Direct AUV Sensor Employment v3 Provide direction/guidance to UAV sensor operator 50.90
01:28:18.75 Direct AUV Sensor Employment v3 Provide direction/guidance to UAV sensor operator 50.90
01:28:30.18 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
Verify the grid zone designator on the map 37.20
01:28:30.18 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
END 37.20
01:28:30.18 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
END 37.20
01:28:30.18 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v3
END 37.20
01:28:30.18 96D-1237 ID Unconventional Act on Imagery v2 START 37.20
01:28:30.18 96D-1237 ID Unconventional Act on Imagery v2 START 37.20
01:28:30.18 96D-1237 ID Unconventional Act on Imagery v2 START 37.20
01:28:30.18 96D-1237 ID Unconventional Act on Imagery v2 Determine the requirement by examining the exploitation
requirement
58.00
01:28:30.18 96D-1237 ID Unconventional Act on Imagery v2 Determine the requirement by examining the exploitation 58.00
199
Clock Function Name Task Name Workload
requirement
01:30:04.46 96D-1232 Exploit Full Motion Video Imagery v2 Conduct analysis of data frame by frame v2 40.30
01:30:04.46 96D-1232 Exploit Full Motion Video Imagery v2 Prepare any SPOT or SALUTE reports v2 57.30
01:30:04.46 96D-1232 Exploit Full Motion Video Imagery v2 Prepare any SPOT or SALUTE reports v2 57.30
01:30:47.55 96D-1237 ID Unconventional Act on Imagery v2 Determine the requirement by examining the exploitation
requirement
36.50
01:30:47.55 96D-1237 ID Unconventional Act on Imagery v2 Locate the unconventional activity on the imagery / map
sheet
46.80
01:30:47.55 96D-1237 ID Unconventional Act on Imagery v2 Locate the unconventional activity on the imagery / map
sheet
46.80
01:31:05.19 96D-1232 Exploit Full Motion Video Imagery v2 Prepare any SPOT or SALUTE reports v2 29.80
01:31:05.19 96D-1232 Exploit Full Motion Video Imagery v2 Perform any object recognition and identification v2 40.10
01:31:05.19 96D-1232 Exploit Full Motion Video Imagery v2 Perform any object recognition and identification v2 40.10
01:31:41.86 96D-1232 Exploit Full Motion Video Imagery v2 Perform any object recognition and identification v2 29.80
01:31:41.86 96D-1232 Exploit Full Motion Video Imagery v2 Perform any audio/video capture v2 47.00
01:31:41.86 96D-1232 Exploit Full Motion Video Imagery v2 Perform any audio/video capture v2 47.00
01:33:41.53 Direct AUV Sensor Employment v3 Provide direction/guidance to UAV sensor operator 27.50
01:33:41.53 Direct AUV Sensor Employment v3 END 27.50
01:33:41.53 Direct AUV Sensor Employment v3 END 27.50
01:33:41.53 Direct AUV Sensor Employment v3 END 27.50
01:33:41.53 (Root) Model END 27.50
01:33:41.53 (Root) Model END 27.50
01:33:41.53 (Root) Model END 27.50
01:34:12.39 96D-1237 ID Unconventional Act on Imagery v2 Locate the unconventional activity on the imagery / map
sheet
17.20
01:34:12.39 96D-1237 ID Unconventional Act on Imagery v2 Determine if the imagery quality is sufficient to ID the activity 27.50
01:34:12.39 96D-1237 ID Unconventional Act on Imagery v2 Determine if the imagery quality is sufficient to ID the activity 27.50
01:35:19.21 96D-1237 ID Unconventional Act on Imagery v2 Determine if the imagery quality is sufficient to ID the activity 17.20
200
Clock Function Name Task Name Workload
01:35:19.21 96D-1237 ID Unconventional Act on Imagery v2 ID the type of unconventional activity 27.50
01:35:19.21 96D-1237 ID Unconventional Act on Imagery v2 ID the type of unconventional activity 27.50
01:35:26.94 96D-1232 Exploit Full Motion Video Imagery v2 Perform any audio/video capture v2 10.30
01:35:26.94 96D-1232 Exploit Full Motion Video Imagery v2 Perform any geographic positioning v2 26.40
01:35:26.94 96D-1232 Exploit Full Motion Video Imagery v2 Perform any geographic positioning v2 26.40
01:36:23.66 96D-1232 Exploit Full Motion Video Imagery v2 Perform any geographic positioning v2 10.30
01:36:23.66 96D-1232 Exploit Full Motion Video Imagery v2 Perform any mensuration functions v2 25.20
01:36:23.66 96D-1232 Exploit Full Motion Video Imagery v2 Perform any mensuration functions v2 25.20
01:38:55.63 96D-1232 Exploit Full Motion Video Imagery v2 Perform any mensuration functions v2 10.30
01:38:55.63 96D-1232 Exploit Full Motion Video Imagery v2 Perform any manipulation functions (zoom, rotate, overlay
frames)
25.20
01:38:55.63 96D-1232 Exploit Full Motion Video Imagery v2 Perform any manipulation functions (zoom, rotate, overlay
frames)
25.20
01:40:26.79 96D-1232 Exploit Full Motion Video Imagery v2 Perform any manipulation functions (zoom, rotate, overlay
frames)
10.30
01:40:26.79 96D-1232 Exploit Full Motion Video Imagery v2 Perform any change detection 25.20
01:40:26.79 96D-1232 Exploit Full Motion Video Imagery v2 Perform any change detection 25.20
01:40:53.56 96D-1237 ID Unconventional Act on Imagery v2 ID the type of unconventional activity 14.90
01:40:53.56 96D-1237 ID Unconventional Act on Imagery v2 END 14.90
01:40:53.56 96D-1237 ID Unconventional Act on Imagery v2 END 14.90
01:40:53.56 96D-1237 ID Unconventional Act on Imagery v2 END 14.90
01:40:53.56 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
START 14.90
01:40:53.56 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
START 14.90
01:40:53.56 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
START 14.90
01:40:53.56 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
Determine the scene of the map sheet in use v1 27.50
201
Clock Function Name Task Name Workload
01:40:53.56 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
Determine the scene of the map sheet in use v1 27.50
01:40:56.46 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
Determine the scene of the map sheet in use v1 14.90
01:40:56.46 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
Plot given geographic coordinates 31.20
01:40:56.46 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
Plot given geographic coordinates 31.20
01:41:11.56 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
Plot given geographic coordinates 14.90
01:41:11.56 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
Create lines of latitude and longitude by connecting the grid
tick marks
31.20
01:41:11.56 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
Create lines of latitude and longitude by connecting the grid
tick marks
31.20
01:42:13.35 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
Create lines of latitude and longitude by connecting the grid
tick marks
14.90
01:42:13.35 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
Plot any given MGRS coordinates 31.40
01:42:13.35 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
Plot any given MGRS coordinates 31.40
01:42:34.00 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
Plot any given MGRS coordinates 14.90
01:42:34.00 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
Verify the grid zone designator on the map 28.60
01:42:34.00 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
Verify the grid zone designator on the map 28.60
01:43:13.83 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
Verify the grid zone designator on the map 14.90
01:43:13.83 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
END 14.90
202
Clock Function Name Task Name Workload
01:43:13.83 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
END 14.90
01:43:13.83 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v4
END 14.90
01:43:13.83 Provide chip, image or video clip v2 START 14.90
01:43:13.83 Provide chip, image or video clip v2 START 14.90
01:43:13.83 Provide chip, image or video clip v2 START 14.90
01:43:13.83 Provide chip, image or video clip v2 Provide chip, image, or video clip for additional analysis 29.60
01:43:13.83 Provide chip, image or video clip v2 Provide chip, image, or video clip for additional analysis 29.60
01:46:21.94 Provide chip, image or video clip v2 Provide chip, image, or video clip for additional analysis 14.90
01:46:21.94 Provide chip, image or video clip v2 END 14.90
01:46:21.94 Provide chip, image or video clip v2 END 14.90
01:46:21.94 Provide chip, image or video clip v2 END 14.90
01:46:21.94 96D-1215 ID Vehicle Types on Imagery v2 START 14.90
01:46:21.94 96D-1215 ID Vehicle Types on Imagery v2 START 14.90
01:46:21.94 96D-1215 ID Vehicle Types on Imagery v2 START 14.90
01:46:21.94 96D-1215 ID Vehicle Types on Imagery v2 Determine the requirement by examining the exploitation
requirements
28.70
01:46:21.94 96D-1215 ID Vehicle Types on Imagery v2 Determine the requirement by examining the exploitation
requirements
28.70
01:47:27.47 96D-1215 ID Vehicle Types on Imagery v2 Determine the requirement by examining the exploitation
requirements
14.90
01:47:27.47 96D-1215 ID Vehicle Types on Imagery v2 Locate vehicles types on imagery 25.20
01:47:27.47 96D-1215 ID Vehicle Types on Imagery v2 Locate vehicles types on imagery 25.20
01:49:01.58 96D-1232 Exploit Full Motion Video Imagery v2 Perform any change detection 10.30
01:49:01.58 96D-1232 Exploit Full Motion Video Imagery v2 Prepare any SPOT or SALUTE reports 27.20
01:49:01.58 96D-1232 Exploit Full Motion Video Imagery v2 Prepare any SPOT or SALUTE reports 27.20
01:49:04.60 96D-1215 ID Vehicle Types on Imagery v2 Locate vehicles types on imagery 16.90
203
Clock Function Name Task Name Workload
01:49:04.60 96D-1215 ID Vehicle Types on Imagery v2 ID deception attempts to the vehicles 27.20
01:49:04.60 96D-1215 ID Vehicle Types on Imagery v2 ID deception attempts to the vehicles 27.20
01:51:28.32 96D-1215 ID Vehicle Types on Imagery v2 ID deception attempts to the vehicles 16.90
01:51:28.32 96D-1215 ID Vehicle Types on Imagery v2 END 16.90
01:51:28.32 96D-1215 ID Vehicle Types on Imagery v2 END 16.90
01:51:28.32 96D-1215 ID Vehicle Types on Imagery v2 END 16.90
01:51:28.32 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
START 16.90
01:51:28.32 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
START 16.90
01:51:28.32 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
START 16.90
01:51:28.32 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
Determine the scale of the map sheet in use v1 16.90
01:51:28.32 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
Determine the scale of the map sheet in use v1 16.90
01:51:31.88 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
Determine the scale of the map sheet in use v1 16.90
01:51:31.88 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
Plot given geographic coordinates 33.20
01:51:31.88 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
Plot given geographic coordinates 33.20
01:51:59.71 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
Plot given geographic coordinates 16.90
01:51:59.71 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
Create lines of latitude and longitude by connecting the grid
tick marks
33.20
01:51:59.71 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
Create lines of latitude and longitude by connecting the grid
tick marks
33.20
01:53:39.68 96D-1050 Plot Coordinates on a Map, Image or Create lines of latitude and longitude by connecting the grid 16.90
204
Clock Function Name Task Name Workload
Geospatial v5 tick marks
01:53:39.68 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
Plot any given MGRS coordinates 33.40
01:53:39.68 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
Plot any given MGRS coordinates 33.40
01:53:53.38 96D-1232 Exploit Full Motion Video Imagery v2 Prepare any SPOT or SALUTE reports 16.50
01:53:53.38 96D-1232 Exploit Full Motion Video Imagery v2 Perform any object recognition and identification v3 26.80
01:53:53.38 96D-1232 Exploit Full Motion Video Imagery v2 Perform any object recognition and identification v3 26.80
01:54:07.97 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
Plot any given MGRS coordinates 10.30
01:54:07.97 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
Verify the grid zone designator on the map 24.00
01:54:07.97 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
Verify the grid zone designator on the map 24.00
01:54:43.62 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
Verify the grid zone designator on the map 10.30
01:54:43.62 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
END 10.30
01:54:43.62 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
END 10.30
01:54:43.62 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v5
END 10.30
01:54:43.62 Provide Chip, Image or Video Clip v3 START 10.30
01:54:43.62 Provide Chip, Image or Video Clip v3 START 10.30
01:54:43.62 Provide Chip, Image or Video Clip v3 START 10.30
01:54:43.62 Provide Chip, Image or Video Clip v3 Provide chip, image, or video clip for additional analysis 25.00
01:54:43.62 Provide Chip, Image or Video Clip v3 Provide chip, image, or video clip for additional analysis 25.00
01:56:08.03 96D-1232 Exploit Full Motion Video Imagery v2 Perform any object recognition and identification v3 14.70
01:56:08.03 96D-1232 Exploit Full Motion Video Imagery v2 Perform any audio/video capture v3 14.70
205
Clock Function Name Task Name Workload
01:56:08.03 96D-1232 Exploit Full Motion Video Imagery v2 Perform any audio/video capture v3 14.70
01:57:53.06 Provide Chip, Image or Video Clip v3 Provide chip, image, or video clip for additional analysis 0.00
01:57:53.06 Provide Chip, Image or Video Clip v3 END 0.00
01:57:53.06 Provide Chip, Image or Video Clip v3 END 0.00
01:57:53.06 Provide Chip, Image or Video Clip v3 END 0.00
01:57:53.06 ID on video an object or event v2 START 0.00
01:57:53.06 ID on video an object or event v2 START 0.00
01:57:53.06 ID on video an object or event v2 START 0.00
01:57:53.06 ID on video an object or event v2 ID Object, area or activity of interest on an image or video 12.30
01:57:53.06 ID on video an object or event v2 ID Object, area or activity of interest on an image or video 12.30
01:58:10.64 96D-1232 Exploit Full Motion Video Imagery v2 Perform any audio/video capture v3 12.30
01:58:10.64 96D-1232 Exploit Full Motion Video Imagery v2 Perform any geographic positioning v3 12.30
01:58:10.64 96D-1232 Exploit Full Motion Video Imagery v2 Perform any geographic positioning v3 12.30
01:58:44.69 ID on video an object or event v2 ID Object, area or activity of interest on an image or video 0.00
01:58:44.69 ID on video an object or event v2 END 0.00
01:58:44.69 ID on video an object or event v2 END 0.00
01:58:44.69 ID on video an object or event v2 END 0.00
01:58:44.69 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v6
START 0.00
01:58:44.69 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v6
START 0.00
01:58:44.69 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v6
START 0.00
01:58:44.69 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v6
END 0.00
01:58:44.69 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v6
END 0.00
01:58:44.69 96D-1050 Plot Coordinates on a Map, Image or
Geospatial v6
END 0.00
206
Clock Function Name Task Name Workload
01:58:44.69 Respond to Request for Imagery v2 START 0.00
01:58:44.69 Respond to Request for Imagery v2 START 0.00
01:58:44.69 Respond to Request for Imagery v2 START 0.00
01:58:44.69 Respond to Request for Imagery v2 Respond to request for imagery 16.20
01:58:44.69 Respond to Request for Imagery v2 Respond to request for imagery 16.20
01:59:29.90 96D-1232 Exploit Full Motion Video Imagery v2 Perform any geographic positioning v3 16.20
01:59:29.90 96D-1232 Exploit Full Motion Video Imagery v2 Perform any manipulation functions v3 31.10
01:59:29.90 96D-1232 Exploit Full Motion Video Imagery v2 Perform any manipulation functions v3 31.10
02:02:02.66 96D-1232 Exploit Full Motion Video Imagery v2 Perform any manipulation functions v3 16.20
02:02:02.66 96D-1232 Exploit Full Motion Video Imagery v2 Perform any change detection v3 31.10
02:02:02.66 96D-1232 Exploit Full Motion Video Imagery v2 Perform any change detection v3 31.10
02:02:38.71 Respond to Request for Imagery v2 Respond to request for imagery 14.90
02:02:38.71 Respond to Request for Imagery v2 END 14.90
02:02:38.71 Respond to Request for Imagery v2 END 14.90
02:02:38.71 Respond to Request for Imagery v2 END 14.90
02:02:38.71 Provide Chip, Image or Video Clip v4 START 14.90
02:02:38.71 Provide Chip, Image or Video Clip v4 START 14.90
02:02:38.71 Provide Chip, Image or Video Clip v4 START 14.90
02:02:38.71 Provide Chip, Image or Video Clip v4 Provide chip, image, or video clip for additional analysis 29.60
02:02:38.71 Provide Chip, Image or Video Clip v4 Provide chip, image, or video clip for additional analysis 29.60
02:05:46.40 Provide Chip, Image or Video Clip v4 Provide chip, image, or video clip for additional analysis 14.90
02:05:46.40 Provide Chip, Image or Video Clip v4 END 14.90
02:05:46.40 Provide Chip, Image or Video Clip v4 END 14.90
02:05:46.40 Provide Chip, Image or Video Clip v4 END 14.90
02:05:46.40 (Root) Model END 14.90
02:05:46.40 (Root) Model END 14.90
02:05:46.40 (Root) Model END 14.90
02:10:37.07 96D-1232 Exploit Full Motion Video Imagery v2 Perform any change detection v3 0.00
207
Clock Function Name Task Name Workload
02:10:37.07 96D-1232 Exploit Full Motion Video Imagery v2 Prepare any SPOT or Salute reports v3 0.00
02:10:37.07 96D-1232 Exploit Full Motion Video Imagery v2 Prepare any SPOT or Salute reports v3 0.00
02:14:02.45 96D-1232 Exploit Full Motion Video Imagery v2 Prepare any SPOT or Salute reports v3 0.00
02:14:02.45 96D-1232 Exploit Full Motion Video Imagery v2 END 0.00
02:14:02.45 96D-1232 Exploit Full Motion Video Imagery v2 END 0.00
02:14:02.45 96D-1232 Exploit Full Motion Video Imagery v2 END 0.00
02:14:02.45 (Root) Model END 0.00
02:14:02.45 (Root) Model END 0.00
02:14:02.45 (Root) Model END 0.00
Table B-4. GEOINT System Primary Image Analyst Workload Detail with Auto Assist
Clock Auditory Cognitive Visual Psychomotor Overall
Workload
00:00:00.00 4.20 12.70 14.90 9.80 41.60
00:00:27.29 4.20 11.10 13.00 11.80 40.10
00:02:43.23 4.20 12.70 14.90 11.80 43.60
00:06:16.00 0.00 9.00 9.90 7.20 26.10
00:09:23.86 0.00 9.00 9.40 9.10 27.50
00:14:59.24 0.00 8.30 9.90 9.60 27.80
00:16:50.10 0.00 3.70 4.00 2.60 10.30
00:22:52.86 0.00 8.30 9.40 9.10 26.80
00:23:39.43 4.60 11.10 12.40 6.50 34.60
00:25:17.77 0.00 8.30 9.40 9.10 26.80
00:26:41.74 0.00 8.30 9.40 8.40 26.10
00:40:09.94 0.00 7.40 8.00 5.20 20.60
00:40:15.52 4.20 12.70 13.40 9.80 40.10
00:43:57.86 4.20 12.70 13.40 9.80 40.10
208
Clock Auditory Cognitive Visual Psychomotor Overall
Workload
00:45:00.52 0.00 7.40 8.00 5.20 20.60
00:46:38.84 4.20 11.10 13.00 9.80 38.10
00:50:35.80 0.00 7.40 8.00 5.20 20.60
00:55:04.18 0.00 9.00 9.90 9.10 28.00
00:55:41.80 0.00 7.40 8.00 5.20 20.60
00:58:38.00 0.00 8.30 9.90 9.10 27.30
00:59:40.50 0.00 9.20 11.30 13.00 33.50
01:00:17.99 0.00 8.30 10.80 12.30 31.40
01:00:50.15 0.00 8.30 10.40 8.40 27.10
01:06:57.28 0.00 8.30 10.40 8.40 27.10
01:07:23.30 0.00 9.00 11.30 10.40 30.70
01:08:57.93 0.00 9.90 11.30 10.40 31.60
01:10:23.76 0.00 8.30 9.40 8.40 26.10
01:11:07.65 0.00 8.30 9.40 8.40 26.10
01:11:45.02 0.00 8.30 9.40 8.40 26.10
01:13:17.41 0.00 8.30 9.40 8.40 26.10
01:17:51.53 0.00 8.30 9.40 8.40 26.10
01:18:53.08 0.00 8.30 9.90 9.10 27.30
01:23:45.59 0.00 8.30 9.90 9.10 27.30
01:24:03.03 0.00 9.00 9.90 9.10 28.00
01:26:20.54 0.00 9.90 11.30 9.10 30.30
01:26:23.50 0.00 9.90 11.80 12.30 34.00
01:26:41.81 0.00 9.90 11.80 12.30 34.00
01:27:35.67 0.00 10.60 11.30 12.30 34.20
01:27:55.82 0.00 9.00 11.30 11.10 31.40
01:28:02.46 0.00 9.00 11.30 11.10 31.40
01:28:18.75 4.20 14.30 16.70 15.70 50.90
209
Clock Auditory Cognitive Visual Psychomotor Overall
Workload
01:28:30.18 4.20 17.40 18.30 18.10 58.00
01:30:04.46 4.20 16.70 18.30 18.10 57.30
01:30:47.55 4.20 13.60 15.30 13.70 46.80
01:31:05.19 4.20 12.70 13.40 9.80 40.10
01:31:41.86 4.20 14.30 14.80 13.70 47.00
01:33:41.53 0.00 9.00 9.40 9.10 27.50
01:34:12.39 0.00 9.00 9.40 9.10 27.50
01:35:19.21 0.00 9.00 9.40 9.10 27.50
01:35:26.94 0.00 7.40 9.90 9.10 26.40
01:36:23.66 0.00 7.40 9.40 8.40 25.20
01:38:55.63 0.00 7.40 9.40 8.40 25.20
01:40:26.79 0.00 7.40 9.40 8.40 25.20
01:40:53.56 0.00 8.30 10.80 8.40 27.50
01:40:56.46 0.00 8.30 11.30 11.60 31.20
01:41:11.56 0.00 8.30 11.30 11.60 31.20
01:42:13.35 0.00 9.00 10.80 11.60 31.40
01:42:34.00 0.00 7.40 10.80 10.40 28.60
01:43:13.83 1.00 7.40 10.80 10.40 29.60
01:46:21.94 0.00 9.00 11.30 8.40 28.70
01:47:27.47 0.00 7.40 9.40 8.40 25.20
01:49:01.58 0.00 8.20 9.90 9.10 27.20
01:49:04.60 0.00 8.20 9.90 9.10 27.20
01:51:28.32 0.00 4.50 5.90 6.50 16.90
01:51:31.88 0.00 9.10 11.80 12.30 33.20
01:51:59.71 0.00 9.10 11.80 12.30 33.20
01:53:39.68 0.00 9.80 11.30 12.30 33.40
01:53:53.38 0.00 9.00 9.40 8.40 26.80
210
Clock Auditory Cognitive Visual Psychomotor Overall
Workload
01:54:07.97 0.00 7.40 9.40 7.20 24.00
01:54:43.62 1.00 7.40 9.40 7.20 25.00
01:56:08.03 1.00 3.70 5.40 4.60 14.70
01:57:53.06 2.00 3.70 4.00 2.60 12.30
01:58:10.64 2.00 3.70 4.00 2.60 12.30
01:58:44.69 2.00 3.70 5.90 4.60 16.20
01:59:29.90 2.00 7.40 11.30 10.40 31.10
02:02:02.66 2.00 7.40 11.30 10.40 31.10
02:02:38.71 1.00 7.40 10.80 10.40 29.60
02:05:46.40 0.00 3.70 5.40 5.80 14.90
02:10:37.07 0.00 0.00 0.00 0.00 0.00
Abstract (if available)
Abstract
From early heliographs to the modern day alphabets, humans have communicated with one another by using a combination of symbols. As groups gather and began using the same symbols, formal languages were developed within cultural boundaries. What were common to each group and language were the building blocks that allowed people to express and communicate with each other. Today, engineers have developed their own vocabulary and symbols to communicate with each other. ❧ The vocabulary and symbols are used in models and documents to represent a system under development. As systems have evolved into more complex entities, the need to increase and formalize modeling semantics has garnered greater importance. When system architects and engineers saw the power of the Object Management Group’s (OMG) Unified Modeling Language (UML) within the software engineering community, they began to use UML for system development. When creating descriptive system models with UML, the system engineering community recognized a gap in UML for systems engineering. UML did not provide the necessary terminology that the system community was accustomed to. In order to evolve the language, the system engineering community decided to extend UML to meet their needs. Evolving UML with common terminology frequently used within the system engineering community led to the creation of the OMG System Modeling Language (SysML). Since its inception in 2007, SysML has become the de-facto language for system architects and engineers for descriptive system models. Most of the research dedicated to system modeling has been focused on upfront conceptual design and architecture in the traditional system engineering discipline. As we move forward to integrate descriptive models into analytical models, there is a key opportunity to integrate other viewpoints into the system model by extending current semantics and adding other non-traditional systems engineering disciplines. ❧ Today with the role of the human changing from operator of a system to an agent within the system (Madni, 2011a) and with the need for greater system adaptability, great importance is being placed on system architects and engineers to integrate the human into the systems, to facilitate the interactions between the human and machine, as well as the interfaces between them. In order to do this, the human element needs to be taken into account and appropriately modeled to support human-machine system tradeoffs. Current systems engineering practices address human-system integration as an afterthought (i.e. only after the system has been architected). In this situation, when changes to the system accumulate, redesign costs can spiral out of control. The issue is that people not trained in the human factors engineering discipline are unable to communicate with those that are, due to differences in understanding human characteristics, terminology and language. Even within the human system integration and human factor communities there is no general agreement on terminology and language. In order to better integrate humans into systems, new semantics are needed to extend current system modeling semantics (e.g. those associated with current model-based methods). The integration of the new semantics will allow for human elements to be analyzed within a holistic view of the system and the integration of the human element analysis at the system architecting phase.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
Integration of digital twin and generative models in model-based systems upgrade methodology
PDF
Modeling and simulation testbed for unmanned systems
PDF
COSYSMO 3.0: an extended, unified cost estimating model for systems engineering
PDF
Context-adaptive expandable-compact POMDPs for engineering complex systems
PDF
A declarative design approach to modeling traditional and non-traditional space systems
PDF
Impacts of system of system management strategies on system of system capability engineering effort
PDF
Optimal guidance trajectories for proximity maneuvering and close approach with a tumbling resident space object under high fidelity J₂ and quadratic drag perturbation model
PDF
Advanced nuclear technologies for deep space exploration
PDF
Reduction of large set data transmission using algorithmically corrected model-based techniques for bandwidth efficiency
PDF
A medical imaging informatics based human performance analytics system
PDF
Building straggler-resilient and private machine learning systems in the cloud
PDF
Magnetic induction-based wireless body area network and its application toward human motion tracking
PDF
Techniques for analysis and design of temporary capture and resonant motion in astrodynamics
PDF
In-situ quality assessment of scan data for as-built models using building-specific geometric features
PDF
BRIM: A performance-based Bayesian model to identify use-error risk levels in medical devices
PDF
Systems pharmacology for acute myeloid leukemia and feedback control pharmacodynamic models
PDF
Feature and model based biomedical system characterization of cancer
PDF
Numerical and experimental investigations of ionic electrospray thruster plume
PDF
Studies into computational intelligence approaches for the identification of complex nonlinear systems
PDF
From matching to querying: A unified framework for ontology integration
Asset Metadata
Creator
Orellana, Douglas
(author)
Core Title
Extending systems architecting for human considerations through model-based systems engineering
School
Viterbi School of Engineering
Degree
Doctor of Philosophy
Degree Program
Astronautical Engineering
Publication Date
01/19/2018
Defense Date
12/08/2017
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
analytical models,descriptive models,human performance analysis,human performance modeling,human system integration,model based engineering,model-based systems engineering,OAI-PMH Harvest,ontology,SysML,system analysis,system architecting,system architecture,system modeling,systems engineering
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Madni, Azad (
committee chair
), Erwin, Daniel (
committee member
), Moore, James (
committee member
)
Creator Email
dworella@usc.edu,dworellana@gmail.com
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-c40-464733
Unique identifier
UC11267378
Identifier
etd-OrellanaDo-5965.pdf (filename),usctheses-c40-464733 (legacy record id)
Legacy Identifier
etd-OrellanaDo-5965.pdf
Dmrecord
464733
Document Type
Dissertation
Rights
Orellana, Douglas
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the a...
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Tags
analytical models
descriptive models
human performance analysis
human performance modeling
human system integration
model based engineering
model-based systems engineering
ontology
SysML
system analysis
system architecting
system architecture
system modeling
systems engineering