SlideShare une entreprise Scribd logo
1  sur  9
Télécharger pour lire hors ligne
© Professor Alan Hedge, Cornell University, August 2002




                         Systems Thinking
                               DEA 325/651
                           Professor Alan Hedge

                      Systems Ergonomics
• Work Environment
      1. Physical demands (e.g. lifting limits)
      2. Skill demands (e.g. typing at 100 wpm)
      3. Risk demands (e.g. driving on ice)
      4. Time demands (e.g. deadlines)

• Physical Environment
      1. physical agents (e.g. noise, vibration)
      2. chemical agents (e.g. air pollutants)
      3. biological agents (e.g. allergens, airborne diseases)


                      Systems Ergonomics
• Technology
      1. Product design (anthropometrics, biomechanics)
      2. Hardware Interface design
      3. Software interface design

• Psychosocial Environment
      1. Social (e.g. teams)
      2. Cultural (e.g. motivators)
      3. Lifestyle (e.g. quality of work life)


                Ergonomic Considerations
• Physical factors
   • ambient conditions; objects (tools, furniture, etc.)
• Biological factors
   • body dimensions, body capabilities, physiological processes
• Psychological factors
   • mental workload, information processing, training, motivation
• Work factors
   • job demands (time, rate, etc.), job design
• Organizational factors
   • organization type/climate, management regimes



                                           1
© Professor Alan Hedge, Cornell University, August 2002




                       Objectives of Ergonomics
       Enhance the effectiveness and efficiency of work:
         • ease of use, reliability, productivity,
         • reduce errors
       Enhance certain desirable human values:
         • enhance safety, satisfaction, comfort, quality of work life
         • reduce fatigue, stress, accidents
                                 Systems Thinking
" For every complex problem there is always a simple solution.

And it is wrong. "
H.L.Mencken


                                 Three Mile Island,
                    PWRs at Harrisburg, PA (~230 miles away).
                    Pressurized Water Reactor (PWR)
• In PWRs, the reactor core is in a massive steel bottle (Pressure Vessel). Three water coolant loops move
   heat from the reactor core to the turbines which drive the generators.
• Reactor Coolant loop passes water among the fuel assemblies in the core at several hundred °F, under
   >2,000 psi pressure to prevent the water from boiling into steam.
• Hot water from the pressure vessel enters steam generator (a heat exchanger).
• Secondary water passes around primary feed, and under far lower pressure it flashes quickly into
   steam.
• Primary coolant constantly recirculated by RC pumps. Each pump at TMI was 2 stories high, 9,000 hp,
   and circulated 360,000 gallons per minute!



                                  PWR Pressurizer
• The pressurizer connects to the primary coolant loop. It is normally about half full of water, and half full of
   steam.
• A tube at the bottom connects to the reactor vessel, and at the top are relief valves which open to protect
   the system when too much pressure builds up.
• Relief valve (RV) designed to always open (but didn't always close).
• Block valve (BV) manually operated so that it could be closed to prevent coolant loss.
• For safety, all components of the primary cooling loop are located a huge concrete structure
   (containment dome) with walls 12' thick, and reinforced with high-strength steel. It is designed to
   withstand the direct impact of a jet airliner, and can withstand very high inside pressures.


                                  TMI-2: a 'normal' night?
• March 27, 1979 - Three Mile Island generating 97% of rated 1,000 megawatt
   capacity.
• 11pm night shift - supervisor + 2 trained operators in control room, 14 auxiliary
   operators scattered around the plant.


                                                       2
© Professor Alan Hedge, Cornell University, August 2002



• Almost 4am - 2 men were cleaning a feedwater polisher (large tank, 1 of 8,
  filled with resin beads to remove contaminants from feedwater).Old beads
  formed a 'clot' (usually dislodged by bursts of compressed air).
• TMI-2 had 2 compressed air systems, 1 of which controlled all pneumatically
  operated valves and controls (instrument air).
• BUT sometime during the night, someone mistakenly connected a rubber hose
  between this instrument air system and a water line (area was dark, fittings
  almost identical and poorly labeled).


                       TMI-2 : the incident starts!
• 3:57am - air+water mixture reached valve control piping - feedwater valves
  slammed shut. Water hammer occurred. Control room floor shuddered. Violent
  shock tore out valve controls, ruptured feedwater pump casing and shattered
  feedwater pipes. Within seconds, entire auxiliary building awash with scalding
  water and filled with water vapor.
• Plant's automatic control systems tripped the steam turbines and opened
  bypass valves to dump steam from the generator to the condenser.
• BUT the condensor had a design flaw - sudden burst of steam blew water into
  the condensor pump which tripped off as planned and steam was dumped into
  the atmosphere with a deafening roar!
                   TMI-2 : now you see it don't you?
• With no cooling feedwater, reactor core temperature and pressure increased.
  Reactor tripped and control rods were inserted.
• To help cool the core, 3 emergency feedwater pumps (2 electric + 1 steam)
  started automatically.
• BUT during routine maintenance 1 week previously the water supply block
  valves had been closed and not re-opened.
• Pumps were running at full speed but delivering no feedwater to the core. Loss
  of feedwater was indicated by a warning light in the control room.
• BUT an adjacent yellow maintenance tag attached to a nearby switch covered
  the light.
• The 1st item on the emergency checklist - "Verify emergency feed" was
  overlooked, and operators erroneously assumed water was flowing.
             TMI-2 : what you don't know could kill you?
• As heat and pressure increased a relief valve opened to vent steam to the
  quench tank, and failed to close.
• BUT the operators didn't realize that the absence of a bright indicator light in
  the control only signaled that the valve had been commanded to close, not that
  it had closed!
• Primary loop temperature and pressure continued to soar. Eventually the
  maintenance tag was noticed and removed, revealing a red warning light. The
  operators opened the emergency feedwater valves, and the temperature rise
  slowed - temporarily.


                                         3
© Professor Alan Hedge, Cornell University, August 2002



• As cold water entered the superhot primary system, steam bubbles were
    formed. The main coolant RC pumps had to be shut down to avoid them
    vibrating to pieces and spilling primary coolant. Normally, convection currents
    would have continued cooling the core.


                            TM-2 : to err is human?
• BUT steam bubbles in the pipes prevented normal convection. There were no
    indicators for this condition.
• A large steam bubble formed over the top of the core and core meltdown
    began.
•   Eventually the supervisor suspected that a relief valve must be stuck open. He
    asked an operator for a temperature reading at the valve outlet.
•   BUT the operator read the wrong display and reported that the temperature
    was normal.
•   Eventually, steam flowing at 1,000 pounds per minute ruptured a safety disk
    and overflowed into the containment building, flooding this with radioactive
    water. All radiation alarms went off simultaneously.
•   At 6:00 am the day shift arrived.
                            TMI-2 : in the light of day.
• The day shift engineer realized that the relief valve must be stuck open, and
  closed this, but reactor core pressure began to rise again. Operators looked to
  the computer printouts for answers.
• BUT thousands of alarms had been registered and the printer was hours
  behind reality. Also, software designers had never envisaged temperatures
  anywhere that would be > 700°F (actually in places they exceeded 10,000 °F)
  and the software just printed question marks!
• The fuel rods were clad in Zirconium. When exposed to steam this reacts and
  generates hydrogen. A hydrogen bubble began to form in the core.
• A relay spark ignited the first hydrogen bubble, which exploded with the force
  of a 1,000 lb bomb. Fortunately, the containment building held.


                    TMI-2 : you can trust the politicians.
• 15 hours after the start of the incident, engineers managed to restart one of
    the reactor core pumps, and finally the core began to properly cool.
• March 30, 1979, 3 days after the incident, the governor recommend the
  evacuation of all pregnant women and preschool aged children. Schools were
  closed in the area and the governor ordered people to stay indoors.
• April 1 (4 days later) - President Carter visited TMI and declared the reactor
  safe and the incident 'over'. In truth, at this time a second hydrogen bubble
  had formed and also oxygen was being released (at 5% this would be highly
  explosive). Oxygen was at 5% for over 24 hours!
• Eventually, through the use of recombiner units, steam, hydrogen and oxygen
  were removed from the core and the bubble collapsed.


                                          4
© Professor Alan Hedge, Cornell University, August 2002




                             TMI-2 : the aftermath
• All the 36,000 nuclear fuel rods in the core were damaged or destroyed. Core
  temperature had reached 4,300°F (uranium melts at 5,000 °F).
• Approximately 700,000 gallons of radioactive cooling water were spilled onto
  the floor of the reactor building. To control this excessive quantity of water,
  400,000 gallons of radioactive water were released into the Susquehanna
  River.
• Thousands of Curies of radioactive noble gases were released into the
  atmosphere. The last major venting was in 1981.
• Some radioactive materials even managed to pass through the walls of the
  plant.
                       TMI-2 : the lessons, the costs?
• The TMI incident is characterized by poor human factors design of the
    information displays (from lights to software) and controls, and by frequent
    human error.
•   The decision to evacuate was controversial. Many experts insisted that there
    was no threat to the public health and such evacuation orders would cause
    public panic. It didn't!
•   The cleanup of the accident lasted to 1993. On Dec. 28 the plant was placed
    on monitored storage.
•   TMI will now be in 'safe storage' by 2005!
•   TMI will be decommissioned when the other reactor on Three Mile Island (TMI-
    1) is taken out of service. TMI unit 1 will lose its operational license on May 18,
    2008.

                        Forensic Human Factors
• Accident analysis
    • Historical data
    • Observational data
    • Accident reconstruction
• Critical Incident Analysis
    • Near misses
    • Deliberate near failure
• Error Analysis
    • Overload
    • Environment demands
    • Cognitive failures
• Human Reliability Analysis
                          Systems Thinking
" The world we have made as a result of the level of thinking we
  have done thus far creates problems that we cannot solve at the

                                           5
© Professor Alan Hedge, Cornell University, August 2002



  same level (of consciousness) at which we have created them....
  We shall require a substantially new manner of thinking if
  humankind is to survive. "
  Albert Einstein

                       Systems Thinking
  Systems thinking involves 'seeing' inter-connections and
  relationships, the whole picture as well as the component parts.

  Systems thinking provides key insights for the management of
  complexity.

           Ergonomics and Systems Thinking
  The concept of a system is fundamental to ergonomics
  “A system is an entity that exists to carry out some purpose” (Bailey, 1982)
  “A system is an organized or complex whole: an assemblage or combination of
  things or parts forming a complex, unitary whole.” (von Bertalanffy, ~1935)
  Human-Machine system - “A system is an interacting combination, at any level
  of complexity, of people, materials, tools, machines, software, facilities and
  procedures designed to work together for some common purpose.” (Chapanis,
  1996, p.22)
                           System Goals
• Mission-oriented systems
     Needs of personnel are subordinated to the goals of the system
    (e.g. mission to the moon).

• Service-oriented systems
     Needs of the clients/users are part of the goals of the system
    (e.g. hotel).
                       Types of Systems
• Manual systems
  • physical aids that are coupled to the operator (e.g. hand tools)
• Mechanical systems (semiautomatic)
  • Operator uses controls to directly determine machine function
    (e.g. car)
• Automated systems
  • Operator uses supervisory control to monitor system


                                       6
© Professor Alan Hedge, Cornell University, August 2002



   performance (e.g. robotic assembly line)
                  System Characteristics
• Systems are Purposive
  • Every system has a purpose (objective, goal).
  • E.g. University goals include:
    •   Education
    •   Socialization
    •   Skills
    •   Research
    •   Administration
    •   Recruitment
    •   Financial
    •   Employment
                  System Characteristics
• Systems can be Hierarchical
  • Subsystems are parts of larger systems. In studying any system.
  • Defining where system analysis starts and ends requires
   decisions on:
    • System boundaries - what is and isn’t part of the system (not necessarily
      any right or wrong answers). Decisions based on the identification of
      system functions
    • Limit of Resolution - how deep does the systems analysis need to be?
    • Components - the lowest level of analysis required for defining a system.
      A component in turn can be a subsystem.


                  System Characteristics
• Systems Operate in an Environment
  • The system environment is everything outside of the boundaries
   of the system.
    • Immediate environment - proximate environment around the system
      (e.g. ambient conditions). Usually has a demonstrable impact on system
      performance, such as lighting, noise, ventilation)
    • Intermediate environment - more distant conditions and settings that are
      directly experienced (e.g. home, car, campus store)
    • Distant environment - more distant conditions that aren’t directly
      experienced (e.g. blackouts, solar storms)
                  System Characteristics
• System Components Serve Functions

                                     7
© Professor Alan Hedge, Cornell University, August 2002



  • Every component serves at least one function that is related to
    achieving the system goals. Allocating functions between people
    and machines is a key area in ergonomics
  • Components either:
     • sense, store, or process information
     • execute actions
        – physical control action - handle, move, modify or alter materials or objects
        – communication action - message trasmission

                   System Characteristics
• Component Interactions
  • In complex systems components interact to achieve system
    goals
  • Components can be people or machines
                   System Characteristics
• Inputs and Outputs
  • Systems, subsystems an components have inputs and outputs
  • The impact of inputs and outputs can be used to categorize
    systems as being:
     • Open- loop
     • Closed-loop
     • Feed-forward
                         Types of Systems
• Open-loop (closed systems) {e.g. gun}
                         Types of Systems
                         Types of Systems
• Feed-forward systems {e.g. lunar lander, catching a ball}

                  Systems Characteristics
• Components in Series
  • The more components that are in a series the less reliable the system. If a
    system has 100 components each of which is 99% reliable, then the system
    is only 36.5% reliable (a two-in-three chance of error or failure).
  • If one component fails the whole system fails (e.g. decorative lights)
• Components in Parallel
  • Systems with parallel components have built-in backup or redundancy.
  • Adding components in parallel increases system reliability.


                                             8
© Professor Alan Hedge, Cornell University, August 2002



  • Adding components in parallel is expensive.
                  System Characteristics
• Finding the Weakest Link
  • For most systems, systems performance is only as good as that
    of the weakest link.
  • The Human Operator is usually the weakest link in a system
    because human performance is unreliable and unpredictable.
  • So why have people in systems?
                  Allocation of Functions
• Determine what system components will be performed by people
  and what will be performed by machines.
• Utilize knowledge of the performance characteristics of people and
  machines:
  • People - slow, weak, error prone, creative, flexible
  • Machines – fast, strong, accurate, dumb?, inflexible


              Person-Technology System

         Ergonomics: A Systems Design Process




                                      9

Contenu connexe

En vedette

Culture mra
Culture mraCulture mra
Culture mramra252
 
Schaefer10e ppt ch10
Schaefer10e ppt ch10Schaefer10e ppt ch10
Schaefer10e ppt ch10kamran
 
mistakes in milk indusry
mistakes in milk indusrymistakes in milk indusry
mistakes in milk indusrymadan kumar
 
Maximize Profits Format 2 R4
Maximize Profits Format 2 R4Maximize Profits Format 2 R4
Maximize Profits Format 2 R4Scott Espy
 
Being good while being bad social responsabilty
Being good while being bad social responsabiltyBeing good while being bad social responsabilty
Being good while being bad social responsabiltyMARIAPEPITAS
 
White7e ppt ch15
White7e ppt ch15White7e ppt ch15
White7e ppt ch15difordham
 
Market research analysis bobby philpot
Market research analysis  bobby philpotMarket research analysis  bobby philpot
Market research analysis bobby philpotphilpotbhhs
 
Optimizing Engagement Through Design
Optimizing Engagement Through DesignOptimizing Engagement Through Design
Optimizing Engagement Through DesignEric Shaver, PhD
 
Economic growth
Economic growthEconomic growth
Economic growthVikram g b
 
Powerpoint lawrence m. preston
Powerpoint   lawrence m. prestonPowerpoint   lawrence m. preston
Powerpoint lawrence m. prestonaiimnevada
 

En vedette (17)

Culture mra
Culture mraCulture mra
Culture mra
 
Indroduction1
Indroduction1Indroduction1
Indroduction1
 
Schaefer10e ppt ch10
Schaefer10e ppt ch10Schaefer10e ppt ch10
Schaefer10e ppt ch10
 
mistakes in milk indusry
mistakes in milk indusrymistakes in milk indusry
mistakes in milk indusry
 
Sem ii-topic-8-comp adv
Sem ii-topic-8-comp advSem ii-topic-8-comp adv
Sem ii-topic-8-comp adv
 
Maximize Profits Format 2 R4
Maximize Profits Format 2 R4Maximize Profits Format 2 R4
Maximize Profits Format 2 R4
 
Being good while being bad social responsabilty
Being good while being bad social responsabiltyBeing good while being bad social responsabilty
Being good while being bad social responsabilty
 
White7e ppt ch15
White7e ppt ch15White7e ppt ch15
White7e ppt ch15
 
Market research analysis bobby philpot
Market research analysis  bobby philpotMarket research analysis  bobby philpot
Market research analysis bobby philpot
 
Eport
EportEport
Eport
 
Finalppt
FinalpptFinalppt
Finalppt
 
Macro economics
Macro economicsMacro economics
Macro economics
 
Optimizing Engagement Through Design
Optimizing Engagement Through DesignOptimizing Engagement Through Design
Optimizing Engagement Through Design
 
Economic growth
Economic growthEconomic growth
Economic growth
 
Sem ii-t-7-global marketing-n
Sem ii-t-7-global marketing-nSem ii-t-7-global marketing-n
Sem ii-t-7-global marketing-n
 
Powerpoint lawrence m. preston
Powerpoint   lawrence m. prestonPowerpoint   lawrence m. preston
Powerpoint lawrence m. preston
 
Indias retail sector
Indias retail sectorIndias retail sector
Indias retail sector
 

Similaire à Systems

Three Mile Island Meltdown
Three Mile Island MeltdownThree Mile Island Meltdown
Three Mile Island Meltdownkvnsutton75
 
novNikezicPresentationNikezicPresentation
novNikezicPresentationNikezicPresentationnovNikezicPresentationNikezicPresentation
novNikezicPresentationNikezicPresentationARUNPRAKASHS7
 
Bhopal Gas Tragedy- The Night of Death
Bhopal Gas Tragedy- The Night of DeathBhopal Gas Tragedy- The Night of Death
Bhopal Gas Tragedy- The Night of DeathParidhi Khandelwal
 
Industrial disasters
Industrial disasters Industrial disasters
Industrial disasters ravali52
 
OCEAN THERMAL ENERGY CONVERSION
OCEAN THERMAL ENERGY CONVERSION OCEAN THERMAL ENERGY CONVERSION
OCEAN THERMAL ENERGY CONVERSION Snehanshu Das
 
OCEAN THERMAL ENERGY CONVERSION
OCEAN THERMAL ENERGY CONVERSIONOCEAN THERMAL ENERGY CONVERSION
OCEAN THERMAL ENERGY CONVERSIONNitish Kulkarni
 
what steps can be taken to prevent a core melt down when the tur.pdf
what steps can be taken to prevent a core melt down when the tur.pdfwhat steps can be taken to prevent a core melt down when the tur.pdf
what steps can be taken to prevent a core melt down when the tur.pdfARORACOCKERY2111
 
basics of thermodynamics.pptx
basics of thermodynamics.pptxbasics of thermodynamics.pptx
basics of thermodynamics.pptxAnas332323
 
THERMO ACOUSTIC REFRIGERATION
THERMO ACOUSTIC REFRIGERATIONTHERMO ACOUSTIC REFRIGERATION
THERMO ACOUSTIC REFRIGERATIONAkhil mon
 
Nonrenewable Energy
Nonrenewable EnergyNonrenewable Energy
Nonrenewable EnergyLaura Smith
 
Ocean Thermal Energy.pptx
Ocean Thermal  Energy.pptxOcean Thermal  Energy.pptx
Ocean Thermal Energy.pptxNarendraKasana1
 
CHERNOBYL DISASTER, 1986
CHERNOBYL  DISASTER, 1986CHERNOBYL  DISASTER, 1986
CHERNOBYL DISASTER, 1986TaushifulHoque
 
OTEC (Ocean Thermal Energy Conversion)
OTEC (Ocean Thermal Energy Conversion)OTEC (Ocean Thermal Energy Conversion)
OTEC (Ocean Thermal Energy Conversion)Ashish Bandewar
 
Lecture 1 [Autosaved].pptx
Lecture 1 [Autosaved].pptxLecture 1 [Autosaved].pptx
Lecture 1 [Autosaved].pptxzeon6
 
Nuclear Power plants
Nuclear Power plantsNuclear Power plants
Nuclear Power plantsBeemkumarN
 

Similaire à Systems (20)

Three Mile Island Meltdown
Three Mile Island MeltdownThree Mile Island Meltdown
Three Mile Island Meltdown
 
novNikezicPresentationNikezicPresentation
novNikezicPresentationNikezicPresentationnovNikezicPresentationNikezicPresentation
novNikezicPresentationNikezicPresentation
 
3 Mile island.pdf
3 Mile island.pdf3 Mile island.pdf
3 Mile island.pdf
 
Bhopal Gas Tragedy- The Night of Death
Bhopal Gas Tragedy- The Night of DeathBhopal Gas Tragedy- The Night of Death
Bhopal Gas Tragedy- The Night of Death
 
moc0828011.pdf
moc0828011.pdfmoc0828011.pdf
moc0828011.pdf
 
Chernobyl Disaster - Causes and Solutions
Chernobyl Disaster - Causes and SolutionsChernobyl Disaster - Causes and Solutions
Chernobyl Disaster - Causes and Solutions
 
The Future of Nuclear Power after Fukushima
The Future of Nuclear Power after FukushimaThe Future of Nuclear Power after Fukushima
The Future of Nuclear Power after Fukushima
 
Industrial disasters
Industrial disasters Industrial disasters
Industrial disasters
 
OCEAN THERMAL ENERGY CONVERSION
OCEAN THERMAL ENERGY CONVERSION OCEAN THERMAL ENERGY CONVERSION
OCEAN THERMAL ENERGY CONVERSION
 
OCEAN THERMAL ENERGY CONVERSION
OCEAN THERMAL ENERGY CONVERSIONOCEAN THERMAL ENERGY CONVERSION
OCEAN THERMAL ENERGY CONVERSION
 
what steps can be taken to prevent a core melt down when the tur.pdf
what steps can be taken to prevent a core melt down when the tur.pdfwhat steps can be taken to prevent a core melt down when the tur.pdf
what steps can be taken to prevent a core melt down when the tur.pdf
 
basics of thermodynamics.pptx
basics of thermodynamics.pptxbasics of thermodynamics.pptx
basics of thermodynamics.pptx
 
THERMO ACOUSTIC REFRIGERATION
THERMO ACOUSTIC REFRIGERATIONTHERMO ACOUSTIC REFRIGERATION
THERMO ACOUSTIC REFRIGERATION
 
Nonrenewable Energy
Nonrenewable EnergyNonrenewable Energy
Nonrenewable Energy
 
OTEC.pptx
 OTEC.pptx OTEC.pptx
OTEC.pptx
 
Ocean Thermal Energy.pptx
Ocean Thermal  Energy.pptxOcean Thermal  Energy.pptx
Ocean Thermal Energy.pptx
 
CHERNOBYL DISASTER, 1986
CHERNOBYL  DISASTER, 1986CHERNOBYL  DISASTER, 1986
CHERNOBYL DISASTER, 1986
 
OTEC (Ocean Thermal Energy Conversion)
OTEC (Ocean Thermal Energy Conversion)OTEC (Ocean Thermal Energy Conversion)
OTEC (Ocean Thermal Energy Conversion)
 
Lecture 1 [Autosaved].pptx
Lecture 1 [Autosaved].pptxLecture 1 [Autosaved].pptx
Lecture 1 [Autosaved].pptx
 
Nuclear Power plants
Nuclear Power plantsNuclear Power plants
Nuclear Power plants
 

Plus de Clean Agent Sdn Bhd (20)

Finishing equipments
Finishing equipmentsFinishing equipments
Finishing equipments
 
Color theory
Color theoryColor theory
Color theory
 
Board measure
Board measureBoard measure
Board measure
 
Weight children
Weight childrenWeight children
Weight children
 
Utff color
Utff colorUtff color
Utff color
 
Pattern
PatternPattern
Pattern
 
Memory
MemoryMemory
Memory
 
Lifting
LiftingLifting
Lifting
 
Lifting & arm vibration
Lifting & arm vibrationLifting & arm vibration
Lifting & arm vibration
 
Human vibration
Human vibrationHuman vibration
Human vibration
 
Hip
HipHip
Hip
 
Handarm
HandarmHandarm
Handarm
 
Hand tools
Hand toolsHand tools
Hand tools
 
Grips
GripsGrips
Grips
 
Ergonomicsorigins
ErgonomicsoriginsErgonomicsorigins
Ergonomicsorigins
 
Ergonomics
ErgonomicsErgonomics
Ergonomics
 
Ergo chair
Ergo chairErgo chair
Ergo chair
 
Ergo primer
Ergo primerErgo primer
Ergo primer
 
Controls
ControlsControls
Controls
 
Compatibility
CompatibilityCompatibility
Compatibility
 

Systems

  • 1. © Professor Alan Hedge, Cornell University, August 2002 Systems Thinking DEA 325/651 Professor Alan Hedge Systems Ergonomics • Work Environment 1. Physical demands (e.g. lifting limits) 2. Skill demands (e.g. typing at 100 wpm) 3. Risk demands (e.g. driving on ice) 4. Time demands (e.g. deadlines) • Physical Environment 1. physical agents (e.g. noise, vibration) 2. chemical agents (e.g. air pollutants) 3. biological agents (e.g. allergens, airborne diseases) Systems Ergonomics • Technology 1. Product design (anthropometrics, biomechanics) 2. Hardware Interface design 3. Software interface design • Psychosocial Environment 1. Social (e.g. teams) 2. Cultural (e.g. motivators) 3. Lifestyle (e.g. quality of work life) Ergonomic Considerations • Physical factors • ambient conditions; objects (tools, furniture, etc.) • Biological factors • body dimensions, body capabilities, physiological processes • Psychological factors • mental workload, information processing, training, motivation • Work factors • job demands (time, rate, etc.), job design • Organizational factors • organization type/climate, management regimes 1
  • 2. © Professor Alan Hedge, Cornell University, August 2002 Objectives of Ergonomics Enhance the effectiveness and efficiency of work: • ease of use, reliability, productivity, • reduce errors Enhance certain desirable human values: • enhance safety, satisfaction, comfort, quality of work life • reduce fatigue, stress, accidents Systems Thinking " For every complex problem there is always a simple solution. And it is wrong. " H.L.Mencken Three Mile Island, PWRs at Harrisburg, PA (~230 miles away). Pressurized Water Reactor (PWR) • In PWRs, the reactor core is in a massive steel bottle (Pressure Vessel). Three water coolant loops move heat from the reactor core to the turbines which drive the generators. • Reactor Coolant loop passes water among the fuel assemblies in the core at several hundred °F, under >2,000 psi pressure to prevent the water from boiling into steam. • Hot water from the pressure vessel enters steam generator (a heat exchanger). • Secondary water passes around primary feed, and under far lower pressure it flashes quickly into steam. • Primary coolant constantly recirculated by RC pumps. Each pump at TMI was 2 stories high, 9,000 hp, and circulated 360,000 gallons per minute! PWR Pressurizer • The pressurizer connects to the primary coolant loop. It is normally about half full of water, and half full of steam. • A tube at the bottom connects to the reactor vessel, and at the top are relief valves which open to protect the system when too much pressure builds up. • Relief valve (RV) designed to always open (but didn't always close). • Block valve (BV) manually operated so that it could be closed to prevent coolant loss. • For safety, all components of the primary cooling loop are located a huge concrete structure (containment dome) with walls 12' thick, and reinforced with high-strength steel. It is designed to withstand the direct impact of a jet airliner, and can withstand very high inside pressures. TMI-2: a 'normal' night? • March 27, 1979 - Three Mile Island generating 97% of rated 1,000 megawatt capacity. • 11pm night shift - supervisor + 2 trained operators in control room, 14 auxiliary operators scattered around the plant. 2
  • 3. © Professor Alan Hedge, Cornell University, August 2002 • Almost 4am - 2 men were cleaning a feedwater polisher (large tank, 1 of 8, filled with resin beads to remove contaminants from feedwater).Old beads formed a 'clot' (usually dislodged by bursts of compressed air). • TMI-2 had 2 compressed air systems, 1 of which controlled all pneumatically operated valves and controls (instrument air). • BUT sometime during the night, someone mistakenly connected a rubber hose between this instrument air system and a water line (area was dark, fittings almost identical and poorly labeled). TMI-2 : the incident starts! • 3:57am - air+water mixture reached valve control piping - feedwater valves slammed shut. Water hammer occurred. Control room floor shuddered. Violent shock tore out valve controls, ruptured feedwater pump casing and shattered feedwater pipes. Within seconds, entire auxiliary building awash with scalding water and filled with water vapor. • Plant's automatic control systems tripped the steam turbines and opened bypass valves to dump steam from the generator to the condenser. • BUT the condensor had a design flaw - sudden burst of steam blew water into the condensor pump which tripped off as planned and steam was dumped into the atmosphere with a deafening roar! TMI-2 : now you see it don't you? • With no cooling feedwater, reactor core temperature and pressure increased. Reactor tripped and control rods were inserted. • To help cool the core, 3 emergency feedwater pumps (2 electric + 1 steam) started automatically. • BUT during routine maintenance 1 week previously the water supply block valves had been closed and not re-opened. • Pumps were running at full speed but delivering no feedwater to the core. Loss of feedwater was indicated by a warning light in the control room. • BUT an adjacent yellow maintenance tag attached to a nearby switch covered the light. • The 1st item on the emergency checklist - "Verify emergency feed" was overlooked, and operators erroneously assumed water was flowing. TMI-2 : what you don't know could kill you? • As heat and pressure increased a relief valve opened to vent steam to the quench tank, and failed to close. • BUT the operators didn't realize that the absence of a bright indicator light in the control only signaled that the valve had been commanded to close, not that it had closed! • Primary loop temperature and pressure continued to soar. Eventually the maintenance tag was noticed and removed, revealing a red warning light. The operators opened the emergency feedwater valves, and the temperature rise slowed - temporarily. 3
  • 4. © Professor Alan Hedge, Cornell University, August 2002 • As cold water entered the superhot primary system, steam bubbles were formed. The main coolant RC pumps had to be shut down to avoid them vibrating to pieces and spilling primary coolant. Normally, convection currents would have continued cooling the core. TM-2 : to err is human? • BUT steam bubbles in the pipes prevented normal convection. There were no indicators for this condition. • A large steam bubble formed over the top of the core and core meltdown began. • Eventually the supervisor suspected that a relief valve must be stuck open. He asked an operator for a temperature reading at the valve outlet. • BUT the operator read the wrong display and reported that the temperature was normal. • Eventually, steam flowing at 1,000 pounds per minute ruptured a safety disk and overflowed into the containment building, flooding this with radioactive water. All radiation alarms went off simultaneously. • At 6:00 am the day shift arrived. TMI-2 : in the light of day. • The day shift engineer realized that the relief valve must be stuck open, and closed this, but reactor core pressure began to rise again. Operators looked to the computer printouts for answers. • BUT thousands of alarms had been registered and the printer was hours behind reality. Also, software designers had never envisaged temperatures anywhere that would be > 700°F (actually in places they exceeded 10,000 °F) and the software just printed question marks! • The fuel rods were clad in Zirconium. When exposed to steam this reacts and generates hydrogen. A hydrogen bubble began to form in the core. • A relay spark ignited the first hydrogen bubble, which exploded with the force of a 1,000 lb bomb. Fortunately, the containment building held. TMI-2 : you can trust the politicians. • 15 hours after the start of the incident, engineers managed to restart one of the reactor core pumps, and finally the core began to properly cool. • March 30, 1979, 3 days after the incident, the governor recommend the evacuation of all pregnant women and preschool aged children. Schools were closed in the area and the governor ordered people to stay indoors. • April 1 (4 days later) - President Carter visited TMI and declared the reactor safe and the incident 'over'. In truth, at this time a second hydrogen bubble had formed and also oxygen was being released (at 5% this would be highly explosive). Oxygen was at 5% for over 24 hours! • Eventually, through the use of recombiner units, steam, hydrogen and oxygen were removed from the core and the bubble collapsed. 4
  • 5. © Professor Alan Hedge, Cornell University, August 2002 TMI-2 : the aftermath • All the 36,000 nuclear fuel rods in the core were damaged or destroyed. Core temperature had reached 4,300°F (uranium melts at 5,000 °F). • Approximately 700,000 gallons of radioactive cooling water were spilled onto the floor of the reactor building. To control this excessive quantity of water, 400,000 gallons of radioactive water were released into the Susquehanna River. • Thousands of Curies of radioactive noble gases were released into the atmosphere. The last major venting was in 1981. • Some radioactive materials even managed to pass through the walls of the plant. TMI-2 : the lessons, the costs? • The TMI incident is characterized by poor human factors design of the information displays (from lights to software) and controls, and by frequent human error. • The decision to evacuate was controversial. Many experts insisted that there was no threat to the public health and such evacuation orders would cause public panic. It didn't! • The cleanup of the accident lasted to 1993. On Dec. 28 the plant was placed on monitored storage. • TMI will now be in 'safe storage' by 2005! • TMI will be decommissioned when the other reactor on Three Mile Island (TMI- 1) is taken out of service. TMI unit 1 will lose its operational license on May 18, 2008. Forensic Human Factors • Accident analysis • Historical data • Observational data • Accident reconstruction • Critical Incident Analysis • Near misses • Deliberate near failure • Error Analysis • Overload • Environment demands • Cognitive failures • Human Reliability Analysis Systems Thinking " The world we have made as a result of the level of thinking we have done thus far creates problems that we cannot solve at the 5
  • 6. © Professor Alan Hedge, Cornell University, August 2002 same level (of consciousness) at which we have created them.... We shall require a substantially new manner of thinking if humankind is to survive. " Albert Einstein Systems Thinking Systems thinking involves 'seeing' inter-connections and relationships, the whole picture as well as the component parts. Systems thinking provides key insights for the management of complexity. Ergonomics and Systems Thinking The concept of a system is fundamental to ergonomics “A system is an entity that exists to carry out some purpose” (Bailey, 1982) “A system is an organized or complex whole: an assemblage or combination of things or parts forming a complex, unitary whole.” (von Bertalanffy, ~1935) Human-Machine system - “A system is an interacting combination, at any level of complexity, of people, materials, tools, machines, software, facilities and procedures designed to work together for some common purpose.” (Chapanis, 1996, p.22) System Goals • Mission-oriented systems Needs of personnel are subordinated to the goals of the system (e.g. mission to the moon). • Service-oriented systems Needs of the clients/users are part of the goals of the system (e.g. hotel). Types of Systems • Manual systems • physical aids that are coupled to the operator (e.g. hand tools) • Mechanical systems (semiautomatic) • Operator uses controls to directly determine machine function (e.g. car) • Automated systems • Operator uses supervisory control to monitor system 6
  • 7. © Professor Alan Hedge, Cornell University, August 2002 performance (e.g. robotic assembly line) System Characteristics • Systems are Purposive • Every system has a purpose (objective, goal). • E.g. University goals include: • Education • Socialization • Skills • Research • Administration • Recruitment • Financial • Employment System Characteristics • Systems can be Hierarchical • Subsystems are parts of larger systems. In studying any system. • Defining where system analysis starts and ends requires decisions on: • System boundaries - what is and isn’t part of the system (not necessarily any right or wrong answers). Decisions based on the identification of system functions • Limit of Resolution - how deep does the systems analysis need to be? • Components - the lowest level of analysis required for defining a system. A component in turn can be a subsystem. System Characteristics • Systems Operate in an Environment • The system environment is everything outside of the boundaries of the system. • Immediate environment - proximate environment around the system (e.g. ambient conditions). Usually has a demonstrable impact on system performance, such as lighting, noise, ventilation) • Intermediate environment - more distant conditions and settings that are directly experienced (e.g. home, car, campus store) • Distant environment - more distant conditions that aren’t directly experienced (e.g. blackouts, solar storms) System Characteristics • System Components Serve Functions 7
  • 8. © Professor Alan Hedge, Cornell University, August 2002 • Every component serves at least one function that is related to achieving the system goals. Allocating functions between people and machines is a key area in ergonomics • Components either: • sense, store, or process information • execute actions – physical control action - handle, move, modify or alter materials or objects – communication action - message trasmission System Characteristics • Component Interactions • In complex systems components interact to achieve system goals • Components can be people or machines System Characteristics • Inputs and Outputs • Systems, subsystems an components have inputs and outputs • The impact of inputs and outputs can be used to categorize systems as being: • Open- loop • Closed-loop • Feed-forward Types of Systems • Open-loop (closed systems) {e.g. gun} Types of Systems Types of Systems • Feed-forward systems {e.g. lunar lander, catching a ball} Systems Characteristics • Components in Series • The more components that are in a series the less reliable the system. If a system has 100 components each of which is 99% reliable, then the system is only 36.5% reliable (a two-in-three chance of error or failure). • If one component fails the whole system fails (e.g. decorative lights) • Components in Parallel • Systems with parallel components have built-in backup or redundancy. • Adding components in parallel increases system reliability. 8
  • 9. © Professor Alan Hedge, Cornell University, August 2002 • Adding components in parallel is expensive. System Characteristics • Finding the Weakest Link • For most systems, systems performance is only as good as that of the weakest link. • The Human Operator is usually the weakest link in a system because human performance is unreliable and unpredictable. • So why have people in systems? Allocation of Functions • Determine what system components will be performed by people and what will be performed by machines. • Utilize knowledge of the performance characteristics of people and machines: • People - slow, weak, error prone, creative, flexible • Machines – fast, strong, accurate, dumb?, inflexible Person-Technology System Ergonomics: A Systems Design Process 9