1. 5B: Leader/Follower Control of a
Robotic Transport Skid
Final Progress Report
Timothy Banks Faculty Advisor:
Benjamin Campbell Dr. John Lacontora
Simone Osuji
David Redington Faculty Co-Advisors:
Michael Squillace Dr. Tom Chmielewski
Dr. Harry Kwatny
1
3. NAVAIR needs - Priority 1
● NAVY 1 - Demonstrate/simulate skid replicating movement of MHU-191
weapon skid
● NAVY 2 - Skid follows a leader autonomously
● NAVY 3 - Skid capable of operating beyond line of sight
● NAVY 4 - Skid can maneuver to a location
● NAVY 5 - Skid can avoid static obstacles
● NAVY 6 - Skid can avoid moving obstacles
● NAVY 7 - Skid can avoid obstacles between leader and skid
● NAVY 8 - Skid replicates current mechanized skid kinematics
● NAVY 9 - Operates indoors and outdoors
● NAVY 10 - Operates in metal passageways
● NAVY 11 - Does not interfere with payload
● NAVY 12 - Locate on a known map without GPS
● NAVY 13 - Operator can give commands to follower
● NAVY 14 - Utilize ROS or similar coding language
● NAVY 15 - Obstacle and Target recognition
● NAVY 16 - Transition around corners
● NAVY 17 - Maneuver to a commanded goal
4. Progress
37/70 Needs met (52.9%)
21/33 Priority 1 needs met (63.6%)
12/22 Priority 2 needs met (54.5%)
4/15 Priority 3 needs met (26.6%)
5. Primary Goals
●Replicates Current Mechanized Skid
Kinematics
●Track and Follow a Designated Leader
●Obstacle Detection and Avoidance
●Detect and Control Loss of Line of Sight
5
18. Ultrasonic Sensor Triggering
● Sequentially triggering sensors is required to avoid
interference between sensors
● Minimum measurement time is 100ms (ping + Analog
conversion)
● With three sensors array current measurement
frequency is 3.3Hz for each sensor
● Other triggering techniques may be possible
19. Why MATLAB/Simulink?
● Familiarity of language
● Readily available tutorials from advisor
○ Programming Raspberry Pi with Simulink
○ Object detection using RGB Color
● Software/Hardware Integration Features
○ acquire video from imaging/video hardware (camera)
■ Computer Vision System Toolbox (Object
Detection)
○ Raspberry Pi Support
○ Library of blocks for Pi input/output
● Model deployment for standalone operation
20. Image Tracking Design
● Raspberry Pi used for all image tracking
● Visual marker used is a high intensity green object
● Two threshold criteria
○ Minimum Intensity of green
○ Minimum Number of green pixels
● Compares intensity threshold and number of green
pickles in field of view. Must match both criteria.
● If a match is in F.O.V, the centroid of the green object is
calculated and its x position is used for servo control
● Data is converted to a 4 bit value and passed to the
steering control program on the Arduino
● A second output is provided which indicates if a valid
target is present in the F.O.V
23. Image Tracking:
Future Improvement Options
Identifying certain shapes using edge detectionExpand current color detection to determine if
green object is a certain shape (so robot is not
confused when multiple green object detected)
25. Main Control Overview
● Combines all decision logic for skid into one functional
simulink block.
● Makes decisions on when to enter the two primary
modes of operation
○ Follow the leader
○ Loss of line of sight
■ 1 bit of data confirms sight or loss of sight
● Accepts inputs from video tracking from the Raspberry
Pi and the data from ultrasonic sensors.
○ Makes logic decisions based on the information
● Two outputs
○ Throttle
○ Steering
26. Mode: Follow Leader
● Uses web camera for navigation.
● Tracks and follows a leader wearing a green vest
○ Parameters
■ Number of green pixels
■ Intensity of green object
● Uses 4 bits of data from Rasp. Pi representing the
position of the target.
● Position data converted to angular displacement and
used to operate the steering servo.
● Travels at a walking speed and stops within 2 feet of
leader or any obstacles in its path.
27. Bench Testing
● Video Tracking Controls
○ Ensure tracking of green tennis ball
○ Confirm loss of sight when green object removed from F.O.V.
○ Camera panning tests ensure random green pixel are not tracked and
eliminate false positives in the program.
○ Adjust Intensity and number of pixel levels and compare results
● Steering Controls
○ Confirmed 4 bit Data output from tracking program is properly
converted into a steering servo value.
○ Manipulated green object in F.O.V and verified the output command to
the steering servo.
● Throttle Controls
○ Distance tests with sensor bank and throttle servo commands
■ Outside of 3’ throttle command is a constant value (98)
■ Inside of 3’ throttle command is set to neutral. (90)
28. Mode: Loss of Line of Sight
● Determines last known direction of leader (left of skid or
right of skid)
○ Logic low representing leader to the left
○ Logic high representing leader to the right
● Uses ultrasonic sensors for navigation
● Employs linear steering algorithm to determine how
sharply to turn the wheels.
○ Variable is distance of skid from wall.
○ Slope is scaling factor which determines how sharp
to turn wheels.
● Maintains a 3 foot buffer from walls.
29. Results Video
● Shows skid in Follow Mode
● Shows skid in Loss of Line of Sight Mode
○ Navigating a corridor and avoiding collision with
walls.
○ Shows skid search for, find, and re-acquire its target.
30. Potential Improvements
1. Loss of line of sight
a. Corner case coverage
2. Obstacle Detection
a. Testing of additional ultrasonic sensors
b. Identify a distance measurement sensor with less sensor to sensor
interference
3. Target Tracking
a. Non-color based tracking method
4. Programing environment
a. Simulink allows simple tasks to be implemented quickly but makes
more sophisticated tasks much more cumbersome
b. Timer and interrupt controls are challenging in Simulink