University of North FloridaTeam Photo Spring 2015: Not all members are present in this photo.Website: www.ospreydivers.orgTeam Members:Marshall Curry (Team Lead), Eric Rutherford, Kerry Anyanwu, Brandyn Sloan, SamanthaBusch, Wenjing Liu, Brian Zamore, Marcus Blessing, Brian White, Stephanie Olson, VedranPehlivanovic, Joel Avalos, John Hellmann, Stefano Di Bella, Demetrius Brown, Rob Grunwald,Brian White, Juan Mata, Jakob Howe, Nicholas Emord, Timothy Vu, Christopher Searcy, RyanO’toole, James Rutherford, Michael Otero, Stewart Fletcher, Michael Burkhardt, Will Gualtieri,Ashlee Lithalangsy, Maria-Flora Jacobs, Charlie Santana, John Gans, Andrew Turgeon, JeremyCantor, and Jonathan Frias.Team Advisors:Dr. Daniel Cox – Professor of Mechanical EngineeringDr. Brian Kopp – Assistant Professor of Electrical Engineering

AbstractThe University of North Florida’s RoboSubTeam, the Osprey Divers, would like topresent AUV-SPREY, its first vehicle entryin AUVSI Foundation and ONR’s 18thAnnual RoboSub Competition, “Back toTRANSDEC,” located in San Diego July of2015. The Osprey Divers team was foundedNovember of 2014 by members of UNF’sTeleRobotics Club. The goal for the OspreyDivers was to design and build anAutonomous Underwater Vehicle for thecompetition. AUVSI’s competition allowsthe club to get a unique systems engineeringapproach to solving a real world problem,while implementing and improving theexpertise of the dedicated UNF students. TheOsprey Divers is an interdisciplinary team ofUNF students, bringing together studentsmajoring in Electrical and MechanicalEngineering, Computer Science, Business,and any other major at the University ofNorth Florida.One of the main emphasis’ for thecompetition and the Osprey Divers has beenoutreach, working with the school and otherorganizations to get students involved in theSTEM field and inform them of howimportant STEM fields are to the community.more capabilities. This year will set themomentum for the years to come. The teamis broken down into subsystems whichinclude frame, propulsion, camera, sensors,power, controls, and business.The Osprey Divers had two main designideas. Plan A was to implement anunderwater quad copter propulsion systemand use the Pixhawk flight controller,interfaced with an Arduino Mega tomaneuver through the pool. Once researchand tests were done on the first prototype,results concurred that the time untilcompetition did not allow for proper controlof this system. This design will remain asresearch and development for further years tocome. Therefore, Plan B was executed, usingthe Arduino Mega as the central controlboard, interfacing with the raspberry pi2 andthe camera, and the sensors, receiving inputs,and distributing the necessary outputs to thethrusters.2. FrameBelow, in Figure 1, is the CAD model ofAUV-SPREY, completed using Solidworks.1. IntroductionApproximately 6 months of effort has beendedicated to the Osprey Divers’ RoboSubthis year. As a first year team, simple goalswere placed. The goals for AUV-SPREY areto maneuver through the gate and participatein the buoy, time portal, and refuel bin(dropping markers) tasks. Once these tasksare mastered, the sub will be integrated forFigure 1. AUV-SPREY Solidworks CADModel.Osprey Divers University of North Florida1

2.1 Main HullThe final frame design consists of one mainhull made of a schedule 40 PVC t-connectorfit for 6” diameter pipe, and an extension of6” PVC to give more space for componentsand electronics inside the sub. Two levels ofracks allow for easy access to electronics.The three openings to the hull are each sealedwith an acrylic window bolted to aluminumsheet metal and made watertight with customfit 6” diameter O-Rings. The forward andbottom facing openings of the t-connectorPVC allow for camera compartments. Thecamera is held in place with 3D-printed ABSplastic parts, the acrylic windows giving thecamera an interface to the water outside of thesub. The rear opening, also sealed with anacrylic window has various waterproofconnectors attached, allowing for aconvenient interface to components on theexterior of the AUV.electronics, and external components allowfor stability. Fine tuning was done to thebuoyancy to make the AUV as stable aspossible when stationary.3. PropulsionThe propulsion system is made up of fourthrusters controlling vertical movement androll, and two thrusters controlling yaw andhorizontal displacement. The thrusters weremade from T-Motor, 380 KV brushlessmotors as shown in Figure 2. Shrouds andpropellers were attached to the motor asshown in Figure 1. Electronic SpeedControllers were interfaced with the ArduinoMega to accept PWM commands to controlthe motors.2.2 Space FrameThe main hull is surrounded and mounted toa frame made of 80/20 aluminum. Thisallows for mounting external components tothe sub and easy integration of futureadditions. There are six thrusters used in thisdesign. Four thrusters mount to the fourcorners of the AUV to allow for elevation androll control, and two thrusters on each side ofthe AUV to allow for horizontal movementand yaw. The AUV is slightly positivelybuoyant to alleviate stress on thrusters,reducing power consumption, and acting as asafety feature.Overall, the calculated net buoyant force ofthe t-connector hull is 23.4 lbs. Therefore,slightly less than 23.4 lbs. in space frame,Figure 2. T-Motor, 380KV BrushlessMotors.4. Camera4.1 Description of camera, software,Microcontrollers and protocolThe camera subsystem is made up of araspberry pi 2 computer, Arduino Mega,Osprey Divers University of North Florida2

embedded software and the creative Senz3Dcamera. The raspberry pi 2 is powered by a900 MHz quad-core ARM Cortex-A7 CPUwith a volatile memory of 1GB. Thecomputer also includes 4 USB ports, 40GPIO pins (General Purpose Input/Output),full HDMI port, an Ethernet port, combined3.5mm audio jack and composite video,camerainterface(CSI),displayinterface(DSI), Micro SD card slot and aVideoCore IV 3D graphics core. Theraspberry pi 2 is used to perform thecomputational demands required for imageprocessing, it also establishes serialcommunication via USB protocol betweenthe cameras and also data transfer from the pi2 to the Arduino Mega. The computer alsoprovides power to the camera via theuniversal serial bus. The Arduino Mega issolely the master controller for the entireAUV system platform. It receives therequested data from the image processingcomponent and then makes a decision basedon the data received and then sends that datato whatever subsystem that requires thatinformation. The embedded software used bythe camera subsystems are as follows,raspbian, openCv framework, and languageAPI (Application Programming Interface).The raspbian OS is a Linux distribution thatis embedded in the raspberry pi 2’s core. Theimage processing and color detection andmachine vision framework was implementedusing the openCv application programminginterface. This library provides a rich set ofeasy to use modular functions. Theimplantation of the framework was viapython language for the raspberry pi 2 andC/C for the Arduino Mega. Thissubsystem utilized the creative Senz3Dcamera. This is a relatively inexpensivecamera but packed with power features suchas 1.0 megapixel HD (1280 x 720) sensor,auto lens focusing, built-in Mic with noisecancelation, and maximum frame of up to30fps (frames per second). The cameracaptures the frame and sends that informationvia USB protocol to the image processingcomponent for image analysis andprocessing. The camera is also used to searchfor objects as well.4.2 Technical Decision ProcessThe initial prototype for the camera subsystem was to use a pixy camera for visualsensing,interfacedwithaPICmicrocontroller to receive and interpret thedata. The pixy implemented a blog detectionsystem for image processing and did notprovide a rich library of functions to workwith. The camera had a limited use of sevencolor signatures and did not provide a way forthe developer to implement a visual loggingsystem for troubleshooting. The PICmicrocontroller was a 16bit controller madeby Microchip. The microcontroller packed anample amount of features such as i2c, SPI,interrupts and more. Executing these featuresproved difficult, with limited online help andcoupled with the time constraint it wasdecided by the camera team to abandon thesetechnologies and to use simpler technologiesthat provided an ample amount of tutorialsand references. Such technologies used werethe raspberry pi 2 computer, Arduino Mega,and the creative Senz3D camera. Theraspberry pi 2 was chosen because of itscompact form factor and features such as its900 MHz quad-core ARM Cortex-A7 CPU.Also the computer is relatively inexpensiveOsprey Divers University of North Florida3

meaning it can be easily replaced if damaged.The Arduino Mega was chosen as the mastercontroller because of its ease of use, rich APIlibrary and plentiful references. Themicrocontrollerisalsorelativelyinexpensive. The creative Senz3D camerawas chosen because of its HD and autofocusing features. The entire technology usedby the camera subsystem was based onbudget, ease of use and time constraints.4.3 How the System WorksA Creative Senz3D Depth and GestureRecognition Camera is mounted to the frontof AUV. The camera has 720p HD qualityvideo shooting quality, depth recognition,and features 3D sensing. OpenCV, RaspberryPi, Arduino, and Python programminglanguages were used for computer vision.Software is written in Python to utilizeOpenCV to visually detect object andidentify color and shape. Raspberry Pitransfers the data to Arduino using serialcommunication protocol for further analysisand actions. Figure 3 shows the camerasystem diagram.Figure 3. Camera Systems Flow.4.4 Issues EncounteredTechnical problems were a barricade atalmost every step of development. TheOsprey Divers’ camera team initially chosethe Microchip brand as the micro controllerboard. However, upon further research andtesting, the functions used in the “C”language to work with the PIC microcontroller board were too complex tounderstand in such a limited amount timeafforded to our team. Getting the PicKit3device (used for compiling and programmingthe target device, or PIC microcontroller) wasan elusive task as well. To further complicatematters, public reference material wasscarcely available to troubleshoot problemswith the PIC microcontroller code. Later, thecamera team came to realize that the Pixyimage tracking camera was inadequate. ThePixy camera’s shortcomings included thefollowing: limited ability in capturing shapes,only capable of recognizing 7 colors,inability to properly redefine its API to suitneeds and inability to record. Recording iscritical since it is necessary fortroubleshooting purposes. Getting thedistance from the updated camera, to anobject was not a straightforward task. Othermethods had to be created to get a sense ofhow far away an object was. The camera northe code had the ability to provide the focallength necessary to determine distance to theobject. It seemed desirable to use the i2ccommunication protocol between theArduino board and its slave devices.However, using i2c communication provedto be an overly exhaustive task consideringthe amount of initial setup for the software onthe devices.Osprey Divers University of North Florida4

4.4 Synopsis of the AlgorithmOpenCV is an open source computer visionlibrary that is available on a variety ofdifferent operating systems includingRaspbian, a free operating system for theRaspberry Pi. OpenCV implemented in C but can easily be interfaced with high levelprogramming languages such as pythonwhich was language of choice for thisproject. From experience, python is capableof expressing concepts in a fewer lines whilemaintaining readability of the code.OpenCV provides a vast library of differentcomputer vision and machine learningalgorithms built into functions that can beconveniently called upon via python. Thechallenge is to identify what algorithms andtools available will be deemed useful for thevarious part of the competition.4.4.1 Color DetectionAll the challenges involved in thecompetition involve dealing with objects(gates, buoys, etc.) that are convenientlycolor coded. OpenCV’s inRange() functiontakes an image and the color boundaries asparameters. In computing, the color of asingle pixel is represented by 3 bytes witheach byte representing the intensity of red,blue, and green respectively. The functionreturns an image that excludes the pixels thatdo not fit within the specified boundary.Figure 4 shows the original alongside animage returned from inRange().Figure 4. Original image alongside imagereturned from inRange().4.4.2 Contour DetectionGeometrically, a contour is a continuouscurve along a boundary that has the samecolor or intensity. The findContour() takes animage as parameter and as the name impliesit return a list contours within an image. Eachcontour within the list is a collection of pointsor edges that represent the contour. This isused in combination with the inRange() colordetection function to identify a shape of acolored object.Figure 5 shows theapproximating bounding rectangle arounddetected object.Figure 3. Approximating bounding rectanglearound detected objects.4.4.3 Hough Circle DetectionThe HoughCircles() function takes an imagealong with a minimum and maximum size ofradius given in pixels. It returns theOsprey Divers University of North Florida5

coordinates where the center of circularobjects are detected within the providedimage. Using this in combination with colordetection will be used to detect coloredbuoys. Figure 6 shows an example of a circledetected within an image.magnetometer is able to find the direction ofNorth and set North as a reference point.5.2 Gyroscope and AccelerometerA gyroscope and accelerometer are used tomaintain the balance of the sub. Thegyroscope and accelerometer are included inthe MPU-6050 chip. The gyroscope sensesthe change of the angles from the axis of thesub. The accelerometer measures the forcesfrom acceleration on the sub.5.3 Pressure TransducerFigure 1. Circle detected within image.5. SensorsAUV-SPREY uses a system of sensors tomanipulate the orientation and detect thesurrounding environment. A microcontrollercommunicates with the rest of the sub tomanipulate the sensors. An Arduino Megamicrocontroller was ideal because thedifficulty of use is low which was appreciateddue to the limited time that the team had. Thecommunication between the sensors and theMega is i2c.5.1 MagnetometerA digital compass is used to sense thedirection of the sub so it could be redirectedto the next task. For the digital compass, theHMC5883L chip is used. The HMC5883L isa magnetometer that can measure themagnetization’s strength of different objects.By sensing the magnetic field of Earth, theFor the vehicle to know at what depth it was,a pressure transducer is used. The Model K2pressure transducer calculates a voltageoutput that varies dependent on depth. Toprevent the sub from going too deep orshallow, a threshold was made by finding thevoltages of the limits and programming theArduino to stay between the voltages. Thevoltage output from the pressure sensor is fedinto the Arduino Mega’s 10-bit ADC. Thepressure range recognized by the transduceris 0-100 psi, making the output voltage range0-4.5 volts. This voltage range was dividedamongst the 10-bits of the converter andintervals for each depth were made. Thecontroller reads the digital signal from theADC and determines the depth of the sub.6. ControlsThe main control board used is the ArduinoMega 2560. The board communicates to theraspberry pi2 via serial communicationthrough USB.The Arduino alsocommunicates to external sensors using i2c.The Arduino Mega receives signals from theperipherals, being the camera and othersensors, processes the data, and distributesOsprey Divers University of North Florida6

commands via PWM to the ESC’s thatcontrol the six thrusters. An emergency stopswitch is interfaced with the outside of thesub to provide emergency disconnect of allbatteries from the circuits.7. ConclusionAs a first year team, the goal for the OspreyDivers is to at least qualify and complete atleast three of the tasks. After testing of thefirst design, the team opted for the backupplan because of simplicity.With theexperience gained this first year, the team hasmomentum to produce a more sophisticatedAUV next year. The goal is to keepintegrating systems to make the sub better inevery aspect.Most importantly, theexperience that this competition gave, andwill give in July, to the individual students onthe Osprey Divers’ team is priceless and onethat is not always found in the classroomenvironment. As years progress, and moreexperience is gained, the Osprey Divers willbe a force to be reckoned with.Mechanical Engineering advisor, and Dr.Brian Kopp, Electrical Engineering advisor.Last but not least, thank you to theanonymous donators to the go fund meaccount, the friends and family members whohave donated, and the dedicated members ofthe Osprey Divers who donated, or as DeanTumeo put it, “provided skin in the game.”8. AcknowledgementsThis team would not have been possiblewithout the support of the Osprey Divers’sponsors and the individuals that gave his andher time and dedication to the progression ofthe AUV. Sponsors include the College ofComputing, Engineering, and ConstructionManagement (CCEC) at the University ofNorth Florida, Taylor Engineering ResearchInstitute, Microchip, and UPS. Specialthanks also goes out to Dean Mark Tumeo,Dean of CCEC, Dr. Murat Tiryakioglu,director of the School of Engineering at UNF,Dr. Don Resio of Taylor EngineeringResearch Institute, Dr. Daniel Cox,Osprey Divers University of North Florida7