Oakland University Presents:

Similar documents
GCAT. University of Michigan-Dearborn

UNIVERSITÉ DE MONCTON FACULTÉ D INGÉNIERIE. Moncton, NB, Canada PROJECT BREAKPOINT 2015 IGVC DESIGN REPORT UNIVERSITÉ DE MONCTON ENGINEERING FACULTY

INTRODUCTION Team Composition Electrical System

2016 IGVC Design Report Submitted: May 13, 2016

Princess Sumaya University for Technology

Autonomous Ground Vehicle

UMD-SMART: Un-Manned Differentially Steered Multi-purpose. GCAT: GPS enabled Conventional-steered Autonomous Transporter

Autonomously Controlled Front Loader Senior Project Proposal

Club Capra- Minotaurus Design Report

N.J.A.V. (New Jersey Autonomous Vehicle) 2013 Intelligent Ground Vehicle Competition

LTU Challenger. TEAM MEMBERS: Andrey Chernolutskiy Vincent Shih-Nung Chen. Faculty Advisor's Statement:

Vehicle Design Competition Written Report NECTAR 2000

iwheels 3 Lawrence Technological University

Cilantro. Old Dominion University. Team Members:

PATH TO SUCCESS: AN ANALYSIS OF 2016 INTELLIGENT GROUND VEHICLE COMPETITION (IGVC) AUTONOMOUS VEHICLE DESIGN AND IMPLEMENTATION

BASIC MECHATRONICS ENGINEERING

NJAV New Jersey Autonomous Vehicle

Eurathlon Scenario Application Paper (SAP) Review Sheet

ISA Intimidator. July 6-8, Coronado Springs Resort Walt Disney World, Florida

Centurion II Vehicle Design Report Bluefield State College

DELHI TECHNOLOGICAL UNIVERSITY TEAM RIPPLE Design Report

TENNESSEE STATE UNIVERSITY COLLEGE OF ENGINEERING, TECHNOLOGY AND COMPUTER SCIENCE

FLYING CAR NANODEGREE SYLLABUS

Detailed Design Review

Freescale Cup Competition. Abdulahi Abu Amber Baruffa Mike Diep Xinya Zhao. Author: Amber Baruffa

Black Knight. 12th Annual Intelligent Ground Vehicle Competition Oakland University, Rochester, Michigan June 12 th 14 th 2004

2015 AUVSI UAS Competition Journal Paper

Calvin College Automated Designated Driver 2005 Intelligent Ground Vehicle Competition Design Report

Cooperative Autonomous Driving and Interaction with Vulnerable Road Users

Moksha. Unmanned Ground Vehicle. M S Ramaiah Institute of Technology s entry into the 2011 Intelligent Ground Vehicle Competition

Enhancing Wheelchair Mobility Through Dynamics Mimicking

Homework 3: Design Constraint Analysis and Component Selection Rationale

GPS Robot Navigation Bi-Weekly Report 2/07/04-2/21/04. Chris Foley Kris Horn Richard Neil Pittman Michael Willis

F.I.R.S.T. Robotic Drive Base

HOSEI UNIVERSITY. Orange2015. Design Report

The College of New Jersey

ELM327 OBD to RS232 Interpreter

K.I.T.T. KINEMATIC INTELLIGENT TACTICAL TECHNOLOGY

Slippage Detection and Traction Control System

Implementation Notes. Solar Group

THIRTEENTH ANNUAL INTERNATIONAL GROUND VEHICLE COMPETITION. Design Report

Rule-based Integration of Multiple Neural Networks Evolved Based on Cellular Automata

The Lug-n-Go. Team #16: Anika Manzo ( ammanzo2), Brianna Szczesuil (bszcze4), Gregg Lugo ( gclugo2) ECE445 Project Proposal: Spring 2018

PROJECT PROPOSAL FIRE FIGHTING ROBOT CHALLENGE THE ENGINEERS: SUBMITTED TO: SPONSORED BY: Micro Fire Extinguisher

MOLLEBot. MOdular Lightweight, Load carrying Equipment Bot

The University of Detroit Mercy Presents BAZINGA! IGVC 2012 Design Report

The MathWorks Crossover to Model-Based Design

Eurathlon Scenario Application Paper (SAP) Review Sheet

UAV KF-1 helicopter. CopterCam UAV KF-1 helicopter specification

Preliminary Design Report. Project Title: Lunabot

Alan Kilian Spring Design and construct a Holonomic motion platform and control system.

ATHENA 2008 Intelligent Ground Vehicle Competition

: MOBILE ROBOTS CAPSTONE DESIGN COURSE

Vehicle Design Report: UBC Snowbots Avalanche

Odin s Journey. Development of Team Victor Tango s Autonomous Vehicle for the DARPA Urban Challenge. Jesse Hurdus. Dennis Hong. December 9th, 2007

Mercury VTOL suas Testing and Measurement Plan

TWELFTH ANNUAL INTERNATIONAL GROUND VEHICLE COMPETITION. Design Report

Capra6 Design Report 2013

Steering Actuator for Autonomous Driving and Platooning *1

Autonomous Quadrotor for the 2014 International Aerial Robotics Competition

Adult Sized Humanoid Robot: Archie

Laser Tag Droid. Jake Hamill, Martin Litwiller, Christian Topete ECE 445 Project Proposal

2019 SpaceX Hyperloop Pod Competition

Journal of Emerging Trends in Computing and Information Sciences

RED RAVEN, THE LINKED-BOGIE PROTOTYPE. Ara Mekhtarian, Joseph Horvath, C.T. Lin. Department of Mechanical Engineering,

Steer-by-Wire Systems with Integrated Torque Feedback Improve Steering Performance and Reduce Cost

Week 11. Module 5: EE100 Course Project Making your first robot

Solar Powered Golf Cart

Department of Electrical and Computer Science

Initial Concept Review Team Alpha ALUM Rover (Astronaut Lunar Utility Mobile Rover) Friday, October 30, GMT

ParcelBot A Tracked Parcel Transporter with High Obstacle Negotiation Capabilities

IEEE SoutheastCon Hardware Challenge

Faculty Advisor Statement. Penn State Robotics Club

Farhana Shirin Lina BSC.(Electrical and Electronic) Memorial University of Newfoundland & Labrador

Environmental Envelope Control

Palos Verdes High School 1

Listed in category: ebay Motors > Other Vehicles > Race Cars (Not Street Legal) > Off-Road. Bidder or seller of this item? Sign in for your status

AC : INTERACTIVE SENSOR PACKAGE UNIT - A MULTIDISCIPLINARY DESIGN PROJECT

Robot Arm with Conveyor Belts

Pothole Tracker. Muhammad Mir. Daniel Chin. Mike Catalano. Bill Quigg Advisor: Professor Ciesielski

EV-EMCU Electric Vehicle - Economy Mode Control Unit

An Autonomous Braking System of Cars Using Artificial Neural Network

Design and Development of the UTSA Unmanned Aerial System ACE 1

Super Squadron technical paper for. International Aerial Robotics Competition Team Reconnaissance. C. Aasish (M.

SAE Mini BAJA: Suspension and Steering

ELM327 OBD to RS232 Interpreter

NINTH ANNUAL INTERNATIONAL GROUND VEHICLE COMPETITION Design Report ALVIN II. Trinity College. Hartford, Connecticut. May 18, 2001

Overview of operation modes

COMET: Colorado Mini Engine Team Manufacturing Status Review February 3, 2014

ENERGY ANALYSIS OF A POWERTRAIN AND CHASSIS INTEGRATED SIMULATION ON A MILITARY DUTY CYCLE

WE Bots Project CAR. Competative Autonomus Racer

SELF DRIVING VEHICLE WITH CONTROL SYSTEM USING STEREOVISION TECHNIQUE

Enabling Technologies for Autonomous Vehicles

SIL, HIL, and Vehicle Fuel Economy Analysis of a Pre- Transmission Parallel PHEV

Give Your Battery A Rest With A Supercapacitor-based Power Subsystem

EECS 461 Final Project: Adaptive Cruise Control

Laird Thermal Systems Application Note. Cooling Solutions for Automotive Technologies

Control of Mobile Robots

System Integration of an Electronic Monitoring System in All-Terrain Vehicles

High-accuracy Dead-reckoning System (HADRS) for Manned and Unmanned Ground Vehicles

Transcription:

Oakland University Presents: I certify that the engineering design present in this vehicle is significant and equivalent to work that would satisfy the requirements of a senior design or graduate project course. Dr. Ka C. Cheok

Contents 1 Introduction... 1 2 Project Management... 1 2.1 Design Strategy... 1 2.2 Team Organization... 1 3 Innovations... 2 4 Mechanical System... 3 4.1 Chassis Design... 3 4.2 Drive Train... 3 5 Electrical, Computing and Sensing System... 4 5.1 Power Distribution... 4 5.2 Sensor Array... 4 5.3 Hardware Architecture... 5 5.4 Custom PCB for dspic Processors... 6 5.5 Manual Control and Wireless E-Stop... 6 5.6 Battery Life... 6 6 Software Strategy... 7 6.1 Search-Based Path Planning... 7 6.2 Kalman Filter Based Sensor Fusion... 9 6.3 Autonomous Challenge... 9 6.3.1 Lane Detection... 10 6.3.2 Goal Point Selection... 10 6.4 Navigation Challenge... 11 6.5 Accuracy of Arrival at Waypoints... 11 6.6 JAUS Challenge... 11 7 Predicted Performance... 12 7.1 Speed... 12 7.2 Ramp Climbing Ability... 12 8 Cost Breakdown of Components... 13 9 Conclusion... 13 Acknowledgements... 13

1 Introduction This year, Oakland University is participating in IGVC with the UGV Beast. Beast is a re-design of X-Man from the 2008 competition. Beast is a light-weight, powerful and energy efficient mobile platform, designed for ease of maintenance and maneuverability. The team has high expectations for this year s competition, with newly developed artificial intelligence and improved electronic hardware. 2 Project Management 2.1 Design Strategy Using experiences gained in past IGVC competitions, as well as addressing the issues with the previous design, shown in Figure 2, it was decided to create a lighter, simpler robot for the 2010 competition. The result is Beast, shown in Figure 1. To design Beast, the entire team met on a weekly basis to report progress, plan the next steps of the design, and to specify tasks and their Figure 1: Beast 2010 completion deadlines. Each subsystem Figure 2: X-Man 2008 had its own champion in charge of it. Each member would prepare progress reports to keep the rest of the team updated on the status of their design at each meeting. The team leader's role was to run the weekly meetings, organize and manage the team's efforts, and guide the work of the sub-teams where possible. 2.2 Team Organization An organization chart of the team outlining the contribution of each of the team members is shown in Figure 3. All of the team members volunteered their free time to work on Beast, and contributed a total of approximately 1500 man-hours. 1

Team Leader Micho Radovnikovich Mechanical Design Kirk McGuire Low-Level Software Pavan Vempaty High-Level Software Micho Radovnikovich Electronics Steve Grzebyk Frame Design Scott Marginet Drive Control Naveen Chilukoti Vision Lincoln Lorenz Matt Bruer JAUS Kevin Hallenbeck Path Planning Micho Radovnikovich Figure 3: Team organization chart 3 Innovations Search-Based Path Planning Instead of approaching path planning from a completely control systems point of view as in the past, cost function minimization and fuzzy logic techniques are used to greatly improve the artificial intelligence. Rear-View Lidar Using a Lidar in the back of the vehicle allows for extra visibility of the robot s surroundings, and provides opportunity for better path planning, especially in the Navigation Challenge. Custom PCB Electronics The dspic microcontrollers running the drive control algorithms and interfacing to the wireless remote are mounted on a custom PCB that provide them with proper power, external oscillators and easy access to the pins. 2

4 Mechanical System 4.1 Chassis Design Beast's chassis is based on Oakland's 2008 IGVC entry X-Man, but is completely re-designed. While keeping the original electric wheelchair motors and core structure, as well as some of the changes from X-Man, the rest of the frame was modified to make the vehicle much lighter and simpler. The new chassis design was created around the requirements set forth in the early design phase, and to accommodate the desired placement of the hardware Figure 4: Beast s base frame components. Figure 4 shows the base frame from which the new chassis is derived. The old frame was much too heavy, with most of the body made out of steel and lots of unnecessary material. The camera shaft and its base alone weighed almost 40 pounds in the effort to make it rigid. On Beast, the body is replaced by Plexiglas, and the interior structure is simplified dramatically, with a compartment for batteries, a vertically mounted panel for the electronics, and space for the laptop and other components. It has a fiberglass camera shaft with tubular aluminum rods in a pyramid-type structure to support it. The complete chassis is shown in Figure 5. 4.2 Drive Train The drive train on Beast is the only component of the original wheelchair that remains unchanged. The drive train consists of two 24 volt brushed DC motors, which have built-in 32:1 gearboxes. The team retrofitted the motors with optical encoders to provide feedback measurement to the drive control program. Figure 5: Beast's complete chassis 3

5 Electrical, Computing and Sensing System 5.1 Power Distribution Beast's power is sourced from four 12 volt, 19 Ah sealed lead acid batteries. They are arranged with two sets of two batteries in parallel, which are then put in series with each other. This 24 volt source is fused, and then fed directly to the motor controllers. The 24 volt source is also regulated down to 12 and 5 volts to power the rest of the components on the robot. A conventional turn-to-release e-stop button controls power to the robot s systems using a normally closed switch. To recharge the batteries, two 12 volt chargers are connected to ports on the side of the chassis, with each one independently charging one of the parallel sets of batteries. Using two normally open switches controlled by the e-stop button, the batteries are automatically isolated from the rest of the vehicle s circuitry and connected appropriately to the charging ports when the e-stop is engaged. The power distribution and battery charging circuit are illustrated in Figure 6. 5.2 Sensor Array To accurately determine its own pose and perceive the obstacles in its surroundings, Beast is outfitted with the following sensors: Hokuyo Lidar: SICK Lidar: Camera: GPS Receiver: A Hokuyo URG-04LX is used to detect objects to the rear of the vehicle. A SICK LMS200 Lidar unit is used to detect objects in front of the vehicle. An IDS µeyele camera is used to detect lane lines and other objects of interest. Figure 6: Beast uses a ublox AEK-4P GPS receiver. The receiver and its antenna are very small and cheap, but the readings require filtering with data from other sensors to reliably approach the GPS targets. 4 Power distribution and battery charging circuit

Digital Compass: Wheel Encoder: A Honeywell HMR3200 digital compass is used to help filter the readings from the GPS receiver by providing another measurement of the vehicle s heading. A US Digital E3 encoder is mounted to each of Beast s motors to provide feedback for the drive control algorithm. The readings are also used to estimate the velocity of the robot, which is another measurement in the GPS filter. 5.3 Hardware Architecture The architecture of the computing and sensing hardware is shown in Figure 7. The drive control software is implemented on a Microchip 30F4011 dspic processor. It reads speed commands via RS-232, measures the pulses from the wheel encoders, and applies PI control using the encoder readings for feedback. Interface to the wireless remote control is done on a 30F2012 dspic processor. All of the high level software like path planning, vision and JAUS are done on a laptop. µeyele Camera AEK-4P GPS URG-04LX Lidar US Digital E3 Encoders 30F4011 dspic Victor 884 H-Bridges LMS200 Lidar HMR3200 Compass Dell Latitude 2100 Laptop DX5E Radio 30F2012 dspic Figure 7: Architecture of Beast s hardware components 5

5.4 Custom PCB for dspic Processors To reliably provide the dspic processors with proper power and high-quality external oscillators, while also not taking up large amounts of space, a custom PCB board was designed and fabricated. The board has sockets to hold both the 30F4011 and the 30F2012, and projects all the I/O pins to headers which can be easily accessed. To allow easy replacement of components in the case of damage, all the capacitors, resistors, LED and voltage regulator have sockets as well. The board was fabricated from ExpressPCB TM, using CAD software. A picture of the actual PCB and the CAD schematic sent to ExpressPCB are shown in Figure 8. Figure 8: Custom PCB and its CAD schematic 5.5 Manual Control and Wireless E-Stop Manual control of Beast is achieved using a Spektrum DX5E radio controller for RC airplanes. The receiver outputs RC signals whose pulse widths vary according to the joystick positions on the controller. Because of this, it is very easy to measure the inputs from the joysticks using the input capture ports on a dspic processor and generate appropriate vehicle speed commands. These speed commands are then sent to the drive control dspic. The wireless e-stop is also implemented using the DX5E, which has a dedicated channel normally used to deploy the landing gear of an RC airplane. The channel outputs two discrete pulse widths, and when the pulse width corresponding to the stop state of the switch is detected, the drive control dspic immediately disables PWM output to the motors to stop the vehicle. 5.6 Battery Life Table 1 shows approximations for the amount of power that each hardware component being powered from the main batteries consumes. Based on these estimates, it is estimated that Beast can drive at full power for about 1.2 hours before requiring a recharge in worst case. 6

Using a more realistic estimate of 6 amps for the two motors on average, it is approximated that Beast lasts about 5 hours. Assuming the batteries are almost completely discharged, the maximum amount of time it would take to recharge Beast at a 10 amp rate is around 4.5 hours. While testing Beast, these estimates were found to be quite accurate. The laptop computer runs off its own power, and also powers the camera and GPS receiver. Experimentation has shown that while running the software algorithms and powering the external USB devices, the laptop battery lasts around 4 hours. Table 1: Power Consumption Estimates Component Max Current (A) Max Power (W) SICK LMS200 0.83 20 Hokuyo URG-04LX 0.5 6 Honeywell HMR3200 0.0034 0.041 US Digital E3 Encoder 0.34 1.7 Motors 30 720 6 Software Strategy All of the software algorithms running on the laptop, with the exception of the JAUS system, are implemented in Matlab/Simulink. This unified development environment allows for easy integration of the several different algorithms, and makes debugging the overall system simpler. 6.1 Search-Based Path Planning Beast s path planning system uses data from the Lidar sensors and camera and applies a simple search algorithm to find its way to the goal. For each of the 360 degrees around it, the robot computes a cost for traveling in that direction. The cost function is computed based on how clear the surroundings are and how much progress can be made toward to the goal. Progress toward the goal is quantified according to (1): f D i = λ 1 r i + λ 2 d i (1) where f D i is the distance function in a given direction i, r i is the distance to the nearest obstacle in the direction i, d i is the distance from this obstacle to the goal, and λ 1, λ 2 are constants. The geometry of this is shown in Figure 9. By adjusting the values of λ 1 and λ 2, the robot places different emphasis on making aggressive movements toward the goal. 7

The total cost function is then constructed based on (1) as well as a measure of how clear the surroundings are, as shown in (2): C i = f D α i f O β (i) (2) where C i is the total cost to go in direction i, f O i is proportional to how far an obstacle is Attracted to Obstacle Edges from the robot in that direction, and α, β are constants. The contribution of f O lowers the cost when the obstacle is far away, thus encouraging the robot to explore open areas. By properly tuning the constants in (1) and (2), as the robot approaches its goal but it sees an obstacle in its path, it is naturally attracted to the edge of the obstacle, which allows for very efficient traversal of it. within a certain threshold of an obstacle. d i To avoid situations where the robot reaches a cost function well and gets stuck, as well as to encourage it to keep moving the same general direction, a two-step disallowing region is defined where the robot just came from. This is shown in Figure 10. Since the algorithm naturally goes to the edges of obstacles, it tends to get too close and slow down dramatically. Therefore, another preventive measure is taken, where it makes a reflective movement away from it once it gets The search algorithm was simulated in Matlab, where a test map was drawn to simulate obstacles, and a program was written to generate simulated Lidar scans depending on the current position and heading of the robot. The location of the goal point and the Lidar data are inputted to the search algorithm, and the robot tries to navigate itself to the goal. An example of this simulation is shown in Figure 11. r i Figure 9: Search-based path planning Move 1 Move 2 Combined Disallowed Region Figure 10: Disallow shortterm backtracking 8

Minimum Cost Next Goal Point Simulated Lidar Scan Cost Function Plot Disallowed Figure 11: Example of search-based path planning simulation 6.2 Kalman Filter Based Sensor Fusion Since the AEK-4P GPS receiver lacks the accuracy to reliably get to the waypoints, a Kalman filter is used to fuse readings from the compass, wheel encoders, and GPS to provide a much more accurate estimate of the vehicle s location. 6.3 Autonomous Challenge For the Autonomous Challenge, the search-based path planning system is used, but adapted to accommodate the different task. The core decision-making algorithm remains the same, but the system also integrates lane line information, while also periodically updating the goal point. 9

6.3.1 Lane Detection In the Autonomous Challenge, anything that is not green grass is something that should be detected and avoided. Based on this assumption, a median thresholding technique is used to extract objects of interest from the grass background of an incoming image from the robot s camera. After performing the median thresholding, the locations of the objects of interest are measured. This process is illustrated in Figure 12. Median Thresholding Location Extraction Figure 12: Example image processing procedure The median thresholding technique operates on 320x240 pixel images in the HSV color space. Since the majority of the pixels in any given image on the Autonomous Challenge course are generally grass, the median color value will follow the color of the grass. Therefore, any pixels with color values outside of a certain threshold in all three color planes are marked as a pixel of interest. To clean up the output image from the thresholding operation, a morphological closing is performed. This clean output image is shown in the middle of Figure 12. To extract position information from the thresholded image, the locations of the bottommost pixel of interest in each of nine fixed-width columns are recorded. These correspond to the colored dots in Figure 12. These dots are then mapped to their corresponding coordinates in the robot s coordinate frame according to a calibrated kinematics transformation. The locations of the points of interest are then input to a fuzzy logic system that quantifies how blocked the left, center and right regions are. These measures are used in the path planning algorithm to avoid the lines and update the goal point appropriately. 6.3.2 Goal Point Selection The goal point is updated based on a history of the past GPS readings. It is periodically changed to be 30 feet in front of the vehicle, along the line formed by the current location of the robot and where it was 10 seconds previous. 10

The lane detection system outputs the angle of the lines that it sees, and this measurement of the angle is used to adjust the goal point as illustrated in Figure 13. 6.4 Navigation Challenge The search-based path planning system is very well suited for the Navigation Challenge because the goal points are fixed. The task is then to pick a sequence of waypoints and directly input them as the goals in the path planning system. Original Goal Point Adjusted Goal Point 6.5 Accuracy of Arrival at Waypoints From experimentation, it was observed that the raw, unprocessed position readings from the AEK-4P receiver had a variance of around ±2 meters, which would be unreliable for arriving at the GPS waypoints. However, with the Kalman filter fusing the GPS readings with the wheel encoders and compass, the variance was brought down to about ±1.1 meters. Waypoint navigation tests using the output from the Kalman filter were found to yield much better results than similar tests using the raw GPS data. 6.6 JAUS Challenge For the JAUS Challenge, it was decided to utilize the OpenJAUS project, an open source implementation of the JAUS protocol. The OpenJAUS functions are implemented in C, and are responsible for reading incoming JAUS messages, running state machines to govern the response and reaction to these messages, and generating properly formatted JAUS messages to transmit. In order to relay information to and from the robot s Matlab-based systems, loopback TCP is used. A diagram of the JAUS system is shown in Figure 14. Figure 13: Goal point selection 11

COP Input Message Interpreter Output Message Generator Matlab Software Algorithms State Machine Loopback TCP Beast s Systems OpenJAUS C Implementation Figure 14: Diagram of the JAUS system implementation 7 Predicted Performance 7.1 Speed While spinning at full speed, the wheels were measured to be rotating at 148 RPM. With 13 inch wheels, this corresponds to a forward speed of 5.72 mph, which exceeds the requirements for the competition. Therefore, the drive control program on the dspic processor limits the output it can apply to the motors such that the fastest they can rotate is 125 RPM. At 125 RPM, the speed of the robot is 4.83 mph. 7.2 Ramp Climbing Ability To determine if Beast would be able to climb the ramps on the Autonomous Challenge course, some simple estimates and calculations were made. The wet coefficient of friction between Beast s tires and the plywood was determined experimentally to be 0.282. With the weight of the robot approximately 150 pounds and assuming nominal torque of each motor being applied, this resulted in a maximum constant-speed climb angle of 15.7 degrees. Also, based on 12

these estimates, Beast should be able to climb the approximately 8.5 degree ramps while accelerating at 1.37 m/s 2, without slipping. 8 Cost Breakdown of Components Table 2: Cost Breakdown of the Development of Beast Item Quantity Price Extended Price Cost to Team Ublox GPS Unit 1 $198 $198 $198 Optical Wheel Encoder 2 $52 $104 $104 SICK Lidar 1 $4,000 $4,000 $0 Hokuyo Lidar 1 $2,375 $2,375 $2,375 Digital Compass 1 $175 $175 $175 Machine Vision Camera 1 $380 $380 $380 Camera Lens 1 $75 $75 $75 Dell Laptop 1 $560 $560 $560 Electric Wheelchair 1 $1,100 $1,100 $0 12 Volt, 19 Ah Battery 4 $75 $300 $300 PCB Fabrication N/A $60 $60 $60 Motor Controller 2 $115 $230 $230 Frame Materials N/A $400 $400 $0 Wire, Cabling and Connectors N/A $200 $200 $200 IC's and Circuit Components N/A $100 $100 $100 Total: $10,257 $4,757 9 Conclusion Beast has proven to be very rugged, efficient and reliable, performing well while driving on any kind of terrain. The new artificial intelligence design shows promising results, and the Oakland University team has great confidence going into this year s competition. Acknowledgements The Oakland University IGVC team would like to express gratitude to the School of Engineering and Computer Science for providing the funding and lab space without which, participation in IGVC would not be possible. Special thanks also goes to our advisor, Professor Ka C. Cheok, from whom we continually learn valuable techniques that not only help us in IGVC, but prepare us for the future. 13