CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would...

53
CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering Drone for Real Time Environment Mapping and Reporting, for Use by Law Enforcement A thesis submitted in partial fulfillment of the requirements For the degree of Master of Science in Computer Science By Edward Eisen December 2015

Transcript of CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would...

Page 1: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

CALIFORNIA STATE UNIVERSITY, NORTHRIDGE

A Hovering Drone for Real Time Environment Mapping and Reporting, for Use by Law

Enforcement

A thesis submitted in partial fulfillment of the requirements

For the degree of Master of Science in Computer Science

By

Edward Eisen

December 2015

Page 2: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

ii

The thesis of Edward Eisen is approved:

Capt. Scott VanScoy Date

Dr. John Noga Date

Dr. Jeffrey Wiegley, Chair

Date

California State University, Northridge

Page 3: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

iii

Table of Contents

Signature Page .................................................................................................................... ii

List of Figures ......................................................................................................................v

Abstract .............................................................................................................................. vi

Chapter 1: Introduction ........................................................................................................1

1.1 Motivation ..................................................................................................................1

1.2 Project Goals ..............................................................................................................1

1.3 Background ................................................................................................................2

1.3.1 Quadcopter ..........................................................................................................2

1.3.2 PWM....................................................................................................................3

1.3.3 Raspberry Pi B+ ..................................................................................................3

1.3.4 Microsoft X Box 360 Kinect ...............................................................................4

1.3.5 I2C ........................................................................................................................4

1.4 Similar Research ........................................................................................................4

Chapter 2: Hardware ............................................................................................................6

2.1 Flight System ..............................................................................................................6

2.2 Computing System .....................................................................................................8

2.3 Power System ...........................................................................................................12

2.4 3D Printing ...............................................................................................................13

Chapter 3: Software ...........................................................................................................17

3.1 Python vs. C .............................................................................................................17

3.2 Python Code Overview ............................................................................................17

3.3 PID Loop Controller Details ....................................................................................19

3.4 Remote Computer Software .....................................................................................23

Chapter 4: Results ..............................................................................................................24

4.1 Testing ......................................................................................................................24

4.2 Positive Results ........................................................................................................24

4.2.1 Quadcopter Build...............................................................................................24

4.2.2 Partial Autopilot Control ...................................................................................25

Page 4: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

iv

4.2.3 Manual Override ................................................................................................25

4.2.4 3D Modeling with Kinect ..................................................................................25

4.2.5 Wi-Fi Video Streaming .....................................................................................25

4.3 Negative Results .......................................................................................................26

Chapter 5: Future Work .....................................................................................................28

5.1 Continued Work With Raspberry Pi – Kinect Resources ........................................28

5.2 Improved Test Conditions ........................................................................................28

5.3 Improved Manual Override Circuit ..........................................................................28

5.4 Further Sensor Integration into PID Loop ................................................................29

5.5 Raspberry Pi 2 ..........................................................................................................29

5.6 GPS ...........................................................................................................................30

References ..........................................................................................................................31

Appendix A: QuadResources.py ........................................................................................33

Appendix B: AutoPilot.py .................................................................................................39

Page 5: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

v

List of Figures

1.1 Quadcopter Motor Layout..............................................................................................2

1.2 PWM Servo Signal ........................................................................................................3

2.1 KK-Mini .........................................................................................................................7

2.2 Raspberry Pi ...................................................................................................................9

2.3 PWM to High/Low Circuit ..........................................................................................10

2.4 Transistor Control Switch ............................................................................................10

2.5 Kinect, Before and After Stripping ..............................................................................11

2.6 Combrov WF81 Webcam ............................................................................................12

2.7 Hardware Setup ............................................................................................................14

2.8 Cost and Weight of Hardware ......................................................................................15

2.9 Fully Built Quadcopter with Kinect .............................................................................16

3.1 Standard PID Loop Block Diagram .............................................................................20

3.2 Effects of changing the proportional constant .............................................................21

3.3 Effects of changing the integral constant .....................................................................22

3.4 Effects of changing the derivative constant .................................................................22

Page 6: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

vi

Abstract

A Hovering Drone for Real Time Environment Mapping and Reporting, for Use by Law

Enforcement

By

Edward Eisen

Master of Science in Computer Science

Surveillance and intel are key to keeping officers safe during dangerous situations;

however, the current robotics being used to assist in these duties are too expensive for

many smaller departments. This thesis/project looks into the use of the remote controlled,

multirotor-copters as a form of inexpensive remote reconnaissance. A custom built quad-

copter is used in conjunction with the Microsoft Kinect to relay information to an offsite

user. The Kinect is a device used to gather RGB and 3D images. To reduce learning

requirements on the users’ part, the quad-copter is designed to fly autonomously.

Chapter 1 goes over topic background and the project goals. Chapter 2 explains

the hardware used. Details in weight, pricing, and hardware builds are included. Chapter

3 details the code and resources used. This contains information on the autonomous flight

control, resources for using the MS Kinect, and programs used for remote monitoring.

Chapter 4 explains the results of the project and describes what aspects were successful

Page 7: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

vii

and which were unsuccessful. Chapter 5 describes possible future work to be done,

including different approaches which may lead to better success.

Page 8: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

1

Chapter 1

Introduction

1.1 Motivation

Law enforcement officers continually put their lives on the line for the safety of

the community; however, as technology grows and improves, dangerous tasks can

become safer for officers. One form of technology, helping to keep officers safe, is in

surveillance. More and more drones are produced for law enforcement agencies every

year. Despite this, there are some setbacks in purchasing these devices.

The main setback is money. Many of these devices are between $5,000 and

$100,000 [13]. This does not necessarily include maintenance or upgrades. It is possible

for a department to never use one of these high priced devices before it is out of date and

must be replaced.

Another setback is training. These devices must be piloted by an officer and it is

likely that an officer must attend regular training to be prepared to use one of these

robots. This training can be expansive, both in monetary cost and in time.

1.2 Project Goals

This project attempts to provide research in low cost surveillance drones. The

main goal is to provide an easy to use drone, requiring little to no training, which is low

in cost. The goal of this project is to create a virtual, 3D environment using data collected

Page 9: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

2

by an autonomous drone. The drone would communicate with a remote computer over

Wi-Fi, sending RGB images, 3D models and positioning data. The remote computer

would then display this data to the user. This allows the user to examine a room without

having to enter the room or know how to directly pilot the drone. Though this flying

drone would not be nearly as durable as a ground based drone, it would be much less

expensive to replace.

1.3 Background

1.3.1 Quadcopter

A quadcopter is a multirotor-copter, consisting of four motor/propeller

assemblies. The two front propellers spin in opposite directions, as well do the back. By

increasing all propeller speeds, lift is obtained. Increasing a single propeller speed lifts

that side of the copter and creates lateral movement. Rotational movement is obtained by

increasing or decreasing the speed of two opposing propellers (e.g. front-left and back-

right). This creates torque and spins the copter. A flight control board is used to translate

user input to motor speeds. Quadcopters have become increasingly popular in recent

years. This is due to their low cost and increased stability control technology.

Fig 1.1: Quadcopter Motor Layout

Page 10: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

3

1.3.2 PWM

Pulse Width Modulation is the standard for communication between radio

receivers and servos. A PWM signal is a series of square wave forms. The period is 20ms

and the duty cycle is between 1ms and 2ms, where 1ms = -100% and 2ms = 100%. In

respect to the quadcopter, PWM signals are used to communicate flight control from a

radio receiver to a flight control board and motor speed from the flight control board to

the ESCs (Electronic Speed Control).

Fig 1.2: PWM Servo Signal

1.3.3 Raspberry Pi B+

The Raspberry Pi is a cross between a microprocessor and a PC. It has a small

form factor and is able to run ARM based operating systems. It includes four USB ports,

one of which can be used with a Wi-Fi adapter. At the time that this project was started,

Page 11: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

4

the B+ was the latest model; however, since then the Raspberry Pi 2 has been released.

Despite this, the B+ was kept on this project.

1.3.4 Microsoft X Box 360 Kinect

The Kinect is a gaming device used to read a gamer’s body position and

movement as input into a game. It does this be reading a 3D image and interpreting the

users figure. The Kinect has both an RGB camera and depth scanner. The depth scanner

reads a 3D image by sending out an infrared dot matrix, the Kinect can measure changes

in the matrix and determine surface distance. The Kinect is powered by a 12v power

supply and can connect to a computer using a USB cable [4].

1.3.5 I2C

This is the main communication bus used between the Raspberry Pi and the

sensors. I2C requires four wires: adata line (SDA), a clock line (SCL); and two lines for

power: voltage (3.3v or 5v) and ground. Much of the communication is taken care of by

the Raspberry Pi and resource programs provided for the individual sensors.

1.4 Similar Research

Some research has already been done in combining the Kinect Sensor with a

quadcopter. Two examples stand out:

Page 12: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

5

DIY Drones developed a quadcopter that uses the Kinect sensor to auto level and

hold position if an object comes within a certain distance of it [3].

MIT developed a quadcopter that uses the Kinect to fly itself within a room and

capture a 3D model of the room.

Though these projects are similar to this project, there are some distinct differences.

This project focuses on budget and include other sensors for environment scanning.

Additionally, the Raspberry Pi is being tested as an inexpensive onboard

microcontroller/computer.

Page 13: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

6

Chapter 2

Hardware

2.1 Flight System

The job of the flight system is to receive input from a radio controller or onboard

computer and make flight adjustments. Though the system attempts to self-level, all other

flight decisions are made by the computing system. The flight system is comprised of the

following:

Flight control board (KK-mini)

Transmitter (Turnigy 9X 9CH)

Radio receiver (9X8C v2)

ESCs (Electronic Speed Control) (Turnigy Plush 12amp ESC/BEC)

Motors (Multistar Elite 2204-2300KV)

Frame (FPV250LH Low Hung Clean and Dirty)

The KK-mini is used for the flight control board. The KK-mini is an optimal solution

at $25 and 36x36mm, making it one of the smallest and least expensive multi-rotor

controllers. The flight controller translates control input from a receiver or computing

system and outputs to the four ESCs. Inputs are PWM signals for throttle (up/down),

elevators (forward/back), ailerons (role), and ruder (left/right). The outputs to the ESCs

are also PWM signals, controlling power to each motor.

Page 14: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

7

Fig 2.1: KK-Mini

The radio receiver is used for manual control of the device. It receives PPM

(Pulse Position Modulation) radio signals from a remote controller and outputs the PWM

signals to the control board. For this project the Turnigy 9X 9Ch Transmitter and 9X8C

v2 were chosen since they were the best price for functionality. The Turnigy 9X 9Ch is a

programmable nine channel remote controller at $83.

An ESC is used to control the speed of each motor. It translates PWM signals to

voltage output. Four ESCs are required for a quadcopter (one for each motor). The main

consideration when choosing the ESCs is the motors. If the motors pull more amperage

than the ESCs can provide, the ESCs will burnout. The Turnigy Plush 12amp ESCs were

chosen for this project at $10 each.

While choosing the motors, the power to lift ratio is the main consideration. Many

motors were tested by HobbyKing.com, which listed their lift at different power levels,

using different propeller sizes. The Multistar Elite 2204-2300KV was listed as having a

maximum lift of 371g per motor, at 7.5A (49.5 g/A) with 5x3 propellers and a maximum

lift of 591g per motor, at 9.9A (59.7 g/A) with 5x4 propellers. Though 5x4 propellers

have a more desirable power to weight ratio (as long as the weight is kept under 2364g),

Page 15: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

8

these were hard to find at a reasonable price; however, 5x3 propellers were found at $3

for four and acted as a good balance.

The main consideration in deciding on the FPV250LH frame is its size. A small

frame is desired; however, too small of a frame would limit the propeller size. At 250mm,

the FPV250LH was a desirable choice at $20. It also includes a cage, which connects to

the main frame using eight rubber balls. These reduce noise from the motors and

propellers, allowing clearer video capture.

2.2 Computing System

The computing system receives input from sensors and decides on course corrections.

Ideally, it would receive input from a Kinect and send 3D/RGB image information to a

remote computer over a Wi-Fi connection to be recompiled into a virtual environment;

however, because it was not possible to integrate the Kinect into the system in this way, a

Wi-Fi camera was supplemented. The computing system is comprised of the following:

Raspberry Pi

Sonar rangefinders (MB1242 I2CXL-MaxSonar-EZ4)

Accelerometer/gyro/compass board (Adafruit LSM9DS0)

Servo control board (Adafruit PCA9685)

Manual override board (custom build)

Microsoft X Box 360 Kinnect

Wi-Fi camera (Conbrov WF81)

Page 16: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

9

The Raspberry Pi acts as the intelligence for the computing system. It receives input

from the rangefinders and the LSM9DS0 and sends control signals to the KK-mini using

the servo control board. The Raspberry Pi also switches control between the servo

controller and the 9X8C v2 receiver, using the manual override board.

Fig 2.4: Raspberry Pi

Five MB1242 rangefinders are used to sense objects on the sides, above, below,

and in front of the quadcopter. The MB1242 rangefinder has a resolution of 1cm and can

sense an object up to 750cm away and as close as 1.5cm. The larger an object is, the

farther the rangefinders are able to sense it. The MB1242 communicate with the

Raspberry Pi over the I2C bus.

The LSM9DS0 board reads acceleration, rotation, and magnetic fields in 6

directions. A fusion algorithm combines these readings to provide three-dimensional

orientation information. This information is also communicated to the Raspberry Pi over

the I2C bus.

Page 17: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

10

Though the Raspberry Pi is able to directly interface with the KK-mini control

board through PWM, the signal is noisy and unsuitable for accurate flight control.

Instead, the PCA9685 is used to communicate control signals to the KK-mini. The

Raspberry Pi communicates with the PCA9685 over the I2C bus and the PCA9685

communicates with the KK-mini using PWM signals.

Fig 2.2: PWM to High/Low Circuit

Fig 2.3: Transistor Control Switch

The manual override board has two functions. The first is to convert a PWM

signal from the 9X8C receiver into a DC logic signal to be read by the Raspberry Pi as a

manual override indicator. To do this, the PWM signal is first run through two low pass

filters. These flatten the PWM signal to a DC signal. Though high and low signals are

output from the low pass filters, the high is too low to be read by the Raspberry Pi as a

Page 18: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

11

high signal. To boost the signal an op-amp (operational amplifier) with a potentiometer is

used. This output signal is sent to one of the Raspberry Pi’s GPIO pins. The second

function is to switch control between the Raspberry Pi to the 9X8C receiver. An array of

NPN transistors is used for each flight control signal. When a high PWM signal is read

from the 9X8C receiver, a high DC signal is sent to the Raspberry Pi. In response, the

Raspberry Pi switches high/low output to the 9X8C and Raspberry Pi transistor arrays.

The manual override board also acts as a I2C and power bus for the Raspberry Pi and

sensors.

Despite software integration problems, the Kinect was successfully integrated on

the hardware side. To do this, the Kinect was stripped of all uneasily components. The

outer casing, much of the metal framework, the stand motor, and the microphones were

all removed. This reduced the weight of the Kinect from 440g to 130g and greatly

reduced the footprint. A USB adapter for the Kinect was modified to receive power from

the quadcopters battery and connect the Kinect to Raspberry Pi through USB.

Fig 2.5: Kinect, Before and After Stripping

Page 19: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

12

Since the Kinect could not be integrated into this project, a Wi-Fi camera was

settled on. The WF81 has an independent battery and can communicate with a remote

computer over a router or act as a server host. The video feed can be viewed through a

web browser or mobile app.

Fig 2.6: Conbrov WF81 Webcam

2.3 Power System

The power system is used to power the devices on the quadcopter. The power system

is comprised of:

Battery (Multistart 3S 4000mAh LiPo pack)

Power distribution board (HobbyKing PDB)

BEC (Batter Eliminator Circuit) for KK-mini (Turnigy Plush 12amp ESC/BEC)

BEC for Raspberry Pi (BW RC 3A 5V UBEC)

Power transformer for Kinect (DROK LM2577)

Page 20: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

13

Though the LiPo battery chosen is heavy (250g); a single, large battery turned out to

be more efficient than several small batteries. The battery connects to the PDB. The PDB

is a simple solder board. It provides direct battery power to many of the hardware

components.

Because the KK-mini and Raspberry Pi require 5v power, BECs are used to reduce

and even out power from the main battery. The Raspberry Pi BEC input is soldered to the

PDB and the output is connected to a stripped down, micro USB cable. The KK-mini

BEC is soldered to the PDB and provides power to the KK-mini through the motor output

connectors on the KK-mini board.

Because the Kinect requires 12v and 3S batteries can drop under 11v (at about half

charge), a simple BEC is unable to provide power to the Kinect. Instead a step-up

transformer is required to provide power to the Kinect. Though the Kinect can handle

upwards of 12.5v, anything below 12v failed to power the device. Because of this, the

transformer is set to 12.3v. The transformer input is soldered to the PDB and the output is

spliced into the USB input on the Kinect.

2.4 3D Printing

The first attempt at attaching the computing devices to the quadcopter was done

with Styrofoam support and electrical tape. This quickly became unwieldy and fragile. To

correct this a 3D printer was used to create custom parts. A shell to house the computing

system was printed/built and attached to the bottom of the quadcopter. This shell includes

small struts for landing. Safety struts are also attached under each motor (protruding

Page 21: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

14

outward) in case the copter tilts over on landing. The HICTOP Prusa I3 printer kit was

used for printing, SketchUp was used for design and Cura was used for g-code

conversion to the printer.

Fig 2.7: Hardware Setup

Page 22: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

15

Item weight cost

KK-mini 9g $24.99

9X 9CH Transmitter NA $82.79

9X8C v2 Receiver 18g NA

12amp ESC/BEC (4) 10g $40.40

Elite 2204-2300KV Motors (4) 96g $66.99

propellers (4) 4g $3.84

FPV250LH Frame 180g $19.99

Raspberry Pi B+ w/ Wi-Fi Adapter 52g $48.00

MB1242 Rangefinder (5) 45g $174.75

PCA9685 Sensor Board 7g $14.95

Manual Override Board 10g $10.00

WF81 Webcam 21g $49.99

MS Kinect 130g $19.99

3S 4Ah LiPo Battery 250g $25.13

Power Distribution Board 4g $2.22

5V UBEC 10g $7.99

DROK LM2577 Power Transformer 22g $10.20

Computing system Housing 110g NA

misc parts 60g $30.00

Total (w/ Kinect): 1017g $582.23

Total (w/ Webcam): 908g $612.23

Fig 2.8: Cost and Weight of Hardware

Page 23: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

16

Fig 2.9: Fully Built Quadcopter with Kinect

Page 24: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

17

Chapter 3

Software

3.1 Python vs. C

The first decision to be made while working with the Raspberry Pi was which

programming language to use. Both Python and C are well supported on the Raspberry

Pi; however, some of the I2C devices did not have prebuilt C libraries for the Raspberry

Pi. Though it would have been possible to custom build these libraries using Arduino

templets, this would have been very time consuming.

Another major factor in this decision was the ease of code writing. Writing certain

code in Python was much easier than in C. For example, interrupts in Python were easier

to write and easier to make stable. The only downside to Python was speed. Though

Python has come a long way in optimization, it is still consistently slower than C [2].

3.2 Python Code Overview

It was decided that Python would be used unless its speed became too slow;

however, this was not the case since much of the bottlenecking was on the hardware

level. Specifically, the rangefinders required approximately 80ms to obtain a new range

reading. Because of this, Python was kept as the programming language.

Page 25: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

18

The main code on the Raspberry Pi controlled flight decisions. Multi-threading

was used to limit the effects of time delays. In this way, fast processes were not limited

by slower processes. The following threads were implemented in the AutoPilot.py code:

exitSig(): Waits for input from the user and signals other threads to end.

getRanges(): Continually pings rangefinders for updated readings.

getSensors(): Continually pulls readings from the LSM9DS0 board.

flightControl(): Analyzes sensor readings and changes sensor targets.

PID(): Compares current sensor readings to sensor targets and makes appropriate

control adjustments.

QuadResources.py was written to help abstract sensor readings and servo controlling.

This program uses the RTIMULib recourses [12], and Adafruit_PWM_Servo_driver.py

for accessing the LSM9DS0 sensor board and Adafruti_PWM_Servo_Driver [1] for

accessing the PCA9685 servo control board. Two classes were written into

QuadResources.py:

1. Sensors:

a. pingRanges(directions): Initiates a range reading in the direction passed in

or ‘all.’

b. getRange(direction, noWait): Pulls range reading in the direction passed in

or in the form of a dict. if ‘all’ is passed in. If noWait is set to True,

function will not delay for new reading.

Page 26: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

19

c. getGyros(direction): Pulls current orientation reading (in degrees) from

LSM9DS0 board, about the axis passed in or as a dict. if ‘all’ is passed.

d. getAccel(direction): Pulls current acceleration reading (in degrees) from

LSM9DS0 board, in the direction passed in or as a dict. if ‘all’ is passed.

e. getPollInterval(): Returns recommended time between LSM9DS0 sensor

reading polls.

2. Servos:

a. Set(servo, percent): Sets designated servo to designated percentage. All

servos, except aux and throttle, are between -100% and 100%. Aux and

throttle are between 0% and 100%.

b. zeroOut(): Sets all servo controls to 0%

c. get(servo): Returns current value of specified servo direction.

d. manOverrideInit(): initializes manual override state and interrupt. Interrupt

watches for rising and falling edges on manual override line and switch

control between the servo control board and the radio receiver.

See code (Appendixes A and B) for more details.

3.3 PID Loop Controller Details

The PID loop is the standard feedback loop for control systems. It continually

integrates sensor readings into the control variable. A PID loop is implemented for each

of the four servo control outputs: throttle, elevators, ailerons, and rudders. There are three

terms of the PID loop: Proportional, Integral, and Derivative.

Page 27: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

20

Fig 3.1: Standard PID Loop Block Diagram[11]

The proportional term is the error (e) between the current position/orientation and

the target position/orientation multiplied by a constant: 𝑃 = 𝐾𝑃 ∙ 𝑒. This means that the

larger the error, the larger the proportional term. Increasing KP, increases the speed of

correction; however, it can create instability and overshooting of the target, if set too

high.

The integral term is the accumulated error: 𝐼 = 𝐾𝐼 ∙ ∑ 𝑒. This means that the

longer an error persists, the larger the integral term gets. The Integral term corrects for

steady state error. In other words, if the proportional term is not correcting the error, the

integral term will. Increasing KI, will increase the rate of steady state error correction;

however, it can add instability to the system, if set too high.

Finally, the derivative term is the rate of change between the current error and the

last error: 𝐷 = 𝐾𝐷 ∙ (𝑒𝑖−1 − 𝑒𝑖), where ei is the current error. This means that the faster

the error is being corrected, the lower the derivative term becomes. Increasing KD will

help reduce overshooting and dampen the overall PID term; however, it can overly slow

the response time if set too high. The derivative term is left out of many control systems

because it can be hard to predict real world stability effects of this term [6, 8].

Page 28: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

21

Though the PID function is simple to code it is can be hard to perfect. The key to a

successful PID function is tuning. Each constant (K) is increased until stability is

compromised and then reduced just under this value. This is, however, only a guideline.

At least some trial and error is usually needed to fine-tune the PID function.

Fig 3.2: Effects of changing the proportional constant [10]

Page 29: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

22

Fig 3.3: Effect of changing the integral constant [10]

Fig 3.4: Effect of changing the derivative constant [10]

Page 30: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

23

3.4 Remote Computer Software

To interface with the Raspberry Pi, an ssh was setup using Putty. This was used to

start and monitor programs running on the Raspberry Pi. Since it can be time consuming

to program over an ssh, much of the work was done on the remote computer and

transferred over to the Raspberry Pi using FileZilla.

The Kinect SDK with fusion was also used on the remote computer; however,

since no method was found for remotely interfacing with the Kinect, this software was

not fully implemented. These resources were tested and worked with the Kinect, while

connected through a USB cable.

Page 31: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

24

Chapter 4

Results

4.1 Testing

To test the project functionality, a medium sized room (about 25x15 feet) was

setup. Cardboard was used in an attempt to create flat walls for easier sensor readings.

The autopilot program was slowly given control of the quadcopter. Each servo control

was tested on its own, with the other three controls handled by manual control. Ideally, as

each piece of the PID loop was properly tuned, several control directions would be

handed over to the autopilot program until the quadcopter was able to fly itself; however,

this was never fully achieved.

4.2 Positive Results

The following progress was made on this project:

4.2.1 Quadcopter Build

The custom quadcopter was successfully built with a printed casing for the

computing system. Additionaly, the stripped down kinect was integrated into the

quadcopter, even if software integration was not successful. The quadcopter was able to

hover with all modules mounted; however, it required about 75% throttle to maintain a

hover above three feet. Though this was a higher adjustment than originally anticipated,

Page 32: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

25

flight time was still acceptable, being longer than 10 minutes. Under manual flight

control, the quadcopter performance was fairly stable.

4.2.2 Partial Autopilot Control

The PID loop was able to control each individual control direction with some

level of success; however, when more than two directions were controlled by the flight

system, system instability became an issue.

4.2.3 Manual Override

The manual override circuit was able to convert the PWM signal from the 9X8C

receiver and send a digital, high/low, signal to the Raspberry Pi. Once the Raspberry Pi

read a change in input, it switched control between itself and the 9X8C receiver.

4.2.4 3D Modeling with Kinect

By connecting the stripped down Kinect directly to a PC with a USB cable, the

fusion test programs, provided by Microsoft, were able to build 3D models and video

streaming.

4.2.5 Wi-Fi Video Streaming

Using the WF81 webcam, it was possible to stream a video feed from the

quadcopter to the remote computer using the Google Chrome web browser.

Page 33: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

26

4.3 Negative Results

Several issues were encountered while working on this project. Though the

project is still viable, the main goals were not accomplished in the desired time. This was

because of the amount of custom work that had to be done during the building of the

quadcopter to interface all the devices together. With more man-hours or a longer period

of time, the project may have been able to meet the desired goals. The two main goals not

met were self-sustained, autonomous flight and 3D environment building with the Kinect.

Though the quadcopter was able to handle some flight control, it was unable to

maintain fully autonomous flight. There are several reason for this:

Rangefinders:

o Though the rangefinders can be very accurate, they also have a low refresh

rate and can have misreads. When ignoring misreads, the rate of reading

data becomes even slower. Reading accuracy is also very dependent on the

material being sensed. The softer the material, the more likely it absorbs

sound and causes misreads. Since the rangefinders were the primary

source of input for the autopilot, these issues caused major errors in

autonomous flight.

PID Tuning:

o Though the PID loop was tuned well enough to control the quadcopter, it

was hard finding a good balance between fast response and stability. Finer

tuning would be a large step towards stable auto pilot flight control.

Page 34: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

27

Testing Conditions:

o Finding a facility in which to test the autopilot proved difficult. An ideal

room (just over 25x25 feet with flat, solid walls and minimal obstacles)

could not be found. The best that could be done, with limited resources,

was to use a garage and put up walls made of cardboard. These walls

protected items stored in the garage and created flatter surfaces for

rangefinder readings; however, the cardboard walls caused as many issues

as they helped avoid. Because they lacked rigidity, they did not provide

reliable reading surfaces for the rangefinders.

The first attempt at building a 3D model off of the Kinect was to compile the

images on the Raspberry Pi and send them over a Wi-Fi connection; however, no

workable resources were found to access the Kinect from the Raspberry Pi. Two possible

resources were found: Open Kinect [9] and xxorde [15]. Both of these were tried, but

only RGB images/video were accessible. More experimentation with these resources may

yield desirable results and further updates may make installation easier.

The second attempt at building a 3D model was to use the Raspberry Pi as a

remote USB hub. To do this, VirtualHere [14] was used. This, however, was unusable for

the Kinect. This was likely due to the Kinect’s speed requirements not being met [7]. It

was possible to use this software with a normal webcam; however, there were large

delays in transmission, rendering this approach unusable.

Page 35: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

28

Chapter 5

Future Work

5.1 Continued Work with Raspberry Pi – Kinect Resources

Farm more time is needed in attempting to integrate the Raspberry Pi with the

Kinect. Several other projects were able to integrate these two devices; even if they did

not send the Kinect data over a Wi-Fi connection. This will continue to be a main goal of

this project.

5.2 Improved Test Conditions

Due to the rangefinders’ drawback in reading absorbent materials, they may not

prove to be reliable enough for real world use. Despite this, it may be useful to test them

in a better equipped facility. The same garage may be used as before; however, instead of

using cardboard for the walls, a more solid material (such as plywood or drywall) will be

used. Doing so will reduce the number of false reads and help tune the PID loop. As well,

this will give a better idea of how well the PID loop can run based on the rangefinders.

5.3 Improved Manual Override Circuit

Though the Manual Override Circuit is fully functional, it may be prudent to

replace the transistor arrays with a multiplexer/switching IC chip. This will help ensure

Page 36: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

29

that a proper amount of power consumption is maintained. Doing so will also improve

the reliability of the circuit board.

5.4 Further Sensor Integration into PID Loop

Further development into the PID loop could help increase flight stability and

response. Currently, only one sensor is being used per PID loop segment: rudder is based

on z-axis, rotational readings and all other controls are based on the range finders. By

integrating acceleration, rotation and range into all PID loop segments, a more reliable

error reading may be obtained.

The challenge in integrating several sensor readings into each PID loop section is

deciding how much weight to give to each sensor. As well, there may be challenges in

deciding whether one of the sensor readings is a misread and what other readings indicate

the misread.

5.5 Raspberry Pi 2

While working on this project, a new version of the Raspberry Pi was released.

Though the time requirement and cost for moving to the new Raspberry Pi did not make

sense mid-project, it may be a valuable tool in the future. The Raspberry Pi 2 features:

a quad-core 900 MHz processor

1 GB RAM

ARMv7

Page 37: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

30

Moving from a single-core 700 MHz processor to a quad-core means that the new

Raspberry Pi will be much faster at multitasking. Having the 1 GB of RAM, versus the

528 MB will further increase performance.

Because the Raspberry Pi 2 has moved from the ARMv6 to the ARMv7, several of

the software limitations will no longer exist. There will now be Windows and Ubuntu OS

distributions available for the Raspberry Pi [5].

5.6 GPS

Another exciting avenue for his project is outdoor flight. Using a GPS sensor may

be able to give faster refresh rates than the rangefinders; however, GPS units do not

function well indoors and have a much larger possible error than the rangefinders (not

accounting for misreads). Despite this, with open, outdoor use, a 9 foot error (maximum

expected GPS error) would not be substantial and having room to move around means

that the PID loop response would not have to be fast. A Wi-Fi video feed could still be

used (assuming the quadcopter does not travel too far out).

This would provide an affordable outdoor surveillance drone that could be used

for a variety of tasks unsuitable for human response. These situations could include:

bomb threats, armed suspects, and biohazardous incidents. Because this avenue of

research would not have much use for 3D imaging, it would still be viable even if the

Kinect integration is not.

Page 38: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

31

References

[1] adafruit/Adafruit-Raspberry-Pi-Python-Code.

https://github.com/adafruit/Adafruit-Raspberry-Pi-Python-Code. Feb. 2015.

[2] B. J. Pelc. Yet Another Language Speed Test: Counting Primes (C, C++, Java,

JavaScript, PHP, Python and Ruby).

https://bjpelc.wordpress.com/2015/01/10/yet-another-language-speed-test-

counting-primes-c-c-java-javascript-php-python-and-ruby-2/. Oct. 2015.

[3] Chris Anderson. Quadcopter with Kinect doing altitude hold and obstacle

avoidance. http://diydrones.com/profiles/blogs/quadcopter-with-kinect-doing.

Sep. 2015.

[4] How Kinect depth sensor works - stereo triangulation?.

https://mirror2image.wordpress.com/2010/11/30/how-kinect-works-stereo-

triangulation/. Oct. 2015.

[5] J. Brodkin. Raspberry Pi 2 arrives with quad-core CPU, 1GB RAM, same $35

price. http://arstechnica.com/information-technology/2015/02/raspberry-pi-2-

arrives-with-quad-core-cpu-1gb-ram-same-35-price/. Oct 2015.

[6] K. H. Ang, G. Chong, and Y. Li. PID control system analysis, design, and

technology. IEEE Transactions on Control Systems Technology, vol. 13, no. 4,

pp. 559–576, Jul. 2005.

[7] Kinect Support. https://www.virtualhere.com/node/342. Oct. 2015.

[8] K. J. Astrom, T. Hagglund, and K. J. Astrom, PID controllers, 2nd ed. Research

Triangle Park, N.C: International Society for Measurement and Control, 1995.

[9] OpenKinect. http://openkinect.org/wiki/Main_Page. Feb. 2015.

[10] PID Controller. https://en.wikipedia.org/w/index/php?title=PID_controller&

Oldid=693402501. Feb. 2015.

[11] PID Controller Simplified. https://radhesh.wordpress.com/2008/05/11/pid-

controller-simplified/. Oct. 2015.

[12] richards-tech/RTIMULib. https://github.com/richards-tech/RTIMULib. Jan. 2015

Page 39: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

32

[13] SDR Tactical Robots. https://www.sdrtactical.com/Overview.aspx. Jan. 2015.

[14] VirtualHere. http://www.virtualhere.com/oem_faq. Aug. 2015.

Page 40: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

33

Appendix A

QuadResources.py

'''

A Hovering Drone for Real Time Environment Mapping and Reporting,

for Use by Law Enforcement

Edward Eisen

13 November 2015

QuadResources.py: provides resources for reading sensors and

Controlling servos using PWM signals.

Sensors Class: provides functions for accessing rangefinders,

Accelerometers and gyros.

Servo Class: provides functions for controling PWM signal to flight

board. Also contains interrupts to control manual override

functions.

'''

import sys, getopt

sys.path.append('.')

import os.path

from smbus import SMBus

from Adafruit_PWM_Servo_Driver import PWM

import RTIMU

import time

import traceback

import RPi.GPIO as GPIO

import math

#globals

manLedPin = 17

manSelectPin = 4

manPin = 27

autoPin = 22

SETTINGS_FILE = "RTIMULib"

class Sensors(object):

rangeTimeStamp = {'front': 0, 'left': 0, 'right': 0, 'up': 0,

'down': 0}

rangers = {'front': 0x31, 'left': 0x34, 'right': 0x30, 'up': 0x33,

'down': 0x32}

ranges = {'front': 0, 'left': 0, 'right': 0, 'up': 0, 'down': 0}

gyros = {'x': 0, 'y': 0, 'z': 0}

accel = {'x': 0, 'y': 0, 'z': 0}

def __init__(self):

#i2c init for range finders

self.i2cbus = SMBus(1)

Page 41: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

34

#IMU init for accel, gryo

print("Using settings file " + SETTINGS_FILE + ".ini")

if not os.path.exists(SETTINGS_FILE + ".ini"):

print("Settings file does not exist, will be created")

self.s = RTIMU.Settings(SETTINGS_FILE)

self.imu = RTIMU.RTIMU(self.s)

print("IMU Name: " + self.imu.IMUName())

if(not self.imu.IMUInit()):

print("IMU Init Failed")

return

else:

print("IMU Init Successful")

self.pollInt = self.imu.IMUGetPollInterval()

print("Recommended poll interval: {}".format(self.pollInt))

####Range

Methethods########################################################

def pingRange(self, direction = 'all'):

if direction == 'all':

for key, val in self.rangers.iteritems():

self.i2cbus.write_byte(val, 0x51)

self.rangeTimeStamp[key] = time.time()

else:

self.i2cbus.write_byte(self.rangers[direction], 0x51)

self.rangeTimeStamp[key] = time.time()

#returns Range in feet

def getRange(self, direction = 'all', noWait = False):

if direction == 'all':

for key, val in self.rangers.iteritems():

if time.time() - self.rangeTimeStamp[key] < 0.1:

#has not waited long enough for range calc

if(not noWait):

waitTime = 0.1 - (time.time() -

self.rangeTimeStamp[key])

if(waitTime > 0):

time.sleep(waitTime)

else:

self.ranges[key] = self.pullRange(key)

return self.ranges

else:

if time.time() - self.rangeTimeStamp[direction] < 0.1:

#has not waited long enough for range calc

if noWait:

return self.ranges[direction] #will return current value

else: # (address pull may create error)

time.sleep(0.1 - (time.time()

- self.rangeTimeStamp[direction]))

return self.pullRange(key)

def pullRange(self, key):

Page 42: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

35

val = self.i2cbus.read_word_data(self.rangers[key], 0xe1)#get value

val = ((val >> 8) | ((val & 0x0F) << 8)) * 0.0328

#flip bytes and convert

return val

# m->ft

####RTIMU Methods#####################################################

#read IMU values into containers

def read(self):

if(self.imu.IMURead()):

imuData = self.imu.getIMUData

self.gyros['x'], self.gyros['y'], self.gyros['z'] = \

self.imu.getMeasuredPose()

for key in self.gyros:

self.gyros[key] = self.gyros[key] * 180.0 / math.pi #rad->deg

self.gyros[key] = int(self.gyros[key] * 10)/10.0 #round to .1

if(self.gyros[key] < 0):

self.gyros[key] = self.gyros[key] + 360

#conv to 360 (+ deg)

self.accel['x'], self.accel['y'], self.accel['z'] = \

self.imu.getAccelResiduals()

return True

else:

#print('error reading IMU')

return False

def getGyros(self, direction = "all"):

if(direction == "all"):

return self.gyros

else:

return self.gyros[direction]

def getAccel(self, direction = "all"):

if(direction == "all"):

return self.accel

else:

return self.accel[direction]

#returns wait time (in seconds) to poll sensors

def getPollInterval(self):

return self.pollInt * 1.0/1000.0

#which way is x, y and z?????

#is pressure a reliable means mesuring of hight????

class Servos(object):

servos = {'ail': 7, 'elv': 6, 'thr': 5, 'rud': 4, 'aux': 8}

servoVals = {'ail': 0, 'elv': 0, 'thr': 0, 'rud': 0, 'aux': 0}

Page 43: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

36

manOverrideInitialized = False

ManFlag = 0;

def __init__(self):

# Initialise the PWM device using the default address

# Note if you'd like more debug output you can instead run:

# pwm = PWM(0x40, debug=True)

self.pwm = PWM(0x40)

self.pwm.setPWMFreq(50)

print('PWM controller Initialized')

self.inManOverride = False

def set(self, servo, percent):

servoMin = 250 # Min pulse length out of 4096

servoMax = 438 # Max pulse length out of 4096

servoZero = 344

servoOffSet = 94

enteredPercent = percent

#percent is different for thr (and aux?) and others

if(servo == 'aux' or servo == 'thr'):

#translate from 0-100 to -100-100

if(percent > 100 or percent < 0):

print('Error: {}% must be between 0 and 100 ({} entered)'

.format(servo, percent))

return False

percent = percent - 50

percent = percent * 2

if(percent < -100 or percent > 100):

print('Error: {}% must be between -100 and 100 ({}

entered)'.format(

servo, percent))

return False

if(percent < 0):

percent = 0 - percent

val = servoZero - int(servoOffSet * percent / 100.0)

self.pwm.setPWM(self.servos[servo], 0, val)

self.servoVals[servo] = enteredPercent

else:

val = servoZero + int(servoOffSet * percent / 100.0)

self.pwm.setPWM(self.servos[servo], 0, val)

self.servoVals[servo] = enteredPercent

return True

def zeroOut(self):

self.set('ail',0)

self.set('elv',0)

self.set('thr',0)

self.set('rud',0)

self.set('aux',0)

Page 44: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

37

def get(self, servo = 'all'):

if(servo == 'all'):

return self.servoVals

else:

return self.servoVals[servo]

def setControl(self, event): #callback for override interrupts

if(GPIO.input(manSelectPin)):

GPIO.output(manLedPin, GPIO.HIGH)

GPIO.output(manPin, GPIO.HIGH)

GPIO.output(autoPin, GPIO.LOW)

self.inManOverride = True

print("Manual Override Engaged") #for debug

else:

GPIO.output(manLedPin, GPIO.LOW)

GPIO.output(manPin, GPIO.LOW)

GPIO.output(autoPin, GPIO.HIGH)

self.inManOverride = False

print("Manual Override Disengaged") #for debug

def manOverrideInit(self):

if(self.manOverrideInitialized): #only init once

print("Manual Override Already Initialized")

return

#setup GPIO pins

GPIO.setmode(GPIO.BCM)

GPIO.setup(manSelectPin, GPIO.IN, pull_up_down = GPIO.PUD_DOWN)

GPIO.setup(manLedPin, GPIO.OUT)

GPIO.setup(manPin, GPIO.OUT)

GPIO.setup(autoPin, GPIO.OUT)

#check override states

if(GPIO.input(manSelectPin)):

GPIO.output(manLedPin, GPIO.HIGH)

GPIO.output(manPin, GPIO.HIGH)

GPIO.output(autoPin, GPIO.LOW)

self.inManOverride = True

print("Manual Override Engaged") #for debug

else:

GPIO.output(manLedPin, GPIO.LOW)

GPIO.output(manPin, GPIO.LOW)

GPIO.output(autoPin, GPIO.HIGH)

self.inManOverride = False

print("Manual Override Disengaged") #for debug

#set up override interrupts

GPIO.add_event_detect(manSelectPin, GPIO.BOTH,

callback=self.setControl)

self.manOverrideInitialized = True

print("Manual Override Interrupt Initialized")

Page 45: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

38

def inManOverrideState(self):

return self.inManOverride

Page 46: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

39

Appendix B

AutoPilot.py

'''

A Hovering Drone for Real Time Environment Mapping and Reporting, for

Use by Law Enforcement

Edward Eisen

13 November 2015

AutoPilot.py: controls flight of quadcopter.

Vars class: Used as a container for variables to be passed between

threads

PID(): Runs through PID loop to control quadcopter and correct

position/orientation error (differnce between current and

target)

flightControl(): Takes sensor readings and sets target

positon/orientation.

'''

from QuadResources import Sensors

from QuadResources import Servos

import time

import thread

import RPi.GPIO as GPIO

#globals#################################################################

rangeLock = thread.allocate_lock() #for signaling new range readings

rangeLock.acquire()

sensorLock = thread.allocate_lock() #for signaling new sensor readings

sensorLock.acquire()

exitLock = thread.allocate_lock() #for starting landing/shutdown

exitLock.acquire()

PIDLock = thread.allocate_lock() #for stoping PID loop

PIDLock.acquire()

elvLock = thread.allocate_lock()

#for stopping elvPID if forward wall out of range

rightLock = thread.allocate_lock() #if locked, ailPID dependent on right

range

rightLock.acquire() #else, depends on left range

ailLock = thread.allocate_lock()

#for stopping ailPID if no side walls in range

#GPIO.cleanup() ########################################################

sensors = Sensors()

servos = Servos()

Page 47: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

40

servos.zeroOut()

servos.manOverrideInit()

class Vars(object):

def __init__(self):

self.gyros = {'x': 0, 'y': 0, 'z': 0}

self.gyroTargets = {'x': 0, 'y': 0, 'z': 0}

#accels

self.ranges = {'front': 0, 'left': 0, 'right': 0, 'up': 0,

'down': 0}

self.rangeTargets = {'front': 10, 'left': 5, 'right': 5, 'up': 25,

'down': 0}

self.gyrosLock = thread.allocate_lock()

self.gyroTLock = thread.allocate_lock()

self.rangesLock = thread.allocate_lock()

self.rangeTLock = thread.allocate_lock()

def setGyros(self, gyros):

self.gyrosLock.acquire()

self.gyros = gyros

self.gyrosLock.release()

def getGyros(self):

self.gyrosLock.acquire()

temp = self.gyros

self.gyrosLock.release()

return temp

def setGyroTargets(self, gyroTargets):

self.gyroTLock.acquire()

self.gyroTargets = gyroTargets

self.gyroTLock.release()

def getGyroTargets(self):

self.gyroTLock.acquire()

temp = self.gyroTargets

self.gyroTLock.release()

return temp

def setRanges(self, ranges):

self.rangesLock.acquire()

self.ranges = ranges

self.rangesLock.release()

def getRanges(self):

self.rangesLock.acquire()

temp = self.ranges

self.rangesLock.release()

return temp

Page 48: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

41

def setRangeTargets(self, rangeTargets):

self.rangeTLock.acquire()

self.rangeTargets = rangeTargets

self.rangeTLock.release()

def getRangeTargets(self):

self.rangeTLock.acquire()

temp = self.rangeTargets

self.rangeTLock.release()

return temp

vars = Vars()

'''

general PID loop alg:

get err and last err

calc P = Kp * err

limit P

calc I = lastI + ( err * ki)

limit I

calc D = (err - last err) * kd

limt D

setVal = P + I + D

limit setVal

submit setVal

'''

def PID():

gotSensors = False

gotRanges = False

ailEr = 0

ailI = 0

elvEr = 0

elvI = -10

thrEr = 0

thrI = 0 #set starting setpoint at 40%, 0 is no thrust

rudEr = 0

rudI = 0

while(PIDLock.locked()):

if(servos.inManOverrideState()):

manOverride()

if(sensorLock.acquire(False)):

gotSensors = True

gyros = vars.getGyros()

gyroTargets = vars.getGyroTargets()

if(rangeLock.acquire(False)):

gotRanges = True

ranges = vars.getRanges()

rangeTargets = vars.getRangeTargets()

#ailPID

if(gotRanges and not ailLock.locked()):

if(ranges['right'] < 20 and rightLock.locked()):

#likly error if >20

Page 49: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

42

ailErLast = ailEr

ailEr = rangeTargets['right'] - ranges['right']

ailP = ailEr * 10

ailI = ailI + (ailEr * 0.1)

ailD = (ailEr - ailErLast) * 2

ailVal = ailP + ailI + ailD

if(ailVal > 20):

ailVal = 20

if(ailVal < -20):

ailVal = -20

servos.set('ail', ailVal)

elif(ranges['left'] < 20 and not rightLock.locked()):

#likly error if > 20

ailErLast = ailEr

ailEr = ranges['right'] - rangeTargets['right'] #neg

ailP = ailEr * 10

ailI = ailI + (ailEr * 0.1)

if(elvI > 15):

elvI = 15

if(elvI < 115):

elvI = -15

ailD = (ailEr - ailErLast) * 2

ailVal = ailP + ailI + ailD

if(ailVal > 20):

ailVal = 20

if(ailVal < -20):

ailVal = -20

#elvPID

if(gotRanges and not elvLock.locked()):

if(ranges['front'] < 20): #likly error

elvErLast = elvEr

elvEr = rangeTargets['front'] - ranges['front']

elvP = elvEr * 10

elvI = elvI + (elvEr * 0.1)

if(elvI > 15):

elvI = 15

if(elvI < 115):

elvI = -15

elvD = (elvEr - elvErLast) * 2

elvVal = elvP + elvI + elvD

if(elvVal > 20):

elvVal = 20

if(elvVal < -20):

elvVal = -20

servos.set('elv', elvVal)

#print( rangeTargets['front'],ranges['front'],elvVal)

#thrPID

if(gotRanges):

thrErLast = thrEr

thrEr = rangeTargets['down'] - ranges['down']

thrP = thrEr * 10

thrI = thrI + (thrEr * 0.005)

Page 50: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

43

if(thrI > 90):

thrI = 90

if(thrI < 30):

thrI = 30

thrD = (thrEr - thrErLast) * 2

thrVal = thrP + thrI + thrD

if(thrVal > 100):

thrVal = 100

if(thrVal < 10):

thrVal = 10

servos.set('thr', thrVal)

#rudPID

if(gotSensors):

rudErLast = rudEr

rudEr = gyroTargets['z'] - gyros['z']

if(rudEr > 180):

rudEr = 0 - (360 - gyroTargets['z'] + gyros['z'])

if(rudEr < -180):

rudEr = 360 - gyros['z'] + gyroTargets['z']

rudEr = rudEr/180.0 * 100

#round towards 0. ignore < 1 deg of err

rudP = rudEr * 2

rudI = rudI + (rudEr * 0.01)

if(rudI > 40):

rudI = 40

if(rudI < -40):

rudI = -40

rudD = (rudEr - rudErLast) * 1

rudVal = rudP + rudI + rudD

rudVal = 0 - rudVal #turning right is nag, left is pos.

if(rudVal > 60):

rudVal = 60

if(rudVal < -60):

rudVal = -60

servos.set('rud', rudVal)

gotSensors = False

gotRanges = False

def flightControl():

#set targets for testing

gyros = vars.getGyros()

gyroTargets = gyros.copy()

gyroTargets['z'] = 92

vars.setGyroTargets(gyroTargets)

ranges = vars.getRanges()

rangeTargets = ranges.copy()

rangeTargets['down'] = 3 #ft

rangeTargets['right'] = 5 #ft

rangeTargets['forward'] = 5 #ft

vars.setRangeTargets(rangeTargets)

while(exitLock.locked()):

Page 51: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

44

time.sleep(4)

gyroTargets['z'] = gyroTargets['z']

if(gyroTargets['z'] < 0):

gyroTargets['z'] = gyroTargets['z'] + 360

vars.setGyroTargets(gyroTargets)

def getRanges():

while(True):

sensors.pingRange('all')

vars.setRanges(sensors.getRange('all'))

#before next if because it can delay

if(rangeLock.locked()):

rangeLock.release()

def getSensors():

while(True):

sensors.read()

vars.setGyros(sensors.getGyros('all'))

if(sensorLock.locked()):

sensorLock.release()

time.sleep(sensors.getPollInterval())

def manOverride():

while(servos.inManOverrideState()):

pass

def startupCopter(): #arm copter to start flight

print("Arming")

servos.set('rud', -100)

time.sleep(2)

servos.set('rud', 0)

print("Armed")

def shutdownCopter(): #disarm copter to end flight

print("Disarming")

servos.set('thr',0)

servos.set('rud', 100)

time.sleep(2)

servos.set('rud', 0)

print("Disarmed")

def landCopter():

print("landing")

PIDLock.release() #until thr is used

return #until thr is used

gyros = vars.getGyros()

gyroTargets = gyros.copy()

vars.setGyroTargets(gyroTargets)

ranges = vars.getRanges()

rangeTargets = ranges.copy()

Page 52: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

45

rangeTargets['down'] = 1 #ft

vars.setRangeTargets(rangeTargets)

if(elvLock.locked()):

servos.set('elv', 0)

while(vars.getRanges()['down'] > 1.1):

if(vars.getRanges()['front'] < 24):

rangeTargets['front'] = vars.getRanges()['front']

vars.setRangeTargets(rangeTargets)

time.sleep(0.5)

PIDLock.release()

throttle = servos.get('thr')

servos.set('ail', 0)

servos.set('elv', 0)

servos.set('rud', 0)

while(throttle > 0):

throttle = throttle - 1

servos.set('thr', throttle)

time.sleep(0.1)

print("landed")

def exitSig():

while(True):

print("waiting for quit signal (q key)")

sig = raw_input("")

if(sig == 'q'):

print("exit signal recieved")

break

elif(sig == 'r'):

print("restarting PID")

PIDLock.release()

time.sleep(0.01)

PIDLock.acquire()

thread.start_new_thread(PID,())

else:

("non-exit signal recieved: {}".format(sig))

exitLock.release()

#main####################################################################

startupCopter()

thread.start_new_thread(exitSig,())

print("getting inital sensor readings")

thread.start_new_thread(getRanges,())

thread.start_new_thread(getSensors,())

time.sleep(1)

print("starting flight algorithms")

thread.start_new_thread(flightControl,())

time.sleep(1)

thread.start_new_thread(PID,())

Page 53: CALIFORNIA STATE UNIVERSITY, NORTHRIDGE A Hovering …by an autonomous drone. The drone would communicate with a remote computer over Wi-Fi, sending RGB images, 3D models and positioning

46

print("Flight controls engaged")

exitLock.acquire() #waiting for exit signal

exitLock.release()

#landCopter()

time.sleep(0.1)

shutdownCopter()

GPIO.cleanup()