Behaviors & Sensors

25
Behaviors & Sensors Robotics Seminar CSI445/660 Spring 2006 Robert Salkin University at Albany, SUNY

description

Behaviors & Sensors. Robotics Seminar CSI445/660 Spring 2006 Robert Salkin University at Albany, SUNY. Homework Issues. - PowerPoint PPT Presentation

Transcript of Behaviors & Sensors

Page 1: Behaviors & Sensors

Behaviors & Sensors

Robotics Seminar CSI445/660Spring 2006Robert SalkinUniversity at Albany, SUNY

Page 2: Behaviors & Sensors

Homework Issues

It occurred to me on the way in this morning that you can’t hand your memsticks in today and still test your projects on the AIBOs this week before the homeworks are graded, so…

All of you have been (or will be) sent a Gmail invite Over 2GB of space Webmail or POP3

Please hand your code/documentation in via email to [email protected] using a .tar file from now on

HW1 will be graded during Thurs office hours How do you want grades distributed?

Entirely up to you… On the course website?

Page 3: Behaviors & Sensors

tar

TAr = Tape Archiver Written to archive files to tape Now used to convert multiple files into one .tar

file for storage or downloading/emailing Preserves directory structure

Your should submit exactly one .tar file (or .tar.gz if it is very large) called hw#_memstick.tar hw2_5.tar //homework 2, memstick 5 hw3_10.tar //homework 3, memstick 10 etc…

Page 4: Behaviors & Sensors

tar options

tar options files_to_tar c – create a new .tar x – extract from a .tar f file – use file instead of stdin/stdout v – verbose output (scrolls filenames) z – gzip (cz) or gunzip (xz)

Use lowercase z, not uppercase Z Used only for .tar.gz or .tgz files

t – table of contents (scrolls filenames)

Page 5: Behaviors & Sensors

tar example

Copy your homework files to a directory with the same naming convention as the .tar file, tar it, then email the .tar file mkdir hw2_7 cp myfiles hw2_7 tar cvf hw2_7.tar hw2_7 Email hw2_7.tar to [email protected]

Page 6: Behaviors & Sensors

What is a Behavior?

One way to define a behavior would be to say that a behavior is how a robot creates action from perception as shown in the diagram below:

Action

Actuators

Perception

External World External World

Sensors

Cognition(Behavior)

Figure 1

( http://www-2.cs.cmu.edu/~robosoccer/cmrobobits/lectures/Introduction.ppt )

Page 7: Behaviors & Sensors

Sensors

Sensors along with the Event model can be thought of as the perception portion of the earlier diagram. We need to understand what “senses” are needed for a

given task and how to manipulate them to allow behaviors to make good decisions.

This complete perception is abstracted for us by Tekkotsu. We simply “subscribe” to a certain (sensor) event and wait.

erouter->addListener(this, SensorEGID); Values can be accessed

through the event (vision) from the “WorldState”

Page 8: Behaviors & Sensors

Events

So, the information from the world (perception) is received through the sensors and Tekkotsu notifies the behavior (cognition) that certain information was received through the use of Events.

Each event has the following schema: EventBase::

generatorID : this is the specific EventBaseGenerator that caused the event.

e.g. buttonEGID, sensorEGID, visObjEGID, etc. sourceID : generator specified integer relating to “detail” on what

caused the event. e.g. specific button for buttonEGID, specific object for visObjEGID, etc.

typeID : “used to denote whether it's the first in a sequence (button down), in a sequence (button still down), or last (button up)”

activateETID Start of an event sequence, e.g. button down. statusETID Indicates a value has changed, e.g. new sensor readings deactivateETID Last of a series of events, e.g. button up numETIDs the number of different event types

Page 9: Behaviors & Sensors

More Events

EventBase allows for a few more optional parameters. Take a look at the API for a complete list of generators, sources, and type ids supported in Tekkotsu.

We can omit some of those three basic parameters to catch a more general event.

All three will be used when we catch and handle the event. When we register for an event, we need to specify a listener (our

behavior), and an EventBase object (or set of parameters). We’ll see a good example of this in a few slides erouter->addListener(el, generatorid, sourceid, typeid); erouter->addListener(this, EventBase::buttonEGID, BackBut,

EventBase::activateETID); Specifying only what we need is a good way to keep the behaviors as

simple (and clean) as possible.

Page 10: Behaviors & Sensors

World State

Tekkotsu provides an easy way to access information about Aibo and it’s immediate environment. WorldState class, implemented as a global pointer called

“state” Contains sensor readings, current joint positions, etc. This is a shared memory region between MainObj,

MotoObj, and possibly others in the future. Usage:

Static Members WorldState.g

gravity constant on Earth WorldState.ERS210Mask, WorldState.ERS7Mask.

bit mask used for model testing (What model am I?)

Page 11: Behaviors & Sensors

World State

Other Public Members (from Tekkotsu API) float outputs[NumOutputs]

last sensed positions of joints, for ears (or other future "boolean joints"), x<.5 is up, x>=.5 is down; indexes (aka offsets) are defined in the target model's namespace (e.g. ERS210Info)

float buttons[NumButtons] magnitude is pressure for some, 0/1 for others; indexes are defined

in the ButtonOffset_t of the target model's namespace (e.g. ERS210Info::ButtonOffset_t)

float sensors[NumSensors] IR, Accel, Thermo, Power stuff; indexes are defined in

SensorOffset_t of the target model's namespace (e.g. ERS210Info::SensorOffset_t).

float vel_x, vel_y, vel_a

Page 12: Behaviors & Sensors

World State example

To access correct values from WorldState’s arrays, we need to look in the respective ModelInfo namespace to find which constant has been enumerated for our needs.

to get current IR distance reading (on the 7) state->sensors[NearIRDistOffset];

Retrieving info about a paw button state->buttons[LFrPawOffset]

will return 1 if left front paw button if currently depressed, 0 otherwise The ERS7 will give a noisy pressure sensitive float value representing

how hard the button is being pressed. Current forward velocity

state->vel_x (in approx. cm/s) joint values – current “LF shoulder joint”

state->outputs[LFrLegOffset+0] recall that there are 3 DOF in each leg more on motion next week

Page 13: Behaviors & Sensors

Global Pointers

Tangent here for a moment to lay out a few necessary global pointers. state

WorldState handles the previously seen sensor and world-type tasks.

erouter Event Router handles the posting, receiving, and (un)registering of events erouter->addTimer(this, 500, false)

motman Motion Manager allows correspondence between behaviors and the MotoObj. motman->addMotion(SharedObject<WalkMC>());

sndman Sound Manager Plays/ Buffers Sound Flies (16 bit / 16kHz / WAV) sndman->PlayFile(“fart.wav”);

The coming example shows these in action

Page 14: Behaviors & Sensors

What is a Behavior?

Now that we have a handle on sensing, let’s apply our knowledge to a Behavior.

A Behavior is the “main” function we would use in application programming. We write one behavior to accomplish one task, such as

chasing a ball or navigating a room. Keep in mind that we are using an event-based model.

In Tekkotsu, a behavior is also a C++ class inherited from class BehaviorBase. Rusty on C++ polymorphism or OOP ?

http://www-2.cs.cmu.edu/~tekkotsu/CPPReview.html

Page 15: Behaviors & Sensors

Behaviors

Each non-trivial Tekkotsu behavior needs to have the following class structure:

class TutorialBehavior: public BehaviorBase{

TutorialBehavior(); //the constructor

~ TutorialBehavior(); //the destructor

void DoStart(); //init

void DoStop(); //de-init

void processEvent(const EventBase &); //handle caught events

std::string getName() const; //return a string to display in TekktosuMon

}; //end behavior

Let’s step through a very simple sensor/behavior example [ WalkForwardAndStop (Tutorial Behavior ) ]

Page 16: Behaviors & Sensors

Behavior Design

Two schools of thought on this [1] –1. The subsumption architecture [2], where behaviors are

built by successive layers of fairly reactive control. 2. A state machine where behaviors are built by stringing

together a series of states. Either one could represent a reflex agent. What kinds of behaviors seem best fit for each? Tekkotsu provides wonderful support for both

through it’s Hierarchical State Machine architecture ( future lecture ).

Page 17: Behaviors & Sensors

Quick FSM vs. Subsumption

In a Finite State Machine, programs are broken down into modular states, and program control moves between states via some condition, called a transition.

A Subsumption Architecture builds a system by layering levels of control, allowing lower levels to override the higher ones. creates control precedence

Page 18: Behaviors & Sensors

FSM vs. Subsumption [2]

FSM (ExploreMachine)

walk

turn

Obstacle Detected

No Obstacle

Brooks’ Subsumption Control System

Page 19: Behaviors & Sensors

More sensors

Sensor Types [3]1. Passive sensors - capture the environment as is

Vision camera, temperature sensor

2. Active sensors - emit energy and capture result Sonar: out sound, receives echo, and measures distance from time

difference IR: out infrared energy, and measures amount returned

Sensor Noise. Def: The returned value from a sensor is cluttered with data

completely unrepresentative of the real world object it’s perceiving

OR Some real world object is interfering with the expected value of a

given sensor.

Page 20: Behaviors & Sensors

Handling Noise

Either active or passive sensor types can wind up being noisy in a real application.

Tekkotsu handles frame-rate level noise (redundant data, etc.) But we still may need some tools if the data just plain seems

inconsistent Assume we get regular IR values returned from Tekkotsu. How

could these still be termed “noisy” ? How do we identify the noise?

We could handle the noise with any of the following [3]:1. Thresholding2. Hysteresis3. Averaging4. Learning – not covered here

Page 21: Behaviors & Sensors

Handling Noise

Let’s focus on the more straightforward methods for now.1. Single-value Thresholding [3].

Using a single threshold returns two values and leads into a transition between two states. ex. if (noisy sensor < 350) STOP

Is this a good idea with noisy sensors? Not really. Thresholding with noisy data on a single value may lead to behavior

switching oscillation.2. Hysteresis [3] – allows for a little more certainty in switching

between states still using a threshold allow some overlap in the values between states to gain this “certainty”

some latency is a natural result3. Exponential averaging [3]

combines multiple sensor readings and then decisions are made thresholding on the computed average.

_valnew_sensor 1( )1( navgavg n

Page 22: Behaviors & Sensors

Choosing a Sensor

When designing the behavior, be aware of the type of perception that’s required for the task. Active vs. Passive IR vs. Camera When could a sensor choice give us trouble?

Environment Consistency Will we have it?

Is the sensor noisy or just a bad choice? In terms of a Robocup system, what’s a good choice to detect if a

given Aibo has “control” over the ball for dribbling?

Page 23: Behaviors & Sensors

Summary [3]

Behaviors provide the cognitive component in a robotics system

Behaviors can be state machines. The AIBO robot used in the class has a specific set of sensors. Sensor readings are input to the “robot program” (behavior). Tekkotsu provides easy access to the perceptive portion of the

system through its event mechanism and shared memory. Sensors have uncertainty that needs to be addressed with a

technique such as with hysteresis. Understanding how Tekkotsu handles Behaviors is central for

programming success from here on out. Always test your assumptions about returned sensor data

before simply assuming noise or programming flaw.

Page 24: Behaviors & Sensors

Homework

See HW2.pdf

Page 25: Behaviors & Sensors

References

Special thanks to CMU and all the Tekkotsu developers for valuable input into these topics.

[1] Tekkotsu: A Rapid Development Framework for Robotics (2004)Ethan Tira-ThompsonMasters Thesis, Carnegie Mellon University

http://www-2.cs.cmu.edu/~tekkotsu/media/thesis_ejt.pdf

[2] R. Brooks, “A robust layered control system for a mobile robot”,

IEEE Journal of Robotics and Automation, vol. RA-2, no. 1, pp. 14–23, Mar. 1986.

[3] Robobits Lecture 2,

http://www-2.cs.cmu.edu/~robosoccer/cmrobobits/lectures/Sensors.pdf

[4] Shawn Turner