PowerPoint Presentation · PPT file · Web view2012-07-18 · Interdisciplinary Research in...

15
Tennessee State University Tennessee State University College of Engineering College of Engineering ENGINEERING RESEARCH INSTITUTE (ERI) ENGINEERING RESEARCH INSTITUTE (ERI) Interdisciplinary Research in Robotics Interdisciplinary Research in Robotics Intelligent Tactical Mobility Research Laboratory (ITMRL) Intelligent Tactical Mobility Research Laboratory (ITMRL) Intelligent Control Systems Laboratory (ICS) Intelligent Control Systems Laboratory (ICS) Center for Neural Engineering (CNE) Center for Neural Engineering (CNE) Computer and information systems Laboratory (CISE) Computer and information systems Laboratory (CISE) Mohan J. Malkani, Ph.D. (Director) Mohan J. Malkani, Ph.D. (Director) (615) 963-5400 Fax: (615) 963-5397 (615) 963-5400 Fax: (615) 963-5397 [email protected]

Transcript of PowerPoint Presentation · PPT file · Web view2012-07-18 · Interdisciplinary Research in...

Tennessee State UniversityTennessee State UniversityCollege of EngineeringCollege of Engineering

ENGINEERING RESEARCH INSTITUTE (ERI)ENGINEERING RESEARCH INSTITUTE (ERI)Interdisciplinary Research in RoboticsInterdisciplinary Research in Robotics

Intelligent Tactical Mobility Research Laboratory (ITMRL)Intelligent Tactical Mobility Research Laboratory (ITMRL)Intelligent Control Systems Laboratory (ICS)Intelligent Control Systems Laboratory (ICS)

Center for Neural Engineering (CNE)Center for Neural Engineering (CNE)Computer and information systems Laboratory (CISE)Computer and information systems Laboratory (CISE)

Mohan J. Malkani, Ph.D. (Director)Mohan J. Malkani, Ph.D. (Director)(615) 963-5400 Fax: (615) 963-5397(615) 963-5400 Fax: (615) 963-5397

[email protected]

Research Projects in Robotics: Past and Present

Tele-Robotics jointly with Caltech funded by NSF (1997-2000)Tele-Robotics jointly with Caltech funded by NSF (1997-2000)

Originally Funded by US Army TACOM, Warren, MI, under two research Originally Funded by US Army TACOM, Warren, MI, under two research grant contracts:grant contracts:

1. Development of an Integrated High-level Mobility Controller for Virtual Tandem Robotic Vehicles, DAAE07-98-C-0029, (1997-2000)

2. Deliberative, Reactive, and Adaptive Task Planning of Intelligent Cooperative Mobile Robots, DAAE07-01-C-L-065, (2001-2002) Embodiment of intelligent behaviors on mobile robots using fuzzy-genetic Embodiment of intelligent behaviors on mobile robots using fuzzy-genetic

algorithms, funded by NASA/Ames Research Center (2000-2004)algorithms, funded by NASA/Ames Research Center (2000-2004) Funded by DARPA Through Penn State Applied Research “Sensor Funded by DARPA Through Penn State Applied Research “Sensor

surveillance” under MURI-ESP Research Project, surveillance” under MURI-ESP Research Project, DAAD19-01-1-0504,DAAD19-01-1-0504, (2002-2003)(2002-2003)

Funded by NASA/JPL, FAR Investigator Program, “Visual Telerobotic Task Funded by NASA/JPL, FAR Investigator Program, “Visual Telerobotic Task Planning of Cooperative Mobile Robots”, Planning of Cooperative Mobile Robots”,

(2003-2006)(2003-2006)

Development of Advanced Control schemes that enable tactical team Development of Advanced Control schemes that enable tactical team cooperation of Intelligent Autonomous robots effectively and efficiently.cooperation of Intelligent Autonomous robots effectively and efficiently.

Test and evaluate performance of advanced control schemes under Test and evaluate performance of advanced control schemes under different operational conditions and different sensory data modality different operational conditions and different sensory data modality experimentally using high-fidelity computer generated simulation and experimentally using high-fidelity computer generated simulation and physical robotic test beds.physical robotic test beds.

Technical Competency Areas Included:Technical Competency Areas Included: Behavior-based Distributed control of Cooperative Mobile Robots.Behavior-based Distributed control of Cooperative Mobile Robots. Sensory data and image processing and fusion for fault tolerance Sensory data and image processing and fusion for fault tolerance

control of intelligent robots. control of intelligent robots. Advanced control schemes based Soft Computing techniques, (Neural Advanced control schemes based Soft Computing techniques, (Neural

Networks, Fuzzy Logic, Genetic Algorithms, …).Networks, Fuzzy Logic, Genetic Algorithms, …). High-fidelity world perception modeling of robotic systems.High-fidelity world perception modeling of robotic systems. Man-machine development for Visual Teleoperation and Telerobotic Man-machine development for Visual Teleoperation and Telerobotic

control of Cooperative Robots. control of Cooperative Robots.

Research Focus Areas

Developed various behavior-based schemes for Developed various behavior-based schemes for intelligent deliberative, reactive, and adaptive intelligent deliberative, reactive, and adaptive task planning of cooperative robots.task planning of cooperative robots.

Developed various image processing techniques Developed various image processing techniques for visual localization and target tracking of for visual localization and target tracking of robots.robots.

Applied different soft computing methods for Applied different soft computing methods for target pattern recognition and classification.target pattern recognition and classification.

Developed FMCell comprehensive robotic Developed FMCell comprehensive robotic simulation software for the purpose of man-simulation software for the purpose of man-machine interface development.machine interface development.

State-of-the-art physical robotic test bed State-of-the-art physical robotic test bed consisting of twelve heterogeneous robots.consisting of twelve heterogeneous robots.

Embodiment of intelligent behaviors on mobile Embodiment of intelligent behaviors on mobile robots using fuzzy-genetic algorithmsrobots using fuzzy-genetic algorithms

Theoretical and Experimental Theoretical and Experimental Research Capabilities Research Capabilities

Image Image EnhancementEnhancement

Robot Identification ByRobot Identification ByColor Feature DetectionColor Feature Detection

ImageImageWindowingWindowing

Robots Pose Detection Robots Pose Detection Using Neural NetsUsing Neural Nets

Image CapturedImage CapturedBy Anchor RobotBy Anchor Robot

HD: 2.0LD: 2.6RO:-45.0ID : 3

HD: 6.1LD:-2.2Ro:-90.0ID : 1

HD: 9.3LD: 2.9RO:-82.5ID : 2

RobotsRobotsLocalizationLocalization

BackgroundBackgroundEliminationElimination

NoiseNoiseReductionReduction

RobotsRobotsIsolationIsolation

Image Image EnhancementEnhancementImage Image EnhancementEnhancement

Robot Identification ByRobot Identification ByColor Feature DetectionColor Feature DetectionRobot Identification ByRobot Identification ByColor Feature DetectionColor Feature Detection

ImageImageWindowingWindowingImageImageWindowingWindowing

Robots Pose Detection Robots Pose Detection Using Neural NetsUsing Neural NetsRobots Pose Detection Robots Pose Detection Using Neural NetsUsing Neural Nets

Image CapturedImage CapturedBy Anchor RobotBy Anchor RobotImage CapturedImage CapturedBy Anchor RobotBy Anchor Robot

HD: 2.0LD: 2.6RO:-45.0ID : 3

HD: 6.1LD:-2.2Ro:-90.0ID : 1

HD: 9.3LD: 2.9RO:-82.5ID : 2

RobotsRobotsLocalizationLocalization

HD: 2.0LD: 2.6RO:-45.0ID : 3

HD: 6.1LD:-2.2Ro:-90.0ID : 1

HD: 9.3LD: 2.9RO:-82.5ID : 2

RobotsRobotsLocalizationLocalization

BackgroundBackgroundEliminationEliminationBackgroundBackgroundEliminationElimination

NoiseNoiseReductionReductionNoiseNoiseReductionReduction

RobotsRobotsIsolationIsolationRobotsRobotsIsolationIsolation

Intelligent Man-Machine InterfaceIntelligent Man-Machine Interface Interactive Component Based Architecture for Interactive Component Based Architecture for

rapid task deployment of cooperative robots.rapid task deployment of cooperative robots. Image and sensory data processing and analysis Image and sensory data processing and analysis

capability for intelligent control of autonomous capability for intelligent control of autonomous robots.robots.

Soft computing capability for deliberative, Soft computing capability for deliberative, reactive, and adaptive development of behavior-reactive, and adaptive development of behavior-based robot tactical schemes.based robot tactical schemes.

3D modeling and simulation tools for world 3D modeling and simulation tools for world perception modeling and visualization of perception modeling and visualization of cooperative mobile robots. cooperative mobile robots.

Built-in TCP/IP wireless communication Built-in TCP/IP wireless communication protocols for distributed client/server-based protocols for distributed client/server-based control of remotely operating robots.control of remotely operating robots.

Experimental human-robot interactionExperimental human-robot interaction

Robotics ResearchRobotics Research

Human-Robot InteractionIntelligent Control Systems

TeleroboticsIntelligent User Interfaces

Multi-Robot CooperationInteroperability

Software Architectures

• Human-Robot Interaction– Over the Internet; Via PDAs; Via Speech– Via cellular phones (speech integrated) – Human detection, recognition, and localization– Social behavior modeling Interoperability for

Robotics– Programming language and operating system

independent software architecture• Intelligent User Interface Design

– Adaptive - mission aware– Multiple users – multiple robots

• Heterogeneous Multi-Robot Cooperation– Behavior-based approach

Robotics ResearchRobotics Research

The Human Agent System

• The Human Agent is a virtual agent that serves as an internal active representation of people in the robot’s environment.

•As a As a representation,representation, it is able to detect, represent and monitor people. The it is able to detect, represent and monitor people. The description description activeactive is used, much as in describing active perception vision is used, much as in describing active perception vision systems [Bajcsy 1987], to indicate that the system can take action to make its systems [Bajcsy 1987], to indicate that the system can take action to make its representation richer.representation richer.

Human DetectionAgent (motion)

Human DetectionAgent (sound)

AffectEstimationAgent

Human IdentificationAgent (face)

Human IdentificationAgent (voice)

Human Database

IdentificationAgent

Human AffectAgent

ObserverAgent

Monitoring Agent

Human Intention

Agent

SocialAgent

Interaction Agent

Human Agent

To Self Agent

Sensory EgoSphere (SES) for Mobile Robots• Peters redefined the

Sensory EgoSphere as a sparse spatiotemporally indexed short term memory (STM).

• Structure: a variable density geodesic dome.

• Nodes: links to data structures and files.

• Indexed by azimuth, elevation and time.

• Searchable by location and content.

images sonar

laserPeters, R. A. II, K. E. Hambuchen, K. Kawamura, and D. M. Wilkes, “The Sensory Ego-Sphere as

a Short-Term Memory for Humanoids”, Proc. IEEE-RAS Int’l. Conf. on Humanoid Robots, pp. 451-459, Waseda University, Tokyo, Japan, 22-24 Nov. 2001.

Experimental Design

• 2 training tasks with the original and enhanced interface

• 2 teleoperation tasks with the enhanced and orignal interface.

Telepresence Software Architecture (Over Internet)

Robot Control Programs

Internet Control (ServerSide)

Internet Control (Client Side)

TCP/IP (Internet)

USER

SERVERS

API

Hardware

HumanCommander

RobotCommand

er

Audio Command

s

Soldiers

Speech Recognition

TCP/IPInternet

Research Motivations (Consumer Tele-presence)

MANAGER

Robot-1 Grabs an Image

Process (NN-Fuzzy)

Grab

Robot-2 Grabs an Image

Process (NN-Fuzzy)

Image Image

Grab

Robot-3 Grabs an Image

Process (NN-Fuzzy)

Image

Grab

Final Decision (Fuzzy Logic)

Fuzzy Decision-2Fuzzy Decision-3

Fuzzy Decision-1

System Architecture

Research Motivations (Development of Robot Behaviors, NASA, Phase-I)FIRBA Implementation

Abstracts beeSoft: Complex API protocols are hidden

Object Oriented: Abstraction, reuse by inheritance.

Perception Sharing: Common perceptions can be shared

Action Suggestions: Arbitration through MAL, fuzzy inference and De-fuzzification.

Independent BehaviorsOverall Software Architecture.

LEVEL 1 BEHAVIORS

LEVEL 0 BEHAVIORS

SONARHANDLING

SONARCLASS

ODOMETERHANDLING

ODOMETERCLASS

SONAR PERCEPTIONS

TARGETHANDLING

TARGETCLASS

PATHHANDLING

PATHCLASS

TARGET PERCEPTIONS

MOTIONPRIMITI-

VES

Research Motivations (Development of Robot Behaviors, NASA, Phase-I)

SENSORS

Pre-Perception Processing

Perception Capabilities

Behaviors.

Action Capabilities

Action Execution

ACTUATORS

The FIRBA architecture.

FIRBA – Robot Control System

Complexity -• Robustness• Multiple Sensors• Multiple Methods• Integration• Incremental Development• Software

This complexity is handled by system decomposition in terms of :

-- functional units-- behavioral units