Educational Service Robot Open Platform...
Transcript of Educational Service Robot Open Platform...
Educational Service Robot Open Platform (ESROP)
RobotForAll OpenCourseWare
Development Guide
Version 2019
RoboCup@Home Education | www.RoboCupatHomeEDU.org
RoboCup@Home RoboCup@Home aims to foster the development of service and assistive robot technology to make possible future
personal domestic applications. The competitions comprise of a set of benchmark tests to evaluate the robots’ capabilities
in realistic home environment settings and scenarios, with the research focuses on: human-robot interaction and
cooperation, navigation in dynamic environments, computer vision and object recognition under natural light conditions,
object manipulation, adaptive behaviors and learning, ambient intelligence, and system integration.
2
RoboCup@Home Education | www.RoboCupatHomeEDU.org
RoboCup@Home EDUCATION
RoboCup@Home EDUCATION is an educational initiative that promotes
educational efforts to boost RoboCup@Home participation and service robot
development.
Under this initiative, currently there are 3 projects in operation:
1. RoboCup@Home Education Challenge
2. Support the Development of Educational Open Robot Platforms for
RoboCup@Home (service robotics)
3. Outreach Programs (domestic workshops, international academic exchange
programs, etc.)
http://www.robocupathomeedu.org/
https://www.facebook.com/robocupathomeedu/
3
RoboCup@Home Education | www.RoboCupatHomeEDU.org
Educational Service Robot Open Platform (ESROP)
4
RoboCup@Home Education | www.RoboCupatHomeEDU.org
Educational Service Robot Open Platform (ESROP)
A. Mobile Platform a. SLAM, Navigation
B. Robot Vision a. Visual Perception, Object Recognition
C. Robot Arm a. Object Manipulation
D. Human-Robot Interaction a. Speech Interaction
E. Robot Intelligence a. AI, Machine Learning, Cloud Computing
A
B
C
D
E
5
RoboCup@Home Education | www.RoboCupatHomeEDU.org
Ubuntu and ROS Setup
6
● Ubuntu setup ○ Linux Ubuntu 16.04.5 LTS (Xenial Xerus)
○ http://www.ubuntu.com/
○ http://releases.ubuntu.com/16.04/
● ROS setup ○ Ubuntu 16.04 > ROS Kinetic
○ http://wiki.ros.org/kinetic/Installation/Ubuntu
● ROS tutorials ○ http://wiki.ros.org/ROS/Tutorials
RoboCup@Home Education | www.RoboCupatHomeEDU.org
TurtleBot Setup
● TurtleBot - http://wiki.ros.org/Robots/TurtleBot
● Installation (Kinetic) -
http://robotforall.org/wiki/index.php?title=TurtleBot2:Installation(Kinetic)
(http://wiki.ros.org/turtlebot/Tutorials/indigo/Turtlebot%20Installation)
● Kobuki setup - http://wiki.ros.org/turtlebot/Tutorials/indigo/Kobuki%20Base
● Kinect setup -
http://wiki.ros.org/turtlebot/Tutorials/indigo/Alternative%203D%20Sensor%20
Setup
● Astra setup - http://wiki.ros.org/astra_camera
● TurtleBot bringup -
http://wiki.ros.org/turtlebot_bringup/Tutorials/indigo/TurtleBot%20Bringup
● 3D visualization -
http://wiki.ros.org/turtlebot/Tutorials/indigo/3D%20Visualisation
● Follower demo - http://wiki.ros.org/turtlebot_follower/Tutorials/Demo
7
RoboCup@Home Education | www.RoboCupatHomeEDU.org
● Create a ROS Workspace ○ $ mkdir -p ~/catkin_ws/src
○ $ cd ~/catkin_ws/
○ $ catkin_make
● Environment Setup ○ $ echo "source ~/catkin_ws/devel/setup.bash" >> ~/.bashrc
○ $ source ~/.bashrc
● Add Sample ROS Codes ○ $ cd ~/catkin_ws/src
○ $ git clone https://github.com/robocupathomeedu/rc-home-edu-learn-ros.git
○ $ cd ~/catkin_ws
○ $ catkin_make
● Add Sample MATLAB Codes ○ $ cd ~/catkin_ws/src
○ $ git clone https://github.com/robocupathomeedu/rc-home-edu-learn-matlab.git
○ $ cd ~/catkin_ws
○ $ catkin_make
ROS Workspace and Sample Codes
8
● Install Git for Linux
○ $ sudo apt-get install git
● Sign up GitHub account
○ https://github.com/join
● Configure GitHub settings
○ $ git config --global user.name "username"
○ $ git config --global user.email "email address"
● Create a local repository (test)
○ $ git init test
● Create a test file in the local repository
○ $ cd test
○ $ gedit README.md
● Add repository files to an index
○ $ git add README.md
○ $ git add --all (add all folders and files)
● Commit changes made to the index
○ $ git commit -m "description message"
● Create a repository on GitHub
○ https://github.com/new
● Set the remote GitHub repository
○ $ git remote add origin
https://github.com/username/test.git
Using Git and GitHub on Ubuntu
9
● Pushing files in local repository to GitHub
○ $ git push origin master
● Clones a repository to your computer
○ $ git clone
https://github.com/USERNAME/REPOSITORY.git
● Fetches updates made to a remote repository
○ $ git fetch remotename
● Merges updates made online with your local
work
○ $ git merge remotename/branchname
● Grabs online updates and merges them with
your local work
○ $ git pull remotename branchname
● References
○ https://www.howtoforge.com/tutorial/install-git-
and-github-on-ubuntu-14.04/
○ https://help.github.com/articles/fetching-a-remote/
● Further readings
○ Fork https://help.github.com/articles/fork-a-repo/
RoboCup@Home Education | www.RoboCupatHomeEDU.org
● Festival - http://www.cstr.ed.ac.uk/projects/festival/
● ROS sound_play - http://wiki.ros.org/sound_play
○ Tutorials - http://wiki.ros.org/sound_play/Tutorials
● Installation:
○ $ sudo apt-get install ros-kinetic-audio-common
○ $ sudo apt-get install libasound2
● Command line operation:
○ $ roscore
○ $ rosrun sound_play soundplay_node.py
○ $ rosrun sound_play say.py "Hello!“
● Source code implementation:
○ rchomeedu_speech/scripts/sound_test.py
○ $ roslaunch rchomeedu_speech sound_test.launch
Speech Synthesis (Text-to-Speech)
10
RoboCup@Home Education | www.RoboCupatHomeEDU.org
● CMUSphinx - https://cmusphinx.github.io
● ROS Pocketsphinx - http://wiki.ros.org/pocketsphinx
● Installation: ○ Install Pocketsphinx
■ $ sudo apt-get install python-pip python-dev build-essential
■ $ sudo pip install --upgrade pip
■ $ sudo apt-get install libasound-dev
■ $ sudo apt-get install python-pyaudio
■ $ sudo pip install pyaudio
■ $ sudo apt-get install swig
■ $ sudo pip install pocketsphinx
○ Install ROS package for Pocketsphinx
■ $ cd ~/catkin_ws/src
■ $ git clone https://github.com/Pankaj-Baranwal/pocketsphinx
■ $ cd ~/catkin_ws
■ $ catkin_make
○ Add language model
■ Download and copy the hub4wsj_sc_8k language model to
/usr/local/share/pocketsphinx/model/en-us/en-us/
■ https://sourceforge.net/projects/cmusphinx/files/Acoustic%20and%20Language%20Mod
els/Archive/US%20English%20HUB4WSJ%20Acoustic%20Model/
Speech Recognition
11
RoboCup@Home Education | www.RoboCupatHomeEDU.org
Speech Recognition
● Demonstration of kws mode (keyword spotting mode):
○ $ roslaunch pocketsphinx
kws.launch dict:=/home/<username>/catkin_ws/src/pocketsphinx/demo/voice_cmd.dic kws:=/h
ome/<username>/catkin_ws/src/pocketsphinx/demo/voice_cmd.kwlist
○ $ rostopic echo /kws_data
● Voice commands for TurtleBot simulation in Gazebo:
○ $ roslaunch turtlebot_gazebo turtlebot_world.launch (takes time to load)
○ $ rosrun pocketsphinx voice_control_example.py
● Source code implementation:
○ $ roslaunch rchomeedu_speech talkback.launch (kws.launch)
12
RoboCup@Home Education | www.RoboCupatHomeEDU.org
Speech Recognition
● Demonstration of lm mode (language model mode):
○ $ roslaunch rchomeedu_speech lm.launch dict:=/home/<username>/catkin_ws/src/rc-home-
edu-learn-
ros/rchomeedu_speech/robocup/robocup.dic lm:=/home/<username>/catkin_ws/src/rc-home-
edu-learn-ros/rchomeedu_speech/robocup/robocup.lm
○ $ rostopic echo /lm_data
● Creating a Vocabulary:
○ $ roscd rchomeedu_speech/robocup
○ $ less robocup.corpus
○ http://www.speechcs.cmu.edu/tools/lmtool-new.html
○ Update dic and lm files in launch file (lm.launch)
13
RoboCup@Home Education | www.RoboCupatHomeEDU.org
语音交互-科大讯飞语音(在线)
● 科大讯飞语音 - https://www.xfyun.cn/
● Installation
○ $ cd ~
○ $ git clone https://github.com/ncnynl/xf-ros.git
○ $ cp -R xf-ros/xfei_asr ~/catkin_ws/src/
○ In xfei_asr/CMakelist.txt,
/home/ubu/catkin_ws/ >>> home/<username>/catkin_ws
○ $ cd ~/catkin_ws
○ $ catkin_make
● TTS 语音合成
○ $ roscore
○ $ rosrun xfei_asr tts_subscribe_speak
○ $ rostopic pub xfwords std_msgs/String “你好”
● Speech Recognition 语音识别
○ rosrun xfei_asr iat_record
14
RoboCup@Home Education | www.RoboCupatHomeEDU.org
ROS TurtleBot Navigation
1. Overview
2. Implementation a. Build a map with SLAM
b. Autonomously navigate in a known map
Source: http://wiki.ros.org/turtlebot_navigation 15
RoboCup@Home Education | www.RoboCupatHomeEDU.org
3. Source code implementation: ○ Node:
■ /rc-home-edu-learn-ros/rchomeedu_navigation/nodes/navigation.py
■ Navigation target: Location A
○ Launch:
■ $ roslaunch turtlebot_bringup minimal.launch
■ $ export TURTLEBOT_MAP_FILE=/<path to maps folder>/map.yaml
■ $ roslaunch turtlebot_navigation amcl_demo.launch
■ $ roslaunch turtlebot_rviz_launchers view_navigation.launch
■ $ roslaunch rchomeedu_navigation navigation.launch
■ (set initial position)
ROS TurtleBot Navigation
16
RoboCup@Home Education | www.RoboCupatHomeEDU.org
Robot Visual Perception
● OpenNI2 - http://structure.io/openni
● OpenKinect - https://github.com/OpenKinect
● PCL - http://pointclouds.org/
● OpenCV - http://opencv.org/
● ROS opencv_apps - http://wiki.ros.org/opencv_apps
● Installation: ○ Astra Setup - http://wiki.ros.org/astra_camera
○ USB Camera - $ sudo apt-get install ros-kinetic-usb-cam
○ ROS opencv_apps
■ $ cd ~/catkin_ws/src
■ $ git clone https://github.com/ros-perception/opencv_apps
■ $ cd ~/catkin_ws
■ $ catkin_make
● Multiple Astra sensors ○ Check Astra device id: $ rosrun astra_camera astra_list_devices
○ Update device id into rc-home-edu-learn-ros/rchomeedu_vision/launch/multi_astra.launch
■ device_1_id – Base sensor for navigation
■ device_2_id – Top sensor for visual perception
17
RoboCup@Home Education | www.RoboCupatHomeEDU.org
Robot Visual Perception
● Bring up ○ [Astra] $ roslaunch astra_launch astra.launch
○ [Kinect] $ roslaunch freenect_launch freenect-registered-xyzrgb.launch
○ [USB Camera] $ roscore | $ rosrun usb_cam usb_cam_node
○ [Multiple Astra] $roslaunch rchomeedu_vision multi_astra.launch
● Display RGB and Depth images ○ $ rosrun image_view image_view image:=/camera/rgb/image_raw
○ $ rosrun image_view image_view image:=/camera/depth_registered/image_raw
○ [USB Camera] $ rosrun image_view image_view image:=/usb_cam/image_raw
○ [Multiple Astra] $ rosrun image_view image_view image:=/camera_top/rgb/image_raw
● Take photo ○ Bring up camera
○ $ rosrun rchomeedu_vision take_photo.py
○ $ rosrun rchomeedu_vision take_photo_sub.py
■ $ rostopic pub -1 /take_photo std_msgs/String “take photo”
○ * Image topic = "/camera_top/rgb/image_raw" [Multiple Astra] (edit code according to camera
used)
○ * Photo taken is saved in ~/.ros folder
18
RoboCup@Home Education | www.RoboCupatHomeEDU.org
CamShift Object Tracking
OpenCV CamShift filter
● Bring up: ○ [Astra] $ roslaunch astra_launch astra.launch
○ [Kinect] $ roslaunch freenect_launch freenect-registered-xyzrgb.launch
○ [USB Camera] $ roscore | $ rosrun usb_cam usb_cam_node
○ [Multiple Astra] $roslaunch rchomeedu_vision multi_astra.launch
● Launch node: ○ $ roslaunch opencv_apps camshift.launch image:=/camera/rgb/image_raw
19
RoboCup@Home Education | www.RoboCupatHomeEDU.org
● Bring up: ○ [Astra] $ roslaunch astra_launch astra.launch
○ [Kinect] $ roslaunch freenect_launch freenect-registered-xyzrgb.launch
○ [USB Camera] $ roscore | $ rosrun usb_cam usb_cam_node
○ [Multiple Astra] $roslaunch rchomeedu_vision multi_astra.launch
● Face Detection using Cascade Classifier ○ $ roslaunch opencv_apps face_detection.launch image:=/camera/rgb/image_raw
● Face Recognition ○ $ roslaunch opencv_apps face_recognition.launch image:=/camera/rgb/image_raw
● People Detection using Histogram of Oriented Gradients (HOG) ○ $ roslaunch opencv_apps people_detect.launch image:=/camera/rgb/image_raw
People/Face Detection and Recognition
20
RoboCup@Home Education | www.RoboCupatHomeEDU.org
● TurtleBot Arm - http://wiki.ros.org/turtlebot_arm/
● Hardware - https://makezine.com/projects/build-an-arm-for-your-turtlebot/
● Installation and setup ○ ROS dynamixel_motor - http://wiki.ros.org/dynamixel_motor
○ $ sudo apt-get install ros-kinetic-dynamixel-motor
○ Connecting to Dynamixel bus -
http://wiki.ros.org/dynamixel_controllers/Tutorials/ConnectingToDynamixelBus
○ Creating a joint controller -
http://wiki.ros.org/dynamixel_controllers/Tutorials/CreatingJointPositionController
● Controlling the PhantomX Pincher Robot Arm ○ http://wiki.ros.org/dynamixel_controllers/Tutorials/Controlling%20the%20PhantomX%20Pinche
r%20Robot%20Arm
TurtleBot Arm
21
RoboCup@Home Education | www.RoboCupatHomeEDU.org
TurtleBot Arm
● GUI Interface ○ $ cd ~/catkin_ws/src/dynamixel_hr/
○ $ python ToolDynamixelLab.py
● Bring up ○ $ roslaunch rchomeedu_arm arm.launch
● ROS topics ○ /waist_controller/command
○ /shoulder_controller/command
○ /elbow_controller/command
○ /wrist_controller/command
○ /hand_controller/command
● Moving the joints ○ $ rostopic pub -1 /waist_controller/command std_msgs/Float64 -- 0.3
○ $ rostopic pub -1 /shoulder_controller/command std_msgs/Float64 -- 0.3
○ $ rostopic pub -1 /elbow_controller/command std_msgs/Float64 -- 0.3
○ $ rostopic pub -1 /wrist_controller/command std_msgs/Float64 -- 0.3
○ $ rostopic pub -1 /hand_controller/command std_msgs/Float64 -- 0.3
● Source code implementation ○ Bring up - $ roslaunch rchomeedu_arm arm.launch
○ Arm movements - $ rosrun rchomeedu_arm arm.py | dance_arm.py (pub “dance arm”)
22
RoboCup@Home Education | www.RoboCupatHomeEDU.org
People Tracking (Follow Me)
23
RoboCup@Home Education | www.RoboCupatHomeEDU.org
RGB-D Sensor
3D MS Kinect sensor
● RGB Color VGA Video camera
● Depth sensor
○ Infrared projector
○ Monochrome CMOS sensor
● Resolution 640 x 480 pixel
● 30 FPS
24
RoboCup@Home Education | www.RoboCupatHomeEDU.org
People Tracking (Follow Me)
25
Parameter
min_y
max_y
min_x
max_x
max_z
goal_z
x_scale
z_scale
Data Point Cloud
Control Parameters
RoboCup@Home Education | www.RoboCupatHomeEDU.org
People Tracking (Follow Me)
● TurtleBot Follower Demo ○ http://wiki.ros.org/turtlebot_follower/Tutorials/Demo
● Bring up: ○ $ roslaunch turtlebot_bringup minimum.launch
● Launch node: ○ $ roslaunch turtlebot_follower follower.launch
● Changing Follower Parameters ○ $ rosrun rqt_reconfigure rqt_reconfigure
26
RoboCup@Home Education | www.RoboCupatHomeEDU.org
Robot Applications (rchomeedu_apps)
Follower2
● Follower with multiple Astra sensors ○ Setup sensor id in rchomeedu_vision/launch /multi_astra.launch
○ Bring up - $ roslaunch turtlebot_bringup minimal.launch
○ Launch - $ roslaunch rchomeedu_follower follower2.launch
● Follower start/stop service control ○ $ rosrun rchomeedu_follower follower_control.py
27
RoboCup@Home Education | www.RoboCupatHomeEDU.org
Robot Applications (rchomeedu_apps)
PartyBot
● Functions ○ Speech recognition with lm mode to recognize questions from guest
○ Speech synthesis to speak the answers and play audio files
○ Take photo by speech command
○ Integrated with Follower to start/stop by speech commands
○ Move mobile base and arm for dance sequence
● Launch ○ $ roslaunch turtlebot_bringup minimal.launch
○ $ roslaunch rchomeedu_arm arm.launch
○ $ roslaunch rchomeedu_follower follower2.launch
○ $ roslaunch rchomeedu_partybot partybot.launch dict:=/home/<username>/catkin_ws/src/rc-
home-edu-learn-
ros/rchomeedu_speech/robocup/robocup.dic lm:=/home/<username>/catkin_ws/src/rc-home-
edu-learn-ros/rchomeedu_speech/robocup/robocup.lm
28
RoboCup@Home Education | www.RoboCupatHomeEDU.org
Acknowledgment
● RoboCup@Home Education is a special education program providing support
for new teams in the context of a simplified RoboCup@Home competition
oriented on forming new teams and preparing them to the main competitions.
● This open courseware is mainly supported by IEEE RAS CEMRA (Creation of
Educational Material in Robotics and Automation) Project.
● The course contents are developed partially in collaboration with MathWorks.
All contents are open source.
29