Chapter 4 Interpersonal communication skills Mohammad R. Rawashdeh
Utilizing Science & Technology and Innovation for Development Real-Time 3D First-Person View for...
-
Upload
rosanna-boone -
Category
Documents
-
view
215 -
download
0
Transcript of Utilizing Science & Technology and Innovation for Development Real-Time 3D First-Person View for...
![Page 1: Utilizing Science & Technology and Innovation for Development Real-Time 3D First-Person View for Unmanned Systems Osamah A. Rawashdeh, PhD, PE Associate.](https://reader035.fdocuments.us/reader035/viewer/2022062516/56649e6c5503460f94b6a7ec/html5/thumbnails/1.jpg)
1
Utilizing Science & Technology and Innovation for Development
Real-Time 3D First-Person View for Unmanned Systems
Osamah A. Rawashdeh, PhD, PEAssociate Professor, Oakland University
Belal Sababha, PhDAssistant Professor, PSUT
Marriott Hotel- Amman, Wed. 12/8/2015
![Page 2: Utilizing Science & Technology and Innovation for Development Real-Time 3D First-Person View for Unmanned Systems Osamah A. Rawashdeh, PhD, PE Associate.](https://reader035.fdocuments.us/reader035/viewer/2022062516/56649e6c5503460f94b6a7ec/html5/thumbnails/2.jpg)
2
Project Team - Jordan
Nathir Rawashdeh, PhDAssistant Professor, Exchange CoordinatorMechatronics Engineering Dept.German Jordanian UniversityAmman, Jordan
Areas of Expertise: Image Processing, Controls, Mobile Robots
Belal Sababha, PhDAssistant Professor, ChairComputer Engineering Dept.Princess Sumaya University for TechnologyAmman, Jordan
Areas of Expertise: UAVs, Automotive Controls, Embedded Systems
![Page 3: Utilizing Science & Technology and Innovation for Development Real-Time 3D First-Person View for Unmanned Systems Osamah A. Rawashdeh, PhD, PE Associate.](https://reader035.fdocuments.us/reader035/viewer/2022062516/56649e6c5503460f94b6a7ec/html5/thumbnails/3.jpg)
3
Project Team - USA
Osamah Rawashdeh, PhD, PEAssociate Professor, Academic CoordinatorElectrical and Computer EngineeringOakland UniversityRochester, Michigan, USA
Areas of Expertise: Unmanned Systems, Reliability and fault tolerance, Embedded Systems Design
Samir Rawashdeh, PhDAssistant ProfessorElectrical and Computer EngineeringUniversity of Michigan - DearbornDearborn, Michigan , USA
Areas of Expertise: Stereo Vision, Small Satellites, biomedical Signal Processing
![Page 4: Utilizing Science & Technology and Innovation for Development Real-Time 3D First-Person View for Unmanned Systems Osamah A. Rawashdeh, PhD, PE Associate.](https://reader035.fdocuments.us/reader035/viewer/2022062516/56649e6c5503460f94b6a7ec/html5/thumbnails/4.jpg)
4
Embedded Systems Research Lab at Oakland University
• Multi-core engine and transmission control for Ford
• Hardware-in-the-Loop (HIL) simulations for Chrysler
• Internet weather connectivity for Chrysler
• Infotainment prototyping for GM• PI for a 6-year Research Program
for Undergraduates (REU) funded by the NSF Autobike, Inc. founding
member and developer of its control system
Regenerative Peripheral Nerve Interface Electronics
for DARPA Project
![Page 5: Utilizing Science & Technology and Innovation for Development Real-Time 3D First-Person View for Unmanned Systems Osamah A. Rawashdeh, PhD, PE Associate.](https://reader035.fdocuments.us/reader035/viewer/2022062516/56649e6c5503460f94b6a7ec/html5/thumbnails/5.jpg)
5
UAV Background
Custom Multi-Rotors Started in 2007
Vision-Based Attitude and Attitude Control
NASA High-Altitude Inflatable-Wing Glider
First Person View for UAVs
![Page 6: Utilizing Science & Technology and Innovation for Development Real-Time 3D First-Person View for Unmanned Systems Osamah A. Rawashdeh, PhD, PE Associate.](https://reader035.fdocuments.us/reader035/viewer/2022062516/56649e6c5503460f94b6a7ec/html5/thumbnails/6.jpg)
6
• 21 years hosting at OU• 40+ teams compete in
autonomously navigating a course and avoid obstacles
Intelligent Ground Vehicle Competition (IGVC) at Oakland
University
• PSUT Team Competed in 2014• Supported by PSUT, Orange, and JPMC
• GJU Team Competed in 2011• Supported by KADDB/KAFD• Placed 22 out 44 teams• First Middle-Eastern team to compete
![Page 7: Utilizing Science & Technology and Innovation for Development Real-Time 3D First-Person View for Unmanned Systems Osamah A. Rawashdeh, PhD, PE Associate.](https://reader035.fdocuments.us/reader035/viewer/2022062516/56649e6c5503460f94b6a7ec/html5/thumbnails/7.jpg)
7
Hybrid Electric/Glow-Engine
Power System for Multi-Rotors
Current UAS Projects at Oakland University
Aerial-Underwater Quadrotor (Albatross)
![Page 8: Utilizing Science & Technology and Innovation for Development Real-Time 3D First-Person View for Unmanned Systems Osamah A. Rawashdeh, PhD, PE Associate.](https://reader035.fdocuments.us/reader035/viewer/2022062516/56649e6c5503460f94b6a7ec/html5/thumbnails/8.jpg)
8
Brief DescriptionReal-Time 3D First-Person
View for Unmanned Systems• Traditional Firs-Person-View (FPV) systems use a
camera on an electromechanical gimbal to allow for orientation of the camera independently from the robot’s orientation.
• The proposed system will create a gimbal in software by performing image processing on ultra-wide angle images.
• We will use 2 wide-angle lenses to generate 3-D FPV to allow depth perception.
• Provides an enhanced way to pilot UAV’s/UGV’s for obstacle avoidance, agile teleportation, and object tracking.
• Viable due to size/weight restrictions on many UAVs
![Page 9: Utilizing Science & Technology and Innovation for Development Real-Time 3D First-Person View for Unmanned Systems Osamah A. Rawashdeh, PhD, PE Associate.](https://reader035.fdocuments.us/reader035/viewer/2022062516/56649e6c5503460f94b6a7ec/html5/thumbnails/9.jpg)
9
• Two stationary wide-angle video cameras mounted onto the UAV and/or UGV.
• The cameras transmit wireless video frames to a base station in real-time.
• Onboard pre-processing prepares the frame images for streaming.
• The base station embedded image processor implements the SW gimbal and sends 3D images to a VR headset.
• The software gimbal output will correspond to the tele-operator's
head movements.
Approach
![Page 10: Utilizing Science & Technology and Innovation for Development Real-Time 3D First-Person View for Unmanned Systems Osamah A. Rawashdeh, PhD, PE Associate.](https://reader035.fdocuments.us/reader035/viewer/2022062516/56649e6c5503460f94b6a7ec/html5/thumbnails/10.jpg)
10
Justifications
• Autonomous systems are often challenged.• From an on-board perspective, the tele-operator can
better understand the surroundings and avoid obstacles.• The proposed research project builds on these previous
projects we completed and will result in a novel solution for teleoperation of unmanned systems.
• Software gimbal provides a viable solution due to size/weight restrictions on many smaller UAVs
![Page 11: Utilizing Science & Technology and Innovation for Development Real-Time 3D First-Person View for Unmanned Systems Osamah A. Rawashdeh, PhD, PE Associate.](https://reader035.fdocuments.us/reader035/viewer/2022062516/56649e6c5503460f94b6a7ec/html5/thumbnails/11.jpg)
11
Objectives
• An applied project was selected that provides practical value for participants
• Developing the infrastructure of the scientific research by establishing research labs in Jordan (PSUT & GJU) via equipment purchasing and joint projects with colleagues in the USA.
• Building the capacity of Jordanian human resources and developing expertise in Jordan by exchanging expertise via mutual lab visits and planed meetings.
• Publish joint research.
• Provide students in Jordan with Master level opportunities in the areas of mobile robots, embedded systems, and image processing.
![Page 12: Utilizing Science & Technology and Innovation for Development Real-Time 3D First-Person View for Unmanned Systems Osamah A. Rawashdeh, PhD, PE Associate.](https://reader035.fdocuments.us/reader035/viewer/2022062516/56649e6c5503460f94b6a7ec/html5/thumbnails/12.jpg)
12
Scope of work/Duration
Scope of work:- Embedded systems hardware and software
development- Real-time image processing algorithm to implement
a software gimbal- Mobile robots- Optics and 3D vision
Duration: 24 Months
![Page 13: Utilizing Science & Technology and Innovation for Development Real-Time 3D First-Person View for Unmanned Systems Osamah A. Rawashdeh, PhD, PE Associate.](https://reader035.fdocuments.us/reader035/viewer/2022062516/56649e6c5503460f94b6a7ec/html5/thumbnails/13.jpg)
13
Methodology of Implementation
1. Component Selection:
• Begin by investigating different wide-angle lens and video camera combinations.
• The optical characteristics of fisheye lenses will be researched and ideally a lens with an evenly distributed distortion will be selected.
• A small, lightweight video camera compatible with the chosen lens must also be found.
• In addition, a wireless AV transceiver system that is capable of transmitting high quality live video to the ground station with low latency must also be obtained.
• Finally, virtual reality goggle options wand image processing hardware would need to be studied, selected and purchased.
![Page 14: Utilizing Science & Technology and Innovation for Development Real-Time 3D First-Person View for Unmanned Systems Osamah A. Rawashdeh, PhD, PE Associate.](https://reader035.fdocuments.us/reader035/viewer/2022062516/56649e6c5503460f94b6a7ec/html5/thumbnails/14.jpg)
14
2. Image Processing Algorithms and Embedded Software R&D
• After the different components are purchased, software will be developed to interface with the virtual reality goggles and to perform image processing on the received video.
• The goggle’s software APIs will be used to obtain positioning data from the headset.
• An efficient image processing algorithm will be researched and implemented to transform the wide-angle videos to flattened images by removing the fisheye distortion.
• From the flattened images, a “window” from each video stream will be furtherly processed with respect to the orientation of the operator’s head as determined by the virtual reality goggle APIs.
• Finally, the processed videos will be shown through the virtual reality headset.
Methodology of Implementation
![Page 15: Utilizing Science & Technology and Innovation for Development Real-Time 3D First-Person View for Unmanned Systems Osamah A. Rawashdeh, PhD, PE Associate.](https://reader035.fdocuments.us/reader035/viewer/2022062516/56649e6c5503460f94b6a7ec/html5/thumbnails/15.jpg)
15
3. Testing and System Evaluation
• An appropriately sized UAV and/or UGV will also be purchased to test the 3D FPV system.
• The cameras and a wireless transceivers will be mounted onto the UAV/UGV and remotely controlled using the 3D FPV system.
• Basic range testing will be done on the wireless video links to evaluate the system’s performance and characteristics.
• Finally, the functioning system will be evaluated for viability in different UAV/UGV applications.
Methodology of Implementation
![Page 16: Utilizing Science & Technology and Innovation for Development Real-Time 3D First-Person View for Unmanned Systems Osamah A. Rawashdeh, PhD, PE Associate.](https://reader035.fdocuments.us/reader035/viewer/2022062516/56649e6c5503460f94b6a7ec/html5/thumbnails/16.jpg)
16
Expected Output
• Improve and extend the features of current wide-angle image software-based gimbals for FPV applications on UAVs and UGVs.
• Implement a 3D FPV system with depth perception for unmanned systems.
• Exchange of knowledge and experience in the fields of embedded real-time image processing between peers in the USA and Jordan.
• Publication of conference and journal papers.
• Technological demonstrations to local institutions in Jordan and USA.
![Page 17: Utilizing Science & Technology and Innovation for Development Real-Time 3D First-Person View for Unmanned Systems Osamah A. Rawashdeh, PhD, PE Associate.](https://reader035.fdocuments.us/reader035/viewer/2022062516/56649e6c5503460f94b6a7ec/html5/thumbnails/17.jpg)
17
Impact
The results may be beneficial for the following governmental and industrial institutions:
• Public Security• Boarder Security• Civil Aviation• The KADDB• Visual multimedia industry
It will also result in building the capacity of researchers in Jordan in the fields of unmanned systems and embedded real-time image processing.
![Page 18: Utilizing Science & Technology and Innovation for Development Real-Time 3D First-Person View for Unmanned Systems Osamah A. Rawashdeh, PhD, PE Associate.](https://reader035.fdocuments.us/reader035/viewer/2022062516/56649e6c5503460f94b6a7ec/html5/thumbnails/18.jpg)
18
Sustainability
• Continued joint project works between colleagues in Jordan and the USA
• Referral and mobility of experts and master level students between institutes
• Complementary cooperation between partner labs
• Sharing of expertise with local partners in Jordan and USA
![Page 19: Utilizing Science & Technology and Innovation for Development Real-Time 3D First-Person View for Unmanned Systems Osamah A. Rawashdeh, PhD, PE Associate.](https://reader035.fdocuments.us/reader035/viewer/2022062516/56649e6c5503460f94b6a7ec/html5/thumbnails/19.jpg)
19
Action Plan
Sequence of planned activities
1. Two researchers from Jordan will visit the partner laboratories in the USA, to define the hardware and software designs, and agree on work distributions.
2. The partner institutions in Jordan will proceed with the equipment purchasing.3. Prototype development will start in Jordan in collaboration with US partners.4. Two researchers from the USA will visit the labs in Jordan to review the progress,
adjust objectives, and outline research publications.5. Initial progress will be presented in conferences in both countries sponsored by the
partner universities.6. The labs in Jordan will finalize the prototype and conduct applied experiments.7. The labs in Jordan will collect necessary data for publications.8. All partners will collaborate on writing journal research publications.9. Partners in Jordan will demonstrate the system to interested local entities via an
interactive seminar.
![Page 20: Utilizing Science & Technology and Innovation for Development Real-Time 3D First-Person View for Unmanned Systems Osamah A. Rawashdeh, PhD, PE Associate.](https://reader035.fdocuments.us/reader035/viewer/2022062516/56649e6c5503460f94b6a7ec/html5/thumbnails/20.jpg)
20
Thank You!