Introduction for Time of Flight Camera by Win - Reference From Wiki
Transcript of Introduction for Time of Flight Camera by Win - Reference From Wiki
Introduction
A human being can generate a new idea from others’ suggestions by
communicating with them. Everyone has had the experience of solving a problem
which cannot be addressed alone but only by communicating with others. It is a
fact that communication with others is an effective way to generate a new idea or
activate creativity. In such a case, what is means and information we use for
communication with others is important. What place we exchange communications
in is also important.
General means of communication are facial expression, gesture, and speaking.
These are performed face to face. Only text and voice are used for communication
when people are apart from each other. If a system is developed which can send
and receive realistic facial expressions and gestures, we will be able to
communicate with each other more effectively. The important components in such
a system are a natural interface and a real time process. The pointing gesture is one
of the natural interfaces for man-machine interaction. We think that a system
becomes more effective in communication if the system can find an indicated
object.
Input devices for gesture recognition
The ability to track a person's movements and determine what gestures they may
be performing can be achieved through various tools. Although there is a large
amount of research done in image/video based Gesture Recognition, there is some
variation within the tools and environments used between implementations.
Depth-aware cameras Using specialized cameras such as time-of-flight
cameras, one can generate a depth map of what is being seen through the camera at
a short range, and use this data to approximate a 3d representation of what is being
seen. These can be effective for detection of hand gestures due to their short range
capabilities.
A time-of-flight camera (TOF camera) is a camera system that creates distance
data with help of the time-of-flight (TOF) principle. The scene is illuminated by
short light pulses and the camera measures the time taken until the reflected light
reaches the camera again. This time is directly proportional to the distance. The
camera therefore provides a range value for each pixel. The principle is similar to
that of 3D scanners with the advantage that whole scene is captured at the same
time.
Time-of-flight cameras are relatively new devices, as the semiconductor processes
have only recently become fast enough for such devices. The systems cover
ranges of a few meters up to about 40 m. The distance resolution is about 1 cm,
and the lateral resolution is about 200 by 200 pixels. The biggest advantage of the
cameras may be that they provide up to 100 images per second.
Component of the time of flight camera
Illumination unit : It illuminates the scene. As the light has to be modulated
with high speeds up to 100 MHz, only LEDs or Laser diodes are feasible.
The illumination normally uses infrared light to make the illumination
unobtrusive.
illumination unit - a measure of illumination
light unit - a measure of the visible electromagnetic radiation
foot-candle - a unit of luminance on a surface that is everywhere 1 foot from
a point source of 1 candle
Lambert, L - a cgs unit of illumination equal to the brightness of a perfectly
diffusing surface that emits or reflects one lumen per square centimeter
lux, lx - a unit of illumination equal to 1 lumen per square meter; 0.0929
foot candle
phot - a unit of illumination equal to 1 lumen per square centimeter; 10,000
phots equal 1 lux
Optics : A lens gathers the reflected light and images the environment onto
the image sensor. An optical band pass filter only passes the light with the
same wavelength as the illumination unit. This helps suppress background
light.
Image sensor : This is the heart of the TOF camera. Each pixel measures the
time the light has taken to travel from the illumination unit to the object and
back. The image sensor is built up similar to other image sensors except for
the pixel, which is much more complicated. It contains two or more fast
shutters to sample the incoming light at given points in time. Because of this
functionality, TOF pixels, in contrast to pixels in current 2D digital
cameras, have large pixel sizes up to 100 micrometers.
Driver electronics : Both the illumination unit and the image sensor have to
be controlled by high speed signals. These signals have to be very accurate
to obtain a high resolution. For example, if the signals between the
Illumination unit and the sensor shift by only 10 picoseconds, the distance
changes by 1.5 mm. For comparison: current CPUs reach frequencies of up
to 3 GHz, corresponding to clock cycles of about 300 ps - the corresponding
'resolution' is only 45 mm.
Computation/Interface : The distance is calculated directly in the camera.
To obtain good performance, some calibration data is also used. The camera
then provides a distance image over a USB or Ethernet interface.
Time of flight principle
The time of flight (TOF) describes the method used to measure the time that it
takes for a particle, object or stream to reach a detector while traveling over a
known distance. In kinematics, TOF is the duration in which a projectile is
travelling through the air. Given the initial velocity u of the particle, the downward
(i.e. gravitational) acceleration a, and the projectile's angle of projection θ
(measured relative to the horizontal), then a simple rearrangement of the SUVAT
equation
results in this equation
for the time of flight of a projectile.
The simplest version of a time-of-flight camera uses light pulses. The illumination is switched on for a very short time, the resulting light pulse illuminates the scene and is reflected by the objects. The camera lens gathers the reflected light and images it onto the sensor plane. Depending on the distance, the incoming light experiences a delay. As light has a speed of c = 300'000'000 meters per second, this delay is very short: an object 2.5 m away will delay the light by:
The simplest version of a time-of-flight camera uses light pulses. The illumination
is switched on for a very short time; the resulting light pulse illuminates the scene
and is reflected by the objects. The camera lens gathers the reflected light and
images it onto the sensor plane.
Time-of-flight measurements are often used for the measurement of some distance,
e.g. with a laser range finder, used e.g. in an airplane, possibly in the form of a
scanning laser radar. Here, an apparatus sends out a short optical pulse and
measures the time until a reflected portion of the pulse is monitored. The distance
is then calculated using the velocity of light. Due to this high velocity, the temporal
accuracy must be very high – e.g. 1 ns for a spatial accuracy of 15 cm.
The time-of-flight method is typically used for large distances such as hundreds of
meters or many kilometers. Using advanced techniques (involving high-quality
telescopes, highly sensitive photo detection, etc.), it is possible to measure e.g. the
distance between the Earth and the Moon with an accuracy of a few centimeters, or
to obtain a precise profile of a dam. Typical accuracies of simple devices for short
distances are a few millimeters or centimeters.
Advantages
Simplicity
In contrast to stereo vision or triangulation systems, the whole system is very
compact: the illumination is placed just next to the lens, whereas the other systems
need a certain minimum base line. In contrast to laser scanning systems, no
mechanical moving parts are needed.
Efficient distance algorithm
It is very easy to extract the distance information out of the output signals of the
TOF sensor; therefore this task uses only a small amount of processing power,
again in contrast to stereo vision. After the distance data has been extracted, object
detection is also easy to carry out because the algorithms are not disturbed by
patterns on the object.
Speed
Time-of-flight cameras are able to measure the distances within a complete scene
with one shot. As the cameras reach up to 100 frames per second, they are ideally
suited to be used in real-time applications.
Disadvantages
Background light
Although most of the background light coming from artificial lighting or the sun is
suppressed, the pixel still has to provide a high dynamic range. The background
light also generates electrons, which have to be stored. For example, the
illumination units in today's TOF cameras can provide an illumination level of
about 1 watt. The Sun has a illumination power of about 50 watts per square meter
after the optical band pass filter. Therefore, if the illuminated scene has a size of 1
square meter, the light from the sun is 50 times stronger than the modulated signal.
Applications
Range image of a human face captured with a TOF-camera
Human-machine interfaces / Gaming As time-of-flight cameras provide distance
images in real time; it is easy to track movements of humans. This allows new
interactions with consumer devices such as televisions. Another topic is to use this
type of cameras to interact with games on video game consoles.
Measurement / Machine vision
Range image with height measurements
Other applications are measurement tasks, e.g. for the fill height in silos. In
industrial machine vision, the time-of-flight camera helps to classify objects and
help robots find the items, for instance on a conveyor. Door controls can
distinguish easily between animals and humans reaching the door.
Robotics Another use of these cameras is the field of robotics: Mobile robots can
build up a map of their surroundings very quickly, enabling them to avoid
obstacles or follow a leading person. As the distance calculation is simple, only
little computational power.
Commercial products
CanestaVision - TOF modules and software by Canesta Photonic Mixer Device - technology by PMD Technologies SwissRanger SR4000 - an industrial TOF-only camera by Mesa Imaging,
available with Ethernet or USB interfaces (available now) ZCam - a consumer USB camera incorporating a full colour camera as
well as time-of-flight (scheduled for release late 2009) based on their own sensor chip
Reference-Wikipedia, High Resolution Segmentation with aTime-of-Flight 3D-Camera using the Exampleof a Lecture Scene Neven Santrac, Gerald Friedland, Raul Rojas-A 3D TIME OF FLIGHT CAMERA FOR OBJECT DETECTION Dr.-Ing. Thorsten Ringbeck, Head of BU Systems Dipl.-Ing. Bianca Hagebeuker, Product Marketing (Contact Author)-Performance of a Time-of-Flight Range Camera for Intelligent Vehicle Safety Applications, S. Hsu, S. Acharya, A. Rafii and R. New, Canesta, Inc.-INTEGRATING 3D TIME-OF-FLIGHT CAMERA DATA AND HIGH RESOLUTION IMAGES FOR 3DTV APPLICATIONS, Benjamin Huhle, Sven Fleck ∗, University of T¨ubingen-3D time-of-flight cameras for mobile robotics, Stefan May,
Bj¨orn Werner, Hartmut Surmann and Kai Perv¨olz, Fraunhofer Institute for Autonomous Intelligent Systems (AIS)-Cluster Tracking with Time-of-Flight Cameras, Dan Witzner Hansen Mads Syska Hansen Martin Kirschmeyer Rasmus Larsen, Davide Silvestre, Informatics and Mathematical ModellingTechnical University of Denmark