The trouble with existing 3D imaging technology is that – at the
consumer level, at least – it tends to struggle with distances beyond a
few feet. Put even a third of the width of a basketball court between
yourself and a Microsoft Kinect sensor, for instance, and it won't pick
up your movements at all. Researchers at the University of California,
Berkeley, claim to have developed a Lidar (light radar)-based system
that can remotely sense objects across distances as long as 30 feet (10
m), which could have widespread benefits in fields as diverse as
entertainment, transportation, robotics, and mobile phones.
The system, which is to be presented at the CLEO 2014 conference
in San Jose next week, combines frequency-modulated continuous-wave
Lidar with something called MEMS (micro-electrical-mechanical system)
tunable VCSELs (vertical cavity surface-emitting lasers). That mouthful
essentially means that it emits "frequency chirped" light (that is,
laser light of increasing or decreasing frequency) from a low-power
laser tuned to the natural resonance frequency of the MEMS mirror, which
greatly amplifies the signal without increasing power drain. Measuring
changes in the frequency of reflected light allows the system to gauge
the distance an object from the sensor.
The technology is seen as a smaller, less power-hungry alternative to
current high-resolution Lidar imaging systems, which require big, bulky
boxes and typically have their operating range and accuracy severely
affected by a phenomenon known as Brownian noise limitation – a kind of
signal noise produced by Brownian motion, which is the term used to
describe random movement of particles as they collide with atoms,
molecules, and other particles in the air. Brownian motion in
conventional Lidar systems causes excessive interference – more
specifically, beat frequency, or variation in the frequency of returning
light – for reliable readings over longer distances, but this effect is
severely reduced by tuning the laser to its mechanical resonance
frequency.
This new system will now be scaled down, and the researchers expect
that it will fit on a microchip, thereby making it possible to integrate
Kinect-like hand gestures that allow you to silence your ringtone from
30 ft away or to play virtual tennis in an area approaching the size of a
real tennis court.
Lead researcher Behnam Behroozpour believes the system also has "a host
of new applications that have not even been invented yet." But its
utility seems most apparent in fields related to robotics, as a safety
measure that enables self-driving cars to see hazards half a block away or to improve the depth sensing of drones and other autonomous or semi-autonomous robots.
Source: The Optical Society
0 comments:
Post a Comment