A team from Arizona State University and the University of Arizona has architectured an approach to automatically, in real-time, collect geo-reference images from remote cameras for managing traffic. This is done by integrating the imagery with information on the height and GPS location of the camera. Using this camera data, in combination with a geographic representation (latitude-longitude) of the area to be monitored, leads to an explicit way to geo-reference the observed road and vehicle locations. Absolute values of vehicle positions, speeds, accelerations, decelerations and lane changes can be determined. Initial experiments show promise in geo-referencing the airborne imagery.
Using this technology, the ASU-UA team has developed prototype software to extract individual vehicle trajectories from aerial video. Using this software, one is able to identify individual vehicles and their movement across consecutive images. By knowing the pixel coordinates and the approximate scale of the image, vehicle trajectories (in distance and time) can easily be determined. The ASU-UA team has demonstrated this technique using aerial videos in Tucson and Phoenix. Data sets of vehicle trajectories can also be used for calibration and validation of microscopic traffic simulation models. Such simulation models can then be used for investigating possible roadway improvements or to better explain existing and likely future traffic conditions.
In addition, this approach is viable for collecting aggregate traffic measures (delay, density, flow, speed, etc.) which are extremely useful for traffic mangers to figure out how the freeways and surface streets are performing, especially when other ground sensors are absent or disabled.
The CIDSE researchers are led by Professors Pitu Mirchandani and Ronald Askin. Civil Engineering Professor Mark Hickman leads the UA researchers. The U.S. Department of Transportation has been supporting the research. Also, the German Aerospace Agency (DLR) in Berlin, Germany, collaborates with the team. Led by Dr. Reinhart Kühne and Martin Ruhé, DLR researchers have developed an integrated platform that can be flown on fixed-wing planes and helicopters to acquire and transmit images at five frames per second and subsequently, also in real time, determine traffic parameters such as speeds and densities, and individual vehicle trajectories.
However, their system is expensive, especially because it requires a high-resolution professional camera and a high-precision inertial measurement unit (IMU) that very accurately localizes the camera. Instead, the ASU-UA approach uses a consumer camera, albeit a high-end one, to capture images, but accurately localizes the captured images on a geo-referenced map using fast image processing algorithms. Success of this research could lead to having cameras on UAV’s that could be used for not only monitoring daily traffic congestion and incidents, but also assist in managing traffic during evacuations.
Mirchandani states, “This could revolutionize how well and how fast we respond to traffic congestion and incidents. I can imagine a future where a fleet of equipped UAVs will be available for dispatching to any location at any time, to assist first responders to get to incidents fast and assist traffic managers to proactively manage traffic.”