Bob Droning On…


Bob Moore recently gave a talk on the application of drone technology in landscape architecture site survey to the local branch of the British Computer Society. To a large audience of mainly computer specialists, he first outlined the nature and relevance of site survey and analysis, choosing a site in the Forest of Dean currently being used for a student ‘sustainable technology’ project. Focusing on landform and vegetation, Bob then showed how remotely-controlled drones (more properly described as unmanned aerial vehicles or UAVs) capture Lidar and multi-spectral data which can be processed into accurate elevation levels (Digital Terrain Models), surface 3D models including tree canopies (DSMs) and ‘classified’ vegetation cover maps.

Research into this technology is being actively pursued in the department and to date the existing technology has only provided high-resolution full-colour imagery of the site (see Fig 1 left), but this alone demonstrates the clear benefit of up-to-date information, better resolution than GoogleEarth and cheaper than commercially-available imagery.

The quadcopter UAV has limited flight duration (maximum 10 minutes) but for a small site this is adequate to produce an informative contextual video from which Fig 2 is a screen shot. Lidar (light detection and ranging) is a remote sensing method using a pulsed laser to measure distances from which precise 3D information can be derived (Fig 3). Google Earth uses this technology to good effect (called Ground Level View as distinct from Street View, Fig 4).

Multi-spectral imagery provides data at different frequencies across the electro-magnetic spectrum, including infra-red which is particularly sensitive to healthy green vegetation and obviously of much interest to landscape architects. Using specialist software, individual ‘targets’ (soils, roads, rocks, vegetation…) with their own characteristic spectral responses can be inferred from the digital radiance values and a surface cover map can be produced (Fig 5). The next phase of the research will explore the potential for finer species identification using UAV data, which clearly should be more discriminating than the Landsat 30m resolution data as depicted here.

two_images

Figure 1 UAV imagery (left) vs Google Earth (right) – pixel resolution 2cm and 10cm respectively

 

drone1

Fig 2 Perspective visualisation: UAV video still showing fish-eye effect

dsm_from_uav

Fig 3 Digital Surface Model derived from LIDAR data

google_lidar

Fig 4 Google Earth 3D buildings visualisation of university campus

classified_veg

Fig 5 Surface cover map ‘classified’ from multi-spectral Landsat image: red shades represent mainly forest species; blue = urban/built environment

phantom

Fig 6 Phantom 6 quadcopter with installed GoPro Hero 3 camera: controlled from FPV (First Person View) screen and iPad ‘groundstation’ app

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.