Do you also sometimes get maps which contain zero (0) rather than NULL (no data) in some parts of the map? This can be easily solved with “floodfilling”, even in a GIS.

My original map looks like this (here, Trentino elevation model):

The light blue parts should be no data (NULL) rather than zero (0)...

Now what? In a paint software we would simply use bucket fill but what about GIS data? Well, we can do something similar using “clumping”. It requires a bit of computational time but works perfectly, even for large DEMs, e.g., all Italy at 20m resolution. Using the open source software GRASS GIS 7, we can compute all “clumps” (that are many for a floating point DEM!):

# first we set the computational region to the raster map:
g.region rast=pat_DTM_2008_derived_2m -p
r.clump pat_DTM_2008_derived_2m out=pat_DTM_2008_derived_2m_clump

The resulting clump map produced by r.clump is nicely colorized:

Clumped map derived from DEM (generated with r.clump)

As we can see, the area of interest (province) is now surrounded by three clumps. With a simple map algebra statement (r.mapcalc or GUI calculator) we can create a MASK by assigning these outer boundary clumps to NULL and the other “good” clumps to 1:

r.mapcalc "no_data_mask = if(pat_DTM_2008_derived_2m_clump == 264485050 || \
  pat_DTM_2008_derived_2m_clump == 197926480 || \
  pat_DTM_2008_derived_2m_clump == 3, null(), 1)"

This mask map looks like this:

Mask map from all clumps except for the large outer clumps

We now activate this MASK and generate a copy of the original map into a new map name by using map algebra again (this just keeps the data matched by the MASK). Eventually we remove the MASK and verify the result:

# apply the mask
r.mask no_data_mask
# generate a copy of the DEM, filter on the fly
r.mapcalc "pat_DTM_2008_derived_2m_fixed = pat_DTM_2008_derived_2m"
# assign a nice color table
r.colors pat_DTM_2008_derived_2m_fixed color=srtmplus
# remove the MASK
r.mask -r

And the final DEM is now properly cleaned up in terms of NULL values (no data):

DEM cleaned up for no data

Enjoy.

banner_lidar

[toc]

LiDAR point cloud data are commonly delivered in the ASPRS LAS format. The format is supported by libLAS, a BSD-licensed C++ library for reading/writing these data. GRASS GIS 7 supports the LAS format directly when built against libLAS (as the case for most binary packages being available for download).

In this exercise we will import a sample LAS data set covering a tiny area close to Raleigh, NC (USA), belonging to the North Carolina sample data set. Sample LAS data download: https://grass.osgeo.org/sampledata/north_carolina/  (25MB).

For a full exercise, we will, however, assume that no GRASS GIS location is ready so far (so: newbies are welcome!) and create a new one initially.

1. Having the LAS file: now what?

In the first place, check the metadata of the LAS file using the lasinfo command (comes with libLAS; here only parts of the output shown):

lasinfo points.las
---------------------------------------------------------
  Header Summary
---------------------------------------------------------
  Version:  1.2
  Source ID:  0
  Reserved:  0
  Project ID/GUID:  '00000000-0000-0000-0000-000000000000'
  System ID:  'libLAS'
  Generating Software:  'libLAS 1.2'
[...]
  Spatial Reference:
None
[...]
---------------------------------------------------------
  Point Inspection Summary
---------------------------------------------------------
  Header Point Count: 1287775
  Actual Point Count: 1287775

  Minimum and Maximum Attributes (min,max)
---------------------------------------------------------
  Min X, Y, Z:   6066629.86, 2190053.45, -3.60
  Max X, Y, Z:   6070237.92, 2193507.74, 906.00
  Bounding Box:  6066629.86, 2190053.45, 6070237.92, 2193507.74
  Time:  0.000000, 0.000000
  Return Number:  1, 3
  Return Count:  1, 7
  Flightline Edge:  0, 0
  Intensity:  0, 256
  Scan Direction Flag:  0, 0
  Scan Angle Rank:  0, 0
  Classification:  2, 7
  Point Source Id:  0, 0
  User Data:  0, 0
  Minimum Color (RGB):  0 0 0 
  Maximum Color (RGB):  0 0 0 

  Number of Points by Return
---------------------------------------------------------
  (1) 1225886  (2) 61430  (3) 459

  Number of Returns by Pulse
---------------------------------------------------------
  (1) 30877  (2) 153  (5) 1225886  (6) 30706  (7) 153

  Point Classifications
---------------------------------------------------------
  647337 Ground (2) 
  639673 Low Vegetation (3) 
  740 Building (6) 
  25 Low Point (noise) (7) 
  -------------------------------------------------------
  0 withheld
  0 keypoint
  0 synthetic
  -------------------------------------------------------

We see: no spatial reference system indicated!
Luckily we know from here that the projection is  NAD83(HARN) / North Carolina, LCC 2SP metric, EPSG code 3358). Furthermore we see:

  • Number of Points by Return: 3 (i.e., first, mid, last)
  • Point Classifications: the points are already classified as “Ground” (class 2),  “Low Vegetation” (3),  “Building” (6), and Low Point (noise) (class 7). Something to play with later.

Time to create a GRASS GIS location and import the LAS file.

2. Creating a GRASS GIS location for the LAS file

Since we know the EPSG code of the projection, that’s an easy task. Please note that GRASS GIS can generate locations directly from SHAPE files (with .prj file), GeoTIFF and more.

We fire up GRASS GIS 7 and open the Location Wizard:

grass7_startup

In the Location Wizard, we first define a name for the new location:

grass7_loc_wizard1

We select the “EPSG code” method for creating a new location:

grass7_loc_wizard2

You can search for “North Carolina” and select the EPSG code 3358 from the list:

grass7_loc_wizard3

Next summary should show up as follows (be sure to have the metric projection shown!):

grass7_loc_wizard4

With “Finish” you reach this notification (indeed, nothing to change! It is already fine):

grass7_loc_wizard5

Since we want to import the LAS file, no need to manually define any region extent here – just say “No”:

grass7_loc_wizard6

While we could import the data also into the PERMANENT mapset, we prefer to create an own mapset “lasdata” for our LAS data (once you reach hundreds of maps to manage, you will be happy about the concept of mapsets):

grass7_loc_wizard7

Voilà, we get back to the initial startup screen and can now start our GRASS GIS session with our “nc_nad83_lcc” location and “lasdata” mapset within the location: “Start GRASS”!

grass7_loc_wizard8

3. Import of the LAS file

When creating a new location from a GeoTIFF or SHAPE file (or other GDAL supported format), then the data set is imported right away. This is not the case for LAS files, also due to the fact that we can directly apply binning statistics during import of the LAS file (e.g. percentiles, min or max) and create a raster surface from the points right away rather than importing them as vector points.

3. a) Creating a raster surface from LAS during import

The LAS import into a raster surface is available through r.in.lidar:

grass7_las_import1

First the LAS file needs to be selected and an output file name specified (in this example, we want to extract the 95th percentile as binning method, hence a reasonable map name):

grass7_las_import2

In the “Statistic” tab, we select the “percentile” method along with 95 as value:

grass7_las_import3

In the “Optional” tab we activate to extend the computational region from the LAS file and, since the spatial reference system metadata are lacking from the LAS file, also “override dataset projection” to use that predefined in the location. Finally, we define 5m as desired raster resolution for the resulting raster map:

grass7_las_import4

Upon conpletion of the import/binning, the new raster elevation map is shown after zooming to the map (r.in.lidar -e … restores upon completion the previous region settings, hence we may have to zoom):

grass7_las_import5

Now we can start to analyze or visually explore the imported LAS file.

4. Visual LiDAR data exploration

Using the wxNVIZ 3D viewer, we can easily fly through the new DEM. Switch in the Map Display to “3D view” (1). Note that the default rendering is initially done at low visual resolution for speed reasons. You can switch to “Rotate mode” as well to easily navigate with the mouse. In the “Data” tab (2) you can increase the visual resolution (3) to obtain a crisp view:

grass7_las_viz1

 

Now all kinds of analysis steps may follow.

Outlook

For true LiDAR processing as points, see the following GRASS GIS 7 modules: v.in.lidar (for point import), v.lidar.correction, v.lidar.edgedetection, v.lidar.growing, v.outlier, and v.surf.bspline.

[toc]

Recently, in order to nicely plan ahead for a birthday lunch at the agriturismo Malga Brigolina (a farm-restaurant near Sopramonte di Trento, Italy, at 1,000m a.s.l. in the Southern Alps), friends of mine asked me the day before:

Will the place be sunny at lunch time for making a nice walk?

[well, the weather was close to clear sky conditions but mountains are high here and cast long shadows in winter time].

paganella_rotaliana

A rather easy task I thought, so I got my tools ready since that was an occasion to verify the predictions with some photos! Thanks to the new EU-DEM at 25m I was able to perform the computations right away in a metric system rather than dealing with degree in LatLong.

Direct sunlight can be assessed from the beam radiation map of GRASS GIS’ r.sun when running it for a specific day and time. But even easier, there is a new Addon for GRASS GIS 7 which calculates right away time series of insolation maps given start/stop timestamps and a time step: r.sun.hourly. This shrinks the overall effort to almost nothing.

Creating the direct sunlight maps

The first step is to calculate where direct sunlight reaches the ground. Here the input elevation map is the European “eu_dem_25”, while the output is the beam radiation for a certain day (15 Dec 2013 is DOY 349).

Important hint: the computational region must be large enough to east/south/west to capure the cast shadow effects of all relevant surrounding mountains.

I let calculations start at 8am and finish at 5pm which an hourly time step. The authors have kindly parallelized r.sun.hourly, so I let it run on four processors simultaneously to speed up:

# calculate DOY (day-of-year) from given date:
date -d 2013-12-15 +%j
349

# calculate beam radiation maps for a given time period
# (note: minutes are to be given in decimal time: 30min = 0.5)
r.sun.hourly elev_in=eu_dem_25 beam_rad=trento_beam_doy349 \
      start_time=8 end_time=17 time_step=1 day=349 year=2013 \
      nprocs=4

The ten resulting maps contain the beam radiation for each pixel considering the cast shadow effects of the pixel-surrounding mountains. However, the question was not to calculate irradiance raster maps in Wh/m^2 but simply “sun-yes” or “sun-no”. So a subsequent filtering had to be applied. Each beam radiation map was filtered: if pixel value equal to 0 then “sun-no”, otherwise “sun-yes” (what my friends wanted to achieve; effectively a conversion into a binary map).
Best done in a simple shell script loop:

[Edit 30 Dec 2013: thanks to Anna you can simplify below loop to the r.colors call thanks to the new -b flag for binary output in r.sun.hourly!]

for map in `g.mlist rast pattern="trento_beam_doy349*"` ; do
    # rename current map to tmp
    g.rename rast=$map,$map.tmp
    # filter and save with original name
    r.mapcalc "$map = if($map.tmp == 0, null(), 1)"
    # colorize the binary map
    echo "1 yellow" | r.colors $map rules=-
    # remove cruft
    g.remove rast=$map.tmp
done

As a result we got ten binary maps, ideal for using them as overlay with shaded DEMs or OpenStreetMap layers. The areas exposed to direct sunlight are shown in yellow.

Trento direct sunlight 15 Dec 2013 Animation

Trento, direct sunlight, 15 Dec 2013 between 10am and 5pm (See here for creating an animated GIF). Quality reduced for this blog.

Time to look at some details: So, is Malga Brigolina in sunlight at lunch time?

Situation at 12pm (noon): predicted that the restaurant is still in shadow – confirmed in the photo:

malga_brigolina_direct_sunlight_15dec2013_12pm_foto

(click to enlarge)

Situation at 1:30/2:00pm: sun is getting closer to the Malga, as confirmed in photo (note that the left photo is 20min ahead of the map). The small street in the right photo is still in the shadow as predicted in the map):

malga_brigolina_direct_sunlight_15dec2013_14pm_foto(click to enlarge)

Situation at 3:00pm: sun here and there at Malga Brigolina:

malga_brigolina_direct_sunlight_15dec2013_15pm_3DSunlight map blended with OpenStreetmap layer (r.blend + r.composite) and draped over DEM in wxNVIZ of GRASS GIS 7 (click to enlarge). The sunday walk path around Malga Brigolina is the blue/red vector line shown in the view center.

Situation at 4:00pm: we are close to sunset in Trentino… view towards the Rotaliana (Mezzocorona, S. Michele all’Adige), last sunlit summits also seen in photo:

rotaliana_direct_sunlight_15dec2013_16pm_foto(click to enlarge)

Outcome

The resulting sunlight/shadow map appear to match nicely realty. Perhaps r.sun.hourly should get an additional flag to generate the binary “sun-yes” – “sun-no” maps directly.

trento_direct_sunlight_15dec2013_15pm_3DDirect sunlight zones (yellow, 15 Dec 2013, 3pm): Trento with Monte Bondone, Paganella, Marzolan, Lago di Caldonazzo, Lago di Levico and surroundings (click to enlarge)

GRASS GIS usage note

The wxGUI settings were as simple as this (note the transparency values for the various layers):

trento_direct_sunlight_wxGUI

Data sources:

Inspired by Vaclav Petras posting about “Did you know that you can see streets of downtown Raleigh in elevation data from NC sample dataset?” I wanted to try the new GRASS GIS 7 Addon r.shaded.pca which creates shades from various directions and combines then into RGB composites just to see what happens when using the new EU-DEM at 25m.

To warm up, I registered the “normally” shaded DEM (previously generated with gdaldem) with r.external in a GRASS GIS 7 location (EPSG 3035, LAEA) and overlayed the OpenStreetMap layer using WMS with GRASS 7’s r.in.wms. An easy task thanks to University of Heidelberg’s www.osm-wms.de. Indeed, they offer a similar shading via WMS, however, in the screenshot below you see the new EU data being used for controlling the light on our own:

OpenStreetMap shaded with EU DEM 25m

OpenStreetMap shaded with EU DEM 25m (click to enlarge)

Next item: trying r.shaded.pca… It supports multi-core calculation and the possibility to strengthen the effects through z-rescaling. In my example, I used:

r.shaded.pca input=eu_dem_25 output=eu_dem_25_shaded_pca nproc=3 zmult=50

The leads to a colorized hillshading map, again with the OSM data on top (50% transparency):

eu_dem_25m_PCA_shaded_OSM_trento_rovereto_garda_lake

OpenStreetMap shaded with r.shaded.pca using EU DEM 25m (click to enlarge)

Yes, fun – I like it :-)

Data sources:

eu_dem_upper_garda_lake_riva_arco_italy

EU DEM 25m upper Garda Lake area with Riva del Garda and Arco (Italy). 3D view in wxNVIZ – GRASS GIS 7

The 25m European Digital Elevation Model (EU-DEM, Version 1) is a Digital Surface Model (DSM) representing the first surface as illuminated by the sensors:

eu_dem_s_michele_rotaliana_italy

EU DEM Rotaliana with Mezzocorona and S. Michele (Italy). Produced using Copernicus data and information funded by the European Union – EU-DEM layers.

Its elevations were captured at 1 arc second postings (2.78E-4 degrees). The tiles are provided at 25m resolution in EU-LAEA (EPSG. 3035) projection, temporal coverage: 2000, published in Oct 2013. It is a realisation of the Copernicus programme, managed by the European Commission, DG Enterprise and Industry. Metadata are provided here. According to their “Methodology” page it is a hybrid product based on SRTM and ASTER GDEM data fused by a weighted averaging approach and it has been generated as a contiguous dataset divided into 1 degree by 1 degree tiles, corresponding to the SRTM naming convention. In addition to the DEM data, a colour shaded relief image over Europe is provided.

From the metadata page: “The EU-DEM data are provided as is, i.e. without a formal validation yet. An independent statistical validation is scheduled as part of the GIO land monitoring service activities, and will be made available in the course of 2014.

Data download

Note that the GeoTIFF files are of major size, up to 5 GB:

Data import

The data come as ZIP compressed files, hence unzipping occurs (or simply use the fancy “vsizip” driver in GDAL).

Hint for GRASS GIS users: instead of importing the data, you can use the r.external command to register the GeoTIFF DEM file instead of imorting it within a EU LAEA projected location.

Enjoy!

eu_dem_trento_adige_s_michele_italy

USGS has published a new nice data set called the “Global Multi-resolution Terrain Elevation Data 2010” (GMTED2010). It is offered at three different resolutions (approximately 1,000, 500, and 250 meters).

Example (MEA = mean dataset) Trento – Garda Lake – Verona area (Northern Italy):

The 250m product looks quite smooth -a nice new DEM product…

Data download: https://eros.usgs.gov/#/Find_Data/Products_and_Data_Available/GMTED2010

GRASS GIS processing steps: see here

The recent publication of the ASTER GDEM (30m) is a great step towards a worldwide high resolution elevation model. I have done some test in the Southern Alps around Trento which is among the most complex terrains in Europe. The scope was to calculate a difference map to the local high resolution DEM.

The steps

1. Mosaicking and reprojection to UTM32/WGS84 of ASTER GDEM tiles of the area (using gdalwarp).
2. Import of the ASTER GDEM map into GRASS (r.in.gdal)
3. Creation of a difference map to the provincial DEM (r.mapcalc, r.colors)
4. Histogram and univariate statistics (d.histogram, r.univar)

The results
(click for higher resolution or download slides as PDF):

Comments

The hydrological pattern appears to coincide with the provincial rivers map. It can be seen that lakes weren’t masked during the ASTER DEM preparation, howver, these could be identified for many areas in the world with OpenStreetMap layers.
Despite spikes (including unexpected craters), the overall quality appears to be acceptable for this first version of ASTER GDEM – the standard deviation is 18m for the test area with outliers predominantely found in the areas of complex terrain rather than in the valley floors. A shift of -2.3m is also observed (reprojection artifact?).
It will be worth to check if SRTM data (90m) could help to identify and remove the spikes from the DEM. More hopefully in the near future…

Acknowledgments: ASTER GDEM is a product of METI and NASA.