No Metadata? No Problem.

Where did the imagery come from? 


Even with limited information, a trained eye can glean clues in aerial imagery to learn more about it's origin. If this is your first time working with remote sensing data, here's some tips to research and interpret aerial imagery to make sure you've got the latest and greatest.

Images can look very different but come from the same original source.

York 2018: Original image at left; colour enhanced at right.

Or, look very similar when they aren't.

Peel 2018: Look closely. These are two different neighbourhoods.

Where To Start Looking

If you have an unknown image you need to research, start by looking for publicly available imagery of the same location with documented metadata and compare the image's characteristics and contents to determine if they're the same. Google Earth historical imagery is a good source to find the capture date and provider of an image:

Google Earth historical imagery shows the date and data provider.

Municipalities often have web GIS with dated imagery as well. First Base Solutions captures much of this imagery in Ontario although it may not be explicitly stated on the municipal portals.

Left: VuMAP zoomed to City of Niagara Falls with 2010 imagery. Right: The same imagery through the Niagara Falls Viewer.


Different Dates or Times?

If you like 'Spot The Difference' brain teasers, this exercise should be easy. Images taken even a few seconds apart from the same sensor will show differences where moving traffic is captured.

Where images taken seconds apart are stitched together, the cut lines can be detected where moving objects appear in both photos. Double cars, half cars, and ghost cars are common examples.

In places with tall buildings, the viewing angle from the plane's camera will change from photo to photo, creating the illusion that the building is leaning away from the camera. This phenomenon is call radial displacement. The higher the elevation of the camera (satellite mounted instead of fixed wing aircraft), the less displacement you see.


CN Tower 2016, 2017, 2018. The more of the side of a building that can be seen, the farther away it is from the center of the photo. Different viewing angles are a clear indication that the two images are from different times.


The best places to look for time and date differences in large scenes are the shadows, seasonal changes to vegetation or water levels, and movable objects like cars. Construction progress can also help date an image.

The busy urban environment provides lots of opportunities for manual change detection. 1. Shadow length and direction changes. 2. Car locations change. 3. Construction vehicle locations change. 4. Foliage and canopy cover changes. 5. Pool and landscaping changes.


Rural areas require a trained eye to spot differences. 1. Plough markings change direction. 2. Shadow length changes. 3. Stream channel becomes dry. 4. Soil moisture changes. 5. Car locations change. How many differences can you see?


Different Processing?

You're sure it's the same image but they look so different. Why? Image processing could include mosaicing (which can potentially change the image resolution), orthorectification, colour and contrast adjustments, reprojections, format conversions and other enhancements.

One astute customer even pointed out small changes to the pixel registration when identical imagery is delivered through different WMS connections.

FBS 2018 Toronto imagery at 1:50 scale. Note the misalignment of the cross-walk by approximately 14cm (8cm imagery, error is +/- 16cm). Left: City of Toronto WMTS; Right: MapCast


Many over processed images contain 'artifacts' like speckled pixels, jagged edges on objects with straight lines or blurry transitions between colours where there should be a hard edge. If you can look past those differences to focus on only the image content, the similarities will tell you more than the differences. In the images below, we can tell they're the same because the golfers are in the same position in both.

FBS 2017 Toronto imagery at 1:150 scale. Left: heavily processed; Right: less processed. Notice the differences in pixel artifacts in the bright white of the sand traps. The sharp lines between sand and grass, and the shadows along the walkway become blurry, likely the result of a resampling algorithm used to average a pixel's colour with it's neighbours.

Have you still got questions about the imagery you're working with?  Do you know you'll need something newer and better? We welcome your questions and comments.

info@firstbasesolutions.com


http://www.firstbasesolutions.com/



Comments

Popular posts from this blog

Surveying The Customer Landscape

What's EPSG And How Do I Use It?

How To Work Remotely with FBS Imagery