Products

Solutions

Resources

Contact Sales

Raising the bar on geospatial AI


Oct 2024
Dr Michael Bewley, VP AI & Computer Vision

See how you can identify 130+ different mapped features at speed and make accurate decisions with Nearmap Gen 6 AI.

Oct 2024
Dr Michael Bewley, VP AI & Computer Vision

“I thought we could do this with maybe ten people in two years. It’s taken six years and 20–30 people.”

Dr Michael Bewley, Vice President, AI & Computer Vision, Nearmap.
With the 2024 launch of Nearmap Gen6 AI in October 2024, Dr Michael Bewley, VP of AI and Computer Vision at Nearmap – Mike, as he’s known among Nearmappers – took to the Discovery Stage at the SXSW Sydney 2024 Tech & Innovation Expo.
Mike reflected on the path from concept and inception in 2017 to the present, with more than 130 different features identified by AI at a click, available on all Nearmap multi-angle surveys flown.
When Mike Bewley joined the Nearmap AI team in 2017, Machine Learning (ML) and Artificial Intelligence (AI) was in its ascendance. Nearmap saw an early opportunity to combine AI with with aerial imagery to automate some of the answers our customers were seeking.
Some of those initial requests included: “Can you help us find which houses have swimming pools? And: “How can I see roof outlines automatically instead of having to do it by hand?”
In 2017, Mike realised the potential that lay within the ever-expanding Nearmap data stack, home to tens-of-petabytes of imagery (one petabyte = one million gigabytes), along with the deep intellectual property driving the technology behind it. The camera system used in all Nearmap captures is Australian designed and built, unique to Nearmap.

Nearmap Gen 6 AI show building footprint

As a result, the entire Nearmap AI–ML process exists in-house from end to end, rather than outsourcing to external providers who would lack a deep understanding of the technology, the data, and its capabilities.
That is how Nearmap became home to a single, unified machine learning system, powered by a giant deep learning model at its heart, beginning with just a few feature layers, like swimming pools, and solar panels.
Today, Nearmap Gen 6 AI identifies over 130+ different objects with a click, at scale, supported by a suite of built-in capabilities: scores, APIs, visualisation tools and integrations.
“I remember early on someone suggested to me that there were perhaps a couple of dozen things that were worth recognising in the world… how wrong that was – the work is never done,” said Mike Bewley, reflecting on creating those first few Nearmap AI layers back in 2017.
The ability to quickly see features at scale enables greater analyses across wider areas of interest, and more efficiently than sending crews out on the ground, or looking at aerial imagery to try and detect damage or change with the human eye, marking-up manually. 
Armed with imagery AI insights, you can be better equipped to understand and analyse locations, and to see in high-resolution, how they’re changing. 
Geospatial AI – identifying property and natural characteristics in aerial imagery – is also playing a core role in disaster response. The Nearmap ImpactResponse program (currently available in Australia and North America) takes to the skies as soon as safely possible after catastrophic disasters to capture high-resolution imagery. 
The imagery alone tells a shocking story about the damage caused by disasters. But with a level of accuracy and speed unobtainable by the human eye, Nearmap AI identifies the damage status of buildings and infrastructure in the area, along with information on elements including the amount of junk and wreckage strewn across streets. 
These insights help clarify the extent of damage at scale, within seconds, without the need for manual human analysis. This data provides insurers more reliable insights that enable fast, accurate decisions for policyholders, when they’re needed most.
In the days leading up to Mike’s presentation at SXSW Sydney in October 2024, the Nearmap team had been working around the clock to monitor and collect data on hurricanes that had struck the United States: Hurricane Francine hit Louisiana 11th–14th September; Hurricane Helene hit Florida and nearby states on 26th–29th September, and Hurricane Milton struck Florida weeks later on 10th October. 
Across those three events – Francine, Helene and Milton – Nearmap flew over 300 flights, capturing and recapturing the damaged areas amounting to more than 70,000 square kilometres. 
Hurricane Milton, FL US

The image above shows the impact of Hurricane Milton on neighbouring communities, with smaller mobile-style homes on the left, and shingle-roofed homes built from more permanent materials on the right.
Although the same area was subjected to the same weather conditions, the difference in the impact becomes clear, with Computer Vision and AI data insights.

“I’ve spoken with disaster relief organisations in Australia and the US, and veterans of the industry have been staggered by the speed we’re able to achieve. Until recently, you were fortunate to get imagery processed within days, and then spend weeks with volunteers combing through the pictures looking manually for damaged houses.”

Dr Michael Bewley, VP AI & Computer Vision, Nearmap

Bring sharper insights into your workflow.

Disasters don’t take breaks – which is why the Nearmap AI team is ready to act quickly, giving insurers and responders the accurate data that helps communities recover and rebuild safe with certainty.
Get a Demo