Now Reading
Producing aerial imagery along with your iPhone’s LiDAR sensor

Producing aerial imagery along with your iPhone’s LiDAR sensor

2023-03-13 16:03:41

This technical information particulars how one can create your individual aerial imagery (aka satellite tv for pc view/bird mode/orthorectified imagery) and 3D fashions of streets with the inbuilt iPhone LiDAR sensor (iPhone Professional 12 or later, 2020+ iPad Professional) and open supply instruments within the OpenDroneMap bundle. All it’s essential to do to seize the mannequin is stroll round along with your iPhone at floor degree.

The pedestrianised Margaret Avenue, Sydney with non permanent remedy. Imagery captured with a handheld iPhone 14 Professional. Imagery at https://map.openaerialmap.org/#/-18.6328125,18.562947442888312,3/latest/
Picture by Metropolis of Sydney. The George Avenue Delight flag mission is a part of the NSW Authorities’s Streets as Shared Areas program. https://www.cityofsydney.nsw.gov.au/improving-streets-public-spaces/closure-george-street-north

Normally for such a job you’ll use a drone and course of with WebODM (or Pix4D), however there are areas which can be unsafe or unlawful to fly in. I’ve beforehand detailed find out how to generate imagery using a bicycle helmet mounted GoPro camera, nevertheless this could embody artifacts the place there are many folks. The helmet digital camera technique requires a good GPS lock (unsuitable indoors, city areas or beneath a bridge) and has comparatively low element.

Once more, why may you need to do that? With your individual excessive element and up-to-date fashions and avenue imagery you possibly can:

  • Map new avenue interventions, like bollards, modal filters or raised crossings
  • Report pothole places (and their depth!)
  • Take measurements comparable to highway and cycleway widths round crowds of individuals in city centres
  • Measure footpath obstructions in 3D and charge pedestrian amenity
  • Survey options beneath giant highways
  • Survey avenue parking utilizing the brand new OSM spec: wiki.openstreetmap.org/wiki/Street_parking
  • Map indoor pedestrian areas in OpenStreetMap for higher pedestrian routing
  • Connect your iPhone to your bike and generate LiDAR level clouds of the kerb and cycleway infrastructure (it really works, simply go sluggish!)

This technique leads to very excessive element (5mm decision if desired) 3D fashions and correct orthoimagery. Guide georeferencing is required (which I additionally clarify find out how to do) which limits the arrogance in alignment. It is a proof of idea – in case you have corrections/options/concepts to enhance the strategy, please remark beneath or on Mastodon!

Notice: This technique additionally offers an answer to creating 2.5D oblique orthophotos from drone imagery.

This information covers find out how to:

  • Seize a 3D mannequin utilizing 3d Scanner App (beneficial) or Polycam
    • The iPhone LiDAR sensor has 5 metres max vary, so that you’ll have to stroll round
  • Export the mannequin to an .obj file with textures
  • Rotating the mannequin in Blender to the required orientation
  • Use the odm_orthophoto program contained in the OpenDroneMap Docker container to generate a raster .tiff
  • Georeference the tiff utilizing QGIS
  • Importing the Geotiff to OpenAerialMap to generate a tileset, viewable within the OpenStreetMap iD editor or a Felt map with a customized layer

Capturing a 3D mannequin on an supported iPhone is straightforward. I like to recommend utilizing the app titled 3d Scanner App because it permits appreciable customisation of the scan settings. It permits ending a scan and lengthening later, although this may be buggy. I haven’t had a crash throughout seize – I’ve had Polycam crash midway by way of a big scan shedding all knowledge.

Obtain 3d Scanner App and use the LiDAR Superior mode. I like to recommend the next choices for scanning streets:

  • Confidence to low. This extends the vary of the LiDAR sensor readings used on the expense of extra noise. You possibly can clear up this noise within the processing settings or Blender.
  • Vary to five.0 metres
  • Masking to None
  • Decision to 50mm (the bottom – for giant fashions like streets)

Within the app settings, make sure that to set:

  • GPS tag scans to ON
  • Items to metric

When scanning a avenue, stroll (or cycle) slowly with a sweeping movement to extend the width. If the realm is large sufficient to require a grid sample, comply with the identical form as a drone survey (an S-shape with appreciable overlap). Not sufficient overlap or greater speeds imply the linear passes don’t join appropriately as a consequence of (I assume) inertial measurement unit drift. I’m uncertain if the GPS info is used within the sensor fusion (via ARKit), please remark if you already know!

View of the finished mannequin

Within the 3d Scanner App use the Share button, then choose the .obj file sort. Ship this to your pc (Airdrop works nice if utilizing macOS). If utilizing Polycam, set “Z axis up” within the mesh export settings.

Rotating the mannequin into the right orientation (required for 3d Scanner App)

Sadly the 3d Scanner App exports objects with the Z axis as “up”, whereas the odm_orthophoto program expects the Y axis to be “up”. Confusingly, you may skip this step if utilizing Polycam if exporting with “Z axis up” within the mesh export settings, although Blender reveals the Y axis as up on this export. If you already know why that is, please depart a remark!

To rotate the mannequin, import it to Blender and rotate it 90 levels.

  • First, set up Blender by way of your most popular technique (https://www.blender.org/download/).
  • Open Blender, delete the preliminary default dice (proper click on -> delete, or x hotkey)
  • Import the .obj file: File -> Import -> Wavefront (.obj)
  • (non-obligatory: you may view the gorgeous texture by deciding on “viewport shading” within the prime proper (the horizontal listing of sphere icons))
Mannequin showing in appropriate orientation in Blender, earlier than rotating for export
  • To rotate
    • Click on the item and ensure it’s chosen (orange border)
    • Press hotkey r (from any view)
    • Press x to solely permit rotation on X axis
    • Kind 90 (or desired levels to rotate)
  • Optionally available: You possibly can examine if the rotation is appropriate by urgent numpad key 1. When you don’t have a numpad you will have to allow numpad emulation (see directions at https://www.hack-computer.com/post/how-to-emulate-a-third-mouse-button-and-keypad-for-blender).
    • The rotation is appropriate in case you have a “birds eye view” within the numpad key 1 view, the place the blue Z axis is in the direction of the highest of display screen and the purple X axis is in the direction of the precise of display screen
Appropriate orientation for export to odm_orthophoto. Not the axis show on the prime proper.
  • File -> Export as an .obj to the identical folder with a brand new title (eg. blender_export.obj)
    • Notice: Blender doesn’t create a brand new texture .jpg. When you export to a distinct folder the trail to the .jpg within the .mtl file will want updating.

Use the odm_orthophoto command line instrument to generate a raster orthophoto from a .obj file. This instrument is accessible at https://github.com/OpenDroneMap/odm_orthophoto however has a substantial variety of dependencies.

I imagine the simplest technique presently is to put in WebODM domestically, copy the .obj and texture recordsdata (.mtl and .jpg) into the Docker container after which run this system from contained in the Docker container.

Putting in WebODM domestically

Operating the software program utilizing Docker is a breeze. Set up Docker from https://www.docker.com/ (or your most popular technique) after which:

git clone https://github.com/OpenDroneMap/WebODM --config core.autocrlf=enter --depth 1
cd WebODM
./webodm.sh begin 

See https://github.com/OpenDroneMap/WebODM#getting-started for extra particulars. WebODM itself is great and nice enjoyable in case you have a drone!

Copying the item into the ODM Docker container

You can begin a shell within the container with the next command:

docker exec -it webodm_node-odm_1 /bin/bash

Make a brand new listing to maintain your recordsdata in

mkdir /iphone_model
cd /iphone_model

In one other shell, copy the item and texture recordsdata out of your native machine into the brand new Docker container folder. docker cp can solely copy one file at a time.

cd path/to/your/mannequin/
docker cp blender_export.obj webodm_node-odm_1:/iphone_model/
docker cp blender_export.mtl webodm_node-odm_1:/iphone_model/
# Notice: The blender .obj export does not create a brand new texture .jpg
#   In case your Blender export wasn't in the identical listing, examine
#   replace the trail in blender_export.mtl
docker cp textured_output.jpg webodm_node-odm_1:/iphone_model/

Operating odm_orthophoto

Within the shell you began within the docker container above, run the next command:

cd /iphone_model/
/code/SuperBuild/set up/bin/odm_orthophoto -inputFiles blender_export.obj -logFile log.txt -outputFile orthophoto.tif -resolution 100.0 -outputCornerFile corners.txt

The decision argument is what number of pixels per metre – this may increasingly require altering.

See Also

Exporting the orthophoto out of the Docker container

To repeat the generated orthophoto out, from a shell in your native machine run:

docker cp webodm_node-odm_1:/iphone_model/orthophoto.tif .

Use the same command to extract the log file if required.

Georeferencing is the method of specifying the situation and orientation of the picture so it completely aligns with maps or GIS software program. Whereas a tough location (with a reasonably incorrect rotation) is saved within the mannequin, it seems to be eliminated by the Blender rotation step. If you know the way to repair this please remark beneath!

To do that:

  • Set up QGIS by your most popular technique: https://www.qgis.org/en/site/forusers/download.html
  • Set up the plugins (by way of the Plugins -> Handle & Set up plugins… menu)
    • QuickMapServices (to drag in Bing satellite tv for pc imagery simply)
    • Freehand raster georeferencer (a newbie pleasant georeferencing instrument)
  • Add a Bing satellite tv for pc base layer: Net -> QuickMapServices -> Bing -> Bing Satellite tv for pc
  • Zoom & pan to the tough location of the 3d scan (the preliminary .tif location might be wherever you’re viewing)
  • Drag the .tif output by the earlier step into the sidebar (it gained’t be seen but as it’s not aligned)
  • Go to Raster-> Freehand raster georeferencer -> Add raster for freehand georeferencing and choose the identical .tif
  • Use the Transfer, Rotate and scale buttons within the toolbar to align your orthophoto with the imagery background (tip. Maintain Cmd or Ctrl earlier than scaling to maintain the facet ratio)
Buttons to maneuver/scale/rotate
Aligned to the close by buildings
  • Click on the “Export raster with world file” button (Inexperienced on the precise with exclamation marks).
  • Verify the “Solely export world file for chosen raster” button. Ensure that to do that earlier than chosing the picture path.
  • Choose the present .tif picture and press OK
  • Take away the orthophoto from the QGIS sidebar (proper click on -> take away layer)
  • Drag the present .tif picture again into the sidebar. QGIS will now discover the worldfiles subsequent to it (orthophoto.tif.aux.xml and orthophoto.tfw) so will probably be positioned in the precise place

Export geo-referenced GeoTIFF (with out worldfile)

If you need to add the GeoTIFF to OpenAerialMap or some other place, you will have to “bake in” the situation into the GeoTIFF itself, somewhat than within the worldfile – OpenAerialMap can’t learn the worldfile.

To do that:

  • proper click on your orthophoto layer (after the above steps) and click on Export -> Save As…
  • Set CRS to your required coordinate system (if not but in a coordinate system, I assume you should use EPSG 3857 if you want it to be aligned with OpenStreetMap tiles, however that is the restrict of my present understanding – I haven’t studied surveying but!).
  • To keep away from confusion, create a brand new subfolder and reserve it with the default settings (eg. make folder qgis_export and save as orthophoto.tif).
  • You now have a pleasant georeferenced GeoTIFF!

If you would like the imagery to be publicly viewable and accessible from the OpenStreetMap iD editor, OpenAerialMap is a free place to host your imagery.

That is the imagery from the above instance: https://map.openaerialmap.org/#/-18.6328125,18.562947442888312,3/latest/

I’ve heard of plans for a relaunch of the web site, however presently the add type could be finicky.

  • Open the discover web page: https://map.openaerialmap.org/
  • Sign up (solely Google & FB Oauth supported)
  • Press add
    • At present importing from native file doesn’t seem to work, see https://github.com/hotosm/OpenAerialMap/issues/158 for updates
    • Importing by way of Google Drive with my account (2fa enabled, Gsuite) fails with This app is blocked: This app tried to entry delicate data in your Google Account. To maintain your account secure, Google blocked this entry.
    • Utilizing a URL is probably going the one manner. Creating an S3 bucket is a technique. When you have a quick connection it will be sooner to run an area webserver with Python and working ngrok to make it publicly accessible. I like to recommend not conserving this server working for longer than essential. Eg:
cd qgis_export
python3 -m http.server 8080
ngrok http 8080
# Your file is now accessible at https://SOME_PATH.ngrok.io/orthophoto.tif

Specify this url within the type and add different particulars, then press add.

  • Guide alignment limits the true world accuracy of images
  • Drift throughout lengthy mannequin captures happens. My understanding is drift happens extra when there are sudden or quick actions. The 3d Scanner App sadly doesn’t warn you while you’re shifting to quick, however Polycam does. So far as I do know, the iOS ARKit doesn’t try to reconcile drift when finishing a loop/circuit.
  • Automation! This course of is sluggish nevertheless it works.
    • Including a Makefile or different compile tooling to https://github.com/OpenDroneMap/odm_orthophoto would skip the requirement to put in WebODM and switch recordsdata to/from the Docker container
    • Rotating the mannequin could possibly be added (behind a flag to be backwards suitable) to the odm_orthophoto script
  • Producing pointclouds (supported by 3d Scanner App) after which exporting as a raster from CloudCompare. This may make bigger captures attainable.
    • If there’s a manner of addressing drift of pointclouds for a number of captures – let me understand how!
  • Georeferencing utilizing floor management factors somewhat than a freehand referencer
  • Creating avenue facade montages and evaluating doorways & gentle edges (Jan Gehl (1986) “Mushy edges” in residential streets, Scandinavian Housing and Planning Analysis,3:2,89-102, DOI: 10.1080/02815738608730092)

Let me know in case you have any corrections/options/suggestions!



Source Link

What's Your Reaction?
Excited
0
Happy
0
In Love
0
Not Sure
0
Silly
0
View Comments (0)

Leave a Reply

Your email address will not be published.

2022 Blinking Robots.
WordPress by Doejo

Scroll To Top