EarthAnalytics Python installation

voir https://www.earthdatascience.org/workshops/setup-earth-analytics-python/

il faut installer git, conda (préferer miniconda) et bash

ensuite on peut créer l’environnement earthanalytics en utilisant le fichier yaml et la méthode décrite ici .

pour lister les environnements disponibles taper

conda env list

pour activer l’environnement taper

conda activate <nom_environnement>

Plus de détails sur la CheatSheet de Conda ici

Ensuite il faut choisir un éditeur de texte. Dans mon cas, vu que j’ai installé miniconda et l’environnement virtuel sous Linux en WSL, j’utilise Spyder depuis Windows en définissant le kernel, cf. install Spyder on WSL2 Linux , et notamment cette commande depuis le Shell

python -m spyder_kernels.console --matplotlib="inline" --ip=127.0.0.1 -f=~/remotemachine.json &

Ellipsoidal to orthometric altitudes for geoid model

The geoid models have sometimes a hard to decode format, such as IGN .mnt files https://geodesie.ign.fr/index.php?page=grilles

For now, I’ve just found third parties software that can do the format conversion, even if this could be doable with a bash or python script.

Such files can be converted to ASCII XYZ format with the demo version of Hydromagic, an hydrographic surveying software. Here is how to do :

https://www.eye4software.com/hydromagic/documentation/manual/utilities/geoid-file-conversion/

CRX to RNX

CRX is a Hatanaka compression for Rinex file used by many permanent networks. Y. Hatanaka of GSI wrote softwares for compress/uncompress, see https://terras.gsi.go.jp/ja/crx2rnx.html.

To use the software (example of CRX compression to uncompressed RNX format):

  • open a shell (e.g. Powershell)
  • go to the folder containing the executable
  • type “CRX2RNX.exe input.crx”
  • you get as output a RNX file that you can rename to .yyo
  • for help type “CRX2RNX.exe -h”

Use Reach M2 with Phantom 4 Pro

dataset :

photos with timestamps

UBX from M2 and from base station

Processing:

1- Post-Process base station and get corrected coordinates

2- with Emlid Studio (or RTKLib) calculate events (triggers) positions from UBX

as output one gets an …._events.pos file that gives corrected coordinates for each event together with a timestamp

3- extract timestamps from exif of pictures

use exiftool with this command

exiftool -n -csv -filemodifydate -api TimeZone=UTC *.JPG > metadata.csv

the output file “metadata.csv” gives the name of the picture and the timestamp in UTC format

4- Merge …_events.pos and pictures events

For now I’ve got this ugly Python script:

#! /usr/bin/env python

"""
Update Emlid Reach Survey points with PPK position output from RTKLIB
David Shean
dshean@gmail.com
Edited to fix Pandas datetime/Timestamp tz issues, and a few key changes likely based on Emlid updates
"""

import os
import argparse
import numpy as np
import pandas as pd

def getparser():
    parser = argparse.ArgumentParser(description='Update Emlid Reach Survey points with \
            PPK positions from RTKLIB')
    parser.add_argument('survey_pts_csv_fn', type=str, help='Survey point csv filename')
    parser.add_argument('ppk_pos_fn', type=str, help='PPK pos filename')
    return parser


def main():
    parser = getparser()
    args = parser.parse_args()

    survey_pts_csv_fn = args.survey_pts_csv_fn
    ppk_pos_fn = args.ppk_pos_fn

    print('Loading: %s' % survey_pts_csv_fn)
    survey_pts = pd.read_csv(survey_pts_csv_fn, parse_dates=[1], header=0)
    survey_pts['date']=pd.to_datetime(survey_pts['FileModifyDate'],format="%Y:%m:%d %H:%M:%S+00:00")
    survey_pts.sort_values('date', inplace=True)
    survey_pts.index=survey_pts['date']
    print(survey_pts.dtypes)
    print(survey_pts)
    header = 'Date UTC latitude(deg) longitude(deg)  height(m)   Q  ns   sdn(m)   sde(m)   sdu(m)  sdne(m)  sdeu(m)  sdun(m) age(s)  ratio'
    print('Loading: %s' % ppk_pos_fn)
    ppk_pos = pd.read_csv(ppk_pos_fn, comment='%', delim_whitespace=True, names=header.split(), parse_dates=[[0,1]])
    ppk_pos['date']=pd.to_datetime(ppk_pos['Date_UTC'])
    ppk_pos.index=ppk_pos['Date_UTC']
    print(ppk_pos.dtypes)
    print(ppk_pos)

    # Applying merge_asof on data and store it
    # in a variable
    merged_dataframe = pd.merge_asof(ppk_pos, survey_pts, right_index=True,left_index=True,direction='nearest',tolerance=pd.Timedelta("1s"))

    print(merged_dataframe)

    #Write out new file
    out_fn = os.path.splitext(survey_pts_csv_fn)[0]+'_merged.csv'
    print("Writing out: %s" % out_fn)
    merged_dataframe.to_csv(out_fn)    

if __name__ == "__main__":
    main()

5- In Metashape, import the coordinates of the cameras from the new file

Convert raster from ellispoidal height to orthometric altitude

Differences between ellipsoidal height and orthometric altitude (or geoid height) are explained here https://spatialthoughts.com/2019/10/26/convert-between-orthometric-and-ellipsoidal-elevations-using-gdal/

The definition of the projection is the PROJ4 line that can be found on https://spatialreference.org/

examples

  • using gdalwarp to convert from UTM33N-WGS84 ellipsoidal height to UTM33N-EGM96 geoid height

gdalwarp -s_srs "+proj=utm +zone=33 +datum=WGS84 +units=m +no_defs" -t_srs "+proj=utm +zone=33 +datum=WGS84 +units=m +no_defs +geoidgrids=egm96_15.gtx" input.tif output.tif

  • from Lambert93 planimetric and NGF-IGN69 (with RAF20) to WGS84 ellipsoidal heights

gdalwarp -s_srs "+proj=lcc +lat_0=46.5 +lon_0=3 +lat_1=49 +lat_2=44 +x_0=700000 +y_0=6600000 +ellps=GRS80 +units=m +no_defs +type=crs +geoidgrids=RAF20.gtx" -t_srs "+proj=longlat +datum=WGS84 +no_defs +type=crs" input.tif output.tif

  • from altitude to ellipsoidal height use such command (change srs and geoid grid if needed):

gdalwarp -s_srs "+proj=utm +zone=33 +datum=WGS84 +units=m +no_defs +geoidgrids=egm96_15.gtx" -t_srs "+proj=utm +zone=33 +datum=WGS84 +units=m +no_defs" input.tif output.tif

you can choose your input and output spatial reference using the proj4 syntax https://epsg.io/32633 , and note the add of the geoid file with +geoidgrids=egm96_15.gtx in the command.

if you want to use EGM2008 geoid, get the file from https://github.com/OSGeo/proj-datumgrid/blob/master/world/egm08_25.gtx and copy it to the proj directory. In Linux it’s

/usr/share/proj

then you can simply change the geoid file in the command +geoidgrids=egm08_25.gtx

Metashape script

API Python https://www.agisoft.com/pdf/metashape_python_api_2_0_0.pdf

FAQ scripting : https://agisoft.freshdesk.com/support/solutions/folders/31000114192

Liste des scripts: https://github.com/agisoft-llc/metashape-scripts

exemple d’utilisation de la console Python dans Metashape, pour décocher les orientations de caméras :

import Metashape
doc=Metashape.app.document
chunk=doc.chunk
for camera in chunk.cameras:
   camera.reference.rotation_enabled=0

image matching and alignment for the active chunk :


import Metashape 
chunk=Metashape.app.document.chunk
for frame in chunk.frames:
   frame.matchPhotos(downscale=1)
   chunk.alignCameras()

GPS time

GPS time system (recording the absolute number of seconds since the start of the GPS epoch at midnight on January 6th 1980)

Sometimes GPS time is given with an offset (e.g. 1000000000), in this case the 0 value corresponds to 2011/09/14 at 01:46:25 UTC.

Note also that POSIX (Unix) time is given since Jan 1st, 1970, 00:00:00 UTC.

convert to date

835734.87 6250812.50 -3.04 20 0 57560982
835734.88 6250296.85 -8.16 30 0 46351553
835734.88 6250476.48 -6.97 30 0 57559145

example of gps dataset

The gps epoch is in the sixth field, and the values are offset by 1 billion. To convert to date one can use Python.

>>> import datetime
>>> datetime.date.fromtimestamp(1315964785) #posix time of the GPS epoch including the 1 billion offset
datetime.date(2011, 9, 14)
>>> datetime.date.fromtimestamp(1315964782+57560982) # gps time of the dataset + the posix time
datetime.date(2013, 7, 11)