Vanuatu GIS Feed
banner
vanuatugis.bsky.social
Vanuatu GIS Feed
@vanuatugis.bsky.social
29 followers 3 following 25K posts
The Vanuatu GIS Feed will be used to aggregate site posts. It is monitored, but will probably not be very 'personal'. - @raetlomi.bsky.social Technology Feed @vanuatutech.bsky.social. 231124
Posts Media Videos Starter Packs
Optimize calculation of time series statistics for large amount of points with different date ranges
I have a feature collection with point locations of historic wildfires. I would like to map over this feature collection and for each feature, sample windspeed ('vs' band) from GRIDMET for the corresponding location and day (mean of that day and the following day). I have written code which does do this, however it takes about 3 hours for the output task to complete in GEE. I suspect this is because there is a large amount of points (>80,000) and in my existing code, for every point, the mean for across its two days is being calculated for the entire US before the value at that point is sampled! This is obviously not efficient, and since I plan to use this code to extract other weather variables that would require averaging over longe time periods (i.e., the prior 6 months) I fear that the processing time will become unmanageable. Some approaches I've tried: * Using clip within the map function, but per GEE Coding Best Practices this actually slows things down. * Clipping the GRIDMET image collection to just my study area (California) prior to mapping, but this increased processing time. * Using .filterBounds(point) prior to the mean calculation but when testing this outside the map function on a single feature, I see that it doesn't actually filter the GRIDMET collection at all. Either I'm implementing it incorrectly or it can't filter GRIDMET because its a single image for the entire US. * Increasing the scale - I tried increasing scale from 30 to 4000 (as GRIDMET resolution is 2.5 arc min, roughly 4km). I am testing this now but when testing for a single feature it did not improve performance. Ideally, I would like to update my map function to filter the image collection to just around a particular point before it calculates the mean across multiple days. If this is not possible, is there any way I could optimize my code to avoid calculating means across such a large area? Here is my full code. I have made the feature collection publicly viewable for testing. # wildfire points/dates WF_pts = ee.FeatureCollection('users/allanbkapoor/wildfires2') # GRIDMET GRIDMET = ee.ImageCollection('IDAHO_EPSCOR/GRIDMET') # windspeed band windspeed = GRIDMET.select('vs') # define function that gets pixels value for a single feature based on long/lat and date def get_single_date_value(feat): #get data for that feature from date column date = ee.Date(feat.get('DISCOVERY_DT')) date2 = date.advance(1, 'day') projection_obj = ee.Projection('EPSG:4326') point = feat.get('LONGITUDE_LATITUDE') scale = 30 #for single date: filter image collection to just the image for the feature's date windImage = windspeed.filterBounds(point).filterDate(date,date2).mean() point_value_fc = windImage.sample(point, scale, projection=projection_obj) point_value = point_value_fc.first().get("vs") return feat.set({'wind_speed': point_value}) #map the function over the feature collection WF_pts_wind = WF_pts.map(get_single_date_value)
gis.stackexchange.com
Loading and reprojecting .nc file without projection
I am trying to extract some bands from a .nc file. I don't know the file projection (if there is one), thus I created in GRASS GIS a newLocation with no projection (just a XY Cartesian plane) and tried to loaded the specific subset of the .nc I'm interested in. The file is states.nc, available here: https://luh.umd.edu/data.shtml specifically, LUH2 v2h Release 10/14/16. These are, for instance, the information of one subset of states.nc gdalinfo NETCDF:"/myPath/states.nc":primf Driver: netCDF/Network Common Data Format Files: /myPath/states.nc Size is 1440, 720 Origin = (-180.000000000000000,90.000000000000000) Pixel Size = (0.250000000000000,-0.250000000000000) Metadata: lat#axis=Y lat#long_name=latitude lat#standard_name=latitude lat#units=degrees_north lon#axis=X lon#long_name=longitude lon#standard_name=longitude lon#units=degrees_east I am working at European level, thus at this point I switch to my location with EPSG:3035 projection and set my computational region (to a raster map of Europe, EU_3035) and resolution (1000 meters). I want to create a mask of the European raster I have and load (while reprojecting in EPSG:3035) each raster band I obtained from states.nc (using Europe as a mask). for band in 1121 1122 1123 1124 1125 1126 1127 1128 1129 1130 1131 1132 1133 1134 1135 1136 1137 1138 1139 1140 1141 1142 1143 1144 1145 1146 1147 1148 1149 1150 1151 1152 1153 1154 1155 1156 1157 1158 1159 1160 1161 1162 1163 1164 1165 1166; do echo "CREATE A MASK OF EU, CROP AND REPROJECT BAND $band TO LAEA3035" r.mask --o raster=EU_3035 r.proj --o location=newLocation mapset=PERMANENT input='pastr.'$band output='pastr.'$band'_3035' r.mask -r done At this point, I get an error message: WARNING: file not found for location ERROR: Unable to get projection info of input map I can see on my file browser some files I usually have in my other locations (e.g., PROJ_INFO, PROJ_UNITS) are missing. I cannot frame the exact problem and I am not sure if my workflow, so far, is correct. I tried another approach with gdalwarp (that I used in the past to solve another projection problem). I first reproject the rasters I loaded from newLocation to WGS84. gdalwarp -t_srs '+proj=longlat +datum=WGS84 +no_defs +ellps=WGS84 +towgs84=0,0,0' -tr 0.01 0.01 /myPath/myRaster.tif /myPath/myRaster_WGS84.tif -overwrite I switch to my WGS84 location and load the files. Then if I check them with r.info and r.stats -c, I realize they are empty. > r.info map=myRaster_WGS84 +----------------------------------------------------------------------------+ | Map: myRaster_WGS84 Date: Fri Aug 12 17:14:28 2022 | | Mapset: PERMANENT Login of Creator: lisa | | Location: WGS84 | | DataBase: /myPath | | Title: | | Timestamp: none | |----------------------------------------------------------------------------| | | | Type of Map: raster Number of Categories: 0 | | Data Type: FCELL | | Rows: 100 | | Columns: 100 | | Total Cells: 10000 | | Projection: Latitude-Longitude | | N: 1N S: 0 Res: 0:00:36 | | E: 1E W: 0 Res: 0:00:36 | | Range of data: min = NULL max = NULL | | | | Data Description: | | generated by r.in.gdal | | | | Comments: | | r.in.gdal --overwrite input="/myPath/myRaster_WGS84.tif" \ | | output="myRaster_WGS84" memory=300 offset=0 num_digits=0 | | | +----------------------------------------------------------------------------+ I guess it has to do with the XY location I created, but I don't know which is the appropriate projection and I cannot find this information. Searching on the internet I couldn't find a way to properly load and reproject bands of a .nc file I'm interested in. I am quite new to GRASS GIS and this is the first .nc file I manage. I am using GRASS 7.8.6 on a server running Ubuntu 18.04.6 from a Windows 10 machine.
gis.stackexchange.com
Convert .osm/.pbf to .map for mapsforge
I have download latest version (0.46) of osmosis from https://wiki.openstreetmap.org/wiki/Osmosis#Downloading and mapsforge-map-writer-0.9.1.jar from http://search.maven.org/#search%7Cga%7C1%7Cmapsforge-map and map ("AM.osm") from http://be.gis-lab.info/data/osm_dump/dump/latest/ I'm on Windows-7 64-bit Professional and JDK 1.8u45 I unpacked osmosis to: c:\Progs\osmosis I created a folder called plugins in c:\Progs\osmosis (>> so: c:\Progs\osmosis\plugins) In the plugins folder I put the file mapsforge-map-writer-0.9.1.jar I put the file AM.osm.pbf (and AM.osm) into c:\Progs\osmosis I then opened a command-prompt window and executed the following commands: PS C:\Progs\osmosis> osmosis --rb file=AM.osm.pbf --mw file=AM.map PS C:\Progs\osmosis> osmosis --rx file=AM.osm --mw file=AM.map type=hd I got this error: ьр  23, 2018 9:46:43 AM org.openstreetmap.osmosis.core.Osmosis run INFO: Osmosis Version 0.46 ьр  23, 2018 9:46:43 AM org.openstreetmap.osmosis.core.Osmosis run INFO: Preparing pipeline. ьр  23, 2018 9:46:43 AM org.openstreetmap.osmosis.core.Osmosis main SEVERE: Execution aborted. java.lang.NoClassDefFoundError: gnu/trove/procedure/TShortIntProcedure at org.mapsforge.map.writer.model.MapWriterConfiguration.loadTagMappingFile(MapWriterConfiguration.java:351) at org.mapsforge.map.writer.osmosis.MapFileWriterFactory.createTaskManagerImpl(MapFileWriterFactory.java:58) at org.openstreetmap.osmosis.core.pipeline.common.TaskManagerFactory.createTaskManager(TaskManagerFactory.java:60) at org.openstreetmap.osmosis.core.pipeline.common.Pipeline.buildTasks(Pipeline.java:51) at org.openstreetmap.osmosis.core.pipeline.common.Pipeline.prepare(Pipeline.java:112) at org.openstreetmap.osmosis.core.Osmosis.run(Osmosis.java:86) at org.openstreetmap.osmosis.core.Osmosis.main(Osmosis.java:37) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.lang.reflect.Method.invoke(Unknown Source) at org.codehaus.plexus.classworlds.launcher.Launcher.launchStandard(Launcher.java:330) at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:238) at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415) at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356) at org.codehaus.classworlds.Launcher.main(Launcher.java:47) Caused by: java.lang.ClassNotFoundException: gnu.trove.procedure.TShortIntProcedure at org.java.plugin.standard.StandardPluginClassLoader.loadClass(StandardPluginClassLoader.java:330) at java.lang.ClassLoader.loadClass(Unknown Source) ... 16 more
gis.stackexchange.com
CRS for NOAA snow cover data
I am helping an undergrad student with dissertation work. The student wants to analyse this data: https://www.ncei.noaa.gov/products/climate-data-records/snow-cover-extent With CDO I created annual means then separated each into an individual file, then converted to GeoTIFF with GDAL. The values are organised using the "NOAA NMC limited area fine mesh grid" NASA Panoply can cope with the grid (i.e. plotting it on a map) but QGIS and ArcGIS Pro cannot (see attached screengrab). Ultimately the student wants to clip out a region and I can't see managing to do that if the ROI is not clearly identifiable and the CRS is unknown. How can I create a bespoke CRS using the metadata (see below). coordinates=latitude longitude flag_meanings=no_snow snow_covered flag_values={0,1} grid_mapping=coord_system long_name=NOAA/NCDC Climate Data Record of snow cover extent missing_value=-9 NETCDF_DIM_time=19631.5 NETCDF_VARNAME=snow_cover_extent standard_name=surface_snow_binary_mask STATISTICS_MAXIMUM=1 STATISTICS_MEAN=0.092329545454545 STATISTICS_MINIMUM=0 STATISTICS_STDDEV=0.28949058791384 STATISTICS_VALID_PERCENT=100 _FillValue=-9 More information coord_system#grid_mapping_name=latitude_longitude coord_system#longitude_of_central_meridian=0 coord_system#semimajor_axis=6378137 coord_system#semiminor_axis=6356752.3 NC_GLOBAL#CDI=Climate Data Interface version 1.8.2 (http://mpimet.mpg.de/cdi) NC_GLOBAL#cdm_data_type=Grid NC_GLOBAL#CDO=Climate Data Operators version 1.8.2 (http://mpimet.mpg.de/cdo) NC_GLOBAL#cdr_program=NOAA Climate Data Record Program for satellites NC_GLOBAL#cdr_variable=snow_cover_extent NC_GLOBAL#Conventions=CF-1.6 NC_GLOBAL#date_created=2022-01-04T22:24:20Z NC_GLOBAL#frequency=year NC_GLOBAL#geospatial_lat_max=90 NC_GLOBAL#geospatial_lat_min=0 NC_GLOBAL#geospatial_lat_units=degrees_north NC_GLOBAL#geospatial_lon_max=180 NC_GLOBAL#geospatial_lon_min=-180 NC_GLOBAL#geospatial_lon_units=degrees_east NC_GLOBAL#institution=Global Snow Lab, Center for Environmental Prediction, Rutgers University NC_GLOBAL#keywords=EARTH SCIENCE > CRYOSPHERE > SNOW/ICE > SNOW COVER, EARTH SCIENCE > TERRESTRIAL HYDROSPHERE > SNOW/ICE > SNOW COVER, EARTH SCIENCE > CLIMATE INDICATORS > CRYOSPHERIC INDICATORS > SNOW COVER NC_GLOBAL#keywords_vocabulary=NASA Global Change Master Directory (GCMD) Earth Science Keywords, Version 8.0 NC_GLOBAL#license=No restrictions on access or use NC_GLOBAL#Metadata_Conventions=CF-1.6, Unidata Dataset Discovery v1.0, NOAA CDR v1.0, GDS v2.0 NC_GLOBAL#metadata_link=https://doi.org/10.7289/V5N014G9 NC_GLOBAL#naming_authority=gov.noaa.ncdc NC_GLOBAL#platform=ESSA, NOAA POES, SMS, DMSP, GOES, TIROS, METEOSAT, GMS, TERRA, AQUA, METOP NC_GLOBAL#product_version=v01r01 NC_GLOBAL#sensor=VIDEO CAMERA, VISSR, VAS, VHRR, AVHRR, VISSR-GMS, VISSR-METEOSAT, SEVIRI, MODIS, AMSU-B, AMSR-E, SSMIS, VIIRS NC_GLOBAL#source=NOAA NH Weekly SCE, NIC NH IMS SCE NC_GLOBAL#spatial_resolution=Minimum cell area 10676.8 km^2, maximum cell area 41804.6 km^2 NC_GLOBAL#standard_name_vocabulary=CF Standard Name Table (v22, 12 February 2013) NC_GLOBAL#summary=The data record for the NH SCE CDR spans from October 4, 1966 to present. Prior to June 1999 the NH SCE CDR is based on satellite-derived maps of NH SCE produced weekly by trained NOAA meteorologists. Early NH SCE maps were based on a visual interpretation of photographic copies of shortwave imagery. Analysts incorporated various sources of imagery into the SCE mapping process as they became available (e.g. AVHRR, VAS). In June 1999 NOAA NH SCE maps were replaced by SCE output from the Interactive Multisensor Snow and Ice Mapping System (IMS) at the National Ice Center (NIC). SCE output from the NIC IMS is processed at Rutgers University and appended to the NH SCE CDR to form a cohesive, long-term climate record of SCE. NC_GLOBAL#time_coverage_end=2022-01-03 NC_GLOBAL#time_coverage_start=1966-10-04 NC_GLOBAL#title=Climate Data Record (CDR) of Northern Hemisphere (NH) Snow Cover Extent (SCE) (CDR Name: Snow_Cover_Extent_NH_IMS_Robinson) NETCDF_DIM_EXTRA={time} NETCDF_DIM_time_DEF={1,6} NETCDF_DIM_time_VALUES=19631.5 snow_cover_extent#coordinates=latitude longitude snow_cover_extent#flag_meanings=no_snow snow_covered snow_cover_extent#flag_values={0,1} snow_cover_extent#grid_mapping=coord_system snow_cover_extent#long_name=NOAA/NCDC Climate Data Record of snow cover extent snow_cover_extent#missing_value=-9 [![enter image description here][1]][1]snow_cover_extent#standard_name=surface_snow_binary_mask snow_cover_extent#_FillValue=-9 time#axis=T time#bounds=time_bnds time#calendar=standard time#long_name=time time#standard_name=time time#units=days since 1966-10-03 Dimensions X: 88 Y: 88 Bands: 1
gis.stackexchange.com