Vanuatu GIS Feed
banner
vanuatugis.bsky.social
Vanuatu GIS Feed
@vanuatugis.bsky.social
The Vanuatu GIS Feed will be used to aggregate site posts. It is monitored, but will probably not be very 'personal'. - @raetlomi.bsky.social
Technology Feed @vanuatutech.bsky.social.
231124
Batch Task Execution of a Feature Collection in Google Earth Engine
I have generated a Feature Collection for a geometry with 6 polygons. Formatted the table and then exported the first three feature collections having over 1000 properties. However the last batch file times out giving a computation error. Link to the code: https://code.earthengine.google.com/6c29596851cc826a98b73fdfc0dde165 Asset Link: https://code.earthengine.google.com/?asset=projects/ypm-rs-ml/assets/farms_6 . . // Part of the code used to generate the last feature collection to be exported as table later // Apply smoothing var oeel = require('users/OEEL/lib:loadAll'); var order = 3; var sgFilteredCol = oeel.ImageCollection.SavatskyGolayFilter( regularCol, maxDiffFilter, distanceFunction, order) // This fails as computation times out print(sgFilteredCol.size(), sgFilteredCol.first()) // Table 4: var viBandsSmoothed = sgFilteredCol.select(bandsListForSmoothing) //print('viBandsSmoothed', viBandsSmoothed) var vegIndicesSmoothed = viBandsSmoothed.map(function(image) { var withStats = image.reduceRegions({ collection: geometry, reducer: ee.Reducer.mean(), scale: 10 }).map(function(feature) { return feature.set('imageId', ee.Date(image.get('system:time_start')).format('YYYY-MM-dd')) }) return withStats }).flatten() var format = function(table, rowId, colId) { var rows = table.distinct(rowId); var joined = ee.Join.saveAll('matches').apply({ primary: rows, secondary: table, condition: ee.Filter.equals({ leftField: rowId, rightField: rowId }) }); return joined.map(function(row) { var values = ee.List(row.get('matches')) .map(function(feature) { feature = ee.Feature(feature); // ['ndvi','gndvi','evi2','savi','ndre','ndwi','rvi','dvi','mcari'] var ndvi = ee.List([feature.get('d_0_ndvi'), -9999]).reduce(ee.Reducer.firstNonNull()); var gndvi = ee.List([feature.get('d_0_gndvi'), -9999]).reduce(ee.Reducer.firstNonNull()); var evi2 = ee.List([feature.get('d_0_evi2'), -9999]).reduce(ee.Reducer.firstNonNull()); var savi = ee.List([feature.get('d_0_savi'), -9999]).reduce(ee.Reducer.firstNonNull()); var ndre = ee.List([feature.get('d_0_ndre'), -9999]).reduce(ee.Reducer.firstNonNull()); var ndwi = ee.List([feature.get('d_0_ndwi'), -9999]).reduce(ee.Reducer.firstNonNull()); var rvi = ee.List([feature.get('d_0_rvi'), -9999]).reduce(ee.Reducer.firstNonNull()); var dvi = ee.List([feature.get('d_0_dvi'), -9999]).reduce(ee.Reducer.firstNonNull()); var mcari = ee.List([feature.get('d_0_mcari'), -9999]).reduce(ee.Reducer.firstNonNull()); return [[ee.String(feature.get(colId)).cat('_ndvi'), ee.Number(ndvi).format('%.3f')], [ee.String(feature.get(colId)).cat('_gndvi'), ee.Number(gndvi).format('%.3f')], [ee.String(feature.get(colId)).cat('_evi2'), ee.Number(evi2).format('%.3f')], [ee.String(feature.get(colId)).cat('_savi'), ee.Number(savi).format('%.3f')], [ee.String(feature.get(colId)).cat('_ndre'), ee.Number(ndre).format('%.3f')], [ee.String(feature.get(colId)).cat('_ndwi'), ee.Number(ndwi).format('%.3f')], [ee.String(feature.get(colId)).cat('_rvi'), ee.Number(rvi).format('%.3f')], [ee.String(feature.get(colId)).cat('_dvi'), ee.Number(dvi).format('%.3f')], [ee.String(feature.get(colId)).cat('_mcari'), ee.Number(mcari).format('%.3f')], ]; }); return row.select([rowId]).set(ee.Dictionary(values.flatten())); }); }; var timeSeriesSmoothed = format(vegIndicesSmoothed, 'Farm_Id', 'imageId'); Export.table.toDrive({ collection: timeSeriesSmoothed, description: 'smoothed', folder: 'earthengine', fileNamePrefix: 'smoothed', fileFormat: 'CSV'}) Tried a couple of Batch export options and also to export the table as an Asset if that helps. It was not helpful. Is there any way to perform the operation tile-wise and then export in batches or export the final feature collection itself in batches?
gis.stackexchange.com
December 7, 2025 at 8:10 AM
Using PyKrige for ordinary kriging on an XYZ file
I've previously used ArcGIS for Ordinary Kriging, but I now need a Python alternative. I found the PyKrige module which i decided to give a try, but I am bit confused by it. So far, I have read my XYZ file into a numpy array like so import matplotlib.pyplot as plt import numpy as np import pykrige.kriging_tools as kt from pykrige.ok import OrdinaryKriging from numpy import genfromtxt np.set_printoptions(suppress=True) my_data = genfromtxt('C:/myfile.xyz') data = np.array(my_data) print(data) which prints my data like so [[626769.27 233220.76 14.66] [626768.91 233221.02 14.65] [626768.58 233221.26 14.64] ... However, the next part of the PyKrige example found at https://geostat-framework.readthedocs.io/projects/pykrige/en/stable/examples/00_ordinary.html#sphx-glr-examples-00-ordinary-py uses the following code after reading the data into an array gridx = np.arange(0.0, 5.5, 0.5) gridy = np.arange(0.0, 5.5, 0.5) OK = OrdinaryKriging( data[:, 0], data[:, 1], data[:, 2], variogram_model="linear", verbose=False, enable_plotting=False, ) z, ss = OK.execute("grid", gridx, gridy) kt.write_asc_grid(gridx, gridy, z, filename="pykrigoutput.asc") plt.imshow(z) plt.show() This is what's confusing me. I don't know how the gridx and gridy are determined? Similarly, is z, ss the variable name or what is this? My output will be a GeoTiff (hopefully that's supported) and I hope to set an output cell size of 0.5 But can anyone advise me how to successfully use this module as the documentation gives no explanation of the example.
gis.stackexchange.com
December 7, 2025 at 7:07 AM
Using Raster calculator gives ERROR 000539
I'm new with ArcGIS. I would like to do a calculate of NDWI. So I'm using Arctoolbox => Spatial analyst tools => Map Algebra => Raster calculator. But when I'm trying to perform it, I get this error : Executing: RasterCalculator "Float("LT05_L1TP_196030_19990113_20180215_01_T1_sr_band2.tif" - "LT05_L1TP_196030_19990113_20180215_01_T1_sr_band5.tif") / Float("LT05_L1TP_196030_19990113_20180215_01_T1_sr_band2.tif" + "LT05_L1TP_196030_19990113_20180215_01_T1_sr_band5.tif")" D:\Users\paulc\Documents\SIG_ZHM\NDWI\1999_01_13_NDWI.tif Start Time: Thu Oct 24 15:42:34 2019 Float(Raster(r"LT05_L1TP_196030_19990113_20180215_01_T1_sr_band2.tif") - Raster(r"LT05_L1TP_196030_19990113_20180215_01_T1_sr_band5.tif")) / Float(Raster(r"LT05_L1TP_196030_19990113_20180215_01_T1_sr_band2.tif") + Raster(r"LT05_L1TP_196030_19990113_20180215_01_T1_sr_band5.tif")) ERROR 000539: Error running expression: rcexec() Traceback (most recent call last): File "", line 1, in File "", line 2, in rcexec File "c:\program files (x86)\arcgis\desktop10.3\arcpy\arcpy\__init__.py", line 21, in from arcpy.geoprocessing import gp File "c:\program files (x86)\arcgis\desktop10.3\arcpy\arcpy\geoprocessing\__init__.py", line 14, in from _base import * File "c:\program files (x86)\arcgis\desktop10.3\arcpy\arcpy\geoprocessing\_base.py", line 598, in env = GPEnvironments(gp) File "c:\program files (x86)\arcgis\desktop10.3\arcpy\arcpy\geoprocessing\_base.py", line 595, in GPEnvironments return GPEnvironment(geoprocessor) File "c:\program files (x86)\arcgis\desktop10.3\arcpy\arcpy\geoprocessing\_base.py", line 548, in __init__ import weakref File "D:\Programmes (x86)\Python\Lib\weakref.py", line 14, in from _weakref import ( ImportError: cannot import name _remove_dead_weakref Failed to execute (RasterCalculator). How can I fix this? I have already tried to change the wayfolder for a shorter way, without space or special character, but it's still like there is a problem with the wayfolder.
gis.stackexchange.com
December 7, 2025 at 6:08 AM
Least-cost paths observing maximum slope level in QGIS
Another inquiry as concerns my comeback geofictional project. Over the past several days, I've returned to making use of QGIS' tools for least-cost paths (at least on version 3.30, after experimenting with them on 2.18 several years ago). Last weekend's challenge: Generating highways as long as they do not exceed a maximum slope-percentage limit, in my case 5% (1:20). (As recommended by a Quora user several years ago; U.S. federal limits are slightly higher, with 6% allotted in mountainous/hilly areas and 7% in cases where the speed limit is lower than 60 mph.) And for a good reason: In my "study area" (a term others prefer to use), there are no contiguous (i.e. continuous without a break) stretches of land below 5%, not helped by the fact that it's highly volcanic in origin. Said stretches, some of them confined to the coast, only last as far as several kilometres at best before higher slopes come in. To my credit, I've tried out several dozen Raster Calculator combinations so far (involving slope values/levels, aspect, geomorphons, curvature, the Topographic Wetness Index [TWI]--the list goes on), and even installed the Least Cost Path plugin as an experiment, but the best attempts either take the roads over rivers (which no one wants), or violate the maximum limit. Preferably, I'd like to see segments that run parallel to the slope, i.e. on the same contour line or close to it, à la best practices in real-world road engineering. This is related to the cut-and-fill and full-benched processes, an example of which is provided on ResearchGate. (Now I know what that technique is--but then again, the same applied for "corniche" a few days prior.) I've done a lot of research on the topic during this testing period, and similar questions have cropped up on GIS.SE and elsewhere for the past decade. (Note that a good deal of this coverage deals with non-QGIS procedures). * Qgis Least Cost Path with certain threshold (from May 2017; no responses yet. Let's hope mine doesn't go the same path [to pardon a pun].) * Finding least cost path in QGIS? (September 2016; closed) * Road design using Least cost path by longitudinal slope in ArcGIS Desktop? (2017; algorithm provided in answer is private) * A path with slopes on the terrain (April 2023; for Rhinoceros Grasshopper users; links to...) * Path Finder Using a Recursive Process - Example 8.5 (February 17, 2015; this is the process I'd like to emulate in QGIS) * Least Cost Path Analysis (October/November 2020 slideshow; mentions something called the "ox-cart slope" on page 13, with a formula of f(s) = 1+(s/c)^2 [where s represents the slope map and c the critical slope]) And last but not least--and most importantly... * Connect two contour lines with calculated line length (January 2015; mentions PEGGER, a vintage ArcGIS plugin. PEGGER calculates roads by this algorithm: d = i / (G / 100), where d is the distance segment; i, the vertical interval between contours; and G, the percentage of the desired slope. The Forest Road Designer plugin is the closest equivalent in QGIS, but requires the SciPy library; has a Spanish-only manual and UI; and received its most recent update last August. A while afterward, GitHub user "Xadamo" reported that it might not load on recent versions of QGIS, and may cause the program itself to crash. If you're reading this as one of its users, please let me know otherwise.) Wish I could place a bounty on this thing, because it sure deserves one. Can anyone point to/devise the best solution for this scenario (preferably within five to seven days)? To help you along, here's the same DEM from last time. More shapefiles/rasters to come as needed.
gis.stackexchange.com
December 7, 2025 at 4:07 AM
Maven Dependency Resolution Issue for GeoServer GS:WPS
I am encountering an issue while building a GeoServer wps using Maven, and I could use some guidance. Error Description: I am attempting to build a GeoServer wps using Maven (mvn clean install). However, I'm facing a dependency resolution problem related to the gs-wps library. The error suggests that Maven cannot find the artifact in the specified repository (https://repo.osgeo.org/repository/geoserver-releases/). I Already tried all versions but same error. i checked the Repository and i didnt found the gs:wps. Error: Could not resolve dependencies for project org.geoserver:Test:jar:2.8-SNAPSHOT: The following artifacts could not be resolved: org.geoserver:gs-wps:jar:2.23.0 (absent): org.geoserver:gs-wps:jar:2.23.0 was not found in https://repo.osgeo.org/repository/geoserver-releases/ during a previous attempt. This failure was cached in the local repository and resolution is not reattempted until the update interval of osgeo-snapshot has elapsed or updates are forced -> [Help 1] Is there an alternative repository, a recommended version, or any other solution to successfully resolve this dependency? pom.xml UTF-8 29.0 2.23.0 1.18.2 org.geotools gt-process ${gt.version} org.geoserver.extension gs-wps-core ${gs.version} org.locationtech.jts jts-core ${jts.version} org.geoserver gs-wps 2.23.0 org.apache.maven.plugins maven-compiler-plugin 3.8.1 1.8 1.8 osgeo-snapshot OSGeo Snapshot Repository https://repo.osgeo.org/repository/geoserver-releases/ osgeo Open Source Geospatial Foundation Repository https://repo.osgeo.org/repository/release/ central https://repo.maven.apache.org/maven2
gis.stackexchange.com
December 7, 2025 at 3:13 AM
Using the Intersect filter in OWSlib to filter data between two WFS
I'm trying to use OWSLib to deal with multiple WSF features in python, for now, I'm trying to return all features in sfb WFS that intersects with the specifc feature retrieved in sicar WFS. from owslib.wfs import WebFeatureService, Authentication from owslib.fes2 import PropertyIsEqualTo, Intersects from owslib.etree import etree auth = Authentication() auth.verify = False sicar = WebFeatureService( url="https://geoserver.car.gov.br/geoserver/sicar/ows", auth=auth, version='2.0.0') sfb = WebFeatureService( 'http://sistemas.florestal.gov.br/geoserver/ows', version='2.0.0') car = 'MT-5103700-92A39DC62FDF4D88874F218D41BE1C7F' filter_car = PropertyIsEqualTo('cod_imovel', car).toXML() carInfo = sicar.getfeature(typename='sicar:sicar_imoveis_mt', filter=filter_car, method='post', outputFormat='json') The code above is working fine, the problem starts when i try to get the Intersects geom_property = 'geo_area_imovel'; carInfoStr = carInfo.read().decode('utf-8'); geom = json.loads(carInfoStr)['features'][0]['geometry']['coordinates'] #I'm trying to pass the the coordinates list to Intersects, but i tried with other keys of the object and with the xml response, but doesn't wotk as well. filter_contains = Intersects(geom_property, geom).toXML() intersected = sfb.getfeature(typename='CNFP_orig:APP_Total_simp', filter=filter_contains, outputFormat='json', method='post') The returned errors involve the toXML function, wich depends on the geometry i pass to the intersects, in this case its a string, but need to be the constructor of something of the owslib, but the lib only have Point constructor. I could pass a point if I could retrieve the centroid of geometry, but I don't now how to do this with this lib. BUT, even the example passed in a official PR is not working. the PR with example: https://github.com/geopython/OWSLib/pull/780 The error in this code: Traceback (most recent call last): File "/home/gabrielubuntu/repos/pygeogis/app.py", line 38, in filter_contains = Intersects(geom_property, geom).toXML() File "/home/gabrielubuntu/repos/pygeogis/.venv/lib/python3.10/site-packages/owslib/fes2.py", line 420, in toXML node.append(self.geometry.toXML()) AttributeError: 'dict' object has no attribute 'toXML'
gis.stackexchange.com
December 7, 2025 at 2:10 AM
Overlay Union Geopandas improve performance
How to speed up the "union" of two shapefiles using geopandas? I have two large shapefiles: geodataframe_left.shape = (3610, 12) (500 MB) (link) geodataframe_right.shape = (16396, 3) (200 MB) (link) One file contains global province boundaries (GADM), The other shapefile contains Hydrobasin level 6. I've tried multiple approaches (PostGIS, Google Bigquery, ArcGIS) but nothing beats geopandas' one line command: gdf_union = gpd.overlay(gdf_left, gdf_right, how='union') there is however one problem. The command is extremely slow on larger datasets including mine. I've waited for almost an hour now without success. I'm running my code on a large amazon EC2 instance (32GB RAM). How can I improve the performance of this operation? Is it possible to use multiprocessing for example or can I split the problem into smaller chunks so I can monitor progress? UPDATE: * I tried updating to geopandas 0.4.0 .. still no result after running for two hours. * Used the use_sindex = True flag. Deprecated since 0.4.0 (always uses sindex) * I implemented the tile approach and found weird behavior. Details below. I created a fishnet grid of polygons of 10 x 10 degrees that I use to clip both layers before performing the union. The processing times per cell differ wildly and there are 4 cells that take > 10 minutes to process. One cell takes even 40 minutes. When the process runs in parallel I get the following error (script continues though): AttributeError: 'IndexStreamHandle' object has no attribute '_ptr' Exception ignored in: Traceback (most recent call last): File "/opt/anaconda3/envs/python35/lib/python3.5/site-packages/rtree/index.py", line 875, in __del__ self.destroy() File "/opt/anaconda3/envs/python35/lib/python3.5/site-packages/rtree/index.py", line 863, in destroy if self._ptr is not None: What's interesting is that these cells are at the 180 meridian. I've uploaded the GPKG here Maybe shapely doesn't like this hemisphere crossing stuff. UPDATE 2: Running ArcMap locally, the process takes appr. 10 minutes. Instead of using geopackages, I used shapefiles. file1 file2 Executing: Union "hybas_lev06_v1c_merged_fiona_V04 #;gadm36_1 #" C:\Users\Rutger.Hofste\Desktop\werkmap\union_benchmark\output\union_arcgis_global.shp ALL # GAPS Start Time: Fri Nov 30 10:32:04 2018 Reading Features... Processing Tiles... Assembling Tile Features... Succeeded at Fri Nov 30 10:43:02 2018 (Elapsed Time: 10 minutes 57 seconds) I also tried "Union" (both default and SAGA) in QGIS but this process is also ridiculously slow. I ran for 30min and the progress bar was at 4%. UPDATE 3: I'm implementing the tiled approach however for a small number of polygons the result of an intersection in shapely is not a multipologon or polygon but a geometrycollection with multiple geometries including lineStrings or just LineStrings. I would expect the result of an intersection to be polygon or multipolygon.
gis.stackexchange.com
December 6, 2025 at 11:06 PM
Timeout error when using features_from_place
I'm trying to execute the following to get all buildings within Manhattan in Python on Google Colab tags = { 'building': True } # Geocode the location to get its geometries ft = ox.geocode_to_gdf('Manhattan Island') # Extract the polygon geometry from the GeoDataFrame polygon_geometry = ft.geometry.iloc[0] # Retrieve features within the polygon with specified tags buildings_net = ox.features_from_polygon(polygon_geometry, tags=tags) for some reason, I get this error TimeoutError Traceback (most recent call last) /usr/local/lib/python3.10/dist-packages/urllib3/connectionpool.py in _make_request(self, conn, method, url, body, headers, retries, timeout, chunked, response_conn, preload_content, decode_content, enforce_content_length) 536 try: --> 537 response = conn.getresponse() 538 except (BaseSSLError, OSError) as e: 23 frames TimeoutError: The read operation timed out The above exception was the direct cause of the following exception: ReadTimeoutError Traceback (most recent call last) ReadTimeoutError: HTTPSConnectionPool(host='overpass-api.de', port=443): Read timed out. (read timeout=180) During handling of the above exception, another exception occurred: ReadTimeout Traceback (most recent call last) /usr/local/lib/python3.10/dist-packages/requests/adapters.py in send(self, request, stream, timeout, verify, cert, proxies) 530 raise SSLError(e, request=request) 531 elif isinstance(e, ReadTimeoutError): --> 532 raise ReadTimeout(e, request=request) 533 elif isinstance(e, _InvalidHeader): 534 raise InvalidHeader(e, request=request) ReadTimeout: HTTPSConnectionPool(host='overpass-api.de', port=443): Read timed out. (read timeout=180) I was initially trying to do the following (which does work), you need to specify the distance since it seems to only get by default a 100. I just was to get all the buildings in the region by itself without needing to specify a distance. tags = { 'building': True } # need to change this distance to capture the entire city building_network = ox.features_from_address('Manhattan Island', tags, dist=3000) Is there a way to go around this timeout error?
gis.stackexchange.com
December 6, 2025 at 10:02 PM
"QgsProcessingException: There were errors executing the algorithm" error when selecting and deselecting features with PyQGIS
I'm trying to apply the algorithms "Random Points along Lines" at the feature level. I did some PyQGIS code to iterate through the features to select each individual instance and apply the algorithm in the layer with the selection, but it's not working properly. In fact, the script works fine until I try include the layer.removeSelection() to clear the selection. It triggers an error below (in the run method though). If I don't include that line, the script works fine (but makes selection cumulative.) I've tried few alternatives, but none seems to work. Does anybody have a suggestion? Error: Traceback (most recent call last): File "C:\OSGEO4~1\apps\Python37\lib\code.py", line 90, in runcode exec(code, self.locals) File "", line 1, in File "", line 27, in File "C:/OSGEO4~1/apps/qgis/./python/plugins\processing\tools\general.py", line 96, in run return Processing.runAlgorithm(algOrName, parameters, onFinish, feedback, context) File "C:/OSGEO4~1/apps/qgis/./python/plugins\processing\core\Processing.py", line 183, in runAlgorithm raise QgsProcessingException(msg) _core.QgsProcessingException: There were errors executing the algorithm. Code: layer = qgis.utils.iface.activeLayer() temp_layers = [] for f in layer.getFeatures(): if f['remaining']: layer.removeSelection() layer.select(f.id()) #layer.selectByIds([f.id()]) # alternative methods, same error print("selected: "+ str(layer.selectedFeatureCount())) print("selected: "+ str(layer.selectedFeatureIds())) print("new points: "+ str(f['remaining'])) params = { 'INPUT' : QgsProcessingFeatureSourceDefinition(layer.id(), True), 'MIN_DISTANCE' : 5, 'OUTPUT' : 'memory:', 'POINTS_NUMBER' : f['remaining'] } results = processing.run('qgis:randompointsalongline', params) #results = processing.runAndLoadResults('qgis:randompointsalongline', params) # merging temp layers params = { 'LAYERS': temp_layers, 'CRS': 'EPSG:3857', 'OUTPUT': 'memory:' } processing.runAndLoadResults('qgis:mergevectorlayers', params)
gis.stackexchange.com
December 6, 2025 at 9:08 PM
No response from addPart operation when trying to upload parts of .tpk to AGOL (Python, requests, ArcGIS REST API)
Trying to automate the updating of a large .tpk on AGOL. When the .tpk is updated on my servers I do the following: * Break into parts * Call updateItem operation with multipart = true * Make a request to upload each part using addPart * Make a request with the commit function to finalize the update to the .tpk on AGOL My issue is that I get no response when trying to use addPart, then when the commit operation is hit in my code it says there are no parts to commit. A very odd result as something is definitely happening when the addParts operation is performed. Memory usage kicks up to ~ 3GB (1gb files so this seems right) and each addPart takes maybe 5 minutes or so complete and move to the next. It's not a firewall/proxy issue with the request as I've made multiple request in this environment to AGOL and had no issues getting responses. Code: update_url = 'https://lipa.maps.arcgis.com/sharing/rest/content/users/{}/items/{}/update'.format(username,tpk_id) up_data = {'f':'json', 'token':token, 'title': layer_name, 'itemId':tpk_id, 'type': 'Tile Package', 'overwrite': 'true', # existing item, overwrite is set to true 'async':'true', 'multipart':"true"} update_response = requests.post(update_url, up_data=data, verify = cert).json() print update_response part_num = 1 part_url = 'https://lipa.maps.arcgis.com/sharing/rest/content/users/{}/items/{}/addPart'.format(username,tpk_id) for part in os.listdir(parts): path = r"\\psegliny.com\oms\oms_gis_prd\fileshare_mapping\cache_extent\es_secondary_tpks\full_service_area" part = path + "\{}".format(part) print part print(str(part_num)) ##files_up = {"file": open(part, 'rb')} with open(part, "rb") as f: files_up = {"file":f} data = {'f':'json', 'token':token, 'partNum':part_num} part_response = requests.post(part_url,data = data,files=files_up,verify=cert) part_num+=1 try: jres = json.loads(part_response.text) print jres except: print "no response" ##print add_part_response commit_url = 'https://lipa.maps.arcgis.com/sharing/rest/content/users/{}/items/{}/commit'.format(username,tpk_id) data = {'f':'json', 'token':token} commit_response = requests.post(commit_url,data=data,verify=cert).json() print commit_response Anyone know what I am doing wrong? It's very odd that I get no response at all, at the very least I should get a failed/success response back (as per the documentation)
gis.stackexchange.com
December 6, 2025 at 6:09 PM