...
Run the task and confirm the data has been loaded to the destination specified by using QGIS or pgAdmin.
Check that the data has been transferred correctly
Create the Python 3 environment
Paste code macro |
---|
install .net 4.8
install Microsoft C++ Build Tools via https://visualstudio.microsoft.com/visual-cpp-build-tools/ --> Run exe --> Check C++ build tools --> check MSVC v142 and Windows 10 SDK --> wait for installation to finish
mkdir e:\isharedata\utilities\DataSourceChecker
cd e:\isharedata\utilities\DataSourceChecker
c:\python3\python.exe -m venv ./venv
set - this displays the environment variables
edit venv\Scripts\activate.bat
redefine PATH in activate.bat so it doesn't mention c:\python27 or ogr paths
set GDAL_DATA in activate.bat to blank if it exists as an environment variable
venv\Scripts\activate.bat
download https://github.com/cgohlke/geospatial-wheels/releases/download/v2024.9.22/GDAL-3.9.2-cp39-cp39-win_amd64.whl and copy to e:\isharedata\utilities\DataSourceChecker
pip install GDAL-3.9.2-cp39-cp39-win_amd64.whl
pip install numpy |
Create the feature_count.py script
Store the following as E:\iShareData\Utilities\DataSourceChecker\feature_count.py.
Paste code macro | ||
---|---|---|
| ||
import sys
import configparser
from osgeo import ogr
def count_features_in_ogr(ogr_driver, ogr_connection, ogr_layer):
# Register the VRT driver
ogr.RegisterAll()
ogr.DontUseExceptions()
# Open the datasource
if ogr_driver == 'OGR_VRT':
datasource = ogr.Open(ogr_connection)
elif ogr_driver == 'PostgreSQL':
datasource = ogr.Open(ogr_connection)
else:
driver = ogr.GetDriverByName(ogr_driver)
datasource = driver.Open(ogr_connection, 0)
if datasource is None:
print(f"Failed to open {ogr_connection}")
return 0
layerList = []
for i in datasource:
daLayer = i.GetName()
if daLayer == ogr_layer:
feature_count = i.GetFeatureCount()
return feature_count
def list_ogr_drivers():
cnt = ogr.GetDriverCount()
formatsList = [] # Empty List
for i in range(cnt):
driver = ogr.GetDriver(i)
driverName = driver.GetName()
if not driverName in formatsList:
formatsList.append(driverName)
formatsList.sort() # Sorting the messy list of ogr drivers
for i in formatsList:
print(i)
return 0
if __name__ == "__main__":
config = configparser.ConfigParser()
config.read(sys.argv[1])
source_driver = config['SOURCE']['source_driver']
source_connection = config['SOURCE']['source_connection']
source_layer = config['SOURCE']['source_layer']
source_feature_count = count_features_in_ogr(source_driver,source_connection,source_layer)
destination_driver = config['DESTINATION']['destination_driver']
destination_connection = config['DESTINATION']['destination_connection']
destination_layer = config['DESTINATION']['destination_layer']
destination_feature_count = count_features_in_ogr(destination_driver,destination_connection,destination_layer)
if source_feature_count != destination_feature_count:
print(f"feature counts do not match")
|
Each dataset that is being imported will need a configuration file. Edit accordinly to suit the source & destination dataset. Here we wish to check that the VRT file contains the same number of features as the version held by the Spatial Data Warehouse. Store the file as E:\iShareData\Data\<project>\feature_count_<project>.config.
Paste code macro |
---|
[SOURCE]
source_driver = OGR_VRT
source_connection = E:\iShareData\Data\path\file.vrt
source_layer = the_layer
[DESTINATION]
destination_driver = PostgreSQL
destination_connection = PG: host=<SDW host> user=isharedata_gis dbname=sdw password=<password> port=5432
destination_layer = the_schema.the_layer_temp |
Create a Program Task in Studio to run the script
Create a program task with the following details. Adjust the location of the configuration file.
Paste code macro |
---|
Program: E:\iShareData\Utilities\DataSourceChecker\venv\Scripts\python.exe
Additional Parameters: E:\iShareData\Utilities\DataSourceChecker\feature_count.py E:\iShareData\Data\<project>\feature_count_<project>.config |
Run the task and confirm the two datasets have the same number of features
Transfer the data to the 'live' location
Now we are confident that the data has been downloaded and transferred to the Spatial Data Warehouse correctly we can copy the earlier Spatial Data Transformation task and adjust the destination table so that it in the correct location for any mapsources wanting to use the data.
Create a mapsource to use the data
Now that the data is loaded into the Spatial Data Warehouse we need to create a mapsource and layer that uses the data. If any classification styling is required it is recommended to load the data into QGIS. Choose categorised styling and export the style SLD file. This can be imported into iShare. Ensure the field that is used for categorised styling is exposed as an attribute field.