Brightly Confirm
Overview
Brightly Confirm provides asset management software. Local authorities use Confirm to manage such assets as street furniture and any fault reporting that occurs.
Integration with Brightly Confirm TLDR
Confirm exports are placed on a SFTP server.
A Python script is used to download the data.
A standard spatial data transformation task is used to import the data into a temporary location in the Spatial Data Warehouse.
- A check is placed upon the data to ensure all features have been imported successfully.
- If the checks are passed then the import is re-run to the 'live' location in the SDW.
Confirm exports are placed on a SFTP server.
Each export from Confirm should be a comma-delimited CSV file and contain all the fields necessary to be displayed & queried in iShare. Clearly some means of geographic referencing is required. In the following examples latitude / longitude or easting / northing coordinates are used.
It's worthy at this point to note any datetime fields. Postgresql prefers to import datetime strings as YYYY-MM-DD HH24:MI:SS so some manipulation will be required to adjust these fields into the correct format. The following are two examples from a customer's Brightly Confirm installation.
Street Light Faults Field Definitions
Field Name | Data Type | Example / Notes |
Asset Id | character varying | AB/13/E |
Central Asset Id | character varying | Unique ID in Confirm |
Job Number | integer | 173243 |
Job Entry Date | datetime (DD/MM/YYYY HH24:MI:SS) GMT | 19/06/2022 11:37:28 |
Site Name | character varying | Albert Road |
Locality Name | character varying | N4 |
Feature Location | character varying | S/O No 86A Florence Road |
Job Notes | character varying | General Street Lighting Issue - Lamp on 24/7 |
Priority Name | character varying | Street Lighting - 7 days |
Status Code | character varying | JS33 |
Status Name | character varying | Job Received by Contractor |
Asset Number | decimal | 200009.01 |
Feature Type Code | character varying | LSL |
Cms Activation Date | datetime (DD/MM/YYYY HH24:MI:SS) GMT | 19/06/2022 11:37:28 |
Feat Cent East | decimal | 530981.73 |
Feat Cent North | decimal | 187613.56 |
Longitude | character varying | 51.57218637 |
Latitude | character varying | -0.12345678 |
Url | character varying | |
Defect Type | character varying | Column - Day Burner |
Defect Code | character varying | LCDB |
wkb_geometry | well-known text geometry | POINT (-0.09102906 51.58654038) |
Feature Group Code | character varying | SLCO |
Feature Group Name | character varying | SL-Street Lighting Unit |
Feature Type Name | character varying | Street Light Column |
Waste Assets Field Definitions
Field Name | Definition | Example |
Feature Type Name | character varying | Dog Bin |
Site Code | character varying | P421054 |
Site Name | character varying | Albany Close |
Locality Name | character varying | N15 |
Ward Name | character varying | St Anns |
Asset Number | character varying | 7,001.00 |
Feature Location | character varying | by L/C AC805H |
Central Asset Id | character varying | H5400002729 |
Material | character varying | Plastic |
Manufacturer | character varying | Unknown |
Mounting Configuration | character varying | Floor mounted |
Asset Position On Site | character varying | Back of Path |
Surface Base Type | character varying | Block Paving |
Owner | character varying | Waste Management |
Asset Id | character varying | S1 |
Survey Date | datetime (DD/MM/YYYY HH24:MI:SS) GMT | 11/24/23 0:00 |
Feat Cent East | decimal | 532047.79 |
Feat Cent North | decimal | 189119.58 |
Latitude | character varying | 51.58547145 |
Longitude | character varying | -0.09542232 |
Feature Start Date | datetime (DD/MM/YYYY HH24:MI:SS) GMT | 6/13/24 17:34 |
Feature Type Code | character varying | GRIT (for icon and/or colour coding) |
url | character varying | |
Feature Group Name | character varying | HI-Grit/Salt Bins |
Feature Group Code | character varying | GRIT |
Downloading the data
The following is a Python 3 script that is used to download the data from the SFTP server. A configuration file is defined for the server details and the local location of the downloaded file.
Create the python 3 virtual environment
Ensure Python 3 is installed.
Create the SFTP file downloader script
Copy the script below and save it as E:\iShareData\Utilities\SFTPFileDownloader\sftp_file_downloader.py
Create the SFTP file downloader config file
A config file will be required for each file on the SFTP server. Adjust the following example and save under E:\iShareData\Data\<project>\sftp_<project>.config
Create a Program Task in Studio
Create a program task with the following details. Adjust the location of the configuration file.
Run the task and confirm the file has downloaded
Transfer the data to a temporary location in the Spatial Data Warehouse
Ensure that Studio supports VRT files for Spatial Data Transformations.
Inspect E:\iShareData\LIVE\config\studio\config\ConTypes.xml and check the following FileConType is defined. If not add it.
Create a VRT for the CSV file.
VRTs are a virtual format supported by OGR/GDAL. More information can be found here https://gdal.org/en/latest/drivers/vector/vrt.html.
Ensure that the OGRVRTLayer name matches the csv file name. It is best to use lower case and underscores and definitely no spaces for both.
The following is an example of the waste assets. There's a SrcSQL definition where is a SQL statement is present to adjust the datetime fields into a suitable format for Postgresql. The source 'table' must also match the csv file name. Note that each field name has a type e.g. string or numerical.
The geometries are defined by the GeometryType (wkbPoint), LayerSRS (EPSG:27700) and the GeometryField elements.
It is worth testing that the VRT file has been defined correctly. You can use OGR to do this. Identify where the location of OGR by inspecting the OGR Path parameter in Studio's settings. Open a command window and navigate to that folder. Enter the following to test the VRT file. If correct OGR will report back the features found in the VRT file.
Create the Spatial Data Transformation task
For the Source Data select 'VRT File' and enter the path to the VRT file. For output select a schema and a temporary table location for the data. Uncheck 'Projection as British National Grid' if the VRT file is not using EPSG:27700 as the SRS. If you do uncheck this option specify the destination projection by selecting 'Show expert mode' and entering the following in the Additional Parameters box.
Run the task and confirm the data has been loaded to the destination specified by using QGIS or pgAdmin.
Check that the data has been transferred correctly
Create the Python 3 environment
Create the feature_count.py script
Store the following as E:\iShareData\Utilities\DataSourceChecker\feature_count.py.
Each dataset that is being imported will need a configuration file. Edit accordinly to suit the source & destination dataset. Here we wish to check that the VRT file contains the same number of features as the version held by the Spatial Data Warehouse. Store the file as E:\iShareData\Data\<project>\feature_count_<project>.config.
Create a Program Task in Studio to run the script
Create a program task with the following details. Adjust the location of the configuration file.
Run the task and confirm the two datasets have the same number of features
Transfer the data to the 'live' location
Now we are confident that the data has been downloaded and transferred to the Spatial Data Warehouse correctly we can copy the earlier Spatial Data Transformation task and adjust the destination table so that it in the correct location for any mapsources wanting to use the data.
Create a mapsource to use the data
Now that the data is loaded into the Spatial Data Warehouse we need to create a mapsource and layer that uses the data. If any classification styling is required it is recommended to load the data into QGIS. Choose categorised styling and export the style SLD file. This can be imported into iShare. Ensure the field that is used for categorised styling is exposed as an attribute field.