What is this?

This is basically where I write down stuff that I work with at my job as a GIS Technical Analyst (previously system administrator). I do it because it's practical for documentation purposes (although, I remove stuff that might be a security breach) and I hope it can be of use to someone out there. I frequently search the net for help myself, and this is my way of contributing.

Sunday, December 13, 2020

FME Desktop 2020 on Mac with a M1 Apple Silicon CPU

I recently purchased the new MacBook Pro with an Apple Silicon M1 chip. The reviews were overwhelmingly positive, and it is indeed a very snappy computer. So naturally I had to download FME Workbench to see how it would perform. Unfortunately, no native ARM version is available yet, so I installed the regular Intel-version and ran FME using Rosetta 2. At first it seemed to run fine - the GUI is snappier than it is on my Intel-based MacBook Pro and all my existing workflows ran successfully. They did run a little slower than before so I decided to run some benchmarks.

I made a simple but quite heavy workflow which reads 2.7 million points and 1.9 million polygons from ffs-files into a PointOnArea-transformer. The output is written to a ffs-file.

I ran the same test on three different computers:

  • A MacBook Pro 2020 M1 with 16 GB RAM
  • A MacBook Pro mid-2018 Core i5 with 16 GB RAM
  • Dell Precision 5530 Core i7-8550H/32 GB RAM/Windows 10 

The first run revealed that the Dell computer performed considerably slower than the Intel MacBook Pro. The M1 MacBook was the slowest and logged several dozen memory optimization messages, and one warning:
Failed to free sufficient memory to reach the process usage limit. To improve stability and performance please increase the memory available to FME. Available memory can be increased by adding physical RAM, increasing swap space, or closing other applications”.
For the second run I reduced the number of features to 1 million points and 1 million polygons. The result was pretty similar. The Intel MacBook Pro finished first, the Dell computer second and the M1 MacBook finished last, again logging the same memory optimization messages.
For the last run I reduced the number of features to 500.000 points and 500.000 polygons. This time the MacBook M1 logged no memory optimization messages or warnings and finished the job first.

2.7m points / 1.9m polygons

1m points / 1m polygons

500k points / 500k polygons

MacBook Pro M1 2020/16 GB RAM




MacBook Pro Intel Mid-2018/16 GB RAM




Dell Precision 5530 Core i7-8550H/32 GB RAM/Windows 10




It looks like memory handling is different when FME is running under Rosetta 2 than on an Intel-based Mac. The M1 MacBook is fast, but only as long as it doesn´t need to allocate a lot of RAM. I was also surprised to see that my Intel MacBook performed better than the Dell Computer. Especially since the Dell has twice as much memory and has a considerably faster CPU (Core i7-8550h vs Intel Core i5-8259U)

At the end of the FME log there is a memory report. It will typically say something like:

ProcessID: 6148, peak process memory usage: 17665760 kB, current process memory usage: 12742080 kB

I was surprised to find huge variations in the reported memory usage on the three computers. The numbers were somewhat inconsistent, but the M1 MacBook consistently reported the highest memory usage.

2.7m points / 1.9m polygons

1m points / 1m polygons

500k points / 500k polygons

MacBook Pro M1 - Peak Process




MacBook Pro M1 - Current Process




MacBook Pro Intel - Peak Process




MacBook Pro Intel - Current Process




Dell Precision 5530 - Peak Process




Dell Precision 5530 - Current Process




It would be interesting to hear if SAFE Software have any comments on this, and if they have done similar tests. I’m certainly looking forward to the universal executable of FME Desktop.

If anyone else would like to try the same workflow - it is available here.

Saturday, June 2, 2018

Geoserver 2.12.1/Jetty - Installing a SSL certificate from a PFX keystore

I'm using a standard Windows 2016 server with Java JRE 8 update 151 and Geoserver 2.12.1 installed with pretty much default settings. This is a step-by-step description of what I had to do to extract a (wildcard) certificate/key from a IIS-type PFX (PKCS12) keystore to a JKS keystore and install it on the Jetty web server that comes bundled with Geoserver.

NOTE! When I import from the PFX keystore to my new JKS keystore I must always use the same password on both the old and new keystore. Trying to use a different password for the new keystore will just give me an error in {geoserver}/logs/wrapper.log when I start Geoserver:
FAILED org.eclipse.jetty.server.Server@104e906: java.security.UnrecoverableKeyException: Cannot recover key

Step 1:
Export data from my PFX keystore to a new JKS keystore file:
C:\Program Files (x86)\Java\jre1.8.0_151\bin>keytool.exe -importkeystore -srckeystore c:\temp\my_keystore.pfx -srcstoretype pkcs12 -destkeystore c:\temp\keystore -deststoretype JKS
Importing keystore c:\temp\my_keystore.pfx to c:\temp\keystore...
Enter destination keystore password: MyPassword
Re-enter new password: MyPassword
Enter source keystore password: MyPassword
Entry for alias le-f8a123c3-abcd-4bbb-b341-40251cf90a0b successfully imported.
Import command completed:  1 entries successfully imported, 0 entries failed or cancelled

The JKS keystore uses a proprietary format. It is recommended to migrate to PKCS12 which is an industry standard format using "keytool -importkeystore -srckeystore c:\temp\keystore -destkeystore c:\temp\keystore -deststoretype pkcs12".

Step 2:
Verify that your new keystore is working:
C:\Program Files (x86)\Java\jre1.8.0_151\bin>keytool -list -keystore c:\temp\keystore -storepass MyPassword
Keystore type: JKS
Keystore provider: SUN

Your keystore contains 1 entry

le-f8a123c3-abcd-4bbb-b341-40251cf90a0b, 02.jun.2018, PrivateKeyEntry,
Certificate fingerprint (SHA1): A1:BF:53:7F:30:00:11:22:33:44:8D:F4:8A:20:25:FF:6B:D5:89:7C

The JKS keystore uses a proprietary format. It is recommended to migrate to PKCS12 which is an industry standard format using "keytool -importkeystore -srckeystore c:\temp\keystore -destkeystore c:\temp\keystore -deststoretype pkcs12".

Step 3:
Copy c:\temp\keystore to %GEOSERVER_HOME%\etc\keystore (rename the original "keystore" file for backup purposes)

Step 4:
Download the correct (see below) Jetty distribution archive (tar.gz or zip) from https://repo1.maven.org/maven2/org/eclipse/jetty/jetty-distribution/.
Unpack ssl.mod from the archive (it's located in the modules folder) and copy it to %GEOSERVER_HOME%\modules. 7-zip is my prefered tar/gz unpacking tool when I use Windows.
NOTE: Make sure it's the correct version of ssl.mod - if your Geoserver version is different than mine (12.2.1) your Jetty version is probaably different too. You'll see which jetty version you have by checking jetty-servlet files under %GEOSERVER_HOME%/lib.

Step 5 (Optional):
for increased strength cryptography follow this link: http://docs.geoserver.org/latest/en/user/production/java.html#installing-unlimited-strength-jurisdiction-policy-files
(you can download local_policy.jar and US_export_policy.jar that you copy to %JAVA_HOME%\lib\security\policy\unlimited)

Step 6:
Configure Jetty by adding the following lines to %GEOSERVER_HOME%\start.ini:

I added them immediately below the following section:
# --------------------------------------- 
# Module: http

NOTE! I've seen some references to editing %GEOSERVER_HOME%\etc\jetty-ssl.xml instead of start.ini. I tried that, but I don't think Jetty read the file at all. I would always get the following errors in wrapper.log:
INFO   | jvm 1    | 2018/05/24 23:50:35 | 2018-05-24 23:50:35.874:WARN:oejuc.AbstractLifeCycle:WrapperSimpleAppMain: FAILED SslContextFactory@4b387439(C:\Program Files (x86)\GeoServer 2.13.0\./etc/keystore,C:\Program Files (x86)\GeoServer 2.13.0\./etc/keystore): java.io.IOException: Keystore was tampered with, or password was incorrect
INFO   | jvm 1    | 2018/05/24 23:50:35 | java.security.UnrecoverableKeyException: Password verification failed

Step 7:
Restart the Geoserver service.

Step 8:
Testing! Now you should be able to access your geoserver on both these addresses
https://localhost:8443/geoserver/web/ (probably with a certificate warning due to mismatching address as expected, but I'm sure you know to proceed from here)

If Geoserver just starts and stops after a short while there is an issue with your configuration somewhere. Check %GEOSERVER_HOME%\logs\wrapper.log for details.

Thursday, March 15, 2018

Using SSL Wildcard certificates from a pfx file on FME Server (Tomcat)


I just installed FME Server 2018 today, and the procedure is still more or less the same. To specifically choose pkcs12 as your keystone format use:

keytool -importkeystore -srckeystore c:\temp\my_keystore.pfx -srcstoretype pkcs12 -destkeystore c:\temp\tomcat.keystore -deststoretype pkcs12


I recently installed an instance of FME Server (version 2015.1.3.1) on a Windows 2012R2 server.

The default installation of FME Server uses http so I decided to install the Wildcard SSL certificate we use in my organization to improve security some. Most of our servers are IIS so I only had a .pfx file accessible.

In Safes documentation library I found this description on how to configure FME Server for https:

This describes how to use a self signed certificate or regular CA-authorized certificate from a CSR. None of which applied to my exact need. So instead I created new JKS keystore by importing the .pfx-keychain with "keytool.exe" (the documentation says you need JDK, but I used the JRE binaries that comes with the FME Server installation).

C:\apps\FMEServer\Utilities\jre\bin>keytool -importkeystore -srckeystore c:\temp\my_keystore.pfx -srcstoretype pkcs12

Enter destination keystore password: 
Re-enter new password:
Enter source keystore password:
Entry for alias le-1234abcd-abcd-4444-a395-1234567890ab successfully imported.
Import command completed:  1 entries successfully imported, 0 entries failed or

Next I went through Safe's documentation and altered these files accordingly:

<programdata>\Safe Software\FME Server\localization\publishers\websocket\publisherProperties.xml
<programdata>\Safe Software\FME Server\localization\subscribers\websocket\subscriberProperties.xml

Then I restarted the "FME Server Application Service" but there was no response from https. I checked <tomcatdir>/logs/catalina.<date>.log and found the following useful entries:

mar 19, 2016 6:25:44 AM org.apache.coyote.AbstractProtocol init
SEVERE: Failed to initialize end point associated with ProtocolHandler ["http-bio-443"]
java.io.IOException: Cannot recover key

mar 19, 2016 6:25:44 AM org.apache.catalina.core.StandardService initInternal
SEVERE: Failed to initialize connector [Connector[HTTP/1.1-443]]
org.apache.catalina.LifecycleException: Failed to initialize component [Connector[HTTP/1.1-443]]

I checked around and it turns out that Tomcat was not able to access my newly imported key because I had a different password set for my JKS keystore than what the imported keystore had. I mistakenly assumed that by importing the key from my existing .pfx-keystore it would inherit the password from my new local JKS keystore. Apparently not so. Anyway - by deleting my c:\users\<user>\.keystore file, running the keytool-importkeystore command and making sure both the destination keystore password AND the source keystore password matched I was able to get it running.

PS: After installing the certificate I had to run the post-configuration scripts again manually. It turned out that I wasn't able to edit service properties upon publishing workspaces to FME server:
http://docs.safe.com/fme/html/FME_Server_Documentation/Default.htm#AdminGuide/Post_Config_Steps.htm (you will have to go and manually set services back to https again after doing this though).

Thanks to the guys at Safe for their help.

Friday, June 16, 2017

FME Workbench 2016 - Invalid character value for cast specification

I was working on a quite straight forward ESRI SDE (MSSQL) to MSSQL Spatial ETL workspace today when I ran into an issue which I couldn't find a solution for anywhere else.

Writing to a MSSQL table with a number of columns using various data types would fail and give me the following error: 

Failed to write a feature of type `dbo.MyTable' to the database. Provider error `(-2147217887) Invalid character value for cast specification'. SQL Command `DECLARE @WKB0 varbinary(max); SET @WKB0 = ?; DECLARE @SRID0 INTEGER; SET @SRID0 = ?; INSERT INTO dbo.[MyTable] ([data_id], [localId], [namespace], [version], [originalRef], [program], [project], [projectdesc], [purpose], [assignee_id], [owner_id], [startDate], [endDate], [scale], [res], [method], [prec], [visibility], [methodAlt], [precAlt], [maxDev], [restrictions], [totQuality], [geometry]) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, CASE @WKB0 WHEN '' THEN NULL ELSE geometry::STGeomFromWKB(@WKB0,@SRID0) END)'

Not much else in terms of useful information. I checked all the table columns and made sure all "not null"-columns had data, and that data passed to the writer matched the correct data type. The culprit ended up being the localId-column, who is a uniqueidentifier. It turns out that even though I read the uniqueidentifier value from a ESRI SDE it can't be written back to a MSSQL (spatial) writer without adding squiggly brackets. 

So if your SDE reader gives you: "f69dceae-3bc6-43e2-8279-78b25a23564c" you'll need to alter it to "{f69dceae-3bc6-43e2-8279-78b25a23564c}" before writing.

I suppose it makes perfect sense because when writing to a uuid-field using Python (arcpy) you will also encounter problem if you forget {}. This works in arcpy:

with arcpy.da.InsertCursor(myFeatureClass,("myField") as cursor:

This has happened to me before, but next time it happens I'll just check my own blogpost ;-)

Monday, December 29, 2014

ArcGIS Server - missing Output Directory

This morning I found that services on one of our ArcGIS 10.1 servers were extremely slow, causing timeouts for users. I found that javaw.exe was consuming 100% CPU across all cores. This does not happen very often, in fact it must be a year since last time something like that happened. Anyway, I tried restarting the ArcGIS Server service, but CPU-consumption immediately went back to 100%. Next I tried to restart map services to see if there was one particular that caused the problem. Unfortunately I couldn’t restart the services once they were stopped. Time to read logs.

Two errors would constantly be logged to the ArcGIS Server-log:

<Msg time="2014-12-12T15:59:18,46" type="SEVERE" code="9503" source="Soap" process="10604" thread="24" methodName="" machine="ARCGIS2" user="" elapsed="">Service 'mymapservice.MapServer' not found. Check if the service exists and is started.</Msg>

<Msg time="2014-12-12T15:59:18,60" type="SEVERE" code="9003" source="Rest" process="10604" thread="18" methodName="" machine="ARCGIS2" user="" elapsed="">Unable to process request. Error handling service request : Could not find a service with the name 'MapServer/mymapservice2' in the configured clusters. Service may be stopped or ArcGIS Server may not be running.</Msg>

There was nothing useful in the ArcGIS/services-logs. Not good.

Next I tried publishing the services again (overwrite existing) – it got as far as to “upload service definitions” before failing. Tried deleting the map service and publishing again – it still wouldn't work. Then I had a look at the PublishingTools service it self. It looked all right, but when I restarted it I got an error message saying it could not find c:\arcgisserver\directories\arcgisoutput2 folder. This didn't make sense as ArcGIS Manager showed the following valid configuration:

Time to go behind the GUI and look at config files. The publishing tools service config is located at: C:\arcgisserver\config-store\services\System\PublishingTools.GPServer\PublishingTools.GPServer.json
.. and there was my my problem:
"outputDir": "C:\\arcgisserver\\directories\\arcgisoutput2".
I’m not too keen on altering ArcGIS Server config files by hand so I tried going back to the service definition and just selected the same directory:

..and then “save & restart”. That did the trick – and the config-file was updated. Basically it turns out all my services were configured to use c:\arcgisserver\directories\arcgisoutput2 despite of what ArcGIS Server Manager displayed, so I had to repeat this for every service. I suppose the reason was that I moved the ArcGIS-directories around a while ago as part of a cleanup job, and left the arcgisoutput2-folder behind for a while before deleting it thinking it was no longer in use.

Thursday, March 20, 2014

Tuning GeoServer MSSQL data stores

At work we recently installed some new layers on our GeoServer 2.4.4 which queried for spatial data from a MSSQL 2008r2 database. Unfortunately we soon realized we had some serious performance issues. I tried the usual tricks - updating statistics, checking indexes etc. Microsofts Database tuning engine advisor eventually showed that things were decent - yet we still expericnced poor performance.

I then tried modifying settings for the data store. For MSSQL the default settings are:

max connections:10
min connections:1
fetch size:1000
connection timeout:20
validate connections: enabled
Use native geometry serialization: disabled

I ended up changing the settings to

max connections:20
min connections:5
fetch size:5000
connection timeout:20
validate connections: disabled
Use native geometry serialization: enabled

Performance got a real nice boost. I estmate that things are at least twice as quick now, and This is especially noticeable when the MSSQL server is also quite busy with other tasks (especially heavy disk/network activity during backup).

Especially the "validate connections" parameter seems to have a noticeable impact. According to the documentation this parameter increases the risk of client errors. I suppose I will have to monitor the services for a while to see if they remain stable.

Tuesday, December 31, 2013

Uploading a file to FME Server using Dojo and Internet Explorer

Not being the greatest of Javascript programmers I was struggling with some FME/Javascript issues here for a while. I needed to create a small Javascript application that should do the following:
  1. Upload a file (Excel) to a FME Server.
  2. Run a workspace (job) on the FME server using this file as input.
  3. Wait for the job to finish.
  4. Display status, and if successful: provide a link to download the transformed file from the FME server.
Safe software have lots of Javascript examples on their "FME Server Playground" pages. The one called "Full Data Upload Example" was a good start, but it had several issues for a someone who doesn't do programming often:
  1. The script does not work in Internet Explorer
  2. The script needs to reside ON the FME server to avoid CORS (cross origin resource sharing) issues.
Here's what I did to solve the issues:
  1. IE does not handle the JSON response the same way other browsers do. Rather than pick up the JSON response text and continue the script, IE will display a file download prompt. I had some assistance from a helpful guy at Safe support and we sorted that out. The javascript ended up quite messy though, so I decided to go for a better solution. The application is meant to display a map using ESRIs ArcGIS Javascript API. In other words I am already using the Dojo toolkit in the application, so I decided to rewrite the script to use Dojo instead.
  2. Normally I would not mind having the script running on the FME server, but this application/script will be included as part of a .net-based application, and I don't like scattering bits and pieces around - in other words - it must reside a IIS server. Since many of our applications are map-centric we simply use ESRIs proxy.ashx page (https://developers.arcgis.com/en/javascript/jshelp/ags_proxy.html) to deal with CORS issues. It's real simple and does what its supposed to do (plus it can handle tokens/security when you use ArcGIS Server which is initially why we started using it).
So if you are a GIS/Tech person and find yourself in a situation where your .net/ArcGIS Javascript based application need to upload a file to a FME server and grab the transformed file - perhaps you can save some time by looking at this script:

The repository/service referenced in the script is running at our FME server at my job, if for some reason it should be unavailable you can create your own FME workspace that does the same thing. It really is a basic workspace that does the following:
  • Read a excel spreadsheet (using a XLSX_Reader)
  • Write a csv-file using dynamic properties from the excel spreadsheet
When publishing to FME Server it only requires the "Data Download"-service to be registered.

<!DOCTYPE html>
<html xmlns="http://www.w3.org/1999/xhtml">
    <title>FME Data upload</title>
    <link rel="stylesheet" href="http://js.arcgis.com/3.7/js/esri/css/esri.css">

    <script>dojoConfig = { parseOnLoad: true }</script>
    <script src="//js.arcgis.com/3.7/"></script>

        var filePath = "";
        ], function (
            uploader) {
            // Generate upload form. Dojo will automatically use iFrame for IE, HTML5 for the rest
            var uploadForm = new dojox.form.Uploader({
                name: "file",
                id: "file",
                label: "Upload File",
                multiple: false,
                uploadOnSelect: true,
                onComplete: uploadCallback,
                url: "proxy.ashx?http://fme.miljodirektoratet.no/fmedataupload/kp_repo/xls_to_csv.fmw?opt_extractarchive=true&opt_pathlevel=3&opt_fullpath=true&opt_responseformat=json"

            // Callback: File has been uploaded - check the result
            function uploadCallback(dataUpload, ioargs, widgetRef) {
                try {
                    if (dataUpload.serviceResponse.statusInfo.status == "success") {
                        // go through response for all uploaded files (pointless in this example as we only allow one file - but useful if you must upload multiple files)
                        for (i = 0; i < dataUpload.serviceResponse.files.file.length; ++i) {
                            dom.byId("uploadStatusDiv").innerHTML = "File: " + dataUpload.serviceResponse.files.file[i].name + " was uploaded" + "
                            dom.byId("jobResultDiv").innerHTML = ""// clear jobResultDiv in case this is not the first file uploaded in this session
                            filePath = dataUpload.serviceResponse.files.file[i].path;
                            // Check that filename contains .xls or .xlsx
                            try {
                                var extensionCheckResult = dataUpload.serviceResponse.files.file[i].name.match(/\.xlsx|\.xls$/i);
                                if (extensionCheckResult[0].toLowerCase() == ".xls" || extensionCheckResult[0].toLowerCase() == ".xlsx") {
                                    dom.byId("jobResultDiv").innerHTML += ("File type ok (" + extensionCheckResult + ")
                                    if (/[A-Z]+/.test(extensionCheckResult)) {
                                        dom.byId("jobResultDiv").innerHTML += ("Warning: FME Server does not like upper case letters in file name extensions.
                                else {
                                    dom.byId("jobResultDiv").innerHTML += ("File type wrong (" + extensionCheckResult[0] + ")
                            catch (error) {
                                dom.byId("jobResultDiv").innerHTML += ("Problem determining file type (" + error + ")
                    } else {
                        dom.byId("uploadStatusDiv").innerHTML = "Unable uploading file(s)
                catch (err) {

            // Create the button that lets you run the workspace job
            var jobStartButton = new Button({
                label: "Run workspace",
                disabled: true,
                onClick: function () {
                    dom.byId("jobResultDiv").innerHTML += "Workspace is starting, wait...
                    xhr("proxy.ashx?http://fme.miljodirektoratet.no/fmedatadownload/kp_repo/xls_to_csv.fmw?SourceDataset_XLSXR=" + filePath + "&opt_responseformat=json", {
                        handleAs: "json",
                        method: "POST"
                    }).then(function (dataRunWS) {
                        dom.byId("jobResultDiv").innerHTML += "Status: " + dataRunWS.serviceResponse.statusInfo.status + "
                        dom.byId("jobResultDiv").innerHTML += "Download file at: " + dataRunWS.serviceResponse.url + "
                        jobStartButton.setDisabled(true);       // Only allow button workspace to be run once
                        uploadForm.setDisabled(false);          // reactivate uploadForm for another file
            }, "jobStartButtonDiv");
<body class="claro">
    <h1>FME File Upload</h1>
    <div id="uploadDiv"></div>
    <div id="uploadStatusDiv"> </div>
    <div id="jobStartButtonDiv"></div>
    <div id="jobResultDiv"></div>