S3 storage error

Trying to configure external s3 storage, and when testing via

docker compose exec service node lib/bin/s3.js upload-pending 1

get the following error:

error: function pgrowlocks(unknown) does not exist
    at Parser.parseErrorMessage (/usr/odk/node_modules/pg-protocol/dist/parser.js:287:98)
    at Parser.handlePacket (/usr/odk/node_modules/pg-protocol/dist/parser.js:126:29)
    at Parser.parse (/usr/odk/node_modules/pg-protocol/dist/parser.js:39:38)
    at Socket.<anonymous> (/usr/odk/node_modules/pg-protocol/dist/index.js:11:42)
    at Socket.emit (node:events:519:28)
    at Socket.emit (node:domain:488:12)
    at addChunk (node:internal/streams/readable:559:12)
    at readableAddChunkPushByteMode (node:internal/streams/readable:510:3)
    at Readable.push (node:internal/streams/readable:390:5)
    at TCP.onStreamRead (node:internal/stream_base_commons:191:23) {
  length: 213,
  severity: 'ERROR',
  code: '42883',
  detail: undefined,
  hint: 'No function matches the given name and argument types. You might need to add explicit type casts.',
  position: '162',
  internalPosition: undefined,
  internalQuery: undefined,
  where: undefined,
  schema: undefined,
  table: undefined,
  column: undefined,
  dataType: undefined,
  constraint: undefined,
  file: 'parse_func.c',
  line: '629',
  routine: 'ParseFuncOrColumn',
  notices: []
}node:internal/process/promises:391
    triggerUncaughtException(err, true /* fromPromise */);
    ^

error: function pgrowlocks(unknown) does not exist
    at Parser.parseErrorMessage (/usr/odk/node_modules/pg-protocol/dist/parser.js:287:98)
    at Parser.handlePacket (/usr/odk/node_modules/pg-protocol/dist/parser.js:126:29)
    at Parser.parse (/usr/odk/node_modules/pg-protocol/dist/parser.js:39:38)
    at Socket.<anonymous> (/usr/odk/node_modules/pg-protocol/dist/index.js:11:42)
    at Socket.emit (node:events:519:28)
    at Socket.emit (node:domain:488:12)
    at addChunk (node:internal/streams/readable:559:12)
    at readableAddChunkPushByteMode (node:internal/streams/readable:510:3)
    at Readable.push (node:internal/streams/readable:390:5)
    at TCP.onStreamRead (node:internal/stream_base_commons:191:23) {
  length: 213,
  severity: 'ERROR',
  code: '42883',
  detail: undefined,
  hint: 'No function matches the given name and argument types. You might need to add explicit type casts.',
  position: '162',
  internalPosition: undefined,
  internalQuery: undefined,
  where: undefined,
  schema: undefined,
  table: undefined,
  column: undefined,
  dataType: undefined,
  constraint: undefined,
  file: 'parse_func.c',
  line: '629',
  routine: 'ParseFuncOrColumn',
  notices: []
}

It is possible you are using a custom/external database server and you don't have the pgrowlocks extension in your database?

You can check by running this bit of SQL in your database:

CREATE EXTENSION IF NOT EXISTS pgrowlocks;

We are using another postgresql instance (16) to pull data in from the Central databases that are in the docker container. But we're not using it as a custom database server for Central. The pgrowlocks extension was installed in this separate instance of postgresql 16, but error remains the same.

I have just come across the same issue. Except that the transfer is successful (when I check with docker compose exec service node lib/bin/s3.js count-blobs uploaded I get an answer of 1)

My server is not used for anything else and I have not (knowingly) modified anything from a standard Central installation following all the installation and upgrade steps (just upgraded to v2025.1.2-1-g5ea792b) - except of course to edit .env and add the credentials of my S3_BUCKET - https://docs.getodk.org/central-install-digital-ocean/#using-s3-compatible-storage

I have not modified the database and don't use postgresql for anything else so I've just stepped beyond my competency... (again!).

How do I add the pgrolocks extension in my case? In words of one syllable please :slight_smile: I've done some initial searching for a generic answer but I'm not sure if I'm starting in the right place!

Which folder / user account do I need to be in for the server to 'run this bit of SQL in my database'?

I don't seem to have the psql tool available on my server - should I install that first or is there another way to administer the database? Sorry I know these are very basic questions, I didn't expect to have to furkle around in this area! My experiences of Central to date has been that it 'just works'... as long as you carefully follow all the desctructions.

I am trying to avoid heuristically adding packages etc to my server...

If you are not using a custom/external database, you should not need pgrowlocks. At least that's what the docs say. Give us a few days to investigate...

Thanks, appreciated. That's kind of reassuring (that I haven't done something dumb) and also that you can tell me what the docs say :wink:

I think there is partial success despite the error message...

I checked the Central server audit log this morning and my S3 bucket and things match, but I'm not sure I understand which attachments have been transferred.
Central Server Audit Log: {"failed":1,"duration":153903,"uploaded":224}
S3-Bucket: files 225, 121.76 MiB (I manually tested 1 file transfer yesterday, so I'm guessing that might be the failure from Central).
s3.js count-blobs uploaded: 225

However, there were 144 submissions yesterday (all with attachments - and I can access these both in Central and QuODK - phew, the data is still available!) and I have a few thousand existing attachments on the server that should be pending.

So I am slightly perplexed by 225 attachments being uploaded - I'd call that unexpected... at least by my reading of how Central is meant to work with S3. I don't know how to decode the .json within the s3-bucket to be able to identify the images and therefore check that the process has actually worked as intended (I recognise that this is probably intentional!).

Based on the Central Docs, I tried docker compose exec service node lib/bin/s3.js count-blobs pending to check the situation, and got the same error response as above, plus the following message:

Error: cannot count blobs by status due to missing PostgreSQL extension: PGROWLOCKS.

To install this extension, execute the following query in your PostgreSQL instance:

CREATE EXTENSION IF NOT EXISTS pgrowlocks;

Whereas count-blobs uploaded and count-blobs failed work as expected (225 and 1)

Anything helpful I can provide, just ask (including access to the server to investigate).

We have a docs update that will help, but here's what you'll need to do.

  1. Connect to your local/default database. The default password is odk.
    docker exec -it central-postgres14-1 psql -U odk -W odk
    
  2. Run the following SQL, then quit psql.
    CREATE EXTENSION IF NOT EXISTS pgrowlocks;
    \q
    
  3. Test the config to make sure it works.
    docker compose exec service node lib/bin/s3.js count-blobs pending
    

As to the counts, note that every blob gets uploaded. So form definitions, form attachments, submissions, and submission attachments.

Hi @yanokwa

Excellent - thank you for your 'monosyllabic' words of wisdom! Even I could follow them.

Here is the outcome, for the record:

odk=# CREATE EXTENSION IF NOT EXISTS CITEXT;
NOTICE:  extension "citext" already exists, skipping
CREATE EXTENSION
odk=# CREATE EXTENSION IF NOT EXISTS pg_trgm;
NOTICE:  extension "pg_trgm" already exists, skipping
CREATE EXTENSION
odk=# CREATE EXTENSION IF NOT EXISTS pgrowlocks;
CREATE EXTENSION
odk=#

Now, count-blobs pending works without error and shows zero pending...

I checked the server before doing this and today it showed uploaded 17,000+ items (i.e. it seemed to have worked anyway) - and the S3-bucket is showing c11Gb. So maybe yesterday's 225 items was just a warm-up lap...

All set, and my free disk-space worries can recede ready for the next data collection exercises, rather than trying to fix something 'on-the-fly'...

As always, greatly appreciate your help and speed of response!

1 Like