by radimm on 6/17/24, 7:05 AM with 17 comments
by mslot on 6/17/24, 11:44 AM
I wrote pg_cron with the intention of keeping it as simple, reliable, and low maintenance as possible, so it's likely to remain that way.
It is possible to implement a more advanced job schedulers on top of pg_cron if needed. For instance, you can set up a few parallel jobs that run every N seconds and take an item from a job queue table.
by cuu508 on 6/17/24, 10:08 AM
copy (select 'hello world') to program 'curl -m 10 --data-binary @- https://some-url-here';
There's also a postgres extension for making HTTP requests but this seems to work out of the box (if curl is installed).The "-m 10" parameter is 10 second timeout, to reduce the risk of this command hanging. I did not test what happens if curl returns non-zero exit code, this would also need to be tested and handled.
One could use this to monitor pg_cron tasks with external cron monitoring services. I'm not sure if this would be overall good idea, but one could :-)
by lucianbr on 6/17/24, 10:07 AM
by 9dev on 6/17/24, 1:03 PM
I would have entries that have an expiration date and need to regularly purge all required rows (think access tokens, WebAuthn challenges, etc). The service creating those rows is deployed serverless, so only invoked on incoming requests. Now the only viable options I know are a) having a lottery and run the delete query as part of the normal request handling with p=0.01, b) have a secondary scheduler system that performs housekeeping tasks, or c) using pg_cron to do so in the database.
Are there any other solutions to this? Scheduling jobs on the database system works, but I’m always wondering how others solve this.