[ad_1]
I need to store some big binary data in a Postgresql database. I also need to have an audit log for this database. But I do not need, and not want, the binary data in the audit logs.
I am accessing the database with python via sqlalchemy.
Is there a canonical way to write a sqlalchemy insert so that the data does not end up in the audit logs?
session.add(<dataobject>)
obviously does not do that. The binary data is, in a textual representation, within the query text.
After a bit of experimentation I figured that prepared queries would do that. The experimental audit log line for a
PREPARE func1(text) AS
SELECT name, setting FROM pg_settings WHERE name LIKE $1;
EXECUTE func1('pgaudit%');
is:
2022-06-10 11:20:49.222 UTC [519] LOG: AUDIT: SESSION,9,1,READ,PREPARE,,,"PREPARE func1(text) AS
SELECT name, setting FROM pg_settings WHERE name LIKE $1;",<not logged>
2022-06-10 11:20:55.942 UTC [519] LOG: AUDIT: SESSION,10,1,READ,SELECT,,,"PREPARE func1(text) AS
SELECT name, setting FROM pg_settings WHERE name LIKE $1;",<not logged>
This could fulfil my requirements. Now, how can I get sqlalchemy to do inserts in a prepared statement with function parameters?
I am using sqlalchemy 1.3.17, postgresql 12.11 and pgaudit 1.4.3
[ad_2]