[opendmarc-users] History File vs Database

Andreas Schulze sca at andreasschulze.de
Fri Aug 17 13:40:51 PDT 2012


> We intend to use sqlite to store daily data, then timeout will not be a
> problem.
Hi,

the same situation we had with the reputation extension of opendkim.
opendkim uses opendbx as a abstraction layer for sqlite,mysql and postgres (maybe other)
*writing* some data into such databases is possible at all without any problems.

But there is more. You may like to *query* the database later from a webapplication to calculate
and present correlated informations. That part is done by perlcode in opendkim.
And that code is not using opendbx. It's nativly talking mysql syntax.
We (Murray and me) fail to port the code to sqlite syntax an I decided for me to switch to mysql.

The same will occure with opendmarc. The current perl code queries the database (with mysql syntax)
and create reports. But one may think about other (statistic?) queries...

Therefore I would prefer the current design. Opend[kim|marc] write do flat files and cron-driven imports
fill an unrelated database. That way I implemented multiple 'reporter' to one database.

Sure, that would also be possible, it multiple opend[kim|marc] prosesses write to a remote database,
but then you must have an eye on encryption of the database connection.
Using the current design one may simply use https or ssh tunnels and some scripts to secure the remote database writes.

Andreas




More information about the opendmarc-users mailing list