#3 Tracker Ticket: Enhance data-providers to store parsed data in database instead of memory
Closed: Invalid 3 years ago by jskladan. Opened 5 years ago by jskladan.

Instead of running the data-providers get_actions on cache misses, make the providers store the data in a database, so the main process can always just load the data (the db loaded data can, of course, still be cached).

These are the general "steps" I can easily see you taking, please make sure to create a pr (or a ticket to discuss the details) for each one nstead of having one huge change PR :)

  1. come up with a suitable db structure based on how the (data/actions are used in the code
    • make use of sqlachemy and alembic to define it in the app (oraculum/models/ directory and alembic revision --autogenerate are your friends here), also make sure to enable (and properly change) the code in cli.py
  2. change the data providers, so when the get_actions method is called, it refreshes data in the database
  3. add a command line option to run the data providers 'update' separately (might make more sense to be able to specify which specific ones should run) - this will allow running the update regularly from e.g. cron
  4. investigate messaging systems (0mq is IMO a good choice, as it is 'serverless'), so the main oraculum process can be notified on changes, to reload the (part of) cache, instead of relying on refreshing the data from the DB from time to time

Metadata Update from @jskladan:
- Issue close_status updated to: Invalid
- Issue status updated to: Closed (was: Open)

3 years ago

Log in to comment on this ticket.

Metadata