#604 [frontend] fix exception when multiple sources are generating graph data
Merged 5 years ago by msuchy. Opened 5 years ago by dturecek.
copr/ dturecek/copr graphs  into  master

@@ -3,4 +3,4 @@ 

  runuser -c '/usr/share/copr/coprs_frontend/run/check_for_anitya_version_updates.py --backend pypi --delta=172800 &> /dev/null' - copr-fe

  runuser -c '/usr/share/copr/coprs_frontend/run/check_for_anitya_version_updates.py --backend rubygems --delta=172800 &> /dev/null' - copr-fe

  runuser -c '/usr/share/copr/coprs_frontend/manage.py update_indexes_quick 120 &> /dev/null' - copr-fe

- runuser -c '/usr/share/copr/coprs_frontend/manage.py update_graphs' - copr-fe

+ runuser -c '/usr/share/copr/coprs_frontend/manage.py update_graphs &> /dev/null' - copr-fe

@@ -13,6 +13,7 @@ 

  from sqlalchemy.sql import false,true

  from werkzeug.utils import secure_filename

  from sqlalchemy import bindparam, Integer, String

+ from sqlalchemy.exc import IntegrityError

  

  from copr_common.enums import FailTypeEnum, StatusEnum

  from coprs import app
@@ -217,14 +218,17 @@ 

          if result:

              return

  

-         cached_data = models.BuildsStatistics(

-             time = time,

-             stat_type = type,

-             running = running,

-             pending = pending

-         )

-         db.session.merge(cached_data)

-         db.session.commit()

+         try:

+             cached_data = models.BuildsStatistics(

+                 time = time,

+                 stat_type = type,

+                 running = running,

+                 pending = pending

+             )

+             db.session.add(cached_data)

+             db.session.commit()

+         except IntegrityError: # other process already calculated the graph data and cached it

+             db.session.rollback()

  

      @classmethod

      def get_graph_parameters(cls, type):

Fix for Issue #566, where an exception can be raised if more than one sources generate the graph data and try to save them in the db at the same time.

I'd probably need an explanation to give +1 here (missing background).

Since the graph data generation takes quite a long time, the results are stored in the db so that it's not necessary to generate it multiple times.

Each time someone views the graph page (or when the hourly cron for updating the data runs) the data that was not yet created is calculated -- however, the new data might already be in the database if multiple sources trigger this at the same time and that causes the exception.

rebased onto 5eebecc63ebb5ef6f022003397fff55ccdc49de8

5 years ago

I've changed session.merge to session.add. I've also changed the cronjob so that it matches what is installed from ansible.

I would still go with

db.session.add(...)
db.session.commit()

instead of db_session_scope() to avoid so many nested blocks, but +1

rebased onto 4db947a391b28f21bedf8e61b56f0b6422523daa

5 years ago

Alright, I removed the nested block.

@frostyx wrote

to avoid so many nested blocks

Just an idea -- db_session_scope() could accept err_ignore=[IntegrityError], e.g. (not a material for this PR, though).

@dturecek wrote

Alright, I removed the nested block.

Thank you. +1

rebased onto 8683c42

5 years ago

Pull-Request has been merged by msuchy

5 years ago