#52 packager_alias: Allow for sporadic failures in retrieving info from dist-git
Merged 3 years ago by kevin. Opened 3 years ago by pingou.
fedora-infra/ pingou/ansible master  into  master

@@ -6,6 +6,8 @@ 

  Its goal is to generate all the <pkg>-owner email aliases we provide

  """

  

+ import time

+ 

  import requests

  

  from requests.adapters import HTTPAdapter
@@ -36,8 +38,19 @@ 

      pagure_projects_url = pagure_url + '/api/0/projects?page=1&per_page=100&fork=false'

      session = retry_session()

      while pagure_projects_url:

-         response = session.get(pagure_projects_url)

-         data = response.json()

+         cnt = 0

+         while True:

+             try:

+                 response = session.get(pagure_projects_url)

+                 data = response.json()

+                 break

+             except Exception:

+                 if cnt == 4:

+                     raise

+ 

+                 cnt += 1

+                 time.sleep(30)

+ 

          for project in data['projects']:

              yield project

          # This is set to None on the last page.

Basically, if we fail to retrieve data from pagure or we fail to
convert from JSON, wait for 30 seconds and retry.
If after two minutes (4 attempts) it still hasn't worked, bail.

Fixes https://pagure.io/fedora-infrastructure/issue/7603

Signed-off-by: Pierre-Yves Chibon pingou@pingoured.fr

rebased onto 0db6035

3 years ago

Pull-Request has been merged by kevin

3 years ago

Did you consider https://dev.to/ssbozy/python-requests-with-retries-4p03 ? I think it is cleaner to rely on requests.

Did you consider https://dev.to/ssbozy/python-requests-with-retries-4p03 ? I think it is cleaner to rely on requests.

Ha actually this is already in the script, curious to know why it does not work then :)

Did you consider https://dev.to/ssbozy/python-requests-with-retries-4p03 ? I think it is cleaner to rely on requests.

Ha actually this is already in the script, curious to know why it does not work then :)

It's the not the http request that is failing but the conversion of the data to JSON, it could be that it got an incomplete dataset or something odd like this, but it is weird for sure.

Metadata