#1363 Use createrepo_update even for first repo run
Merged 3 years ago by mikem. Opened 3 years ago by tkopecek.
tkopecek/koji issue1354  into  master

file modified
+19 -4
@@ -2135,7 +2135,7 @@ 

                  del todo[package]


                  results = self.wait(to_list(running.keys()))

-             except (six.moves.xmlrpc_client.Fault, koji.GenericError) as e:

+             except (six.moves.xmlrpc_client.Fault, koji.GenericError):

Does this work for Python 3? I thought we needed the as e for that?

                  # One task has failed, wait for the rest to complete before the

                  # chainmaven task fails.  self.wait(all=True) should thrown an exception.

@@ -5026,9 +5026,21 @@ 

          #only shadowbuild tags should start with SHADOWBUILD, their repos are auto

          #expired.  so lets get the most recent expired tag for newRepo shadowbuild tasks.

          if tinfo['name'].startswith('SHADOWBUILD'):

-             oldrepo = self.session.getRepo(tinfo['id'], state=koji.REPO_EXPIRED)

+             oldrepo_state = koji.REPO_EXPIRED


-             oldrepo = self.session.getRepo(tinfo['id'], state=koji.REPO_READY)

+             oldrepo_state = koji.REPO_READY

+         oldrepo = self.session.getRepo(tinfo['id'], state=oldrepo_state)

+         # If there is no old repo, try to find first usable repo in

+         # inheritance chain and use it as a source. oldrepo is not used if

+         # createrepo_update is not set, so don't waste call in such case.

+         if not oldrepo and self.options.createrepo_update:

+             tags = self.session.getFullInheritance(tinfo['id'])

+             # we care about best candidate which should be (not necessarily)

+             # something on higher levels. Sort tags according to depth.

+             for tag in sorted(tags, key=lambda x: x['currdepth']):

+                 oldrepo = self.session.getRepo(tag['parent_id'], state=oldrepo_state)

+                 if oldrepo:

+                     break

          subtasks = {}

          for arch in arches:

              arglist = [repo_id, arch, oldrepo]
@@ -5106,7 +5118,10 @@ 

              cmd.extend(['-g', groupdata])

          #attempt to recycle repodata from last repo

          if pkglist and oldrepo and self.options.createrepo_update:

-             oldpath = self.pathinfo.repo(oldrepo['id'], rinfo['tag_name'])

+             # old repo could be from inherited tag, so path needs to be

+             # composed from that tag, not rinfo['tag_name']

+             oldrepo = self.session.repoInfo(oldrepo['id'], strict=True)

+             oldpath = self.pathinfo.repo(oldrepo['id'], oldrepo['tag_name'])

              olddatadir = '%s/%s/repodata' % (oldpath, arch)

              if not os.path.isdir(olddatadir):

                  self.logger.warn("old repodata is missing: %s" % olddatadir)

createrepo_update is currently reusing only old repos from same tag.
Nevertheless, for first newRepo there is no old data, but there is a
high chance, that we inherit something. This inherited repo can be used
also for significant speedup.

Fixes: https://pagure.io/koji/issue/1354

Does this work for Python 3? I thought we needed the as e for that?

Does this work for Python 3? I thought we needed the as e for that?
This one doesn't make difference. I've removed it as pyflakes cry "local variable 'e' is assigned to but never used" - I can drop it from this PR.

Seems to be working here

Commit bfbdf31 fixes this pull-request

Pull-Request has been merged by mikem

3 years ago