#7017 Set up koji policy/channels for known archful noarch packages.
Closed: It's all good 5 years ago Opened 6 years ago by ralph.

Reported by @sgallagh in a discussion with @ralph and @mikeb.

There are these funny things called 'archful noarch' packages. Packages which produce noarch binary rpms, but which can only be built on specific architectures (x86_64, specifically).

In the traditional world, when you submit a build of one of these package to koji, your build is sent to a builder with a random architecture. Usually this is wrong, and your build fails. You then submit and submit again until it works. This is colloquially called "winning the builder lottery." It is annoying, but people put up with it.

In the modular world, this poses a real problem. The MBS won't know why the build failed and we can't expect it to try over and over again until it wins the builder lottery. We need a better solution.

The solution we came up with on a whiteboard (a few months ago) was that we can set up a channel in koji called x86_64-builders (or something like that). Then, start maintaining a list of all known "archful noarch" packages. This could start with one or two packages and then we grow it over time.

We would then create a new koji hub policy that says something like:

"Whenever a build is submitted of a package that matches any of the packages in the curated list, submit the build to the x86_64-builders channel."

What do you think? Will it work?


That is a packaging bug, What you need to do in such cases is make the package archful, and have it create a noarch subpackage so that you will always be sure to get things right. There is also "archful noarch" packages that only build on other arches. system firmware for virtual machines for instance. It is not that hard to get the packaging right so that things will always hit the right builders.

I thought this was solved with:

https://pagure.io/koji/issue/19 / https://pagure.io/koji/c/55d5749

So, I think either make the main package archfull and the subpackage noarch as @ausil mentioned, or
BuildArch: x86_64
ExclusiveArch: noarch

BuildArch: x86_64
ExclusiveArch: noarch

I just tried this with one of my packages:

 BuildError: No valid arches were found. tag ['x86_64', 'ppc64', 'ppc64le', 'aarch64'], exclusive ['noarch'], exclude []

Reversing them:

Building target platforms: noarch
Building for target noarch
Wrote: /builddir/build/SRPMS/ReviewBoard-2.5.16-1.el7.src.rpm
Child return code was: 0
ENTER ['do'](['bash', '--login', '-c', '/usr/bin/rpmbuild -bb --target noarch --nodeps /builddir/build/SPECS/ReviewBoard.spec'], chrootPath='/var/lib/mock/epel7-build-9870529-784595/root'env={'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;<mock-chroot>\\007"', 'PS1': '<mock-chroot> \\s-\\v\\$ ', 'LANG': 'en_US.UTF-8'}shell=Falselogger=<mockbuild.trace_decorator.getLog object at 0x7f1532304f98>timeout=172800uid=1000gid=425user='mockbuild'nspawn_args=[]printOutput=False)
Executing command: ['bash', '--login', '-c', '/usr/bin/rpmbuild -bb --target noarch --nodeps /builddir/build/SPECS/ReviewBoard.spec'] with env {'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;<mock-chroot>\\007"', 'PS1': '<mock-chroot> \\s-\\v\\$ ', 'LANG': 'en_US.UTF-8'} and shell False
error: Architecture is not included: noarch
Building target platforms: noarch
Building for target noarch
Child return code was: 1
EXCEPTION: [Error()]
Traceback (most recent call last):
  File "/usr/lib/python3.6/site-packages/mockbuild/trace_decorator.py", line 89, in trace
    result = func(*args, **kw)
  File "/usr/lib/python3.6/site-packages/mockbuild/util.py", line 582, in do
    raise exception.Error("Command failed. See logs for output.\n # %s" % (command,), child.returncode)
mockbuild.exception.Error: Command failed. See logs for output.
 # bash --login -c /usr/bin/rpmbuild -bb --target noarch --nodeps /builddir/build/SPECS/ReviewBoard.spec

Is there a syntax error above, or is this feature broken?

Also, the problem for modularity is making sure a package that is built on a different architecture ends up in the resultant repo for this one.

Let me give a hypothetical example: I want to create a module that will contain MyApp, which is a python noarch application. It has one dependency, SomeLib. SomeLib is also a noarch python package, but building it requires a utility that exists only on x86_64 systems.

In traditional Koji, the way noarch works is that as long as the SRPM that produces SomeLib will build on at least one arch, we can be guaranteed that SomeLib will be available in the buildroot when I want to build MyApp. However, in a modular world, I would need both of these packages in my module metadata file (with a buildorder value that ensures that SomeLib is built first). I want MyApp to be available on x86_64, aarch64 and ppc64le systems, so I set those for the arches of this module.

In this case, my understanding is that Module Build Service will attempt to build SomeLib individually on each arch and fail on the non-x86_64 ones, thus failing the module build. What I'm looking for here is a guarantee that MBS will behave like Koji does with regards to noarch packages when building a module.

Sorry, I of course had that backwards.

BuildArch: noarch
ExclusiveArch: x86_64

This will be a noarch package that koji will build on a x86_64 builder.

So, in your example you want ExclusiveArch: x86_64 aarch64 ppc64le

I'm not fully sure how MBS does it's build order determination.

Sorry, I of course had that backwards.
BuildArch: noarch
ExclusiveArch: x86_64

See the rest of my post. This did not work either. That's what the log was after "reversing them".

Ah you were testing epel7? I think bug https://bugzilla.redhat.com/show_bug.cgi?id=1298668 wasn't fixed yet in rhel7, so there you would have to have:

BuildArch: noarch
ExclusiveArch: x86_64 noarch

@sgallagh, can you please take another look at this since the above bugzilla was fixed in rawhide? Thanks.

@syeghiay I haven't really looked at this in a long time, since the redesign of Modularity made this considerably less likely to encounter. I'll reopen this if I see it again.

Metadata Update from @syeghiay:
- Issue close_status updated to: It's all good
- Issue status updated to: Closed (was: Open)

5 years ago

Login to comment on this ticket.

Metadata