#8959 The s390x Koji builders are extraordinary slow
Closed: Fixed 2 months ago by churchyard. Opened 3 months ago by churchyard.

Describe what you would like us to do:


not sure if this is a known thing or not, so reporting is here.

During the Python 3.9 rebuilds I've noticed the s390x Koji builders are extraordinarily slow. Normally, it takes a while to acquire the builder, but one you egt it, the build is fast. Now, the build themeselves are slower than armv7hl. Watching the build.log is very painfull. Even lines like this:

Wrote: /builddir/build/RPMS/vtk-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/python3-vtk-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/vtk-java-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/vtk-qt-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/python3-vtk-qt-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/vtk-mpich-8.2.0-14.fc33.s390x.rpm

Which normally happen within seconds, now takes minutes in between each.

Is there something wrong with the IO?

When do you need this to be done by? (YYYY/MM/DD)


I perfectly understand if this is not fixed before the data center move.


The S390x builders utilize a shared mainframe so will be affected by other builds from other 'customers' using the mainframe. There is a lot of other builds happening which we have no 'window' on which can affect general IO. The server is also in a locked facility due to COVID-19 so onsite maintenance is limited. We will contact the team who runs this system to see if there are any known IO issues.

Also because various traffic has to go from Phoenix AZ to Westford MA via the Internet which is around 2800 mile (4500 km) .. if the traffic is busy anywhere along that path.. things get slow for other writes (though in this case it would be local virtual IO versus NFS over ssh-fuse).

I think the problem is with local IO, I have noticed similar issue when working on buildvm-s390x-08 (z/VM based) for https://pagure.io/koji/issue/1974

To have some idea:

Wrote: /builddir/build/RPMS/vtk-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/python3-vtk-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/vtk-java-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/vtk-qt-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/python3-vtk-qt-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/vtk-mpich-8.2.0-14.fc33.s390x.rpm

This is where the build log was when I've opened this ticket.

Wrote: /builddir/build/RPMS/vtk-mpich-devel-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/python3-vtk-mpich-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/vtk-mpich-java-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/vtk-mpich-qt-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/python3-vtk-mpich-qt-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/vtk-openmpi-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/vtk-openmpi-devel-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/python3-vtk-openmpi-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/vtk-openmpi-java-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/vtk-openmpi-qt-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/python3-vtk-openmpi-qt-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/vtk-data-8.2.0-14.fc33.noarch.rpm                                                                                                                                                                   
Wrote: /builddir/build/RPMS/vtk-testing-8.2.0-14.fc33.s390x.rpm                                                                                                                                                                 
Wrote: /builddir/build/RPMS/vtk-examples-8.2.0-14.fc33.s390x.rpm                                                                                                                                                                
Wrote: /builddir/build/RPMS/vtk-devel-8.2.0-14.fc33.s390x.rpm                                                                                                                                                                   
Wrote: /builddir/build/RPMS/vtk-debugsource-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/vtk-devel-debuginfo-8.2.0-14.fc33.s390x.rpm                                                                                                                                                         
Wrote: /builddir/build/RPMS/python3-vtk-debuginfo-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/vtk-java-debuginfo-8.2.0-14.fc33.s390x.rpm                                                                                                                                                          
Wrote: /builddir/build/RPMS/vtk-qt-debuginfo-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/python3-vtk-qt-debuginfo-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/vtk-debuginfo-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/vtk-mpich-devel-debuginfo-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/python3-vtk-mpich-debuginfo-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/vtk-mpich-debuginfo-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/vtk-mpich-java-debuginfo-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/python3-vtk-mpich-qt-debuginfo-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/vtk-mpich-qt-debuginfo-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/vtk-openmpi-devel-debuginfo-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/python3-vtk-openmpi-debuginfo-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/vtk-openmpi-java-debuginfo-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/vtk-openmpi-qt-debuginfo-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/python3-vtk-openmpi-qt-debuginfo-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/vtk-testing-debuginfo-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/vtk-examples-debuginfo-8.2.0-14.fc33.s390x.rpm
Wrote: /builddir/build/RPMS/vtk-openmpi-debuginfo-8.2.0-14.fc33.s390x.rpm
Executing(%clean): /bin/sh -e /var/tmp/rpm-tmp.MJF6U7
+ umask 022
+ cd /builddir/build/BUILD
+ cd VTK-8.2.0
+ /usr/bin/rm -rf /builddir/build/BUILDROOT/vtk-8.2.0-14.fc33.s390x

This is where it is 45 minutes later.

I have opened an internal ticket on this. I have no ETA on when it will be picked up or looked at.

Metadata Update from @smooge:
- Issue tagged with: groomed, high-trouble, medium-gain

3 months ago

Metadata Update from @smooge:
- Issue priority set to: Waiting on Assignee (was: Needs Review)

3 months ago

Let me know if you need more data.

So, I have noticed this with the z/vm guests after they have been up a while...

Is it only the z/vm guests? ie, 01-14 or is it all of them?

01-14 are z/vm guests
15-24 are kvm guests.

Looks to me that only the z/vm ones are affected. For example buildvm-s390x-08 is quite slow even after a reboot, so it might be a sign of a deeper issue.

In the past I have rebooted all of them at the same time... wonder if there's some effect where some of them are dragging all the rest down?

I guess rebooting all of them "refreshes" the hypervisor, it frees all the resources the VMs used previously, so it might be good thing to do.

not sure, but it might be 01-14 only.

Indeed buildvm-s390x-24.s390.fedoraproject.org acted normally.

I disabled 01-14 (except 8, which I left in).

I can reboot them later, or leave them if folks want to debug anything more...

Metadata Update from @smooge:
- Issue assigned to smooge

3 months ago

There may be a hardware problem that is going to be worked on when COVID restrictions are lessened. The storage has had some flow changes made to try and compensate.

From mainframe maintainer: " I've tweaked some of the I/O priorities and CPU priorities on the frame. Please let me know if things have improved."

and indeed the vm's seem a lot more responsive. I am going to re-enable them and see how they do...

So, do things seem back to normal now? They all seem more responsive (still) to me...

ack, they look OK to me

I have not experienced the original issue since. Thanks for taking care of this and please forward my thanks to the mainframe maintainer.

Metadata Update from @churchyard:
- Issue close_status updated to: Fixed
- Issue status updated to: Closed (was: Open)

2 months ago

Login to comment on this ticket.

Metadata