Fix the latest round of inconsistent CI tests
<img alt="0001-Ticket-49295-Fix-latest-CI-test-failures.patch" src="/389-ds-base/issue/raw/77b491585fa1ae164ec6b3adb40ba7d75980b390a2df4c93d456d633dcee97e8-0001-Ticket-49295-Fix-latest-CI-test-failures.patch" />
Metadata Update from @mreynolds: - Custom field component adjusted to CI test - suites - Custom field origin adjusted to Dev - Custom field reviewstatus adjusted to review - Custom field type adjusted to defect
Metadata Update from @spichugi: - Custom field reviewstatus adjusted to ack (was: review)
Metadata Update from @mreynolds: - Issue assigned to mreynolds
486584d..dd7a9fa master -> master
Metadata Update from @mreynolds: - Custom field reviewstatus reset (from ack)
<img alt="0001-Ticket-49295-Fix-CI-tests.patch" src="/389-ds-base/issue/raw/fc7990d119bfefc86fca0e01e7820f3f128edcc72325f56cdff413936e857b64-0001-Ticket-49295-Fix-CI-tests.patch" />
Metadata Update from @mreynolds: - Custom field reviewstatus adjusted to review
ack from me: where we have those sleeps for repl, perhaps we can do something smarter? I think @ankity10 encountered this recently ...
Metadata Update from @firstyear: - Custom field reviewstatus adjusted to ack (was: review)
35af6c6..706a1b9 master -> master
Responding to @firstyear, we have a testReplicaton() call, perhaps we need to call it before we test for replication updates? Do some wrapper function? I actually think we need a set of "test" functions, that do the sleeps, and wait for replication to complete, etc, etc. Our CI tests always run into these timing issues, so lets make "CI" functions to handle a lot of this for us? Sorry this is off topic for this ticket :)
I thought something like this existed already? Perhaps we should open a ticket for it on lib389.
There is nothing right now, at least not what I have in mind. I opened:
https://pagure.io/lib389/issue/76
Fixed another CI test, and fixed compiler warnings from https://pagure.io/389-ds-base/issue/49305
706a1b9..b23201b master -> master
<img alt="0001-Ticket-49295-Fix-CI-tests.patch" src="/389-ds-base/issue/raw/files/feca1c383cf621a2101301aff434eb7fea620222513834530f8147f671fda751-0001-Ticket-49295-Fix-CI-tests.patch" />
Metadata Update from @mreynolds: - Custom field reviewstatus adjusted to review (was: ack) - Custom field version adjusted to None
ack (I can't set the flag due to a dnssec bug at the bne office)
8965a8f..59d1556 master -> master
<img alt="0001-Ticket-49295-Fix-CI-Tests.patch" src="/389-ds-base/issue/raw/files/7b69103cff33e97cc76bf6ef7743076ab23c7197bc48a608f4c5fee3e1e62f23-0001-Ticket-49295-Fix-CI-Tests.patch" />
This should address all the current failures from the jenkins job
Metadata Update from @mreynolds: - Custom field reviewstatus adjusted to review (was: ack)
Thanks so much @mreynolds !! This is really great to have fixed all of this finally.
That account test has been a pain point for a while, I think @spichugi or @vashirov were going to work on it soon.
Additionally, @vashirov is closer to having CI integration working, so getting us to 100% green is great! Thank you again!
fc10c8b..1565eb1 master -> master
Thanks so much @mreynolds !! This is really great to have fixed all of this finally. That account test has been a pain point for a while, I think @spichugi or @vashirov were going to work on it soon.
Yeah it wasn't fun, there was lot of "timing" in the acc policy tests that made it tricky.
Well, there are more failures on the Jenkins server, but those tests are currently skipped. They are SSL/TLS tests that always fail on the VM, but they pass on my laptop. These are the final ones that need to be addressed before we have 100% coverage:
py.test -v --ignore=tickets/ticket49303_test.py --ignore=tickets/ticket47838_test.py --ignore=tickets/ticket47536_test.py --ignore=tickets/ticket48784_test.py --ignore=tickets/ticket49039_test.py
Okay, @spichugi and I want to work on the TLS-by-default for lib389 tests soon, so perhaps this could help in these cases?
Well, there are more failures on the Jenkins server, but those tests are currently skipped. They are SSL/TLS tests that always fail on the VM, but they pass on my laptop. These are the final ones that need to be addressed before we have 100% coverage: py.test -v --ignore=tickets/ticket49303_test.py --ignore=tickets/ticket47838_test.py --ignore=tickets/ticket47536_test.py --ignore=tickets/ticket48784_test.py --ignore=tickets/ticket49039_test.py
I have a different set of failing tests:
Failed suites/import/regression_test.py::test_del_suffix_backend 608.37 Failed suites/paged_results/paged_results_test.py::test_search_abandon 5.03 Failed suites/password/pwdPolicy_warning_test.py::test_expiry_time 0.00 Failed suites/password/pwdPolicy_warning_test.py::test_with_different_password_states 10.03 Failed suites/password/pwdPolicy_warning_test.py::test_default_behavior 0.00 Failed suites/psearch/psearch_test.py::test_psearch 1.00 Failed tickets/ticket47462_test.py::test_ticket47462 7.69 Failed tickets/ticket47838_test.py::test_47838_run_4 7.42 Failed tickets/ticket47838_test.py::test_47838_run_5 7.40 Failed tickets/ticket47838_test.py::test_47838_run_8 7.47 Failed tickets/ticket47838_test.py::test_47838_run_9 7.38 Failed tickets/ticket47838_test.py::test_47838_run_10 7.41 Failed tickets/ticket48226_test.py::test_ticket48226_1 6.34 Failed tickets/ticket49121_test.py::test_ticket49121 9.71 Failed tickets/ticket49287_test.py::test_ticket49287 144.28
This is on RHEL74.
What version (commit) of lib389 do you use? And what versions of python-ldap, nss/nspr are on your VM and laptop? It seems that these environmental differences are affecting test results.
The jenkins server builds 389 from master branch (1.3.7), and it uses upstream lib389. python-lib389 is not used.
Jenkins VM:
nss-softokn-3.30.2-1.0.fc25.x86_64 nss-3.30.2-1.1.fc25.x86_64 nspr-4.14.0-2.fc25.x86_64
But I just updated them to:
nspr.x86_64 4.16.0-1.fc25 nss.x86_64 3.32.0-1.1.fc25 nss-softokn.x86_64 3.32.0-1.2.fc25
We'll see how tonight's run goes.
Fix pwdPolicy_warning_test.py to not change system date for testing purposes.
<img alt="0001-Issue-49295-Fix-CI-tests.patch" src="/389-ds-base/issue/raw/files/eca54ddf3248bea2f57e181167546d647bdfb7475a5b910f90ff4e766e066eda-0001-Issue-49295-Fix-CI-tests.patch" />
Metadata Update from @vashirov: - Custom field reviewstatus adjusted to review (was: ack)
Failed suites/paged_results/paged_results_test.py::test_search_abandon 5.03 Failed suites/password/pwdPolicy_warning_test.py::test_expiry_time 0.00 Failed suites/password/pwdPolicy_warning_test.py::test_with_different_password_states 10.03 Failed suites/password/pwdPolicy_warning_test.py::test_default_behavior 0.00 Failed suites/psearch/psearch_test.py::test_psearch 1.00
were failing due to newer version of pyasn1. For some reason controls object was an empty list instead of ldap.controls.ppolicy.PasswordPolicyControl object.
Metadata Update from @mreynolds: - Custom field reviewstatus adjusted to ack (was: review)
Ack from me too @vashirov Thanks!
To ssh://pagure.io/389-ds-base.git 0ee7f61..4c03f30 master -> master
4c03f30..f342881 master -> master
<img alt="0001-Issue-49295-Fix-CI-tests.patch" src="/389-ds-base/issue/raw/files/2f0dc7dde205e6ef303729a72910567689ba0133aab7efcd8e676b8a6d2e7089-0001-Issue-49295-Fix-CI-tests.patch" />
To ssh://pagure.io/389-ds-base.git f342881..5c2fa96 master -> master
Closing ticket since this ticket is getting really "long", and it looks like we finally fixed the current CI tests. We will create a new ticket for future test failures.
Metadata Update from @mreynolds: - Issue close_status updated to: fixed - Issue set to the milestone: 1.3.7.0 - Issue status updated to: Closed (was: Open)
389-ds-base is moving from Pagure to Github. This means that new issues and pull requests will be accepted only in 389-ds-base's github repository.
This issue has been cloned to Github and is available here: - https://github.com/389ds/389-ds-base/issues/2354
If you want to receive further updates on the issue, please navigate to the github issue and click on subscribe button.
subscribe
Thank you for understanding. We apologize for all inconvenience.
Metadata Update from @spichugi: - Issue close_status updated to: wontfix (was: fixed)
Log in to comment on this ticket.