Code Monkey home page Code Monkey logo

zaza-openstack-tests's People

Contributors

addyess avatar ajkavanagh avatar andrewdmcleod avatar camille-rodriguez avatar chrismacnaughton avatar dasm avatar dshcherb avatar exsdev0 avatar fnordahl avatar freyes avatar gboutry avatar hemanthnakkina avatar ionutbalutoiu avatar javacruft avatar jguedez avatar jneo8 avatar lmlg avatar lolwww avatar lourot avatar mkalcok avatar mylesjp avatar n-pochet avatar rodrigogansobarbieri avatar ryan-beisner avatar sabaini avatar sponge-bas avatar thedac avatar utkarshbhatthere avatar wolsen avatar zhhuabj avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

zaza-openstack-tests's Issues

gss: sync-images setup-job appears to be racy?

I made an attempt at switchnig the octavia gate jobs over to the new gss charm and action-based approach.

During the first run I had one success and one failure, log excerpt below.

Could it be the setup function returns before the action run has fully completed?

2020-06-18 14:17:47 [INFO] Synchronising images using glance-simplestreams-sync
2020-06-18 14:17:55 [INFO] Running `retrofit-image` action on octavia-diskimage-retrofit/0
Traceback (most recent call last):
  File "/tmp/tmp.rR8wvy7HBe/func-smoke/bin/functest-run-suite", line 8, in <module>
    sys.exit(main())
  File "/tmp/tmp.rR8wvy7HBe/func-smoke/lib/python3.5/site-packages/zaza/charm_lifecycle/func_test_runner.py", line 204, in main
    force=args.force)
  File "/tmp/tmp.rR8wvy7HBe/func-smoke/lib/python3.5/site-packages/zaza/charm_lifecycle/func_test_runner.py", line 141, in func_test_runner
    force=force)
  File "/tmp/tmp.rR8wvy7HBe/func-smoke/lib/python3.5/site-packages/zaza/charm_lifecycle/func_test_runner.py", line 83, in run_env_deployment
    config_steps.get(deployment.model_alias, []))
  File "/tmp/tmp.rR8wvy7HBe/func-smoke/lib/python3.5/site-packages/zaza/charm_lifecycle/configure.py", line 48, in configure
    run_configure_list(functions)
  File "/tmp/tmp.rR8wvy7HBe/func-smoke/lib/python3.5/site-packages/zaza/charm_lifecycle/configure.py", line 37, in run_configure_list
    utils.get_class(func)()
  File "/tmp/tmp.rR8wvy7HBe/func-smoke/lib/python3.5/site-packages/zaza/openstack/charm_tests/octavia/diskimage_retrofit/setup.py", line 49, in retrofit_amphora_image
    raise_on_failure=True)
  File "/tmp/tmp.rR8wvy7HBe/func-smoke/lib/python3.5/site-packages/zaza/__init__.py", line 48, in _wrapper
    return run(_run_it())
  File "/tmp/tmp.rR8wvy7HBe/func-smoke/lib/python3.5/site-packages/zaza/__init__.py", line 36, in run
    return task.result()
  File "/usr/lib/python3.5/asyncio/futures.py", line 274, in result
    raise self._exception
  File "/usr/lib/python3.5/asyncio/tasks.py", line 239, in _step
    result = coro.send(None)
  File "/tmp/tmp.rR8wvy7HBe/func-smoke/lib/python3.5/site-packages/zaza/__init__.py", line 47, in _run_it
    return await f(*args, **kwargs)
  File "/tmp/tmp.rR8wvy7HBe/func-smoke/lib/python3.5/site-packages/zaza/model.py", line 614, in async_run_action
    raise ActionFailed(action_obj)
zaza.model.ActionFailed: Run of action "retrofit-image" with parameters "{'force': False, 'source-image': ''}" on "octavia-diskimage-retrofit/0" failed with "unable to find suitable source image" (id=15 status=failed enqueued=2020-06-18T14:17:56Z started=2020-06-18T14:17:56Z completed

series upgrade test fails to set openstack-origin for designate-bind

21:57:03 2019-10-11 21:57:00 [INFO] Set origin on designate-bind to openstack-origin
21:57:03 Traceback (most recent call last):
21:57:03   File "/tmp/tmp.1g0MdZzwON/mojo-openstack-specs/xenial/osci-mojo/spec/specs/full_stack/next_series_upgrade/queens/series_upgrade.py", line 42, in <module>
21:57:03     sys.exit(series_upgrade_test.test_200_run_series_upgrade())
21:57:03   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/zaza/openstack/charm_tests/series_upgrade/tests.py", line 99, in test_200_run_series_upgrade
21:57:03     files=self.files)
21:57:03   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/zaza/openstack/utilities/generic.py", line 313, in series_upgrade_application
21:57:03     files=files)
21:57:03   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/zaza/openstack/utilities/generic.py", line 392, in series_upgrade
21:57:03     set_origin(application, origin)
21:57:03   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/zaza/openstack/utilities/generic.py", line 421, in set_origin
21:57:03     model.set_application_config(application, {origin: pocket})
21:57:03   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/zaza/__init__.py", line 48, in _wrapper
21:57:03     return run(_run_it())
21:57:03   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/zaza/__init__.py", line 36, in run
21:57:03     return task.result()
21:57:03   File "/usr/lib/python3.5/asyncio/futures.py", line 274, in result
21:57:03     raise self._exception
21:57:03   File "/usr/lib/python3.5/asyncio/tasks.py", line 239, in _step
21:57:03     result = coro.send(None)
21:57:03   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/zaza/__init__.py", line 47, in _run_it
21:57:03     return await f(*args, **kwargs)
21:57:03   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/zaza/model.py", line 498, in async_set_application_config
21:57:03     .set_config(configuration))
21:57:03   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/juju/application.py", line 383, in set_config
21:57:03     return await app_facade.Set(application=self.name, options=config)
21:57:03   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/juju/client/facade.py", line 472, in wrapper
21:57:03     reply = await f(*args, **kwargs)
21:57:03   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/juju/client/_client8.py", line 1124, in Set
21:57:03     reply = await self.rpc(msg)
21:57:03   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/juju/client/facade.py", line 608, in rpc
21:57:03     result = await self.connection.rpc(msg, encoder=TypeEncoder)
21:57:03   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/juju/client/connection.py", line 455, in rpc
21:57:03     raise errors.JujuAPIError(result)
21:57:03 juju.errors.JujuAPIError: unknown option "openstack-origin"

policyd tests fails intermittently

2019-11-13 13:03:41 [INFO] ## Running Test zaza.openstack.charm_tests.policyd.tests.NeutronApiTests ##
2019-11-13 13:03:46 [INFO] Using keystone API V3 (or later) for overcloud auth
2019-11-13 13:03:54 [INFO] Using keystone API V3 (or later) for overcloud auth
2019-11-13 13:03:57 [INFO] test_001_policyd_good_yaml (zaza.openstack.charm_tests.policyd.tests.NeutronApiTests)
2019-11-13 13:03:57 [INFO] Test that the policyd with a good zipped yaml file.
2019-11-13 13:03:57 [INFO]  ...
2019-11-13 13:03:57 [INFO] Attaching good zip file as a resource.
2019-11-13 13:03:58 [INFO] Setting config to {'use-policyd-override': 'True'}
2019-11-13 13:03:58 [INFO] Now checking for file contents: /etc/neutron/policy.d/file1.yaml
2019-11-13 13:04:37 [INFO] Checking for workload status line starts with PO:
2019-11-13 13:04:38 [INFO] Disabling policy override by setting config to false
2019-11-13 13:04:38 [INFO] Setting config to {'use-policyd-override': 'False'}
2019-11-13 13:06:10 [INFO] Checking that /etc/neutron/policy.d/file1.yaml has been removed
2019-11-13 13:06:11 [INFO] OK
2019-11-13 13:06:11 [INFO] ok
2019-11-13 13:06:11 [INFO] test_002_policyd_bad_yaml (zaza.openstack.charm_tests.policyd.tests.NeutronApiTests)
2019-11-13 13:06:11 [INFO] Test bad yaml file in the zip file is handled.
2019-11-13 13:06:11 [INFO]  ...
2019-11-13 13:06:11 [INFO] Attaching bad zip file as a resource
2019-11-13 13:06:11 [INFO] Setting config to {'use-policyd-override': 'True'}
2019-11-13 13:06:12 [INFO] Checking for workload status line starts with PO (broken):
2019-11-13 13:07:03 [INFO] Now checking that file /etc/neutron/policy.d/file2.yaml is not present.
2019-11-13 13:07:04 [INFO] Setting config to {'use-policyd-override': 'False'}
2019-11-13 13:07:55 [INFO] OK
2019-11-13 13:07:55 [INFO] ok
2019-11-13 13:07:55 [INFO] test_003_test_overide_is_observed (zaza.openstack.charm_tests.policyd.tests.NeutronApiTests)
2019-11-13 13:07:55 [INFO] Test that the override is observed by the underlying service.
2019-11-13 13:07:55 [INFO]  ...
2019-11-13 13:07:55 [INFO] Doing policyd override for neutron
2019-11-13 13:07:56 [INFO] First verify that operation works prior to override
2019-11-13 13:07:56 [INFO] Authentication for demo on keystone IP 10.5.0.54
2019-11-13 13:07:56 [INFO] keystone IP 10.5.0.54
2019-11-13 13:07:57 [INFO] Doing policyd override with: {'get_network': '!'}
2019-11-13 13:07:57 [INFO] Setting config to {'use-policyd-override': 'True'}
2019-11-13 13:08:52 [INFO] Now verify that operation doesn't work with override
2019-11-13 13:08:52 [INFO] Authentication for demo on keystone IP 10.5.0.54
2019-11-13 13:08:52 [INFO] keystone IP 10.5.0.54
2019-11-13 13:08:53 [INFO] Setting config to {'use-policyd-override': 'False'}
2019-11-13 13:09:36 [INFO] Finally verify that operation works after removing the override.
2019-11-13 13:09:36 [INFO] Authentication for demo on keystone IP 10.5.0.54
2019-11-13 13:09:36 [INFO] keystone IP 10.5.0.54
2019-11-13 13:09:37 [INFO] ERROR
2019-11-13 13:09:37 [INFO] ======================================================================
2019-11-13 13:09:37 [INFO] ERROR: test_003_test_overide_is_observed (zaza.openstack.charm_tests.policyd.tests.NeutronApiTests)
2019-11-13 13:09:37 [INFO] Test that the override is observed by the underlying service.
2019-11-13 13:09:37 [INFO] ----------------------------------------------------------------------
2019-11-13 13:09:37 [INFO] Traceback (most recent call last):
2019-11-13 13:09:37 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/urllib3/connectionpool.py", line 672, in urlopen
2019-11-13 13:09:37 [INFO]     chunked=chunked,
2019-11-13 13:09:37 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/urllib3/connectionpool.py", line 421, in _make_request
2019-11-13 13:09:37 [INFO]     six.raise_from(e, None)
2019-11-13 13:09:37 [INFO]   File "<string>", line 3, in raise_from
2019-11-13 13:09:37 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/urllib3/connectionpool.py", line 416, in _make_request
2019-11-13 13:09:37 [INFO]     httplib_response = conn.getresponse()
2019-11-13 13:09:37 [INFO]   File "/usr/lib/python3.6/http/client.py", line 1346, in getresponse
2019-11-13 13:09:37 [INFO]     response.begin()
2019-11-13 13:09:37 [INFO]   File "/usr/lib/python3.6/http/client.py", line 307, in begin
2019-11-13 13:09:37 [INFO]     version, status, reason = self._read_status()
2019-11-13 13:09:37 [INFO]   File "/usr/lib/python3.6/http/client.py", line 276, in _read_status
2019-11-13 13:09:37 [INFO]     raise RemoteDisconnected("Remote end closed connection without"
2019-11-13 13:09:37 [INFO] http.client.RemoteDisconnected: Remote end closed connection without response
2019-11-13 13:09:37 [INFO] During handling of the above exception, another exception occurred:
2019-11-13 13:09:37 [INFO] Traceback (most recent call last):
2019-11-13 13:09:37 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/requests/adapters.py", line 449, in send
2019-11-13 13:09:37 [INFO]     timeout=timeout
2019-11-13 13:09:37 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/urllib3/connectionpool.py", line 720, in urlopen
2019-11-13 13:09:37 [INFO]     method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
2019-11-13 13:09:37 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/urllib3/util/retry.py", line 400, in increment
2019-11-13 13:09:37 [INFO]     raise six.reraise(type(error), error, _stacktrace)
2019-11-13 13:09:37 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/urllib3/packages/six.py", line 734, in reraise
2019-11-13 13:09:37 [INFO]     raise value.with_traceback(tb)
2019-11-13 13:09:37 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/urllib3/connectionpool.py", line 672, in urlopen
2019-11-13 13:09:37 [INFO]     chunked=chunked,
2019-11-13 13:09:37 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/urllib3/connectionpool.py", line 421, in _make_request
2019-11-13 13:09:37 [INFO]     six.raise_from(e, None)
2019-11-13 13:09:37 [INFO]   File "<string>", line 3, in raise_from
2019-11-13 13:09:37 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/urllib3/connectionpool.py", line 416, in _make_request
2019-11-13 13:09:37 [INFO]     httplib_response = conn.getresponse()
2019-11-13 13:09:37 [INFO]   File "/usr/lib/python3.6/http/client.py", line 1346, in getresponse
2019-11-13 13:09:37 [INFO]     response.begin()
2019-11-13 13:09:37 [INFO]   File "/usr/lib/python3.6/http/client.py", line 307, in begin
2019-11-13 13:09:37 [INFO]     version, status, reason = self._read_status()
2019-11-13 13:09:37 [INFO]   File "/usr/lib/python3.6/http/client.py", line 276, in _read_status
2019-11-13 13:09:37 [INFO]     raise RemoteDisconnected("Remote end closed connection without"
2019-11-13 13:09:37 [INFO] urllib3.exceptions.ProtocolError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response',))
2019-11-13 13:09:37 [INFO] During handling of the above exception, another exception occurred:
2019-11-13 13:09:37 [INFO] Traceback (most recent call last):
2019-11-13 13:09:37 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/keystoneauth1/session.py", line 1004, in _send_request
2019-11-13 13:09:37 [INFO]     resp = self.session.request(method, url, **kwargs)
2019-11-13 13:09:37 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/requests/sessions.py", line 533, in request
2019-11-13 13:09:37 [INFO]     resp = self.send(prep, **send_kwargs)
2019-11-13 13:09:37 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/requests/sessions.py", line 646, in send
2019-11-13 13:09:37 [INFO]     r = adapter.send(request, **kwargs)
2019-11-13 13:09:37 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/requests/adapters.py", line 498, in send
2019-11-13 13:09:37 [INFO]     raise ConnectionError(err, request=request)
2019-11-13 13:09:37 [INFO] requests.exceptions.ConnectionError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response',))
2019-11-13 13:09:37 [INFO] During handling of the above exception, another exception occurred:
2019-11-13 13:09:37 [INFO] Traceback (most recent call last):
2019-11-13 13:09:37 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/zaza/openstack/charm_tests/policyd/tests.py", line 494, in get_client_and_attempt_operation
2019-11-13 13:09:37 [INFO]     networks = neutron_client.list_networks()
2019-11-13 13:09:37 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/neutronclient/v2_0/client.py", line 818, in list_networks
2019-11-13 13:09:37 [INFO]     **_params)
2019-11-13 13:09:37 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/neutronclient/v2_0/client.py", line 369, in list
2019-11-13 13:09:37 [INFO]     for r in self._pagination(collection, path, **params):
2019-11-13 13:09:37 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/neutronclient/v2_0/client.py", line 384, in _pagination
2019-11-13 13:09:37 [INFO]     res = self.get(path, params=params)
2019-11-13 13:09:37 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/neutronclient/v2_0/client.py", line 354, in get
2019-11-13 13:09:37 [INFO]     headers=headers, params=params)
2019-11-13 13:09:37 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/neutronclient/v2_0/client.py", line 331, in retry_request
2019-11-13 13:09:37 [INFO]     headers=headers, params=params)
2019-11-13 13:09:37 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/neutronclient/v2_0/client.py", line 282, in do_request
2019-11-13 13:09:37 [INFO]     headers=headers)
2019-11-13 13:09:37 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/neutronclient/client.py", line 340, in do_request
2019-11-13 13:09:37 [INFO]     return self.request(url, method, **kwargs)
2019-11-13 13:09:37 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/neutronclient/client.py", line 328, in request
2019-11-13 13:09:37 [INFO]     resp = super(SessionClient, self).request(*args, **kwargs)
2019-11-13 13:09:37 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/keystoneauth1/adapter.py", line 248, in request
2019-11-13 13:09:37 [INFO]     return self.session.request(url, method, **kwargs)
2019-11-13 13:09:37 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/keystoneauth1/session.py", line 913, in request
2019-11-13 13:09:37 [INFO]     resp = send(**kwargs)
2019-11-13 13:09:37 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/keystoneauth1/session.py", line 1020, in _send_request
2019-11-13 13:09:37 [INFO]     raise exceptions.ConnectFailure(msg)
2019-11-13 13:09:37 [INFO] keystoneauth1.exceptions.connection.ConnectFailure: Unable to establish connection to http://10.5.0.61:9696/v2.0/networks: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response',))
2019-11-13 13:09:37 [INFO] During handling of the above exception, another exception occurred:
2019-11-13 13:09:37 [INFO] Traceback (most recent call last):
2019-11-13 13:09:37 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/zaza/openstack/charm_tests/policyd/tests.py", line 428, in test_003_test_overide_is_observed
2019-11-13 13:09:37 [INFO]     self.get_client_and_attempt_operation(self.keystone_ips[0])
2019-11-13 13:09:37 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/zaza/openstack/charm_tests/policyd/tests.py", line 499, in get_client_and_attempt_operation
2019-11-13 13:09:37 [INFO]     raise PolicydOperationFailedException()
2019-11-13 13:09:37 [INFO] zaza.openstack.charm_tests.policyd.tests.PolicydOperationFailedException
2019-11-13 13:09:37 [INFO] During handling of the above exception, another exception occurred:
2019-11-13 13:09:37 [INFO] Traceback (most recent call last):
2019-11-13 13:09:37 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/zaza/openstack/charm_tests/policyd/tests.py", line 433, in test_003_test_overide_is_observed
2019-11-13 13:09:37 [INFO]     .format(str(e)))
2019-11-13 13:09:37 [INFO] zaza.openstack.utilities.exceptions.PolicydError: Service action failed and should have passed after removing policy override: ""
2019-11-13 13:09:37 [INFO] ----------------------------------------------------------------------
2019-11-13 13:09:37 [INFO] Ran 3 tests in 355.493s
2019-11-13 13:09:37 [INFO] FAILED
2019-11-13 13:09:37 [INFO]  (errors=1)
Traceback (most recent call last):
  File "/home/ubuntu/src/neutron-api/.tox/func/bin/functest-run-suite", line 8, in <module>
    sys.exit(main())
  File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/zaza/charm_lifecycle/func_test_runner.py", line 162, in main
    bundle=args.bundle)
  File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/zaza/charm_lifecycle/func_test_runner.py", line 107, in func_test_runner
    run_env_deployment(env_deployment, keep_model=preserve_model)
  File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/zaza/charm_lifecycle/func_test_runner.py", line 63, in run_env_deployment
    test_steps.get(deployment.model_alias, []))
  File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/zaza/charm_lifecycle/test.py", line 68, in test
    run_test_list(tests)
  File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/zaza/charm_lifecycle/test.py", line 62, in run_test_list
    assert test_result.wasSuccessful(), "Test run failed"
AssertionError: Test run failed
ERROR: InvocationError: '/home/ubuntu/src/neutron-api/.tox/func/bin/functest-run-suite --keep-model'
____________________________________________________ summary ____________________________________________________
ERROR:   func: commands failed

Upgrading MongoDB fails because of an unexpected argument

18:30:00 2020-03-31 18:29:57 [WARNING] About to upgrade mongodb
18:30:00 Traceback (most recent call last):
18:30:00 File "/tmp/tmp.Rq0Sl76MK8/mojo-openstack-specs/trusty/osci-mojo/spec/specs/full_stack/next_series_upgrade_parallel/mitaka/parallel_series_upgrade.py", line 43, in
18:30:00 sys.exit(series_upgrade_test.test_200_run_series_upgrade())
18:30:00 File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/zaza/openstack/charm_tests/series_upgrade/tests.py", line 233, in test_200_run_series_upgrade
18:30:00 files=self.files,
18:30:00 TypeError: async_series_upgrade_non_leaders_first() got an unexpected keyword argument 'origin'
18:30:00 sys:1: RuntimeWarning: coroutine 'async_series_upgrade_application' was never awaited
18:30:00 Exception ignored in: <bound method BaseEventLoop.del of <_UnixSelectorEventLoop running=False closed=True debug=False>>
18:30:00 Traceback (most recent call last):
18:30:00 File "/usr/lib/python3.5/asyncio/base_events.py", line 431, in del
18:30:00 File "/usr/lib/python3.5/asyncio/unix_events.py", line 58, in close
18:30:00 File "/usr/lib/python3.5/asyncio/unix_events.py", line 139, in remove_signal_handler
18:30:00 File "/usr/lib/python3.5/signal.py", line 47, in signal
18:30:00 TypeError: signal handler must be signal.SIG_IGN, signal.SIG_DFL, or a callable object

zaza.openstack.charm_tests.neutron.setup.basic_overcloud_network: ValueError: invalid literal for int() with base 10: ''

2019-10-02 09:21:06 [INFO] Attaching additional port to instance, connected to net id: e254b5c3-fe0a-4cda-b84c-5695de7d4033
2019-10-02 09:21:14 [INFO] Trying to get mac address from port:38c7d263-07df-48ad-9d78-4aeb7772f2fc
Traceback (most recent call last):
  File "/home/ubuntu/src/charm-octavia/build/builds/octavia/.tox/func-smoke/bin/functest-run-suite", line 10, in <module>
    sys.exit(main())
  File "/home/ubuntu/src/charm-octavia/build/builds/octavia/.tox/func-smoke/lib/python3.6/site-packages/zaza/charm_lifecycle/func_test_runner.py", line 162, in main
    bundle=args.bundle)
  File "/home/ubuntu/src/charm-octavia/build/builds/octavia/.tox/func-smoke/lib/python3.6/site-packages/zaza/charm_lifecycle/func_test_runner.py", line 107, in func_test_runner
    run_env_deployment(env_deployment, keep_model=preserve_model)
  File "/home/ubuntu/src/charm-octavia/build/builds/octavia/.tox/func-smoke/lib/python3.6/site-packages/zaza/charm_lifecycle/func_test_runner.py", line 58, in run_env_deployment
    config_steps.get(deployment.model_alias, []))
  File "/home/ubuntu/src/charm-octavia/build/builds/octavia/.tox/func-smoke/lib/python3.6/site-packages/zaza/charm_lifecycle/configure.py", line 48, in configure
    run_configure_list(functions)
  File "/home/ubuntu/src/charm-octavia/build/builds/octavia/.tox/func-smoke/lib/python3.6/site-packages/zaza/charm_lifecycle/configure.py", line 37, in run_configure_list
    utils.get_class(func)()
  File "/home/ubuntu/src/charm-octavia/build/builds/octavia/.tox/func-smoke/lib/python3.6/site-packages/zaza/openstack/charm_tests/neutron/setup.py", line 89, in basic_overcloud_network
    keystone_session=undercloud_ks_sess)
  File "/home/ubuntu/src/charm-octavia/build/builds/octavia/.tox/func-smoke/lib/python3.6/site-packages/zaza/openstack/configure/network.py", line 231, in setup_gateway_ext_port
    add_dataport_to_netplan=add_dataport_to_netplan)
  File "/home/ubuntu/src/charm-octavia/build/builds/octavia/.tox/func-smoke/lib/python3.6/site-packages/zaza/openstack/utilities/openstack.py", line 615, in configure_gateway_ext_port
    dvr_mode=dvr_mode)
  File "/home/ubuntu/src/charm-octavia/build/builds/octavia/.tox/func-smoke/lib/python3.6/site-packages/zaza/openstack/utilities/openstack.py", line 520, in add_interface_to_netplan
    server_name, application_name)
  File "/home/ubuntu/src/charm-octavia/build/builds/octavia/.tox/func-smoke/lib/python3.6/site-packages/zaza/openstack/utilities/juju.py", line 113, in get_unit_name_from_host_name
    for u in model.get_units(application_name=application)
  File "/home/ubuntu/src/charm-octavia/build/builds/octavia/.tox/func-smoke/lib/python3.6/site-packages/zaza/openstack/utilities/juju.py", line 114, in <listcomp>
    if int(u.data['machine-id']) == int(machine_number)]
ValueError: invalid literal for int() with base 10: ''
ERROR: InvocationError: '/home/ubuntu/src/charm-octavia/build/builds/octavia/.tox/func-smoke/bin/functest-run-suite --keep-model --smoke'
______________________________________________________________________________ summary _______________________________________________________________________________
ERROR:   func-smoke: commands failed

PerconaClusterColdStartTest fails because of a race condition.

Used in the cham-percona-cluster, the test PerconaClusterColdStartTest, specifically test_100_cold_start_bootstrap, tries to start instances before these are completely shutoff. This causes the test to fail.

2019-08-30 17:42:43 [INFO] ## Running Test zaza.openstack.charm_tests.mysql.tests.PerconaClusterColdStartTest ##
2019-08-30 17:42:44 [INFO] Using keystone API V3 (or later) for overcloud auth
2019-08-30 17:42:45 [INFO] AUTH_URL: http://10.245.161.156:5000/v3, api_ver: 3
2019-08-30 17:42:45 [INFO] Using keystone API V3 (or later) for undercloud auth
2019-08-30 17:42:45 [INFO] test_100_cold_start_bootstrap (zaza.openstack.charm_tests.mysql.tests.PerconaClusterColdStartTest)
2019-08-30 17:42:45 [INFO] Bootstrap a non-leader node.
2019-08-30 17:42:45 [INFO] ...
2019-08-30 17:42:46 [INFO] Stopping instances: ['15795301-d477-48e8-897c-011a6c43a28e', '327f5ad2-a8d0-4c94-a1d6-db704140e7d1', '921dd945-fd52-417c-abc7-3868a33701a4']
sleep 60 sec
sleep done
2019-08-30 17:43:47 [INFO] Starting instances: ['921dd945-fd52-417c-abc7-3868a33701a4', '327f5ad2-a8d0-4c94-a1d6-db704140e7d1', '15795301-d477-48e8-897c-011a6c43a28e']
2019-08-30 17:43:47 [INFO] ERROR
2019-08-30 17:43:47 [INFO] ======================================================================
2019-08-30 17:43:47 [INFO] ERROR: test_100_cold_start_bootstrap (zaza.openstack.charm_tests.mysql.tests.PerconaClusterColdStartTest)
2019-08-30 17:43:47 [INFO] Bootstrap a non-leader node.
2019-08-30 17:43:47 [INFO] ----------------------------------------------------------------------
2019-08-30 17:43:47 [INFO] Traceback (most recent call last):
2019-08-30 17:43:47 [INFO] File "/home/ubuntu/charms/percona-cluster/.tox/func/lib/python3.6/site-packages/zaza/openstack/charm_tests/mysql/tests.py", line 294, in test_100_cold_start_bootstrap
2019-08-30 17:43:47 [INFO] self.nova_client.servers.start(uuid)
2019-08-30 17:43:47 [INFO] File "/home/ubuntu/charms/percona-cluster/.tox/func/lib/python3.6/site-packages/novaclient/v2/servers.py", line 1107, in start
2019-08-30 17:43:47 [INFO] return self._action('os-start', server, None)
2019-08-30 17:43:47 [INFO] File "/home/ubuntu/charms/percona-cluster/.tox/func/lib/python3.6/site-packages/novaclient/v2/servers.py", line 2073, in _action
2019-08-30 17:43:47 [INFO] info=info, **kwargs)
2019-08-30 17:43:47 [INFO] File "/home/ubuntu/charms/percona-cluster/.tox/func/lib/python3.6/site-packages/novaclient/v2/servers.py", line 2084, in _action_return_resp_and_body
2019-08-30 17:43:47 [INFO] return self.api.client.post(url, body=body)
2019-08-30 17:43:47 [INFO] File "/home/ubuntu/charms/percona-cluster/.tox/func/lib/python3.6/site-packages/keystoneauth1/adapter.py", line 392, in post
2019-08-30 17:43:47 [INFO] return self.request(url, 'POST', **kwargs)
2019-08-30 17:43:47 [INFO] File "/home/ubuntu/charms/percona-cluster/.tox/func/lib/python3.6/site-packages/novaclient/client.py", line 78, in request
2019-08-30 17:43:47 [INFO] raise exceptions.from_response(resp, body, url, method)
2019-08-30 17:43:47 [INFO] novaclient.exceptions.Conflict: Cannot 'start' instance 921dd945-fd52-417c-abc7-3868a33701a4 while it is in vm_state active (HTTP 409) (Request-ID: req-aa857f23-163e-4c1e-b4a5-c32f32145b5c)
2019-08-30 17:43:47 [INFO] ----------------------------------------------------------------------
2019-08-30 17:43:47 [INFO] Ran 1 test in 64.434s
2019-08-30 17:43:47 [INFO] FAILED
2019-08-30 17:43:47 [INFO] (errors=1)
Traceback (most recent call last):
File "/home/ubuntu/charms/percona-cluster/.tox/func/bin/functest-run-suite", line 10, in
sys.exit(main())
File "/home/ubuntu/charms/percona-cluster/.tox/func/lib/python3.6/site-packages/zaza/charm_lifecycle/func_test_runner.py", line 140, in main
bundle=args.bundle)
File "/home/ubuntu/charms/percona-cluster/.tox/func/lib/python3.6/site-packages/zaza/charm_lifecycle/func_test_runner.py", line 76, in func_test_runner
test_steps.get(model_alias, []))
File "/home/ubuntu/charms/percona-cluster/.tox/func/lib/python3.6/site-packages/zaza/charm_lifecycle/test.py", line 68, in test
run_test_list(tests)
File "/home/ubuntu/charms/percona-cluster/.tox/func/lib/python3.6/site-packages/zaza/charm_lifecycle/test.py", line 62, in run_test_list
assert test_result.wasSuccessful(), "Test run failed"
AssertionError: Test run failed

The ``config_change`` helper does not detect attempt to set config already applied

While there is code intended to detect the situation it does not appear to work correctly, and a test having alternate_config set to a value already applied for a charm will time out.

# we need to compare config values to what is already applied before
# attempting to set them. otherwise the model will behave differently
# than we would expect while waiting for completion of the change
app_config = self.config_current(
application_name, keys=alternate_config.keys()
)
if all(item in app_config.items()
for item in alternate_config.items()):
logging.debug('alternate_config equals what is already applied '
'config')
yield

Debug output:

$ functest-test -m zaza-855d7b4a2f6b -t zaza.openstack.charm_tests.neutron.tests.NeutronOpenvSwitchTest
2020-04-20 06:01:29 [INFO] ## Running Test zaza.openstack.charm_tests.neutron.tests.NeutronOpenvSwitchTest ##
2020-04-20 06:01:35 [INFO] Using keystone API V3 (or later) for overcloud auth
2020-04-20 06:01:37 [INFO] test_201_l2pop_propagation (zaza.openstack.charm_tests.neutron.tests.NeutronOpenvSwitchTest)
2020-04-20 06:01:37 [INFO] Verify that l2pop setting propagates to neutron-ovs.
2020-04-20 06:01:37 [INFO]  ... 
2020-04-20 06:01:37 [INFO] HELLO app_config="{'l2-population': True}"
2020-04-20 06:01:37 [INFO] HELLO alternate_config="{'l2-population': 'True'}"
2020-04-20 06:01:37 [INFO] HELLO item="('l2-population', 'True')"
2020-04-20 06:01:37 [INFO] HELLO item in app_config.items()="False"
2020-04-20 06:01:37 [INFO] Changing charm setting to {'l2-population': 'True'}
2020-04-20 06:01:38 [INFO] Waiting for units to execute config-changed hook
2020-04-20 06:01:38 [INFO] Waiting for at least one unit with agent status "executing"
2020-04-20 06:02:38 [INFO] ERROR

Keystone SAML Mellon tests are failing

The test originally was doing a simple check that we were redirected. It is time to write a test fixture so that we can control the IDP side of the equation. Regardless the tests as they exist need updating.

Relying on https://samltest.id is probably not a good idea. It has given 503 errors.

Also, the certificate generation code in Zaza is not removing the ---- END CERTIFICATE ---- string before attaching it as a resource to keystone-saml-mellon.

Related is focal OpenStack dashboard returning 500 errors: https://bugs.launchpad.net/charm-keystone-saml-mellon/+bug/1879791

neutron: intermittent test failures due to charm readiness != API readiness

Test output:

2019-11-29 08:03:12 [INFO] Setting config to {'use-policyd-override': 'False'}
2019-11-29 08:03:13 [INFO] Waiting for at least one unit with agent status "executing"
2019-11-29 08:04:07 [INFO] Finally verify that operation works after removing the override.
2019-11-29 08:04:07 [INFO] Authentication for demo on keystone IP 10.5.0.67
2019-11-29 08:04:07 [INFO] keystone IP 10.5.0.67
2019-11-29 08:04:08 [INFO] ERROR
2019-11-29 08:04:08 [INFO] ======================================================================
2019-11-29 08:04:08 [INFO] ERROR: test_003_test_overide_is_observed (zaza.openstack.charm_tests.policyd.tests.NeutronApiTests)
2019-11-29 08:04:08 [INFO] Test that the override is observed by the underlying service.
2019-11-29 08:04:08 [INFO] ----------------------------------------------------------------------
2019-11-29 08:04:08 [INFO] Traceback (most recent call last):
2019-11-29 08:04:08 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func-smoke/lib/python3.6/site-packages/zaza/openstack/charm_tests/policyd/tests.py", line 541, in get_client_and_attempt_operation
2019-11-29 08:04:08 [INFO]     networks = neutron_client.list_networks()
2019-11-29 08:04:08 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func-smoke/lib/python3.6/site-packages/neutronclient/v2_0/client.py", line 818, in list_networks
2019-11-29 08:04:08 [INFO]     **_params)
2019-11-29 08:04:08 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func-smoke/lib/python3.6/site-packages/neutronclient/v2_0/client.py", line 369, in list
2019-11-29 08:04:08 [INFO]     for r in self._pagination(collection, path, **params):
2019-11-29 08:04:08 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func-smoke/lib/python3.6/site-packages/neutronclient/v2_0/client.py", line 384, in _pagination
2019-11-29 08:04:08 [INFO]     res = self.get(path, params=params)
2019-11-29 08:04:08 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func-smoke/lib/python3.6/site-packages/neutronclient/v2_0/client.py", line 354, in get
2019-11-29 08:04:08 [INFO]     headers=headers, params=params)
2019-11-29 08:04:08 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func-smoke/lib/python3.6/site-packages/neutronclient/v2_0/client.py", line 331, in retry_request
2019-11-29 08:04:08 [INFO]     headers=headers, params=params)
2019-11-29 08:04:08 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func-smoke/lib/python3.6/site-packages/neutronclient/v2_0/client.py", line 294, in do_request
2019-11-29 08:04:08 [INFO]     self._handle_fault_response(status_code, replybody, resp)
2019-11-29 08:04:08 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func-smoke/lib/python3.6/site-packages/neutronclient/v2_0/client.py", line 269, in _handle_fault_response
2019-11-29 08:04:08 [INFO]     exception_handler_v20(status_code, error_body)
2019-11-29 08:04:08 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func-smoke/lib/python3.6/site-packages/neutronclient/v2_0/client.py", line 93, in exception_handler_v20
2019-11-29 08:04:08 [INFO]     request_ids=request_ids)
2019-11-29 08:04:08 [INFO] neutronclient.common.exceptions.ServiceUnavailable: <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
2019-11-29 08:04:08 [INFO] <html><head>
2019-11-29 08:04:08 [INFO] <title>503 Service Unavailable</title>
2019-11-29 08:04:08 [INFO] </head><body>
2019-11-29 08:04:08 [INFO] <h1>Service Unavailable</h1>
2019-11-29 08:04:08 [INFO] <p>The server is temporarily unable to service your
2019-11-29 08:04:08 [INFO] request due to maintenance downtime or capacity
2019-11-29 08:04:08 [INFO] problems. Please try again later.</p>
2019-11-29 08:04:08 [INFO] <hr>
2019-11-29 08:04:08 [INFO] <address>Apache/2.4.29 (Ubuntu) Server at 10.5.0.52 Port 9696</address>
2019-11-29 08:04:08 [INFO] </body></html>
2019-11-29 08:04:08 [INFO] During handling of the above exception, another exception occurred:
2019-11-29 08:04:08 [INFO] Traceback (most recent call last):
2019-11-29 08:04:08 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func-smoke/lib/python3.6/site-packages/zaza/openstack/charm_tests/policyd/tests.py", line 473, in test_003_test_overide_is_observed
2019-11-29 08:04:08 [INFO]     self.get_client_and_attempt_operation(self.keystone_ips[0])
2019-11-29 08:04:08 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func-smoke/lib/python3.6/site-packages/zaza/openstack/charm_tests/policyd/tests.py", line 546, in get_client_and_attempt_operation
2019-11-29 08:04:08 [INFO]     raise PolicydOperationFailedException()
2019-11-29 08:04:08 [INFO] zaza.openstack.charm_tests.policyd.tests.PolicydOperationFailedException
2019-11-29 08:04:08 [INFO] During handling of the above exception, another exception occurred:
2019-11-29 08:04:08 [INFO] Traceback (most recent call last):
2019-11-29 08:04:08 [INFO]   File "/home/ubuntu/src/neutron-api/.tox/func-smoke/lib/python3.6/site-packages/zaza/openstack/charm_tests/policyd/tests.py", line 478, in test_003_test_overide_is_observed
2019-11-29 08:04:08 [INFO]     .format(str(e)))
2019-11-29 08:04:08 [INFO] zaza.openstack.utilities.exceptions.PolicydError: Service action failed and should have passed after removing policy override: ""

Neutron API charm log:

2019-11-29 08:04:05 INFO juju-log Unit is ready
2019-11-29 08:07:49 INFO juju-log ...

neutron-api syslog:

Nov 29 08:04:04 juju-c66fd6-zaza-7aa846afa0cd-6 systemd[1]: Stopped OpenStack Neutron Server.
Nov 29 08:04:04 juju-c66fd6-zaza-7aa846afa0cd-6 systemd[1]: Started OpenStack Neutron Server.
Nov 29 08:17:01 juju-c66fd6-zaza-7aa846afa0cd-6 CRON ...

Neutron API server daemon log:

2019-11-29 08:04:03.901 14066 INFO oslo_service.service [-] Child 14279 killed by signal 15
2019-11-29 08:04:07.039 20193 INFO neutron.common.config [-] Logging enabled!
...
2019-11-29 08:04:09.014 20193 INFO neutron.service [req-b336a6d0-daa1-426b-ae65-d6d395310269 - - - - -] Neutron service started, listening on 0.0.0.0:9676

Apache error log:

[Fri Nov 29 08:04:05.494436 2019] [proxy:error] [pid 18852:tid 140220213229312] (111)Connection refused: AH00957: HTTP: attempt to connect to 127.0.0.1:9676 (localhost) failed
[Fri Nov 29 08:04:05.494493 2019] [proxy_http:error] [pid 18852:tid 140220213229312] [client 10.5.0.52:40388] AH01114: HTTP: failed to make connection to backend: localhost
[Fri Nov 29 08:04:08.358579 2019] [proxy:error] [pid 18852:tid 140220204836608] (111)Connection refused: AH00957: HTTP: attempt to connect to 127.0.0.1:9676 (localhost) failed
[Fri Nov 29 08:04:08.358625 2019] [proxy_http:error] [pid 18852:tid 140220204836608] [client 10.5.0.52:40398] AH01114: HTTP: failed to make connection to backend: localhost

Apache access log:

10.5.0.52:443 10.5.0.52 - - [29/Nov/2019:08:03:20 +0000] "GET /v2.0/networks/b7c57241-8cf9-4b51-adc1-22f369f45803?fields=provider%3Aphysical_network&fields=provider%3Anetwork_type HTTP/1.1" 200 369 "-" "python-neutronclient"

10.5.0.52:443 10.5.0.52 - - [29/Nov/2019:08:04:05 +0000] "GET /v2.0/ports?device_id=52ad1fb7-c64b-4ec5-adb5-4f9a01825a37&fields=binding%3Ahost_id&fields=binding%3Avif_type HTTP/1.1" 503 3106 "-" "python-neutronclient"
10.5.0.52:443 10.5.0.52 - - [29/Nov/2019:08:04:08 +0000] "GET /v2.0/networks HTTP/1.1" 503 3106 "-" "python-neutronclient"

10.5.0.52:443 10.5.0.52 - - [29/Nov/2019:08:04:21 +0000] "GET /v2.0/ports?device_id=b1f990d7-b1de-457c-ad40-29f076ce3bbc&fields=binding%3Ahost_id&fields=binding%3Avif_type HTTP/1.1" 200 2914 "-" "python-neutronclient"

So, what's going on here is that the second the neutron-api charm signals it has done executing (which it has) the test attempts to use the service. And it appears to not actually ready to answer requests until a few seconds later at the earliest.

series upgrade does not unseal vault

Series upgrade test gets blocked because after rebooting a vault unit it comes up sealed and the test times out as 'Unit is sealed' is not a whitelisted status

Ceph Rados Gateway and Swift Proxy should share tests

Currently swift is tested via its integration with glance where as Ceph RadosGW is tested directly. This doesn't make sense, both should share some common object store tests. imo both should exercise the glance integration but I'd settle for them just having some common tests.

post-series-upgrade hook fails for percona

When mysql has been upgraded the port upgrade hook faile with

2019-10-10 10:13:31 INFO juju-log Installing ['percona-xtradb-cluster-server'] with options: ['--option=Dpkg::Options::=--force-confold']
2019-10-10 10:13:31 DEBUG post-series-upgrade Reading package lists...
2019-10-10 10:13:31 DEBUG post-series-upgrade Building dependency tree...
2019-10-10 10:13:31 DEBUG post-series-upgrade Reading state information...
2019-10-10 10:13:31 DEBUG post-series-upgrade percona-xtradb-cluster-server is already the newest version (5.6.37-26.21-0ubuntu0.16.04.2).
2019-10-10 10:13:31 DEBUG post-series-upgrade 0 upgraded, 0 newly installed, 0 to remove and 26 not upgraded.
2019-10-10 10:13:31 INFO juju-log Starting mysqld --wsrep-provider='none' and waiting ...
2019-10-10 10:13:32 DEBUG juju-log /var/run/mysqld/mysqld.sock file is not yet ihe correct state retrying. Check for exists=True
2019-10-10 10:13:42 DEBUG juju-log /var/run/mysqld/mysqld.sock file is not yet ihe correct state retrying. Check for exists=True
2019-10-10 10:13:52 DEBUG juju-log /var/run/mysqld/mysqld.sock file is not yet ihe correct state retrying. Check for exists=True
2019-10-10 10:14:02 DEBUG juju-log /var/run/mysqld/mysqld.sock file is not yet ihe correct state retrying. Check for exists=True
2019-10-10 10:14:13 DEBUG juju-log /var/run/mysqld/mysqld.sock file is not yet ihe correct state retrying. Check for exists=True
2019-10-10 10:14:23 DEBUG juju-log /var/run/mysqld/mysqld.sock file is not yet ihe correct state retrying. Check for exists=True
2019-10-10 10:14:33 DEBUG juju-log /var/run/mysqld/mysqld.sock file is not yet ihe correct state retrying. Check for exists=True
2019-10-10 10:14:43 DEBUG juju-log /var/run/mysqld/mysqld.sock file is not yet ihe correct state retrying. Check for exists=True
2019-10-10 10:14:53 DEBUG juju-log /var/run/mysqld/mysqld.sock file is not yet ihe correct state retrying. Check for exists=True
2019-10-10 10:15:03 DEBUG juju-log /var/run/mysqld/mysqld.sock file is not yet ihe correct state retrying. Check for exists=True
2019-10-10 10:15:13 DEBUG juju-log /var/run/mysqld/mysqld.sock file is not yet ihe correct state retrying. Check for exists=True
2019-10-10 10:15:23 DEBUG juju-log /var/run/mysqld/mysqld.sock file is not yet ihe correct state retrying. Check for exists=True
2019-10-10 10:15:33 DEBUG post-series-upgrade Traceback (most recent call last):
2019-10-10 10:15:33 DEBUG post-series-upgrade   File "/var/lib/juju/agents/unit-mysql-0/charm/hooks/post-series-upgrade", line 1132, in <module>
2019-10-10 10:15:33 DEBUG post-series-upgrade     main()
2019-10-10 10:15:33 DEBUG post-series-upgrade   File "/var/lib/juju/agents/unit-mysql-0/charm/hooks/post-series-upgrade", line 1122, in main
2019-10-10 10:15:33 DEBUG post-series-upgrade     hooks.execute(sys.argv)
2019-10-10 10:15:33 DEBUG post-series-upgrade   File "/var/lib/juju/agents/unit-mysql-0/charm/charmhelpers/core/hookenv.py", line 914, in execute
2019-10-10 10:15:33 DEBUG post-series-upgrade     self._hooks[hook_name]()
2019-10-10 10:15:33 DEBUG post-series-upgrade   File "/var/lib/juju/agents/unit-mysql-0/charm/hooks/post-series-upgrade", line 434, in series_upgrade
2019-10-10 10:15:33 DEBUG post-series-upgrade     check_for_socket(MYSQL_SOCKET, exists=True)
2019-10-10 10:15:33 DEBUG post-series-upgrade   File "/var/lib/juju/agents/unit-mysql-0/charm/hooks/percona_utils.py", line 1204, in check_for_socket
2019-10-10 10:15:33 DEBUG post-series-upgrade     .format(file_name, attempts))
2019-10-10 10:15:33 DEBUG post-series-upgrade Exception: Socket /var/run/mysqld/mysqld.sock not found after 12 attempts.
2019-10-10 10:15:33 ERROR juju.worker.uniter.operation runhook.go:132 hook "post-series-upgrade" failed: exit status 1

Intermittent failure getting floating ip in NeutronNetworkingTest

A floating IP is evidently assigned to both instances, however during test the test runner is unable to successfully interpret the information received from Nova.

2020-03-07 10:28:58 [INFO] test_instances_have_networking (zaza.openstack.charm_tests.neutron.tests.NeutronNetworkingTest)
2020-03-07 10:28:58 [INFO] Validate North/South and East/West networking.
2020-03-07 10:28:58 [INFO]  ... 
2020-03-07 10:29:02 [INFO] Using keystone API V3 (or later) for overcloud auth
2020-03-07 10:29:06 [INFO] Launching instance zaza-neutrontests-ins-1
2020-03-07 10:29:09 [INFO] Checking instance is active
2020-03-07 10:29:09 [INFO] BUILD
2020-03-07 10:29:14 [INFO] BUILD
2020-03-07 10:29:17 [INFO] BUILD
2020-03-07 10:29:22 [INFO] BUILD
2020-03-07 10:29:30 [INFO] BUILD
2020-03-07 10:29:46 [INFO] ACTIVE
2020-03-07 10:29:46 [INFO] Checking cloud init is complete
2020-03-07 10:30:57 [INFO] Assigning floating ip.
2020-03-07 10:30:57 [INFO] Creating floatingip
2020-03-07 10:31:18 [INFO] Assigned floating IP 172.17.105.227 to zaza-neutrontests-ins-1
2020-03-07 10:31:18 [INFO] Testing ssh access.
2020-03-07 10:31:18 [INFO] Attempting to ssh to zaza-neutrontests-ins-1(172.17.105.227)
2020-03-07 10:31:18 [INFO] Connected (version 2.0, client OpenSSH_7.6p1)
2020-03-07 10:31:18 [INFO] Authentication (publickey) successful!
2020-03-07 10:31:18 [INFO] Running uname -n on zaza-neutrontests-ins-1
2020-03-07 10:31:21 [INFO] SSH to zaza-neutrontests-ins-1(172.17.105.227) succesfull
2020-03-07 10:31:25 [INFO] Using keystone API V3 (or later) for overcloud auth
2020-03-07 10:31:28 [INFO] Launching instance zaza-neutrontests-ins-2
2020-03-07 10:31:29 [INFO] Checking instance is active
2020-03-07 10:31:29 [INFO] BUILD
2020-03-07 10:31:31 [INFO] BUILD
2020-03-07 10:31:33 [INFO] BUILD
2020-03-07 10:31:38 [INFO] BUILD
2020-03-07 10:31:48 [INFO] BUILD
2020-03-07 10:32:05 [INFO] ACTIVE
2020-03-07 10:32:05 [INFO] Checking cloud init is complete
2020-03-07 10:32:40 [INFO] Assigning floating ip.
2020-03-07 10:32:40 [WARNING] A floating IP already exists but ports do not match Potentially creating more than one.
2020-03-07 10:32:40 [INFO] Creating floatingip
2020-03-07 10:32:52 [INFO] Assigned floating IP 172.17.105.203 to zaza-neutrontests-ins-2
2020-03-07 10:32:52 [INFO] Testing ssh access.
2020-03-07 10:32:52 [INFO] Attempting to ssh to zaza-neutrontests-ins-2(172.17.105.203)
2020-03-07 10:32:52 [INFO] Connected (version 2.0, client OpenSSH_7.6p1)
2020-03-07 10:32:52 [INFO] Authentication (publickey) successful!
2020-03-07 10:32:52 [INFO] Running uname -n on zaza-neutrontests-ins-2
2020-03-07 10:32:54 [INFO] SSH to zaza-neutrontests-ins-2(172.17.105.203) succesfull
2020-03-07 10:35:03 [INFO] ERROR
2020-03-07 10:35:03 [INFO] ======================================================================
2020-03-07 10:35:03 [INFO] ERROR: test_instances_have_networking (zaza.openstack.charm_tests.neutron.tests.NeutronNetworkingTest)
2020-03-07 10:35:03 [INFO] Validate North/South and East/West networking.
2020-03-07 10:35:03 [INFO] ----------------------------------------------------------------------
2020-03-07 10:35:03 [INFO] Traceback (most recent call last):
2020-03-07 10:35:03 [INFO]   File "/tmp/tmp.gvmbocayuF/func-smoke/lib/python3.5/site-packages/zaza/openstack/charm_tests/neutron/tests.py", line 638, in test_instances_have_networking
2020-03-07 10:35:03 [INFO]     self.validate_instance_can_reach_other(instance_1, instance_2, verify)
2020-03-07 10:35:03 [INFO]   File "/tmp/tmp.gvmbocayuF/func-smoke/lib/python3.5/site-packages/tenacity/__init__.py", line 311, in wrapped_f
2020-03-07 10:35:03 [INFO]     return self.call(f, *args, **kw)
2020-03-07 10:35:03 [INFO]   File "/tmp/tmp.gvmbocayuF/func-smoke/lib/python3.5/site-packages/tenacity/__init__.py", line 391, in call
2020-03-07 10:35:03 [INFO]     do = self.iter(retry_state=retry_state)
2020-03-07 10:35:03 [INFO]   File "/tmp/tmp.gvmbocayuF/func-smoke/lib/python3.5/site-packages/tenacity/__init__.py", line 350, in iter
2020-03-07 10:35:03 [INFO]     raise retry_exc.reraise()
2020-03-07 10:35:03 [INFO]   File "/tmp/tmp.gvmbocayuF/func-smoke/lib/python3.5/site-packages/tenacity/__init__.py", line 168, in reraise
2020-03-07 10:35:03 [INFO]     raise self.last_attempt.result()
2020-03-07 10:35:03 [INFO]   File "/usr/lib/python3.5/concurrent/futures/_base.py", line 398, in result
2020-03-07 10:35:03 [INFO]     return self.__get_result()
2020-03-07 10:35:03 [INFO]   File "/usr/lib/python3.5/concurrent/futures/_base.py", line 357, in __get_result
2020-03-07 10:35:03 [INFO]     raise self._exception
2020-03-07 10:35:03 [INFO]   File "/tmp/tmp.gvmbocayuF/func-smoke/lib/python3.5/site-packages/tenacity/__init__.py", line 394, in call
2020-03-07 10:35:03 [INFO]     result = fn(*args, **kwargs)
2020-03-07 10:35:03 [INFO]   File "/tmp/tmp.gvmbocayuF/func-smoke/lib/python3.5/site-packages/zaza/openstack/charm_tests/neutron/tests.py", line 666, in validate_instance_can_reach_other
2020-03-07 10:35:03 [INFO]     floating_2 = floating_ips_from_instance(instance_2)[0]
2020-03-07 10:35:03 [INFO] IndexError: list index out of range
2020-03-07 10:35:03 [INFO] ----------------------------------------------------------------------
2020-03-07 10:35:03 [INFO] Ran 1 test in 370.065s
2020-03-07 10:35:03 [INFO] FAILED
2020-03-07 10:35:03 [INFO]  (errors=1)
Traceback (most recent call last):
  File "/tmp/tmp.gvmbocayuF/func-smoke/bin/functest-run-suite", line 8, in <module>
    sys.exit(main())
  File "/tmp/tmp.gvmbocayuF/func-smoke/lib/python3.5/site-packages/zaza/charm_lifecycle/func_test_runner.py", line 191, in main
    force=args.force)
  File "/tmp/tmp.gvmbocayuF/func-smoke/lib/python3.5/site-packages/zaza/charm_lifecycle/func_test_runner.py", line 128, in func_test_runner
    force=force)
  File "/tmp/tmp.gvmbocayuF/func-smoke/lib/python3.5/site-packages/zaza/charm_lifecycle/func_test_runner.py", line 79, in run_env_deployment
    test_steps.get(deployment.model_alias, []))
  File "/tmp/tmp.gvmbocayuF/func-smoke/lib/python3.5/site-packages/zaza/charm_lifecycle/test.py", line 117, in test
    run_test_list(tests)
  File "/tmp/tmp.gvmbocayuF/func-smoke/lib/python3.5/site-packages/zaza/charm_lifecycle/test.py", line 111, in run_test_list
    get_test_runners()[runner](testcase, _testcase)
  File "/tmp/tmp.gvmbocayuF/func-smoke/lib/python3.5/site-packages/zaza/charm_lifecycle/test.py", line 73, in run_unittest
    assert test_result.wasSuccessful(), "Test run failed"
AssertionError: Test run failed

Upgrade fails if series upgrade is initiated when juju run still running.

17:21:00 2019-10-10 17:20:58 [INFO] Prepare series upgrade on 10
17:21:00 ERROR unit designate/1 is not ready to start a series upgrade; its agent status is: "executing" running action juju-run
17:21:00 Traceback (most recent call last):
17:21:00   File "/tmp/tmp.PKzDYw8VZG/mojo-openstack-specs/xenial/osci-mojo/spec/specs/full_stack/next_series_upgrade/queens/series_upgrade.py", line 42, in <module>
17:21:00     sys.exit(series_upgrade_test.test_200_run_series_upgrade())
17:21:00   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/zaza/openstack/charm_tests/series_upgrade/tests.py", line 99, in test_200_run_series_upgrade
17:21:00     files=self.files)
17:21:00   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/zaza/openstack/utilities/generic.py", line 313, in series_upgrade_application
17:21:00     files=files)
17:21:00   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/zaza/openstack/utilities/generic.py", line 372, in series_upgrade
17:21:00     model.prepare_series_upgrade(machine_num, to_series=to_series)
17:21:00   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/zaza/model.py", line 1517, in prepare_series_upgrade
17:21:00     subprocess.check_call(cmd)
17:21:00   File "/usr/lib/python3.5/subprocess.py", line 581, in check_call
17:21:00     raise CalledProcessError(retcode, cmd)
17:21:00 subprocess.CalledProcessError: Command '['juju', 'upgrade-series', '-m', 'auto-osci-sv13', '10', 'prepare', 'bionic', '--yes']' returned non-zero exit status 1

add_interface_to_netplan helper attempts to scp to wrong IP

2019-11-12 17:05:18 [INFO] Attaching additional port to instance, connected to net id: e254b5c3-fe0a-4cda-b84c-5695de7d4033
2019-11-12 17:05:26 [INFO] Trying to get mac address from port:0befc420-e868-470b-bd10-55a94b4d19c6
lost connection
Traceback (most recent call last):
  File "/home/ubuntu/src/neutron-api/.tox/func/bin/functest-run-suite", line 8, in <module>
    sys.exit(main())
  File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/zaza/charm_lifecycle/func_test_runner.py", line 162, in main
    bundle=args.bundle)
  File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/zaza/charm_lifecycle/func_test_runner.py", line 107, in func_test_runner
    run_env_deployment(env_deployment, keep_model=preserve_model)
  File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/zaza/charm_lifecycle/func_test_runner.py", line 58, in run_env_deployment
    config_steps.get(deployment.model_alias, []))
  File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/zaza/charm_lifecycle/configure.py", line 48, in configure
    run_configure_list(functions)
  File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/zaza/charm_lifecycle/configure.py", line 37, in run_configure_list
    utils.get_class(func)()
  File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/zaza/openstack/charm_tests/neutron/setup.py", line 89, in basic_overcloud_network
    limit_gws=None)
  File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/zaza/openstack/configure/network.py", line 233, in setup_gateway_ext_port
    limit_gws=limit_gws)
  File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/zaza/openstack/utilities/openstack.py", line 696, in configure_gateway_ext_port
    mac_address=mac_address)
  File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/zaza/openstack/utilities/openstack.py", line 610, in add_interface_to_netplan
    '/home/ubuntu/60-dataport.yaml', user="ubuntu")
  File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/zaza/__init__.py", line 48, in _wrapper
    return run(_run_it())
  File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/zaza/__init__.py", line 36, in run
    return task.result()
  File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/zaza/__init__.py", line 47, in _run_it
    return await f(*args, **kwargs)
  File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/zaza/model.py", line 184, in async_scp_to_unit
    scp_opts=scp_opts)
  File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/juju/unit.py", line 203, in scp_to
    scp_opts=scp_opts)
  File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/juju/machine.py", line 156, in scp_to
    await self._scp(source, destination, scp_opts)
  File "/home/ubuntu/src/neutron-api/.tox/func/lib/python3.6/site-packages/juju/machine.py", line 193, in _scp
    raise JujuError("command failed: %s" % cmd)
juju.errors.JujuError: command failed: ['scp', '-i', '/home/ubuntu/.local/share/juju/ssh/juju_id_rsa', '-o', 'StrictHostKeyChecking=no', '-q', '-B', '/tmp/tmpprv31x4m', '[email protected]:/home/ubuntu/60-dataport.yaml']
ERROR: InvocationError: '/home/ubuntu/src/neutron-api/.tox/func/bin/functest-run-suite --keep-model'
_________________________________________________________________________________________ summary _________________________________________________________________________________________
ERROR:   func: commands failed
$ juju status -m zaza-ef65464e7f82 
Model              Controller            Cloud/Region             Version  SLA          Timestamp
zaza-ef65464e7f82  fnordahl-serverstack  serverstack/serverstack  2.7-rc3  unsupported  17:07:20Z

App                    Version  Status  Scale  Charm                  Store       Rev  OS      Notes
glance                 18.0.0   active      1  glance                 jujucharms  388  ubuntu  
keystone               15.0.0   active      1  keystone               jujucharms  466  ubuntu  
neutron-api            14.0.2   active      1  neutron-api            local         0  ubuntu  
neutron-gateway        14.0.2   active      1  neutron-gateway        jujucharms  390  ubuntu  
neutron-openvswitch    14.0.2   active      2  neutron-openvswitch    jujucharms  386  ubuntu  
nova-cloud-controller  19.0.1   active      1  nova-cloud-controller  jujucharms  456  ubuntu  
nova-compute           19.0.1   active      2  nova-compute           jujucharms  470  ubuntu  
percona-cluster        5.7.20   active      1  percona-cluster        jujucharms  356  ubuntu  
rabbitmq-server        3.6.10   active      1  rabbitmq-server        jujucharms  353  ubuntu  

Unit                      Workload  Agent  Machine  Public address  Ports                       Message
glance/0*                 active    idle   4        10.5.0.52       9292/tcp                    Unit is ready
keystone/0*               active    idle   3        10.5.0.36       5000/tcp                    Unit is ready
neutron-api/0*            active    idle   2        10.5.0.9        9696/tcp                    Unit is ready
neutron-gateway/0*        active    idle   5        10.5.0.35                                   Unit is ready
nova-cloud-controller/0*  active    idle   6        10.5.0.63       8774/tcp,8775/tcp,8778/tcp  Unit is ready
nova-compute/0            active    idle   7        10.5.0.40                                   Unit is ready
  neutron-openvswitch/1   active    idle            10.5.0.40                                   Unit is ready
nova-compute/1*           active    idle   8        10.5.0.41                                   Unit is ready
  neutron-openvswitch/0*  active    idle            10.5.0.41                                   Unit is ready
percona-cluster/0*        active    idle   0        10.5.0.33       3306/tcp                    Unit is ready
rabbitmq-server/0*        active    idle   1        10.5.0.28       5672/tcp                    Unit is ready

Machine  State    DNS        Inst id                               Series  AZ    Message
0        started  10.5.0.33  4e616c63-d5ca-4e6b-a3f7-aaa6a45e66ce  bionic  nova  ACTIVE
1        started  10.5.0.28  68b45b8c-7e11-47e7-a0c5-4d854e251418  bionic  nova  ACTIVE
2        started  10.5.0.9   064d77d7-d5df-4220-b998-7e642a50de33  bionic  nova  ACTIVE
3        started  10.5.0.36  53deff2a-10ab-44b7-aad3-c3c80c5022f3  bionic  nova  ACTIVE
4        started  10.5.0.52  c142e5b3-44d9-464d-85bb-297a82c3a5f6  bionic  nova  ACTIVE
5        started  10.5.0.35  55565113-e575-419c-b627-1698fcd60b03  bionic  nova  ACTIVE
6        started  10.5.0.63  aca6341f-e116-47b5-9b62-f7788a7b5a87  bionic  nova  ACTIVE
7        started  10.5.0.40  0dba9a91-ed27-4607-897a-1a92b7bd96b8  bionic  nova  ACTIVE
8        started  10.5.0.41  774185a6-5c54-4652-a9f0-94f30f462d06  bionic  nova  ACTIVE
$ openstack port show 0befc420-e868-470b-bd10-55a94b4d19c6
+-------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Field                   | Value                                                                                                                                                                                        |
+-------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| admin_state_up          | UP                                                                                                                                                                                           |
| allowed_address_pairs   |                                                                                                                                                                                              |
| binding_host_id         | None                                                                                                                                                                                         |
| binding_profile         | None                                                                                                                                                                                         |
| binding_vif_details     | None                                                                                                                                                                                         |
| binding_vif_type        | None                                                                                                                                                                                         |
| binding_vnic_type       | normal                                                                                                                                                                                       |
| created_at              | 2019-11-12T17:05:19Z                                                                                                                                                                         |
| data_plane_status       | None                                                                                                                                                                                         |
| description             |                                                                                                                                                                                              |
| device_id               | 55565113-e575-419c-b627-1698fcd60b03                                                                                                                                                         |
| device_owner            | compute:nova                                                                                                                                                                                 |
| dns_assignment          | fqdn='juju-b953b1-zaza-ef65464e7f82-5.project.serverstack.', hostname='juju-b953b1-zaza-ef65464e7f82-5', ip_address='10.5.0.31'                                                              |
| dns_domain              | None                                                                                                                                                                                         |
| dns_name                | juju-b953b1-zaza-ef65464e7f82-5                                                                                                                                                              |
| extra_dhcp_opts         |                                                                                                                                                                                              |
| fixed_ips               | ip_address='10.5.0.31', subnet_id='dd7add5d-f0ac-4a8f-a379-9744ea170243'                                                                                                                     |
| id                      | 0befc420-e868-470b-bd10-55a94b4d19c6                                                                                                                                                         |
| location                | Munch({'cloud': '', 'region_name': 'serverstack', 'zone': None, 'project': Munch({'id': '2394c8c726444085b1f043336205d8b7', 'name': 'fnordahl', 'domain_id': None, 'domain_name': 'user'})}) |
| mac_address             | fa:16:3e:07:84:75                                                                                                                                                                            |
| name                    | juju-b953b1-zaza-ef65464e7f82-5_ext-port                                                                                                                                                     |
| network_id              | e254b5c3-fe0a-4cda-b84c-5695de7d4033                                                                                                                                                         |
| port_security_enabled   | False                                                                                                                                                                                        |
| project_id              | 2394c8c726444085b1f043336205d8b7                                                                                                                                                             |
| propagate_uplink_status | None                                                                                                                                                                                         |
| qos_policy_id           | None                                                                                                                                                                                         |
| resource_request        | None                                                                                                                                                                                         |
| revision_number         | 10                                                                                                                                                                                           |
| security_group_ids      |                                                                                                                                                                                              |
| status                  | ACTIVE                                                                                                                                                                                       |
| tags                    |                                                                                                                                                                                              |
| trunk_details           | None                                                                                                                                                                                         |
| updated_at              | 2019-11-12T17:05:27Z                                                                                                                                                                         |
+-------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
$ openstack server show 55565113-e575-419c-b627-1698fcd60b03
+-----------------------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Field                       | Value                                                                                                                                                                                                       |
+-----------------------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| OS-DCF:diskConfig           | MANUAL                                                                                                                                                                                                      |
| OS-EXT-AZ:availability_zone | nova                                                                                                                                                                                                        |
| OS-EXT-STS:power_state      | Running                                                                                                                                                                                                     |
| OS-EXT-STS:task_state       | None                                                                                                                                                                                                        |
| OS-EXT-STS:vm_state         | active                                                                                                                                                                                                      |
| OS-SRV-USG:launched_at      | 2019-11-12T16:52:16.000000                                                                                                                                                                                  |
| OS-SRV-USG:terminated_at    | None                                                                                                                                                                                                        |
| accessIPv4                  |                                                                                                                                                                                                             |
| accessIPv6                  |                                                                                                                                                                                                             |
| addresses                   | fnordahl_admin_net=10.5.0.35, 10.5.0.31                                                                                                                                                                     |
| config_drive                |                                                                                                                                                                                                             |
| created                     | 2019-11-12T16:52:00Z                                                                                                                                                                                        |
| flavor                      | m1.small (2)                                                                                                                                                                                                |
| hostId                      | 2908efd968104b5c286faac048d29958e691a9e2641b682f61a24f94                                                                                                                                                    |
| id                          | 55565113-e575-419c-b627-1698fcd60b03                                                                                                                                                                        |
| image                       | auto-sync/ubuntu-bionic-daily-amd64-server-20191107-disk1.img (dd4ef883-9da9-42f0-a1ad-dd195cd475b0)                                                                                                        |
| key_name                    | None                                                                                                                                                                                                        |
| name                        | juju-b953b1-zaza-ef65464e7f82-5                                                                                                                                                                             |
| progress                    | 0                                                                                                                                                                                                           |
| project_id                  | 2394c8c726444085b1f043336205d8b7                                                                                                                                                                            |
| properties                  | juju-controller-uuid='5d94dd5e-060f-4eec-832a-e2aeb1d6d4f6', juju-machine-id='zaza-ef65464e7f82-machine-5', juju-model-uuid='407053d5-7e8c-44d1-85b8-7e536fb953b1', juju-units-deployed='neutron-gateway/0' |
| status                      | ACTIVE                                                                                                                                                                                                      |
| updated                     | 2019-11-12T16:52:16Z                                                                                                                                                                                        |
| user_id                     | 4370e565e35a47209c3bed681e01d6c0                                                                                                                                                                            |
| volumes_attached            |                                                                                                                                                                                                             |
+-----------------------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+

I guess it should have connected to the main instance address as listed by Juju.

Not sure if this is a intermittent issue or if something regressed through recent network helper related changes in this library.

time.sleep roundup on retries

The test_all_clients_authenticated test in vault/tests.py uses a hard-coded sleep time, wrapped by a for loop to retry.

It would be better to use tenacity retries to be consistent with the established approach to similar needs in other tests.

for i in range(1, 10):
try:
self.assertTrue(client.hvac_client.is_authenticated())
except hvac.exceptions.InternalServerError:
time.sleep(2)
else:
break
else:
self.assertTrue(client.hvac_client.is_authenticated())

I feel the same about the is_initialized method in vault/utils.py:

for i in range(1, 10):
try:
initialized = client.hvac_client.is_initialized()
except (ConnectionRefusedError,
urllib3.exceptions.NewConnectionError,
urllib3.exceptions.MaxRetryError,
requests.exceptions.ConnectionError):
time.sleep(2)
else:
break
else:
raise Exception("Cannot connect")
return initialized

And, same for mysql/tests.py:

while i < 10:
i += 1
time.sleep(5) # give some time to pacemaker to react
new_crm_master = self.get_crm_master()
if (new_crm_master and new_crm_master != old_crm_master):
logging.info(
"New crm_master unit detected"
" on {}".format(new_crm_master)
)
break
else:
assert False, "The crm_master didn't change"

IMHO:

When we call time.sleep(), we need to affirm that we are introducing a possible race condition for our future selves. In the majority of cases where time.sleep() is sought, we should seek a different approach.

And if retries are what we're looking for, we should use a consistent and reusable approach, rather than constructing a one-off retry loop. That may require further distillation of the test procedure into one or more helper functions, but it will be better for it in the end.

octavia: High probability of running over quota during test

Test artifacts showing the issue: https://openstack-ci-reports.ubuntu.com/artifacts/test_charm_pipeline_func_full/openstack/charm-octavia/709079/7/5267/index.html

The test has loadbalancer-topology set to 'ACTIVE_STANDBY' which means that each load balancer consumes two instances, we also have spare-pool-size set to '2' which means two load balancers (consuming 4 instances) will be kept as spares at any given time. As part of test setup we start two instances to serve payload data. The LBaasV2 test does not clean up after itself, so at the point in time where we are testing the policyd stuff chances are high we would be over the quota of '10' running instances.

B-F series upgrade fails, needs additional reboot before do-release-upgrade

This issue was encountered while working through related issue #284.

It is clear that we will need to have someone dig in and step through the unique needs of a B-F series upgrade.

17:19:18  (juju run --machine=2 yes | sudo DEBIAN_FRONTEND=noninteractive apt-get --assume-yes -o "Dpkg::Options::=--force-confdef" -o "Dpkg::Options::=--force-confold" dist-upgrade)
17:19:18 2020-05-18 22:19:18 [INFO] Upgrading 2
17:19:18 2020-05-18 22:19:18 [INFO] About to call '['juju', 'run', '--machine=2', '--timeout=120m', 'yes | sudo DEBIAN_FRONTEND=noninteractive do-release-upgrade -d -f DistUpgradeViewNonInteractive']'
17:19:19 2020-05-18 22:19:19 [WARNING] STDOUT: Checking for a new Ubuntu release
17:19:19 You have not rebooted after updating a package which requires a reboot. Please reboot before upgrading.

Test Coverage: nova live migrations

Currently, live migration of instances is not performed. This needs reviewing and deciding whether to add it to zaza-openstack-tests or rely on the tempest tests and ensure that failures get reported on.

Intermittent Traceback in `Configuring overcloud network` stage

Maybe we need retry logic here?

2018-08-05 20:59:33 [INFO] Configuring overcloud network
Traceback (most recent call last):
  File "/home/ubuntu/src/neutron-dynamic-routing/build/builds/neutron-dynamic-routing/.tox/func/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py", line 372, in _make_request
    httplib_response = conn.getresponse(buffering=True)
TypeError: getresponse() got an unexpected keyword argument 'buffering'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/ubuntu/src/neutron-dynamic-routing/build/builds/neutron-dynamic-routing/.tox/func/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py", line 544, in urlopen
    body=body, headers=headers)
  File "/home/ubuntu/src/neutron-dynamic-routing/build/builds/neutron-dynamic-routing/.tox/func/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py", line 374, in _make_request
    httplib_response = conn.getresponse()
  File "/usr/lib/python3.5/http/client.py", line 1197, in getresponse
    response.begin()
  File "/usr/lib/python3.5/http/client.py", line 297, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python3.5/http/client.py", line 266, in _read_status
    raise RemoteDisconnected("Remote end closed connection without"
http.client.RemoteDisconnected: Remote end closed connection without response   

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/ubuntu/src/neutron-dynamic-routing/build/builds/neutron-dynamic-routing/.tox/func/lib/python3.5/site-packages/requests/adapters.py", line 370, in send
    timeout=timeout
  File "/home/ubuntu/src/neutron-dynamic-routing/build/builds/neutron-dynamic-routing/.tox/func/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py", line 597, in urlopen
    _stacktrace=sys.exc_info()[2])
  File "/home/ubuntu/src/neutron-dynamic-routing/build/builds/neutron-dynamic-routing/.tox/func/lib/python3.5/site-packages/requests/packages/urllib3/util/retry.py", line 245, in increment
    raise six.reraise(type(error), error, _stacktrace)
  File "/home/ubuntu/src/neutron-dynamic-routing/build/builds/neutron-dynamic-routing/.tox/func/lib/python3.5/site-packages/requests/packages/urllib3/packages/six.py", line 309, in reraise
    raise value.with_traceback(tb)
  File "/home/ubuntu/src/neutron-dynamic-routing/build/builds/neutron-dynamic-routing/.tox/func/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py", line 544, in urlopen
    body=body, headers=headers)
  File "/home/ubuntu/src/neutron-dynamic-routing/build/builds/neutron-dynamic-routing/.tox/func/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py", line 374, in _make_request
    httplib_response = conn.getresponse()
  File "/usr/lib/python3.5/http/client.py", line 1197, in getresponse
    response.begin()
  File "/usr/lib/python3.5/http/client.py", line 297, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python3.5/http/client.py", line 266, in _read_status
    raise RemoteDisconnected("Remote end closed connection without"
requests.packages.urllib3.exceptions.ProtocolError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response',))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/ubuntu/src/neutron-dynamic-routing/build/builds/neutron-dynamic-routing/.tox/func/lib/python3.5/site-packages/keystoneauth1/session.py", line 903, in _send_request
    resp = self.session.request(method, url, **kwargs)
  File "/home/ubuntu/src/neutron-dynamic-routing/build/builds/neutron-dynamic-routing/.tox/func/lib/python3.5/site-packages/requests/sessions.py", line 464, in request
    resp = self.send(prep, **send_kwargs)
  File "/home/ubuntu/src/neutron-dynamic-routing/build/builds/neutron-dynamic-routing/.tox/func/lib/python3.5/site-packages/requests/sessions.py", line 576, in send
    r = adapter.send(request, **kwargs)
  File "/home/ubuntu/src/neutron-dynamic-routing/build/builds/neutron-dynamic-routing/.tox/func/lib/python3.5/site-packages/requests/adapters.py", line 415, in send
    raise ConnectionError(err, request=request)
requests.exceptions.ConnectionError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response',))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File ".tox/func/bin/functest-run-suite", line 11, in <module>
    sys.exit(main())
  File "/home/ubuntu/src/neutron-dynamic-routing/build/builds/neutron-dynamic-routing/.tox/func/lib/python3.5/site-packages/zaza/charm_lifecycle/func_test_runner.py", line 125, in main
    bundle=args.bundle)
  File "/home/ubuntu/src/neutron-dynamic-routing/build/builds/neutron-dynamic-routing/.tox/func/lib/python3.5/site-packages/zaza/charm_lifecycle/func_test_runner.py", line 57, in func_test_runner
    configure.configure(model_name, test_config['configure'])
  File "/home/ubuntu/src/neutron-dynamic-routing/build/builds/neutron-dynamic-routing/.tox/func/lib/python3.5/site-packages/zaza/charm_lifecycle/configure.py", line 31, in configure
    run_configure_list(functions)
  File "/home/ubuntu/src/neutron-dynamic-routing/build/builds/neutron-dynamic-routing/.tox/func/lib/python3.5/site-packages/zaza/charm_lifecycle/configure.py", line 21, in run_configure_list
    utils.get_class(func)()
  File "/home/ubuntu/src/neutron-dynamic-routing/build/builds/neutron-dynamic-routing/.tox/func/lib/python3.5/site-packages/zaza/charm_tests/dragent/configure.py", line 73, in setup
    network.setup_sdn(network_config, keystone_session=keystone_session)
  File "/home/ubuntu/src/neutron-dynamic-routing/build/builds/neutron-dynamic-routing/.tox/func/lib/python3.5/site-packages/zaza/configure/network.py", line 115, in setup_sdn
    network_config["external_net_name"])
  File "/home/ubuntu/src/neutron-dynamic-routing/build/builds/neutron-dynamic-routing/.tox/func/lib/python3.5/site-packages/zaza/utilities/openstack.py", line 510, in create_external_network
    networks = neutron_client.list_networks(name=net_name)
  File "/home/ubuntu/src/neutron-dynamic-routing/build/builds/neutron-dynamic-routing/.tox/func/lib/python3.5/site-packages/neutronclient/v2_0/client.py", line 809, in list_networks
    **_params)
  File "/home/ubuntu/src/neutron-dynamic-routing/build/builds/neutron-dynamic-routing/.tox/func/lib/python3.5/site-packages/neutronclient/v2_0/client.py", line 369, in list
    for r in self._pagination(collection, path, **params):
  File "/home/ubuntu/src/neutron-dynamic-routing/build/builds/neutron-dynamic-routing/.tox/func/lib/python3.5/site-packages/neutronclient/v2_0/client.py", line 384, in _pagination
    res = self.get(path, params=params)
  File "/home/ubuntu/src/neutron-dynamic-routing/build/builds/neutron-dynamic-routing/.tox/func/lib/python3.5/site-packages/neutronclient/v2_0/client.py", line 354, in get
    headers=headers, params=params)
  File "/home/ubuntu/src/neutron-dynamic-routing/build/builds/neutron-dynamic-routing/.tox/func/lib/python3.5/site-packages/neutronclient/v2_0/client.py", line 331, in retry_request
    headers=headers, params=params)
  File "/home/ubuntu/src/neutron-dynamic-routing/build/builds/neutron-dynamic-routing/.tox/func/lib/python3.5/site-packages/neutronclient/v2_0/client.py", line 282, in do_request
    headers=headers)
  File "/home/ubuntu/src/neutron-dynamic-routing/build/builds/neutron-dynamic-routing/.tox/func/lib/python3.5/site-packages/neutronclient/client.py", line 343, in do_request
    return self.request(url, method, **kwargs)
  File "/home/ubuntu/src/neutron-dynamic-routing/build/builds/neutron-dynamic-routing/.tox/func/lib/python3.5/site-packages/neutronclient/client.py", line 331, in request
    resp = super(SessionClient, self).request(*args, **kwargs)
  File "/home/ubuntu/src/neutron-dynamic-routing/build/builds/neutron-dynamic-routing/.tox/func/lib/python3.5/site-packages/keystoneauth1/adapter.py", line 213, in request
    return self.session.request(url, method, **kwargs)
  File "/home/ubuntu/src/neutron-dynamic-routing/build/builds/neutron-dynamic-routing/.tox/func/lib/python3.5/site-packages/keystoneauth1/session.py", line 814, in request
    resp = send(**kwargs)
  File "/home/ubuntu/src/neutron-dynamic-routing/build/builds/neutron-dynamic-routing/.tox/func/lib/python3.5/site-packages/keystoneauth1/session.py", line 919, in _send_request
    raise exceptions.ConnectFailure(msg)
keystoneauth1.exceptions.connection.ConnectFailure: Unable to establish connection to http://10.5.0.16:9696/v2.0/networks?name=ext_net: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response',))
ERROR: InvocationError: '/home/ubuntu/src/neutron-dynamic-routing/build/builds/neutron-dynamic-routing/.tox/func/bin/functest-run-suite --keep-model'
_____________________________________________________________________________________ summary ______________________________________________________________________________________
ERROR:   func: commands failed

The `zaza.openstack.charm_tests.octavia.setup.configure_octavia` job silently fails when no applications found

If uuids is empty the job skips creating/attaching ports and goes on to charm configuration.

This appears successful, but in reality there will be no connectivity. This bug exists due to another issue,but it would be good to stop right there so it would be more obvious to the one diagnosing the later failures what has happened.

KeyError: getting subordinate charm after pause/resume - change in libjuju / zaza

[charm-trilio-data-mover] KeyError: 'trilio-data-mover/0' during "ERROR: test_pause_resume (zaza.openstack.charm_tests.trilio.tests.TrilioDataMoverTest)"

It looks like something may have changed in libjuju / zaza around being able to get subordinates by key ... it obviously used to work, so something strange is going on:

https://openstack-ci-reports.ubuntu.com/artifacts/test_charm_pipeline_func_smoke/openstack/charm-trilio-data-mover/732679/1/16935/test_charm_func_smoke_17759/func.txt

https://review.opendev.org/#/c/732679/

2020-06-02 20:05:42 [INFO] ERROR: test_pause_resume (zaza.openstack.charm_tests.trilio.tests.TrilioDataMoverTest)
2020-06-02 20:05:42 [INFO] Run pause and resume tests.
2020-06-02 20:05:42 [INFO] ----------------------------------------------------------------------
2020-06-02 20:05:42 [INFO] Traceback (most recent call last):
2020-06-02 20:05:42 [INFO]   File "/tmp/tmp.2ABKxbTwEc/func-smoke/lib/python3.5/site-packages/zaza/openstack/charm_tests/trilio/tests.py", line 295, in test_pause_resume
2020-06-02 20:05:42 [INFO]     with self.pause_resume(self.services, pgrep_full=False):
2020-06-02 20:05:42 [INFO]   File "/usr/lib/python3.5/contextlib.py", line 59, in __enter__
2020-06-02 20:05:42 [INFO]     return next(self.gen)
2020-06-02 20:05:42 [INFO]   File "/tmp/tmp.2ABKxbTwEc/func-smoke/lib/python3.5/site-packages/zaza/openstack/charm_tests/test_utils.py", line 352, in pause_resume
2020-06-02 20:05:42 [INFO]     model_name=self.model_name)
2020-06-02 20:05:42 [INFO]   File "/tmp/tmp.2ABKxbTwEc/func-smoke/lib/python3.5/site-packages/zaza/__init__.py", line 48, in _wrapper
2020-06-02 20:05:42 [INFO]     return run(_run_it())
2020-06-02 20:05:42 [INFO]   File "/tmp/tmp.2ABKxbTwEc/func-smoke/lib/python3.5/site-packages/zaza/__init__.py", line 36, in run
2020-06-02 20:05:42 [INFO]     return task.result()
2020-06-02 20:05:42 [INFO]   File "/usr/lib/python3.5/asyncio/futures.py", line 274, in result
2020-06-02 20:05:42 [INFO]     raise self._exception
2020-06-02 20:05:42 [INFO]   File "/usr/lib/python3.5/asyncio/tasks.py", line 239, in _step
2020-06-02 20:05:42 [INFO]     result = coro.send(None)
2020-06-02 20:05:42 [INFO]   File "/tmp/tmp.2ABKxbTwEc/func-smoke/lib/python3.5/site-packages/zaza/__init__.py", line 47, in _run_it
2020-06-02 20:05:42 [INFO]     return await f(*args, **kwargs)
2020-06-02 20:05:42 [INFO]   File "/tmp/tmp.2ABKxbTwEc/func-smoke/lib/python3.5/site-packages/zaza/model.py", line 1480, in async_block_until_unit_wl_status
2020-06-02 20:05:42 [INFO]     await async_block_until(_unit_status, timeout=timeout)
2020-06-02 20:05:42 [INFO]   File "/tmp/tmp.2ABKxbTwEc/func-smoke/lib/python3.5/site-packages/zaza/model.py", line 1162, in async_block_until
2020-06-02 20:05:42 [INFO]     await asyncio.wait_for(_block(), timeout, loop=loop)
2020-06-02 20:05:42 [INFO]   File "/usr/lib/python3.5/asyncio/tasks.py", line 392, in wait_for
2020-06-02 20:05:42 [INFO]     return fut.result()
2020-06-02 20:05:42 [INFO]   File "/usr/lib/python3.5/asyncio/futures.py", line 274, in result
2020-06-02 20:05:42 [INFO]     raise self._exception
2020-06-02 20:05:42 [INFO]   File "/usr/lib/python3.5/asyncio/tasks.py", line 239, in _step
2020-06-02 20:05:42 [INFO]     result = coro.send(None)
2020-06-02 20:05:42 [INFO]   File "/tmp/tmp.2ABKxbTwEc/func-smoke/lib/python3.5/site-packages/zaza/model.py", line 1156, in _block
2020-06-02 20:05:42 [INFO]     result = await c()
2020-06-02 20:05:42 [INFO]   File "/tmp/tmp.2ABKxbTwEc/func-smoke/lib/python3.5/site-packages/zaza/model.py", line 1453, in _unit_status
2020-06-02 20:05:42 [INFO]     v = model_status.applications[app]['units'][unit_name][
2020-06-02 20:05:42 [INFO] KeyError: 'trilio-data-mover/0'

TypeError: series_upgrade_non_leaders_first() got an unexpected keyword argument 'pause_non_leader_primary'

07:20:08 machine-32 series upgrade complete
07:20:08 
07:20:08 Upgrade series for machine "32" has successfully completed
07:20:08 2020-04-11 07:20:03 [INFO] Running run_post_upgrade_functions []
07:20:08 2020-04-11 07:20:03 [INFO] Waiting for workload status 'active' on nova-compute/1
07:20:08 2020-04-11 07:20:05 [INFO] Set series on nova-compute to xenial
07:20:08 2020-04-11 07:20:06 [WARNING] About to upgrade mongodb
07:20:08 Traceback (most recent call last):
07:20:08   File "/tmp/tmp.vXwvaGdcuA/mojo-openstack-specs/trusty/osci-mojo/spec/specs/full_stack/next_series_upgrade/mitaka/series_upgrade.py", line 42, in <module>
07:20:08     sys.exit(series_upgrade_test.test_200_run_series_upgrade())
07:20:08   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/zaza/openstack/charm_tests/series_upgrade/tests.py", line 94, in test_200_run_series_upgrade
07:20:08     files=self.files,
07:20:08 TypeError: series_upgrade_non_leaders_first() got an unexpected keyword argument 'pause_non_leader_primary'

http://osci:8080/view/MojoMatrix/job/mojo_runner/22110/console

Probably 71b9d5a#diff-572ab3d6fed94d02e79792f13f97053f

Tech-debt: PR #115 uses a copy of wait_for_lb_resource -- needs refactor

Ideally, the refactor will involve using:

def resource_reaches_status(resource,

def resource_reaches_status(resource,
                            resource_id,
                            expected_status='available',
                            msg='resource',
                            wait_exponential_multiplier=1,
                            wait_iteration_max_time=60,
                            stop_after_attempt=8,
                            ):
    ...

But that'll need a refactor to be able to absorb what wait_for_lb() is doing:

        def wait_for_lb_resource(client, resource_id):
            resp = client.load_balancer_show(resource_id)
            logging.info(resp['provisioning_status'])
            assert resp['provisioning_status'] == 'ACTIVE', (
                'load balancer resource has not reached '
                'expected provisioning status: {}'
                .format(resp))
            return resp

octavia: Add more resilience/retry handling in the lb-creation code

In one example we can see that just after successful creation of a load balancer using one provider driver, the next attempt using a different provider driver failed due to connection dropped while talking to the API.

Nothing in the crash dump bears indication of any services being restarted, VIPs moved etc, so this appears like a fluke and reality of interconnected clustered systems.

The test code should be more resilient and retry under such circumstances.

2020-07-04 17:15:52 [INFO] Found "This is the default welcome page" in page retrieved through load balancer  (provider="ovn") at "http://172.17.110.224/"
2020-07-04 17:15:52 [INFO] Creating loadbalancer with provider amphora
2020-07-04 17:16:02 [INFO] Awaiting loadbalancer to reach provisioning_status "ACTIVE"
2020-07-04 17:17:33 [INFO] ERROR
2020-07-04 17:17:33 [INFO] ======================================================================
2020-07-04 17:17:33 [INFO] ERROR: test_create_loadbalancer (zaza.openstack.charm_tests.octavia.tests.LBAASv2Test)
2020-07-04 17:17:33 [INFO] Create load balancer.
2020-07-04 17:17:33 [INFO] ----------------------------------------------------------------------
2020-07-04 17:17:33 [INFO] Traceback (most recent call last):
2020-07-04 17:17:33 [INFO]   File "/tmp/tmp.K630hNXERG/func/lib/python3.5/site-packages/urllib3/connectionpool.py", line 677, in urlopen
2020-07-04 17:17:33 [INFO]     chunked=chunked,
2020-07-04 17:17:33 [INFO]   File "/tmp/tmp.K630hNXERG/func/lib/python3.5/site-packages/urllib3/connectionpool.py", line 426, in _make_request
2020-07-04 17:17:33 [INFO]     six.raise_from(e, None)
2020-07-04 17:17:33 [INFO]   File "<string>", line 3, in raise_from
2020-07-04 17:17:33 [INFO]   File "/tmp/tmp.K630hNXERG/func/lib/python3.5/site-packages/urllib3/connectionpool.py", line 421, in _make_request
2020-07-04 17:17:33 [INFO]     httplib_response = conn.getresponse()
2020-07-04 17:17:33 [INFO]   File "/usr/lib/python3.5/http/client.py", line 1225, in getresponse
2020-07-04 17:17:33 [INFO]     response.begin()
2020-07-04 17:17:33 [INFO]   File "/usr/lib/python3.5/http/client.py", line 307, in begin
2020-07-04 17:17:33 [INFO]     version, status, reason = self._read_status()
2020-07-04 17:17:33 [INFO]   File "/usr/lib/python3.5/http/client.py", line 276, in _read_status
2020-07-04 17:17:33 [INFO]     raise RemoteDisconnected("Remote end closed connection without"
2020-07-04 17:17:33 [INFO] http.client.RemoteDisconnected: Remote end closed connection without response
2020-07-04 17:17:33 [INFO] During handling of the above exception, another exception occurred:
2020-07-04 17:17:33 [INFO] Traceback (most recent call last):
2020-07-04 17:17:33 [INFO]   File "/tmp/tmp.K630hNXERG/func/lib/python3.5/site-packages/requests/adapters.py", line 449, in send
2020-07-04 17:17:33 [INFO]     timeout=timeout
2020-07-04 17:17:33 [INFO]   File "/tmp/tmp.K630hNXERG/func/lib/python3.5/site-packages/urllib3/connectionpool.py", line 725, in urlopen
2020-07-04 17:17:33 [INFO]     method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
2020-07-04 17:17:33 [INFO]   File "/tmp/tmp.K630hNXERG/func/lib/python3.5/site-packages/urllib3/util/retry.py", line 403, in increment
2020-07-04 17:17:33 [INFO]     raise six.reraise(type(error), error, _stacktrace)
2020-07-04 17:17:33 [INFO]   File "/tmp/tmp.K630hNXERG/func/lib/python3.5/site-packages/urllib3/packages/six.py", line 734, in reraise
2020-07-04 17:17:33 [INFO]     raise value.with_traceback(tb)
2020-07-04 17:17:33 [INFO]   File "/tmp/tmp.K630hNXERG/func/lib/python3.5/site-packages/urllib3/connectionpool.py", line 677, in urlopen
2020-07-04 17:17:33 [INFO]     chunked=chunked,
2020-07-04 17:17:33 [INFO]   File "/tmp/tmp.K630hNXERG/func/lib/python3.5/site-packages/urllib3/connectionpool.py", line 426, in _make_request
2020-07-04 17:17:33 [INFO]     six.raise_from(e, None)
2020-07-04 17:17:33 [INFO]   File "<string>", line 3, in raise_from
2020-07-04 17:17:33 [INFO]   File "/tmp/tmp.K630hNXERG/func/lib/python3.5/site-packages/urllib3/connectionpool.py", line 421, in _make_request
2020-07-04 17:17:33 [INFO]     httplib_response = conn.getresponse()
2020-07-04 17:17:33 [INFO]   File "/usr/lib/python3.5/http/client.py", line 1225, in getresponse
2020-07-04 17:17:33 [INFO]     response.begin()
2020-07-04 17:17:33 [INFO]   File "/usr/lib/python3.5/http/client.py", line 307, in begin
2020-07-04 17:17:33 [INFO]     version, status, reason = self._read_status()
2020-07-04 17:17:33 [INFO]   File "/usr/lib/python3.5/http/client.py", line 276, in _read_status
2020-07-04 17:17:33 [INFO]     raise RemoteDisconnected("Remote end closed connection without"
2020-07-04 17:17:33 [INFO] urllib3.exceptions.ProtocolError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response',))
2020-07-04 17:17:33 [INFO] During handling of the above exception, another exception occurred:
2020-07-04 17:17:33 [INFO] Traceback (most recent call last):
2020-07-04 17:17:33 [INFO]   File "/tmp/tmp.K630hNXERG/func/lib/python3.5/site-packages/keystoneauth1/session.py", line 1012, in _send_request
2020-07-04 17:17:33 [INFO]     resp = self.session.request(method, url, **kwargs)
2020-07-04 17:17:33 [INFO]   File "/tmp/tmp.K630hNXERG/func/lib/python3.5/site-packages/requests/sessions.py", line 530, in request
2020-07-04 17:17:33 [INFO]     resp = self.send(prep, **send_kwargs)
2020-07-04 17:17:33 [INFO]   File "/tmp/tmp.K630hNXERG/func/lib/python3.5/site-packages/requests/sessions.py", line 643, in send
2020-07-04 17:17:33 [INFO]     r = adapter.send(request, **kwargs)
2020-07-04 17:17:33 [INFO]   File "/tmp/tmp.K630hNXERG/func/lib/python3.5/site-packages/requests/adapters.py", line 498, in send
2020-07-04 17:17:33 [INFO]     raise ConnectionError(err, request=request)
2020-07-04 17:17:33 [INFO] requests.exceptions.ConnectionError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response',))
2020-07-04 17:17:33 [INFO] During handling of the above exception, another exception occurred:
2020-07-04 17:17:33 [INFO] Traceback (most recent call last):
2020-07-04 17:17:33 [INFO]   File "/tmp/tmp.K630hNXERG/func/lib/python3.5/site-packages/zaza/openstack/charm_tests/octavia/tests.py", line 288, in test_create_loadbalancer
2020-07-04 17:17:33 [INFO]     payload_ips)
2020-07-04 17:17:33 [INFO]   File "/tmp/tmp.K630hNXERG/func/lib/python3.5/site-packages/zaza/openstack/charm_tests/octavia/tests.py", line 165, in _create_lb_resources
2020-07-04 17:17:33 [INFO]     octavia_client.load_balancer_show, lb_id)
2020-07-04 17:17:33 [INFO]   File "/tmp/tmp.K630hNXERG/func/lib/python3.5/site-packages/tenacity/__init__.py", line 329, in wrapped_f
2020-07-04 17:17:33 [INFO]     return self.call(f, *args, **kw)
2020-07-04 17:17:33 [INFO]   File "/tmp/tmp.K630hNXERG/func/lib/python3.5/site-packages/tenacity/__init__.py", line 409, in call
2020-07-04 17:17:33 [INFO]     do = self.iter(retry_state=retry_state)
2020-07-04 17:17:33 [INFO]   File "/tmp/tmp.K630hNXERG/func/lib/python3.5/site-packages/tenacity/__init__.py", line 356, in iter
2020-07-04 17:17:33 [INFO]     return fut.result()
2020-07-04 17:17:33 [INFO]   File "/usr/lib/python3.5/concurrent/futures/_base.py", line 398, in result
2020-07-04 17:17:33 [INFO]     return self.__get_result()
2020-07-04 17:17:33 [INFO]   File "/usr/lib/python3.5/concurrent/futures/_base.py", line 357, in __get_result
2020-07-04 17:17:33 [INFO]     raise self._exception
2020-07-04 17:17:33 [INFO]   File "/tmp/tmp.K630hNXERG/func/lib/python3.5/site-packages/tenacity/__init__.py", line 412, in call
2020-07-04 17:17:33 [INFO]     result = fn(*args, **kwargs)
2020-07-04 17:17:33 [INFO]   File "/tmp/tmp.K630hNXERG/func/lib/python3.5/site-packages/zaza/openstack/charm_tests/octavia/tests.py", line 107, in wait_for_lb_resource
2020-07-04 17:17:33 [INFO]     resp = octavia_show_func(resource_id)
2020-07-04 17:17:33 [INFO]   File "/tmp/tmp.K630hNXERG/func/lib/python3.5/site-packages/octaviaclient/api/v2/octavia.py", line 31, in wrapper
2020-07-04 17:17:33 [INFO]     response = func(*args, **kwargs)
2020-07-04 17:17:33 [INFO]   File "/tmp/tmp.K630hNXERG/func/lib/python3.5/site-packages/octaviaclient/api/v2/octavia.py", line 99, in load_balancer_show
2020-07-04 17:17:33 [INFO]     response = self._find(path=const.BASE_LOADBALANCER_URL, value=lb_id)
2020-07-04 17:17:33 [INFO]   File "/tmp/tmp.K630hNXERG/func/lib/python3.5/site-packages/osc_lib/api/api.py", line 396, in find
2020-07-04 17:17:33 [INFO]     headers=headers,
2020-07-04 17:17:33 [INFO]   File "/tmp/tmp.K630hNXERG/func/lib/python3.5/site-packages/osc_lib/api/api.py", line 141, in _request
2020-07-04 17:17:33 [INFO]     return session.request(url, method, **kwargs)
2020-07-04 17:17:33 [INFO]   File "/tmp/tmp.K630hNXERG/func/lib/python3.5/site-packages/keystoneauth1/session.py", line 921, in request
2020-07-04 17:17:33 [INFO]     resp = send(**kwargs)
2020-07-04 17:17:33 [INFO]   File "/tmp/tmp.K630hNXERG/func/lib/python3.5/site-packages/keystoneauth1/session.py", line 1028, in _send_request
2020-07-04 17:17:33 [INFO]     raise exceptions.ConnectFailure(msg)
2020-07-04 17:17:33 [INFO] keystoneauth1.exceptions.connection.ConnectFailure: Unable to establish connection to https://172.17.110.230:9876/v2.0/lbaas/loadbalancers/f4d8cd85-08c3-4eb6-ab00-a56dfeac7679: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response',))
2020-07-04 17:17:33 [INFO] ----------------------------------------------------------------------
2020-07-04 17:17:33 [INFO] Ran 1 test in 370.329s
2020-07-04 17:17:33 [INFO] FAILED
2020-07-04 17:17:33 [INFO]  (errors=1)
Traceback (most recent call last):
  File "/tmp/tmp.K630hNXERG/func/bin/functest-run-suite", line 8, in <module>
    sys.exit(main())
  File "/tmp/tmp.K630hNXERG/func/lib/python3.5/site-packages/zaza/charm_lifecycle/func_test_runner.py", line 204, in main
    force=args.force)
  File "/tmp/tmp.K630hNXERG/func/lib/python3.5/site-packages/zaza/charm_lifecycle/func_test_runner.py", line 141, in func_test_runner
    force=force)
  File "/tmp/tmp.K630hNXERG/func/lib/python3.5/site-packages/zaza/charm_lifecycle/func_test_runner.py", line 88, in run_env_deployment
    test_steps.get(deployment.model_alias, []))
  File "/tmp/tmp.K630hNXERG/func/lib/python3.5/site-packages/zaza/charm_lifecycle/test.py", line 117, in test
    run_test_list(tests)
  File "/tmp/tmp.K630hNXERG/func/lib/python3.5/site-packages/zaza/charm_lifecycle/test.py", line 111, in run_test_list
    get_test_runners()[runner](testcase, _testcase)
  File "/tmp/tmp.K630hNXERG/func/lib/python3.5/site-packages/zaza/charm_lifecycle/test.py", line 73, in run_unittest
    assert test_result.wasSuccessful(), "Test run failed"
AssertionError: Test run failed
Exception ignored in: <bound method BaseEventLoop.__del__ of <_UnixSelectorEventLoop running=False closed=True debug=False>>
Traceback (most recent call last):
  File "/usr/lib/python3.5/asyncio/base_events.py", line 431, in __del__
  File "/usr/lib/python3.5/asyncio/unix_events.py", line 58, in close
  File "/usr/lib/python3.5/asyncio/unix_events.py", line 139, in remove_signal_handler
  File "/usr/lib/python3.5/signal.py", line 47, in signal
TypeError: signal handler must be signal.SIG_IGN, signal.SIG_DFL, or a callable object
ERROR: InvocationError: '/tmp/tmp.K630hNXERG/func/bin/functest-run-suite --keep-model'
___________________________________ summary ____________________________________
ERROR:   func: commands failed
 ! Functional test failed.

Octavia payload test appears not to work

The Octavia payload [0] test seems to not work.

At a minimum, we will need to add port 80 access to the default (admin_domain) security group.

The member node will also need to listen on port 80. This may be a greater challenge. Either we need to ssh onto the node and install apache or we need an image with apache running.

[0] 7140e4e

Hacluster test has unbound variable

See Test artifacts https://openstack-ci-reports.ubuntu.com/artifacts/test_charm_pipeline_func_smoke/openstack/charm-hacluster/734150/1/17094/consoleText.test_charm_func_smoke_17890.txt

2020-06-08 17:19:50 [INFO] ======================================================================
2020-06-08 17:19:50 [INFO] ERROR: test_900_action_cleanup (zaza.openstack.charm_tests.hacluster.tests.HaclusterTest)
2020-06-08 17:19:50 [INFO] The services can be cleaned up.
2020-06-08 17:19:50 [INFO] ----------------------------------------------------------------------
2020-06-08 17:19:50 [INFO] Traceback (most recent call last):
2020-06-08 17:19:50 [INFO] File "/tmp/tmp.msnztYxDUy/func-smoke/lib/python3.5/site-packages/zaza/openstack/charm_tests/hacluster/tests.py", line 50, in test_900_action_cleanup
2020-06-08 17:19:50 [INFO] if primary_status["units"][leader].get("subordinates"):
2020-06-08 17:19:50 [INFO] UnboundLocalError: local variable 'primary_status' referenced before assignment
2020-06-08 17:19:50 [INFO] ======================================================================
2020-06-08 17:19:50 [INFO] ERROR: test_910_pause_and_resume (zaza.openstack.charm_tests.hacluster.tests.HaclusterTest)
2020-06-08 17:19:50 [INFO] The services can be paused and resumed.
2020-06-08 17:19:50 [INFO] ----------------------------------------------------------------------
2020-06-08 17:19:50 [INFO] Traceback (most recent call last):
2020-06-08 17:19:50 [INFO] File "/tmp/tmp.msnztYxDUy/func-smoke/lib/python3.5/site-packages/zaza/openstack/charm_tests/hacluster/tests.py", line 89, in test_910_pause_and_resume
2020-06-08 17:19:50 [INFO] if primary_status["units"][leader].get("subordinates"):
2020-06-08 17:19:50 [INFO] UnboundLocalError: local variable 'primary_status' referenced before assignment
2020-06-08 17:19:50 [INFO] ----------------------------------------------------------------------

[Series upgrade][trusty-mitaka] heat/0 failed due to sharedb relation changed WHILST mysql was still in paused state (test had continued)

Essentially, it looks like the test had continued (machine 22 upgraded - which has mysql) but then the heat/0 unit failed when trying to connect to the mysql database:

05:42:19 2020-05-02 05:39:01 [INFO] Reboot mysql/1
05:42:19 Failed to start reboot.target: Connection timed out
05:42:19 See system logs and 'systemctl status reboot.target' for details.
05:42:19 2020-05-02 05:39:26 [INFO] Waiting for workload status 'blocked' on mysql/1
05:42:19 2020-05-02 05:39:52 [INFO] Waiting for model idleness
05:42:19 2020-05-02 05:39:53 [INFO] Set origin on mysql
05:42:19 2020-05-02 05:39:53 [INFO] Set origin on mysql to source
05:42:19 2020-05-02 05:39:55 [INFO] Complete series upgrade on 22
05:42:19 machine-22 complete phase started
05:42:19 machine-22 started unit agents after series upgrade
05:42:19 mysql/1 post-series-upgrade hook running
05:42:19 mysql/1 post-series-upgrade completed
05:42:19 mysql-hacluster/0 post-series-upgrade hook running
05:42:19 mysql-hacluster/0 post-series-upgrade completed
05:42:19 
05:42:19 Upgrade series for machine "22" has successfully completed
05:42:19 Traceback (most recent call last):
05:42:19   File "/tmp/tmp.eZaH0SXLjw/mojo-openstack-specs/trusty/osci-mojo/spec/specs/full_stack/next_series_upgrade/mitaka/series_upgrade.py", line 42, in <module>
05:42:19     sys.exit(series_upgrade_test.test_200_run_series_upgrade())
05:42:19   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/zaza/openstack/charm_tests/series_upgrade/tests.py", line 94, in test_200_run_series_upgrade
05:42:19     files=self.files,
05:42:19   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/zaza/openstack/utilities/series_upgrade.py", line 322, in series_upgrade_application
05:42:19     post_upgrade_functions=post_upgrade_functions)
05:42:19   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/zaza/openstack/utilities/series_upgrade.py", line 578, in series_upgrade
05:42:19     model.block_until_all_units_idle()
05:42:19   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/zaza/__init__.py", line 48, in _wrapper
05:42:19     return run(_run_it())
05:42:19   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/zaza/__init__.py", line 36, in run
05:42:19     return task.result()
05:42:19   File "/usr/lib/python3.5/asyncio/futures.py", line 274, in result
05:42:19     raise self._exception
05:42:19   File "/usr/lib/python3.5/asyncio/tasks.py", line 239, in _step
05:42:19     result = coro.send(None)
05:42:19   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/zaza/__init__.py", line 47, in _run_it
05:42:19     return await f(*args, **kwargs)
05:42:19   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/zaza/model.py", line 1040, in async_block_until_all_units_idle
05:42:19     raise UnitError(errored_units)
05:42:19 zaza.model.UnitError: Units heat/0 in error state

Smoking gun may be:

05:42:19 2020-05-02 05:39:01 [INFO] Reboot mysql/1
05:42:19 Failed to start reboot.target: Connection timed out

heat/0 failed with

2020-05-02 05:42:14 DEBUG shared-db-relation-changed ERROR: (pymysql.err.OperationalError) (2003, "Can't connect to MySQL server on '172.17.107.231' ([Errno 113] No route to host)")

mysql/1 relevant info:

mysql/0                    maintenance  idle       21       172.17.107.20   3306/tcp                                 Paused. Use 'resume' action to resume normal service.
  mysql-hacluster/1        maintenance  idle                172.17.107.20                                            Paused. Use 'resume' action to resume normal service.
mysql/1*                   active       executing  22       172.17.107.45   3306/tcp                                 (config-changed) Unit is ready
  mysql-hacluster/0*       blocked      executing           172.17.107.45                                            (post-series-upgrade) Resource: res_mysql_monitor not running
mysql/2                    maintenance  idle       23       172.17.107.64   3306/tcp                                 Paused. Use 'resume' action to resume normal service.
  mysql-hacluster/2        maintenance  idle                172.17.107.64                                            Paused. Use 'resume' action to resume normal service.

Strategy:

Trace whether there is a race/error in determining whether the mysql post-series upgrade (or series upgrade in general) verification of complete is actually not correct and is assuming it is complete when it is not.

add_interface_to_netplan fails with KeyError

Add add_interface_to_netplan sometimes fails with:

20:03:41 2019-09-26 20:03:32 [INFO] Attaching additional port to instance, connected to net id: e6cefd93-4682-464c-a030-79af684df076
20:03:41 Traceback (most recent call last):
20:03:41   File "/tmp/tmp.Ym6gVKaGPT/mojo-openstack-specs/bionic/osci-mojo/spec/specs/full_stack/next_ha_vrrp/stein/network_setup.py", line 20, in <module>
20:03:41     cacert=cacert))
20:03:41   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/zaza/openstack/configure/network.py", line 290, in run_from_cli
20:03:41     keystone_session=undercloud_ks_sess)
20:03:41   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/zaza/openstack/configure/network.py", line 231, in setup_gateway_ext_port
20:03:41     add_dataport_to_netplan=add_dataport_to_netplan)
20:03:41   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/zaza/openstack/utilities/openstack.py", line 613, in configure_gateway_ext_port
20:03:41     mac_address=port['mac_address'],
20:03:41 KeyError: 'mac_address'

series upgrade hangs for tempest charm

Series upgrade script appears to hang when upgrading the tempest charm. The upgrade to bionic appears to work but the upgrade is never marked as complete (via "juju upgrade-series X complete")

03:17:14 2019-10-13 03:16:50 [INFO] Reboot tempest/0
03:17:14 Connection to 172.17.107.35 closed by remote host.
03:17:14 2019-10-13 03:16:52 [INFO] Command '['juju', 'ssh', 'tempest/0', 'sudo', 'reboot', '&&', 'exit']' returned non-zero exit status 255
03:17:14 2019-10-13 03:16:52 [INFO] Waiting for workload status 'blocked' on tempest/0
03:17:14 2019-10-13 03:17:10 [INFO] Waiting for model idleness
03:17:14 2019-10-13 03:17:11 [INFO] Set origin on tempest
03:17:14 2019-10-13 03:17:11 [INFO] Set origin on tempest to openstack-origin
03:17:14 Traceback (most recent call last):
03:17:14   File "/tmp/tmp.SonXZG2Dxz/mojo-openstack-specs/xenial/osci-mojo/spec/specs/full_stack/next_series_upgrade/queens/series_upgrade.py", line 42, in <module>
03:17:14     sys.exit(series_upgrade_test.test_200_run_series_upgrade())
03:17:14   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/zaza/openstack/charm_tests/series_upgrade/tests.py", line 101, in test_200_run_series_upgrade
03:17:14     files=self.files)
03:17:14   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/zaza/openstack/utilities/generic.py", line 313, in series_upgrade_application
03:17:14     files=files)
03:17:14   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/zaza/openstack/utilities/generic.py", line 392, in series_upgrade
03:17:14     set_origin(application, origin)
03:17:14   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/zaza/openstack/utilities/generic.py", line 421, in set_origin
03:17:14     model.set_application_config(application, {origin: pocket})
03:17:14   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/zaza/__init__.py", line 48, in _wrapper
03:17:14     return run(_run_it())
03:17:14   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/zaza/__init__.py", line 36, in run
03:17:14     return task.result()
03:17:14   File "/usr/lib/python3.5/asyncio/futures.py", line 274, in result
03:17:14     raise self._exception
03:17:14   File "/usr/lib/python3.5/asyncio/tasks.py", line 239, in _step
03:17:14     result = coro.send(None)
03:17:14   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/zaza/__init__.py", line 47, in _run_it
03:17:14     return await f(*args, **kwargs)
03:17:14   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/zaza/model.py", line 498, in async_set_application_config
03:17:14     .set_config(configuration))
03:17:14   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/juju/application.py", line 383, in set_config
03:17:14     return await app_facade.Set(application=self.name, options=config)
03:17:14   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/juju/client/facade.py", line 472, in wrapper
03:17:14     reply = await f(*args, **kwargs)
03:17:14   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/juju/client/_client8.py", line 1124, in Set
03:17:14     reply = await self.rpc(msg)
03:17:14   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/juju/client/facade.py", line 608, in rpc
03:17:14     result = await self.connection.rpc(msg, encoder=TypeEncoder)
03:17:14   File "/var/lib/jenkins/tools/0/charm-test-infra/.tox/clients/lib/python3.5/site-packages/juju/client/connection.py", line 455, in rpc
03:17:14     raise errors.JujuAPIError(result)
03:17:14 juju.errors.JujuAPIError: unknown option "openstack-origin"
03:17:14 
03:17:14 
03:17:14  change:{'current': 'active', 'message': 'Unit is ready', 'since': '2019-10-12T19:07:06.510765898Z', 'version': ''}
03:17:14 2019-10-12 19:07:33 [DEBUG] deployer.env:  Delta unit: nova-compute/1 change:{'current': 'active', 'message': 'Unit is ready', 'since': '2019-10-12T19:01:47.632721293Z', 'version': ''}
03:17:14 2019-10-12 19:07:52 [DEBUG] deployer.env:  Delta unit: nova-compute/1 change:{'current': 'active', 'message': 'Unit is ready', 'since': '2019-10-12T19:01:47.632721293Z', 'version': ''}
03:17:14 2019-10-12 19:07:52 [DEBUG] deployer.env:  Delta application: nova-cloud-controller change:{'current': 'active', 'message': 'Unit is ready', 'since': '2019-10-12T19:07:50.371163822Z', 'version': ''}
03:17:14 2019-10-12 19:07:52 [DEBUG] deployer.env:  Delta unit: nova-cloud-controller/0 change:{'current': 'active', 'message': 'Unit is ready', 'since': '2019-10-12T19:07:50.371163822Z', 'version': ''}
03:17:14 2019-10-12 19:07:52 [DEBUG] deployer.env:  Delta unit: nova-cloud-controller/0 change:{'current': 'active', 'message': 'Unit is ready', 'since': '2019-10-12T19:07:50.371163822Z', 'version': ''}
03:17:14 2019-10-12 19:07:52 [DEBUG] deployer.env:  Delta unit: nova-compute/0 change:{'current': 'active', 'message': 'Unit is ready', 'since': '2019-10-12T19:02:38.338923805Z', 'version': ''}
03:17:14 2019-10-12 19:07:52 [DEBUG] deployer.env:  Delta unit: neutron-gateway/0 change:{'current': 'active', 'message': 'Unit is ready', 'since': '2019-10-12T19:00:04.230844154Z', 'version': ''}
03:17:14 2019-10-12 19:07:52 [DEBUG] deployer.env:  Delta unit: designate/0 change:{'current': 'active', 'message': 'Unit is ready', 'since': '2019-10-12T19:07:06.510765898Z', 'version': ''}
03:17:14 2019-10-12 19:07:52 [DEBUG] deployer.env:  Delta unit: designate/2 change:{'current': 'active', 'message': 'Unit is ready', 'since': '2019-10-12T19:06:53.562023255Z', 'version': ''}
03:17:14 2019-10-12 19:07:53 [DEBUG] deployer.env:  Delta unit: nova-compute/2 change:{'current': 'active', 'message': 'Unit is ready', 'since': '2019-10-12T19:03:18.074289245Z', 'version': ''}
03:17:14 2019-10-12 19:08:02 [DEBUG] deployer.env:  Delta unit: designate/2 change:{'current': 'active', 'message': 'Unit is ready', 'since': '2019-10-12T19:06:53.562023255Z', 'version': ''}
03:17:14 2019-10-12 19:08:02 [DEBUG] deployer.env:  Delta unit: designate/0 change:{'current': 'active', 'message': 'Unit is ready', 'since': '2019-10-12T19:07:06.510765898Z', 'version': ''}
03:17:14 2019-10-12 19:08:02 [DEBUG] deployer.env:  Delta unit: nova-compute/1 change:{'current': 'active', 'message': 'Unit is ready', 'since': '2019-10-12T19:01:47.632721293Z', 'version': ''}
03:17:14 2019-10-12 19:08:10 [DEBUG] deployer.env:  Delta unit: neutron-gateway/0 change:{'current': 'active', 'message': 'Unit is ready', 'since': '2019-10-12T19:00:04.230844154Z', 'version': ''}
03:17:14 2019-10-12 19:08:10 [DEBUG] deployer.env:  Delta unit: nova-compute/0 change:{'current': 'active', 'message': 'Unit is ready', 'since': '2019-10-12T19:02:38.338923805Z', 'version': ''}
03:17:14 2019-10-12 19:08:19 [DEBUG] deployer.env:  Delta unit: nova-compute/2 change:{'current': 'active', 'message': 'Unit is ready', 'since': '2019-10-12T19:03:18.074289245Z', 'version': ''}
03:17:14 2019-10-12 19:08:19 [DEBUG] deployer.env:  Delta unit: nova-compute/1 change:{'current': 'active', 'message': 'Unit is ready', 'since': '2019-10-12T19:01:47.632721293Z', 'version': ''}
03:17:14 2019-10-12 19:09:08 [DEBUG] deployer.env:  Delta unit: designate/2 change:{'current': 'active', 'message': 'Unit is ready', 'since': '2019-10-12T19:06:53.562023255Z', 'version': ''}
03:17:14 2019-10-12 19:09:08 [DEBUG] deployer.env:  Delta unit: designate/0 change:{'current': 'active', 'message': 'Unit is ready', 'since': '2019-10-12T19:07:06.510765898Z', 'version': ''}
03:17:14 2019-10-12 19:09:41 [DEBUG] deployer.env:  Delta unit: designate/0 change:{'current': 'active', 'message': 'Unit is ready', 'since': '2019-10-12T19:07:06.510765898Z', 'version': ''}
03:17:14 2019-10-12 19:09:41 [DEBUG] deployer.env:  Delta unit: designate/2 change:{'current': 'active', 'message': 'Unit is ready', 'since': '2019-10-12T19:06:53.562023255Z', 'version': ''}
03:17:14 2019-10-12 19:10:24 [DEBUG] deployer.env:  Delta unit: designate/2 change:{'current': 'active', 'message': 'Unit is ready', 'since': '2019-10-12T19:06:53.562023255Z', 'version': ''}
03:17:14 2019-10-12 19:10:24 [DEBUG] deployer.env:  Delta unit: designate/0 change:{'current': 'active', 'message': 'Unit is ready', 'since': '2019-10-12T19:07:06.510765898Z', 'version': ''}
03:17:14 2019-10-12 19:10:47 [DEBUG] deployer.env:  Delta application: designate change:{'current': 'active', 'message': 'Unit is ready', 'since': '2019-10-12T19:07:06.510765898Z', 'version': ''}
03:17:14 2019-10-12 19:10:47 [DEBUG] deployer.env:  Delta unit: designate/1 change:{'current': 'active', 'message': 'Unit is ready', 'since': '2019-10-12T19:10:43.430562369Z', 'version': ''}
03:17:14 2019-10-12 19:10:47 [DEBUG] deployer.env:  Delta unit: designate/1 change:{'current': 'active', 'message': 'Unit is ready', 'since': '2019-10-12T19:10:43.430562369Z', 'version': ''}
03:17:14 2019-10-12 19:11:01 [DEBUG] deployer.env:  Delta unit: designate/2 change:{'current': 'active', 'message': 'Unit is ready', 'since': '2019-10-12T19:06:53.562023255Z', 'version': ''}
03:17:14 2019-10-12 19:11:01 [DEBUG] deployer.env:  Delta unit: designate/0 change:{'current': 'active', 'message': 'Unit is ready', 'since': '2019-10-12T19:07:06.510765898Z', 'version': ''}
03:17:14 2019-10-12 19:11:01 [INFO] deployer.cli: Deployment complete in 1718.59 seconds
03:17:14 + [[ 1 != \0 ]]
03:17:14 + echo ' ! Mojo run manifest failed.'
03:17:14  ! Mojo run manifest failed.
03:17:14 + touch /var/lib/jenkins/workspace/mojo_runner/fyi-.-mojo-run.failed
03:17:14 + set +x
03:17:14 ====>  end of mojo's run  <=======================
03:17:17 Model           Controller      Cloud/Region             Version  SLA          Timestamp
03:17:17 auto-osci-sv07  auto-osci-sv07  serverstack/serverstack  2.6.10   unsupported  03:17:15Z
03:17:17 
03:17:17 App                    Version      Status   Scale  Charm                  Store  Rev  OS      Notes
03:17:17 aodh                   6.0.1        active       1  aodh                   local    0  ubuntu  
03:17:17 ceilometer             10.0.1       active       1  ceilometer             local    0  ubuntu  
03:17:17 ceilometer-agent       10.0.1       active       3  ceilometer-agent       local    0  ubuntu  
03:17:17 ceph-mon               12.2.12      active       3  ceph-mon               local    0  ubuntu  
03:17:17 ceph-osd               12.2.12      active       3  ceph-osd               local   15  ubuntu  
03:17:17 cinder                 12.0.7       active       1  cinder                 local  136  ubuntu  
03:17:17 cinder-ceph            12.0.7       active       1  cinder-ceph            local    2  ubuntu  
03:17:17 designate              6.0.1        active       3  designate              local    0  ubuntu  
03:17:17 designate-bind         9.11.3+dfsg  active       3  designate-bind         local    0  ubuntu  
03:17:17 designate-hacluster                 active       3  hacluster              local    0  ubuntu  
03:17:17 glance                 16.0.1       active       1  glance                 local  150  ubuntu  
03:17:17 gnocchi                4.2.5        active       1  gnocchi                local    0  ubuntu  
03:17:17 heat                   10.0.2       active       1  heat                   local   12  ubuntu  
03:17:17 keystone               13.0.2       active       1  keystone               local    0  ubuntu  
03:17:17 memcached                           active       1  memcached              local   69  ubuntu  
03:17:17 mysql                  5.7.20       active       1  percona-cluster        local   45  ubuntu  
03:17:17 neutron-api            12.0.6       active       1  neutron-api            local    0  ubuntu  
03:17:17 neutron-gateway        12.0.6       active       1  neutron-gateway        local   64  ubuntu  
03:17:17 neutron-openvswitch    12.0.6       active       3  neutron-openvswitch    local    0  ubuntu  
03:17:17 nova-cloud-controller  17.0.10      active       1  nova-cloud-controller  local  501  ubuntu  
03:17:17 nova-compute           17.0.10      active       3  nova-compute           local  133  ubuntu  
03:17:17 openstack-dashboard    13.0.2       active       1  openstack-dashboard    local   32  ubuntu  
03:17:17 rabbitmq-server        3.6.10       active       1  rabbitmq-server        local  150  ubuntu  
03:17:17 swift-proxy            2.17.1       active       1  swift-proxy            local  147  ubuntu  
03:17:17 swift-storage-z1       2.17.1       active       1  swift-storage          local   90  ubuntu  
03:17:17 swift-storage-z2       2.17.1       active       1  swift-storage          local   91  ubuntu  
03:17:17 swift-storage-z3       2.17.1       active       1  swift-storage          local   92  ubuntu  
03:17:17 tempest                2.7.4        blocked      1  tempest                local    0  ubuntu  
03:17:17 
03:17:17 Unit                      Workload  Agent  Machine  Public address  Ports              Message
03:17:17 aodh/0*                   active    idle   0        172.17.107.5    8042/tcp           Unit is ready
03:17:17 ceilometer/0*             active    idle   1        172.17.107.27                      Unit is ready
03:17:17 ceph-mon/0*               active    idle   2        172.17.107.6                       Unit is ready and clustered
03:17:17 ceph-mon/1                active    idle   3        172.17.107.20                      Unit is ready and clustered
03:17:17 ceph-mon/2                active    idle   4        172.17.107.14                      Unit is ready and clustered
03:17:17 ceph-osd/0                active    idle   5        172.17.107.31                      Unit is ready (1 OSD)
03:17:17 ceph-osd/1                active    idle   6        172.17.107.30                      Unit is ready (1 OSD)
03:17:17 ceph-osd/2*               active    idle   7        172.17.107.32                      Unit is ready (1 OSD)
03:17:17 cinder/0*                 active    idle   8        172.17.107.10   8776/tcp           Unit is ready
03:17:17   cinder-ceph/0*          active    idle            172.17.107.10                      Unit is ready
03:17:17 designate-bind/0*         active    idle   12       172.17.107.37                      Unit is ready
03:17:17 designate-bind/1          active    idle   13       172.17.107.9                       Unit is ready
03:17:17 designate-bind/2          active    idle   14       172.17.107.12                      Unit is ready
03:17:17 designate/0               active    idle   9        172.17.107.18   9001/tcp           Unit is ready
03:17:17   designate-hacluster/2   active    idle            172.17.107.18                      Unit is ready and clustered
03:17:17 designate/1*              active    idle   10       172.17.107.3    9001/tcp           Unit is ready
03:17:17   designate-hacluster/1*  active    idle            172.17.107.3                       Unit is ready and clustered
03:17:17 designate/2               active    idle   11       172.17.107.11   9001/tcp           Unit is ready
03:17:17   designate-hacluster/0   active    idle            172.17.107.11                      Unit is ready and clustered
03:17:17 glance/0*                 active    idle   15       172.17.107.7    9292/tcp           Unit is ready
03:17:17 gnocchi/0*                active    idle   16       172.17.107.4    8041/tcp           Unit is ready
03:17:17 heat/0*                   active    idle   17       172.17.107.8    8000/tcp,8004/tcp  Unit is ready
03:17:17 keystone/0*               active    idle   18       172.17.107.22   5000/tcp           Unit is ready
03:17:17 memcached/0*              active    idle   19       172.17.107.34   11211/tcp          Unit is ready
03:17:17 mysql/0*                  active    idle   20       172.17.107.47   3306/tcp           Unit is ready
03:17:17 neutron-api/0*            active    idle   21       172.17.107.24   9696/tcp           Unit is ready
03:17:17 neutron-gateway/0*        active    idle   22       172.17.107.25                      Unit is ready
03:17:17 nova-cloud-controller/0*  active    idle   23       172.17.107.43   8774/tcp,8778/tcp  Unit is ready
03:17:17 nova-compute/0            active    idle   24       172.17.107.42                      Unit is ready
03:17:17   ceilometer-agent/1*     active    idle            172.17.107.42                      Unit is ready
03:17:17   neutron-openvswitch/1   active    idle            172.17.107.42                      Unit is ready
03:17:17 nova-compute/1            active    idle   25       172.17.107.28                      Unit is ready
03:17:17   ceilometer-agent/2      active    idle            172.17.107.28                      Unit is ready
03:17:17   neutron-openvswitch/2   active    idle            172.17.107.28                      Unit is ready
03:17:17 nova-compute/2*           active    idle   26       172.17.107.16                      Unit is ready
03:17:17   ceilometer-agent/0      active    idle            172.17.107.16                      Unit is ready
03:17:17   neutron-openvswitch/0*  active    idle            172.17.107.16                      Unit is ready
03:17:17 openstack-dashboard/0*    active    idle   27       172.17.107.50   80/tcp,443/tcp     Unit is ready
03:17:17 rabbitmq-server/0*        active    idle   28       172.17.107.15   5672/tcp           Unit is ready, Run complete-cluster-series-upgrade when the cluster has completed its upgrade.
03:17:17 swift-proxy/0*            active    idle   29       172.17.107.13   8080/tcp           Unit is ready
03:17:17 swift-storage-z1/0*       active    idle   30       172.17.107.41                      Unit is ready
03:17:17 swift-storage-z2/0*       active    idle   31       172.17.107.46                      Unit is ready
03:17:17 swift-storage-z3/0*       active    idle   32       172.17.107.29                      Unit is ready
03:17:17 tempest/0*                blocked   idle   33       172.17.107.35                      Ready for do-release-upgrade and reboot. Set complete when finished.
03:17:17 
03:17:17 Machine  State    DNS            Inst id                               Series  AZ    Message
03:17:17 0        started  172.17.107.5   fa80e484-90a6-4e91-a32d-68bcc1eac787  bionic  nova  ACTIVE
03:17:17 1        started  172.17.107.27  3c028c74-77c0-4f20-b97b-db3e03959f43  bionic  nova  ACTIVE
03:17:17 2        started  172.17.107.6   48d937ba-3721-453e-ae3f-e48ef9a8942b  bionic  nova  ACTIVE
03:17:17 3        started  172.17.107.20  8e32b176-d29e-406d-8580-b2e8772cf2a9  bionic  nova  ACTIVE
03:17:17 4        started  172.17.107.14  b8d4da7a-6177-4148-831b-173c96b07f25  bionic  nova  ACTIVE
03:17:17 5        started  172.17.107.31  e267f0b4-2e46-4e1f-964e-7b4dd3e27771  bionic  nova  ACTIVE
03:17:17 6        started  172.17.107.30  da52eaf0-3a30-46bc-a199-979d0bfb8e44  bionic  nova  ACTIVE
03:17:17 7        started  172.17.107.32  983de59c-9552-423b-b312-4bb2fcfd5f07  bionic  nova  ACTIVE
03:17:17 8        started  172.17.107.10  197d18f1-e9e7-44e1-b788-935cd255e26c  bionic  nova  ACTIVE
03:17:17 9        started  172.17.107.18  1b4d983c-5a77-4e89-b0d2-0b1fd32fc1fd  bionic  nova  ACTIVE
03:17:17 10       started  172.17.107.3   8fafb63f-5bbd-42b9-8c13-20dcd3a4d5df  bionic  nova  ACTIVE
03:17:17 11       started  172.17.107.11  cd414968-0c85-4a19-9329-13612a831c6d  bionic  nova  ACTIVE
03:17:17 12       started  172.17.107.37  38b1f9ad-a797-4bc6-878d-bb4755ac1398  bionic  nova  ACTIVE
03:17:17 13       started  172.17.107.9   35eed2bd-a95e-4b84-89d0-0bba924fbbb3  bionic  nova  ACTIVE
03:17:17 14       started  172.17.107.12  9c3be1dc-ab17-4647-b017-6b70e05f070f  bionic  nova  ACTIVE
03:17:17 15       started  172.17.107.7   ff248c14-ebee-4eec-a9cd-0150d2fff244  bionic  nova  ACTIVE
03:17:17 16       started  172.17.107.4   a760033b-c568-497a-947f-c3dc6b61eeac  bionic  nova  ACTIVE
03:17:17 17       started  172.17.107.8   1d8c1ad4-dcbb-48b3-94e3-b784115483e1  bionic  nova  ACTIVE
03:17:17 18       started  172.17.107.22  dd36f3dc-4817-4fed-8253-4c43e75472d4  bionic  nova  ACTIVE
03:17:17 19       started  172.17.107.34  cace71a7-f9c7-4f72-8d2e-fcadfd7c522a  bionic  nova  ACTIVE
03:17:17 20       started  172.17.107.47  2e9c7a5c-4440-4ab9-ae30-96d68c05fb35  bionic  nova  ACTIVE
03:17:17 21       started  172.17.107.24  f51fcee3-d2cc-4103-97f0-b9a8ed3d3d33  bionic  nova  ACTIVE
03:17:17 22       started  172.17.107.25  ec446168-8e95-40c3-9bb9-1cb29901ec14  bionic  nova  ACTIVE
03:17:17 23       started  172.17.107.43  93acf896-444c-4cdd-a6ac-6b6075b7b631  bionic  nova  ACTIVE
03:17:17 24       started  172.17.107.42  cf349c1f-f633-48b0-8bcc-edf4af291a68  bionic  nova  ACTIVE
03:17:17 25       started  172.17.107.28  6038a099-9f31-4d0d-afe4-78a426434117  bionic  nova  ACTIVE
03:17:17 26       started  172.17.107.16  cdf9fafe-0960-4afb-9dd0-f8596c2912bf  bionic  nova  ACTIVE
03:17:17 27       started  172.17.107.50  c65b9d71-016c-4d37-89c9-2022da193882  bionic  nova  ACTIVE
03:17:17 28       started  172.17.107.15  64f6cb89-1998-46d2-a990-5c27a8437eab  bionic  nova  ACTIVE
03:17:17 29       started  172.17.107.13  22bad8c5-5c1b-49d4-acc0-44343d3d4de2  bionic  nova  ACTIVE
03:17:17 30       started  172.17.107.41  7ac2157b-8cbe-410e-97fd-240956da3a0b  bionic  nova  ACTIVE
03:17:17 31       started  172.17.107.46  714b89de-3d04-47bb-9649-a12beb200006  bionic  nova  ACTIVE
03:17:17 32       started  172.17.107.29  729c50d0-a54e-4867-93f8-ee850bd62898  bionic  nova  ACTIVE
03:17:17 33       started  172.17.107.35  92951744-97fd-4050-be1b-c724eb7b3065  xenial  nova  ACTIVE
03:17:17 
03:17:17 ====>  f_lolo_log_puller  <=======================
03:17:17   File: '/var/lib/jenkins/tools/0/bot-control/tox.ini'
03:17:17   Size: 537       	Blocks: 8          IO Block: 4096   regular file
03:17:17 Device: fd01h/64769d	Inode: 3342521     Links: 1
03:17:17 Access: (0644/-rw-r--r--)  Uid: (  112/ jenkins)   Gid: (  116/ jenkins)
03:17:17 Access: 2019-10-12 18:40:46.759392677 +0000
03:17:17 Modify: 2019-10-12 18:40:46.759392677 +0000
03:17:17 Change: 2019-10-12 18:40:46.759392677 +0000
03:17:17  Birth: -
03:17:17 stat: cannot stat '/var/lib/jenkins/tools/0/bot-control/.tox/port-cleanup': No such file or directory
03:17:18 port-cleanup create: /var/lib/jenkins/tools/0/bot-control/tools/.tox/port-cleanup
03:17:23 port-cleanup installdeps: -r/var/lib/jenkins/tools/0/bot-control/tools/port-cleanup-requirements.txt

Post series upgrade instance launch fails

The simple_os_checks.py script is run before and after series upgrade. It fails when run after the series upgrade.

The script is configured to launch two guests. But these test guests are not getting cleaned up which means when it tries to run a second time it tries to bring up two additional guests and there is not enough compute resource to support that so a new guest goes into an error state.

 openstack server list                                                                           
+--------------------------------------+--------------------+---------+--------------------------------------+--------+----------+                                                                            
| ID                                   | Name               | Status  | Networks                             | Image  | Flavor   |                                                                            
+--------------------------------------+--------------------+---------+--------------------------------------+--------+----------+                                                                            
| 9a657067-939d-444a-aae6-4e94dc2dc153 | mojo20191013071507 | ERROR   |                                      | trusty | m1.small |                                                                            
| 7cf731c5-3eb9-4f96-90ba-5aa9010a51a4 | mojo20191013071340 | ACTIVE  | private=192.168.21.5, 172.17.107.216 | trusty | m1.small |                                                                            
| 39527ca4-1302-4b40-b9bf-1b3dd9f2755f | 20191012191842     | SHUTOFF | private=192.168.21.4, 172.17.107.212 | trusty | m1.small |                                                                            
| fbf1e840-db86-440c-9a65-bdd7845b5b4f | 20191012191727     | SHUTOFF | private=192.168.21.9, 172.17.107.200 | trusty | m1.small |                                                                            
+--------------------------------------+--------------------+---------+--------------------------------------+--------+----------+       

Deleting old guests and rerunning the script succeeds.

test_manila_share does not wait until a share becomes available

It appears to be that the test_manila_share test case does not wait until a share becomes available which results in failures like this:

https://review.opendev.org/#/c/730835/

2020-06-01 11:09:59 [INFO] test_manila_share (zaza.openstack.charm_tests.manila_ganesha.tests.ManilaGaneshaTests)
2020-06-01 11:09:59 [INFO] Test that Manila + Ganesha shares can be accessed on two instances.
...

2020-06-01 11:16:52 [INFO] ======================================================================
2020-06-01 11:16:52 [INFO] ERROR: test_manila_share (zaza.openstack.charm_tests.manila_ganesha.tests.ManilaGaneshaTests)
2020-06-01 11:16:52 [INFO] Test that Manila + Ganesha shares can be accessed on two instances.
2020-06-01 11:16:52 [INFO] ----------------------------------------------------------------------
2020-06-01 11:16:52 [INFO] Traceback (most recent call last):
2020-06-01 11:16:52 [INFO]   File "/tmp/tmp.AmudZRxoSB/func/lib/python3.5/site-packages/zaza/openstack/charm_tests/manila_ganesha/tests.py", line 77, in test_manila_share
2020-06-01 11:16:52 [INFO]     share.allow(access_type='ip', access=fip_1, access_level='rw')
2020-06-01 11:16:52 [INFO]   File "/tmp/tmp.AmudZRxoSB/func/lib/python3.5/site-packages/manilaclient/v2/shares.py", line 82, in allow
2020-06-01 11:16:52 [INFO]     self, access_type, access, access_level, metadata)
2020-06-01 11:16:52 [INFO]   File "/tmp/tmp.AmudZRxoSB/func/lib/python3.5/site-packages/manilaclient/api_versions.py", line 399, in substitution
2020-06-01 11:16:52 [INFO]     return method.func(obj, *args, **kwargs)
2020-06-01 11:16:52 [INFO]   File "/tmp/tmp.AmudZRxoSB/func/lib/python3.5/site-packages/manilaclient/v2/shares.py", line 555, in allow
2020-06-01 11:16:52 [INFO]     share, access_type, access, access_level, "os-allow_access")
2020-06-01 11:16:52 [INFO]   File "/tmp/tmp.AmudZRxoSB/func/lib/python3.5/site-packages/manilaclient/v2/shares.py", line 548, in _do_allow
2020-06-01 11:16:52 [INFO]     access_params)[1]["access"]
2020-06-01 11:16:52 [INFO]   File "/tmp/tmp.AmudZRxoSB/func/lib/python3.5/site-packages/manilaclient/v2/shares.py", line 672, in _action
2020-06-01 11:16:52 [INFO]     return self.api.client.post(url, body=body)
2020-06-01 11:16:52 [INFO]   File "/tmp/tmp.AmudZRxoSB/func/lib/python3.5/site-packages/manilaclient/common/httpclient.py", line 177, in post
2020-06-01 11:16:52 [INFO]     return self._cs_request(url, 'POST', **kwargs)
2020-06-01 11:16:52 [INFO]   File "/tmp/tmp.AmudZRxoSB/func/lib/python3.5/site-packages/manilaclient/common/httpclient.py", line 136, in _cs_request
2020-06-01 11:16:52 [INFO]     **kwargs)
2020-06-01 11:16:52 [I2020-06-01 11:16:52 [INFO] ======================================================================
2020-06-01 11:16:52 [INFO] ERROR: test_manila_share (zaza.openstack.charm_tests.manila_ganesha.tests.ManilaGaneshaTests)
2020-06-01 11:16:52 [INFO] Test that Manila + Ganesha shares can be accessed on two instances.
2020-06-01 11:16:52 [INFO] ----------------------------------------------------------------------
2020-06-01 11:16:52 [INFO] Traceback (most recent call last):
2020-06-01 11:16:52 [INFO]   File "/tmp/tmp.AmudZRxoSB/func/lib/python3.5/site-packages/zaza/openstack/charm_tests/manila_ganesha/tests.py", line 77, in test_manila_share
2020-06-01 11:16:52 [INFO]     share.allow(access_type='ip', access=fip_1, access_level='rw')
2020-06-01 11:16:52 [INFO]   File "/tmp/tmp.AmudZRxoSB/func/lib/python3.5/site-packages/manilaclient/v2/shares.py", line 82, in allow
2020-06-01 11:16:52 [INFO]     self, access_type, access, access_level, metadata)
2020-06-01 11:16:52 [INFO]   File "/tmp/tmp.AmudZRxoSB/func/lib/python3.5/site-packages/manilaclient/api_versions.py", line 399, in substitution
2020-06-01 11:16:52 [INFO]     return method.func(obj, *args, **kwargs)
2020-06-01 11:16:52 [INFO]   File "/tmp/tmp.AmudZRxoSB/func/lib/python3.5/site-packages/manilaclient/v2/shares.py", line 555, in allow
2020-06-01 11:16:52 [INFO]     share, access_type, access, access_level, "os-allow_access")
2020-06-01 11:16:52 [INFO]   File "/tmp/tmp.AmudZRxoSB/func/lib/python3.5/site-packages/manilaclient/v2/shares.py", line 548, in _do_allow
2020-06-01 11:16:52 [INFO]     access_params)[1]["access"]
2020-06-01 11:16:52 [INFO]   File "/tmp/tmp.AmudZRxoSB/func/lib/python3.5/site-packages/manilaclient/v2/shares.py", line 672, in _action
2020-06-01 11:16:52 [INFO]     return self.api.client.post(url, body=body)
2020-06-01 11:16:52 [INFO]   File "/tmp/tmp.AmudZRxoSB/func/lib/python3.5/site-packages/manilaclient/common/httpclient.py", line 177, in post
2020-06-01 11:16:52 [INFO]     return self._cs_request(url, 'POST', **kwargs)
2020-06-01 11:16:52 [INFO]   File "/tmp/tmp.AmudZRxoSB/func/lib/python3.5/site-packages/manilaclient/common/httpclient.py", line 136, in _cs_request
2020-06-01 11:16:52 [INFO]     **kwargs)
2020-06-01 11:16:52 [INFO]   File "/tmp/tmp.AmudZRxoSB/func/lib/python3.5/site-packages/manilaclient/common/httpclient.py", line 150, in _cs_request_with_retries
2020-06-01 11:16:52 [INFO]     resp, body = self.request(url, method, **kwargs)
2020-06-01 11:16:52 [INFO]   File "/tmp/tmp.AmudZRxoSB/func/lib/python3.5/site-packages/manilaclient/common/httpclient.py", line 128, in request
2020-06-01 11:16:52 [INFO]     raise exceptions.from_response(resp, method, url)
2020-06-01 11:16:52 [INFO] manilaclient.common.apiclient.exceptions.BadRequest: New access rules cannot be applied while the share or any of its replicas or migration copies lacks a valid host or is in an invalid state. (HTTP 400) (Request-ID: req-8d609e13-9a80-428b-953b-17ab8d0e0cae)
2020-06-01 11:16:52 [INFO] ----------------------------------------------------------------------
NFO]   File "/tmp/tmp.AmudZRxoSB/func/lib/python3.5/site-packages/manilaclient/common/httpclient.py", line 150, in _cs_request_with_retries
2020-06-01 11:16:52 [INFO]     resp, body = self.request(url, method, **kwargs)
2020-06-01 11:16:52 [INFO]   File "/tmp/tmp.AmudZRxoSB/func/lib/python3.5/site-packages/manilaclient/common/httpclient.py", line 128, in request
2020-06-01 11:16:52 [INFO]     raise exceptions.from_response(resp, method, url)
2020-06-01 11:16:52 [INFO] manilaclient.common.apiclient.exceptions.BadRequest: New access rules cannot be applied while the share or any of its replicas or migration copies lacks a valid host or is in an invalid state. (HTTP 400) (Request-ID: req-8d609e13-9a80-428b-953b-17ab8d0e0cae)
2020-06-01 11:16:52 [INFO] ----------------------------------------------------------------------

Launch instance helper ping test too optimistic about immediate success

The launch instance setup/test helper launches an instance, assigns a floating IP to it, and then immediately expects to successfully get response to 1 ICMP ping.

It would be awesome if this actually worked, but we have numerous examples of this not being the reality.

ip = openstack_utils.create_floating_ip(
neutron_client,
external_network_name,
port=port)['floating_ip_address']
logging.info('Assigned floating IP {} to {}'.format(ip, vm_name))
try:
openstack_utils.ping_response(ip)
except subprocess.CalledProcessError as e:
logging.error('Pinging {} failed with {}'.format(ip, e.returncode))
logging.error('stdout: {}'.format(e.stdout))
logging.error('stderr: {}'.format(e.stderr))
raise

We should add tenacity to the equation here, somewhere along the lines of what commit e51b3c4 did for the Neutron networking test.

series upgrade of percona skipped Ubuntu upgrade

In a run of series upgrade, the series upgrade part appeared to be skipped and the the post-upgrade hook failed.

2019-10-05 09:41:37 INFO juju-log Unit is ready
2019-10-05 09:44:42 DEBUG pre-series-upgrade inactive
2019-10-05 09:44:42 DEBUG pre-series-upgrade inactive
2019-10-05 09:44:42 DEBUG pre-series-upgrade active
2019-10-05 09:44:46 DEBUG pre-series-upgrade mysql.service is not a native service, redirecting to systemd-sysv-install
2019-10-05 09:44:46 DEBUG pre-series-upgrade Executing /lib/systemd/systemd-sysv-install disable mysql
2019-10-05 09:44:46 DEBUG pre-series-upgrade insserv: warning: current start runlevel(s) (empty) of script `mysql' overrides LSB defaults (2 3 4 5).
2019-10-05 09:44:46 DEBUG pre-series-upgrade insserv: warning: current stop runlevel(s) (0 1 2 3 4 5 6) of script `mysql' overrides LSB defaults (0 1 6).
2019-10-05 09:44:46 DEBUG pre-series-upgrade Created symlink from /etc/systemd/system/mysql.service to /dev/null.
2019-10-05 09:44:52 DEBUG pre-series-upgrade inactive
2019-10-05 09:44:53 DEBUG juju-log Writing file /etc/mysql/percona-xtradb-cluster.conf.d/mysqld.cnf root:root 444
2019-10-05 09:44:53 DEBUG pre-series-upgrade inactive
2019-10-05 09:44:53 DEBUG pre-series-upgrade inactive
2019-10-05 09:45:22 INFO juju.cmd supercommand.go:57 running jujud [2.6.10 gc go1.10.4]
2019-10-05 09:45:22 DEBUG juju.cmd supercommand.go:58   args: []string{"/var/lib/juju/tools/unit-mysql-1/jujud", "unit", "--data-dir", "/var/lib/juju", "--unit-name", "mysql/1", "--debug"}
2019-10-05 09:45:22 DEBUG juju.agent agent.go:545 read agent config, format "2.0"
2019-10-05 09:45:22 INFO juju.cmd.jujud agent.go:133 setting logging config to "<root>=WARNING;unit=DEBUG"
2019-10-05 09:46:17 INFO juju-log Making dir /var/run/mysqld mysql:mysql 755
2019-10-05 09:46:17 INFO juju-log Installing ['percona-xtradb-cluster-server'] with options: ['--option=Dpkg::Options::=--force-confold']
2019-10-05 09:46:18 DEBUG post-series-upgrade Reading package lists...
2019-10-05 09:46:18 DEBUG post-series-upgrade Building dependency tree...
2019-10-05 09:46:18 DEBUG post-series-upgrade Reading state information...
2019-10-05 09:46:18 DEBUG post-series-upgrade percona-xtradb-cluster-server is already the newest version (5.6.37-26.21-0ubuntu0.16.04.2).
2019-10-05 09:46:18 DEBUG post-series-upgrade 0 upgraded, 0 newly installed, 0 to remove and 19 not upgraded.
2019-10-05 09:46:18 INFO juju-log Starting mysqld --wsrep-provider='none' and waiting ...
2019-10-05 09:46:18 DEBUG juju-log /var/run/mysqld/mysqld.sock file is not yet ihe correct state retrying. Check for exists=True
2019-10-05 09:46:28 DEBUG juju-log /var/run/mysqld/mysqld.sock file is not yet ihe correct state retrying. Check for exists=True
2019-10-05 09:46:38 DEBUG juju-log /var/run/mysqld/mysqld.sock file is not yet ihe correct state retrying. Check for exists=True
2019-10-05 09:46:48 DEBUG juju-log /var/run/mysqld/mysqld.sock file is not yet ihe correct state retrying. Check for exists=True
2019-10-05 09:46:58 DEBUG juju-log /var/run/mysqld/mysqld.sock file is not yet ihe correct state retrying. Check for exists=True
2019-10-05 09:47:09 DEBUG juju-log /var/run/mysqld/mysqld.sock file is not yet ihe correct state retrying. Check for exists=True
2019-10-05 09:47:19 DEBUG juju-log /var/run/mysqld/mysqld.sock file is not yet ihe correct state retrying. Check for exists=True
2019-10-05 09:47:29 DEBUG juju-log /var/run/mysqld/mysqld.sock file is not yet ihe correct state retrying. Check for exists=True
2019-10-05 09:47:39 DEBUG juju-log /var/run/mysqld/mysqld.sock file is not yet ihe correct state retrying. Check for exists=True
2019-10-05 09:47:49 DEBUG juju-log /var/run/mysqld/mysqld.sock file is not yet ihe correct state retrying. Check for exists=True
2019-10-05 09:47:59 DEBUG juju-log /var/run/mysqld/mysqld.sock file is not yet ihe correct state retrying. Check for exists=True
2019-10-05 09:48:09 DEBUG juju-log /var/run/mysqld/mysqld.sock file is not yet ihe correct state retrying. Check for exists=True
2019-10-05 09:48:19 DEBUG post-series-upgrade Traceback (most recent call last):
2019-10-05 09:48:19 DEBUG post-series-upgrade   File "/var/lib/juju/agents/unit-mysql-1/charm/hooks/post-series-upgrade", line 1132, in <module>
2019-10-05 09:48:19 DEBUG post-series-upgrade     main()
2019-10-05 09:48:19 DEBUG post-series-upgrade   File "/var/lib/juju/agents/unit-mysql-1/charm/hooks/post-series-upgrade", line 1122, in main
2019-10-05 09:48:19 DEBUG post-series-upgrade     hooks.execute(sys.argv)
2019-10-05 09:48:19 DEBUG post-series-upgrade   File "/var/lib/juju/agents/unit-mysql-1/charm/charmhelpers/core/hookenv.py", line 914, in execute
2019-10-05 09:48:19 DEBUG post-series-upgrade     self._hooks[hook_name]()
2019-10-05 09:48:19 DEBUG post-series-upgrade   File "/var/lib/juju/agents/unit-mysql-1/charm/hooks/post-series-upgrade", line 434, in series_upgrade
2019-10-05 09:48:19 DEBUG post-series-upgrade     check_for_socket(MYSQL_SOCKET, exists=True)
2019-10-05 09:48:19 DEBUG post-series-upgrade   File "/var/lib/juju/agents/unit-mysql-1/charm/hooks/percona_utils.py", line 1204, in check_for_socket
2019-10-05 09:48:19 DEBUG post-series-upgrade     .format(file_name, attempts))
2019-10-05 09:48:19 DEBUG post-series-upgrade Exception: Socket /var/run/mysqld/mysqld.sock not found after 12 attempts.
2019-10-05 09:48:19 ERROR juju.worker.uniter.operation runhook.go:132 hook "post-series-upgrade" failed: exit status 1

NeutronNetworkingTest ssh key failure

The test zaza.openstack.charm_tests.neutron.tests.NeutronNetworkingTest fails because of an issue with the public key format detected by paramiko in the test. This causes charms to fail the OSCI pipeline tests.
(ref: https://openstack-ci-reports.ubuntu.com/artifacts/test_charm_pipeline_func_full/openstack/charm-neutron-api/678911/1/3746/consoleText.test_charm_func_full_7008.txt)

2019-08-29 21:09:55 [INFO] ======================================================================
2019-08-29 21:09:55 [INFO] ERROR: test_instances_have_networking (zaza.openstack.charm_tests.neutron.tests.NeutronNetworkingTest)
2019-08-29 21:09:55 [INFO] Validate North/South and East/West networking.
2019-08-29 21:09:55 [INFO] ----------------------------------------------------------------------
2019-08-29 21:09:55 [INFO] Traceback (most recent call last):
2019-08-29 21:09:55 [INFO] File "/tmp/tmp.nTxMwgcNDI/func/lib/python3.5/site-packages/zaza/openstack/charm_tests/neutron/tests.py", line 150, in test_instances_have_networking
2019-08-29 21:09:55 [INFO] vm_name='{}-ins-1'.format(self.RESOURCE_PREFIX))
2019-08-29 21:09:55 [INFO] File "/tmp/tmp.nTxMwgcNDI/func/lib/python3.5/site-packages/zaza/openstack/configure/guest.py", line 142, in launch_instance
2019-08-29 21:09:55 [INFO] privkey=openstack_utils.get_private_key(nova_utils.KEYPAIR_NAME))
2019-08-29 21:09:55 [INFO] File "/tmp/tmp.nTxMwgcNDI/func/lib/python3.5/site-packages/zaza/openstack/utilities/openstack.py", line 2048, in ssh_test
2019-08-29 21:09:55 [INFO] password=password, privkey=privkey, verify=verify)
2019-08-29 21:09:55 [INFO] File "/tmp/tmp.nTxMwgcNDI/func/lib/python3.5/site-packages/zaza/openstack/utilities/openstack.py", line 2085, in ssh_command
2019-08-29 21:09:55 [INFO] ssh.connect(ip, username=username, password='', pkey=key)
2019-08-29 21:09:55 [INFO] File "/tmp/tmp.nTxMwgcNDI/func/lib/python3.5/site-packages/paramiko/client.py", line 446, in connect
2019-08-29 21:09:55 [INFO] passphrase,
2019-08-29 21:09:55 [INFO] File "/tmp/tmp.nTxMwgcNDI/func/lib/python3.5/site-packages/paramiko/client.py", line 764, in _auth
2019-08-29 21:09:55 [INFO] raise saved_exception
2019-08-29 21:09:55 [INFO] File "/tmp/tmp.nTxMwgcNDI/func/lib/python3.5/site-packages/paramiko/client.py", line 751, in _auth
2019-08-29 21:09:55 [INFO] self._transport.auth_password(username, password)
2019-08-29 21:09:55 [INFO] File "/tmp/tmp.nTxMwgcNDI/func/lib/python3.5/site-packages/paramiko/transport.py", line 1509, in auth_password
2019-08-29 21:09:55 [INFO] return self.auth_handler.wait_for_response(my_event)
2019-08-29 21:09:55 [INFO] File "/tmp/tmp.nTxMwgcNDI/func/lib/python3.5/site-packages/paramiko/auth_handler.py", line 250, in wait_for_response
2019-08-29 21:09:55 [INFO] raise e
2019-08-29 21:09:55 [INFO] paramiko.ssh_exception.BadAuthenticationType: Bad authentication type; allowed types: ['publickey']
2019-08-29 21:09:55 [INFO] ----------------------------------------------------------------------
2019-08-29 21:09:55 [INFO] Ran 1 test in 196.225s
2019-08-29 21:09:55 [INFO] FAILED
2019-08-29 21:09:55 [INFO] (errors=1)
Traceback (most recent call last):
File "/tmp/tmp.nTxMwgcNDI/func/bin/functest-run-suite", line 10, in
sys.exit(main())
File "/tmp/tmp.nTxMwgcNDI/func/lib/python3.5/site-packages/zaza/charm_lifecycle/func_test_runner.py", line 140, in main
bundle=args.bundle)
File "/tmp/tmp.nTxMwgcNDI/func/lib/python3.5/site-packages/zaza/charm_lifecycle/func_test_runner.py", line 76, in func_test_runner
test_steps.get(model_alias, []))
File "/tmp/tmp.nTxMwgcNDI/func/lib/python3.5/site-packages/zaza/charm_lifecycle/test.py", line 68, in test
run_test_list(tests)
File "/tmp/tmp.nTxMwgcNDI/func/lib/python3.5/site-packages/zaza/charm_lifecycle/test.py", line 62, in run_test_list
assert test_result.wasSuccessful(), "Test run failed"
AssertionError: Test run failed
ERROR: InvocationError: '/tmp/tmp.nTxMwgcNDI/func/bin/functest-run-suite --keep-model'
___________________________________ summary ____________________________________
ERROR: func: commands failed

neutron: `NeutronNetworkingTest` `tearDown` removes instances on test failure

While this piece of code makes it pleasant to literately run and develop the test code itself, it makes it extremely hard to diagnose a real failure in a deployment.

@classmethod
def tearDown(cls):
"""Remove test resources."""
logging.info('Running teardown')
for server in cls.nova_client.servers.list():
if server.name.startswith(cls.RESOURCE_PREFIX):
openstack_utils.delete_resource(
cls.nova_client.servers,
server.id,
msg="server")

We need to make the tearDown code NOT remove instances on test failure.

Ceph FS test instance launch retry code too aggressive

On a slow CI day the test code can get into a situation like this:

2020-06-09 14:26:34 [INFO] Launching instance zaza-cephfstests-ins-2
2020-06-09 14:26:35 [INFO] Checking instance is active
2020-06-09 14:26:36 [INFO] BUILD
2020-06-09 14:26:37 [INFO] BUILD
2020-06-09 14:26:39 [INFO] BUILD
2020-06-09 14:26:43 [INFO] BUILD
2020-06-09 14:26:52 [INFO] BUILD
2020-06-09 14:27:08 [INFO] ACTIVE
2020-06-09 14:27:08 [INFO] Checking cloud init is complete
2020-06-09 14:37:41 [INFO] Using keystone API V3 (or later) for overcloud auth
2020-06-09 14:37:43 [INFO] Launching instance zaza-cephfstests-ins-2
2020-06-09 14:37:44 [INFO] Checking instance is active
2020-06-09 14:37:44 [INFO] BUILD
2020-06-09 14:37:45 [INFO] BUILD
2020-06-09 14:37:48 [INFO] BUILD
2020-06-09 14:37:52 [INFO] BUILD
2020-06-09 14:38:00 [INFO] BUILD
2020-06-09 14:38:17 [INFO] BUILD
2020-06-09 14:38:49 [INFO] ACTIVE
2020-06-09 14:38:49 [INFO] Checking cloud init is complete

After 10 minutes of waiting for the first incarnation of xxx-ins-2 it went ahead and started a new one, probably making everything even slower.

The first incarnation of xxx-ins-2 instance was still running cloud-init and would eventually succeed:

[  355.946885] cloud-init[1100]: Get:27 http://nova.clouds.archive.ubuntu.com/ubuntu bionic-backports/universe amd64 Packages [7484 B]
[  356.024347] cloud-init[1100]: Get:28 http://nova.clouds.archive.ubuntu.com/ubuntu bionic-backports/universe Translation-en [4436 B]
More than one server exists with the name 'zaza-cephfstests-ins-2'.

test_921_remove_and_add_unit is not stable

While investigating failures for https://review.opendev.org/#/c/725909/, it became clear (see https://bugs.launchpad.net/charm-rabbitmq-server/+bug/1730709/comments/8) that LP: #1730709 is unrelated while the failure is related to test ordering.

https://docs.python.org/3/library/unittest.html#organizing-test-code
Note The order in which the various tests will be run is determined by sorting the test method names with respect to the built-in ordering for strings.

Related cpython code for comparison of individual test function names:
https://github.com/python/cpython/blob/3.7/Lib/unittest/loader.py#L239
https://github.com/python/cpython/blob/3.7/Lib/unittest/util.py#L115-L117

An attempt to address the issue by restoring the cluster to its previous state was not successful either because running hooks/upgrade-charm does not always result in a unit being re-added to the cluster.

Likewise, introducing actual topology changes via remove-unit/add-unit results in longer test executions and timeouts in some cases which may happen due to overcommit on our test systems or due to a hidden bug in charm-rabbitmq. Related PR that was used for testing: #287

To address this further there are both test changes and charm changes needed by the looks of it.

Keystone connection aborted during cinder-backup test.

Essentially, either it was a timeout issue, or the server went away during the connection. Either way, it's an unstable test:

2020-06-05 16:57:27 [INFO] test_410_cinder_vol_create_backup_delete_restore_pool_inspect (zaza.openstack.charm_tests.cinder_backup.tests.CinderBackupTest)
2020-06-05 16:57:27 [INFO] Create, backup, delete, restore a ceph-backed cinder volume.
2020-06-05 16:57:27 [INFO]  ... 
2020-06-05 16:57:29 [INFO] Checking ceph cinder pool original samples...
2020-06-05 16:57:30 [INFO] creating
2020-06-05 16:57:32 [INFO] available
2020-06-05 16:57:32 [INFO] creating
2020-06-05 16:57:33 [INFO] creating
2020-06-05 16:57:35 [INFO] creating
2020-06-05 16:57:39 [INFO] available
2020-06-05 16:57:56 [INFO] restoring
2020-06-05 16:57:57 [INFO] restoring
2020-06-05 16:57:59 [INFO] restoring
2020-06-05 16:58:03 [INFO] available
2020-06-05 16:58:19 [INFO] Checking ceph cinder pool samples after volume create...
2020-06-05 16:58:24 [INFO] ERROR
2020-06-05 16:58:24 [INFO] ======================================================================
2020-06-05 16:58:24 [INFO] ERROR: test_410_cinder_vol_create_backup_delete_restore_pool_inspect (zaza.openstack.charm_tests.cinder_backup.tests.CinderBackupTest)
2020-06-05 16:58:24 [INFO] Create, backup, delete, restore a ceph-backed cinder volume.
2020-06-05 16:58:24 [INFO] ----------------------------------------------------------------------
2020-06-05 16:58:24 [INFO] Traceback (most recent call last):
2020-06-05 16:58:24 [INFO]   File "/tmp/tmp.v6WpcSKxEY/func/lib/python3.5/site-packages/urllib3/connectionpool.py", line 677, in urlopen
2020-06-05 16:58:24 [INFO]     chunked=chunked,
2020-06-05 16:58:24 [INFO]   File "/tmp/tmp.v6WpcSKxEY/func/lib/python3.5/site-packages/urllib3/connectionpool.py", line 426, in _make_request
2020-06-05 16:58:24 [INFO]     six.raise_from(e, None)
2020-06-05 16:58:24 [INFO]   File "<string>", line 3, in raise_from
2020-06-05 16:58:24 [INFO]   File "/tmp/tmp.v6WpcSKxEY/func/lib/python3.5/site-packages/urllib3/connectionpool.py", line 421, in _make_request
2020-06-05 16:58:24 [INFO]     httplib_response = conn.getresponse()
2020-06-05 16:58:24 [INFO]   File "/usr/lib/python3.5/http/client.py", line 1225, in getresponse
2020-06-05 16:58:24 [INFO]     response.begin()
2020-06-05 16:58:24 [INFO]   File "/usr/lib/python3.5/http/client.py", line 307, in begin
2020-06-05 16:58:24 [INFO]     version, status, reason = self._read_status()
2020-06-05 16:58:24 [INFO]   File "/usr/lib/python3.5/http/client.py", line 276, in _read_status
2020-06-05 16:58:24 [INFO]     raise RemoteDisconnected("Remote end closed connection without"
2020-06-05 16:58:24 [INFO] http.client.RemoteDisconnected: Remote end closed connection without response
2020-06-05 16:58:24 [INFO] During handling of the above exception, another exception occurred:
2020-06-05 16:58:24 [INFO] Traceback (most recent call last):
2020-06-05 16:58:24 [INFO]   File "/tmp/tmp.v6WpcSKxEY/func/lib/python3.5/site-packages/requests/adapters.py", line 449, in send
2020-06-05 16:58:24 [INFO]     timeout=timeout
2020-06-05 16:58:24 [INFO]   File "/tmp/tmp.v6WpcSKxEY/func/lib/python3.5/site-packages/urllib3/connectionpool.py", line 725, in urlopen
2020-06-05 16:58:24 [INFO]     method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
2020-06-05 16:58:24 [INFO]   File "/tmp/tmp.v6WpcSKxEY/func/lib/python3.5/site-packages/urllib3/util/retry.py", line 403, in increment
2020-06-05 16:58:24 [INFO]     raise six.reraise(type(error), error, _stacktrace)
2020-06-05 16:58:24 [INFO]   File "/tmp/tmp.v6WpcSKxEY/func/lib/python3.5/site-packages/urllib3/packages/six.py", line 734, in reraise
2020-06-05 16:58:24 [INFO]     raise value.with_traceback(tb)
2020-06-05 16:58:24 [INFO]   File "/tmp/tmp.v6WpcSKxEY/func/lib/python3.5/site-packages/urllib3/connectionpool.py", line 677, in urlopen
2020-06-05 16:58:24 [INFO]     chunked=chunked,
2020-06-05 16:58:24 [INFO]   File "/tmp/tmp.v6WpcSKxEY/func/lib/python3.5/site-packages/urllib3/connectionpool.py", line 426, in _make_request
2020-06-05 16:58:24 [INFO]     six.raise_from(e, None)
2020-06-05 16:58:24 [INFO]   File "<string>", line 3, in raise_from
2020-06-05 16:58:24 [INFO]   File "/tmp/tmp.v6WpcSKxEY/func/lib/python3.5/site-packages/urllib3/connectionpool.py", line 421, in _make_request
2020-06-05 16:58:24 [INFO]     httplib_response = conn.getresponse()
2020-06-05 16:58:24 [INFO]   File "/usr/lib/python3.5/http/client.py", line 1225, in getresponse
2020-06-05 16:58:24 [INFO]     response.begin()
2020-06-05 16:58:24 [INFO]   File "/usr/lib/python3.5/http/client.py", line 307, in begin
2020-06-05 16:58:24 [INFO]     version, status, reason = self._read_status()
2020-06-05 16:58:24 [INFO]   File "/usr/lib/python3.5/http/client.py", line 276, in _read_status
2020-06-05 16:58:24 [INFO]     raise RemoteDisconnected("Remote end closed connection without"
2020-06-05 16:58:24 [INFO] urllib3.exceptions.ProtocolError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response',))
2020-06-05 16:58:24 [INFO] During handling of the above exception, another exception occurred:
2020-06-05 16:58:24 [INFO] Traceback (most recent call last):
2020-06-05 16:58:24 [INFO]   File "/tmp/tmp.v6WpcSKxEY/func/lib/python3.5/site-packages/keystoneauth1/session.py", line 1004, in _send_request
2020-06-05 16:58:24 [INFO]     resp = self.session.request(method, url, **kwargs)
2020-06-05 16:58:24 [INFO]   File "/tmp/tmp.v6WpcSKxEY/func/lib/python3.5/site-packages/requests/sessions.py", line 530, in request
2020-06-05 16:58:24 [INFO]     resp = self.send(prep, **send_kwargs)
2020-06-05 16:58:24 [INFO]   File "/tmp/tmp.v6WpcSKxEY/func/lib/python3.5/site-packages/requests/sessions.py", line 643, in send
2020-06-05 16:58:24 [INFO]     r = adapter.send(request, **kwargs)
2020-06-05 16:58:24 [INFO]   File "/tmp/tmp.v6WpcSKxEY/func/lib/python3.5/site-packages/requests/adapters.py", line 498, in send
2020-06-05 16:58:24 [INFO]     raise ConnectionError(err, request=request)
2020-06-05 16:58:24 [INFO] requests.exceptions.ConnectionError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response',))
2020-06-05 16:58:24 [INFO] During handling of the above exception, another exception occurred:
2020-06-05 16:58:24 [INFO] Traceback (most recent call last):
2020-06-05 16:58:24 [INFO]   File "/tmp/tmp.v6WpcSKxEY/func/lib/python3.5/site-packages/zaza/openstack/charm_tests/cinder_backup/tests.py", line 147, in test_410_cinder_vol_create_backup_delete_restore_pool_inspect
2020-06-05 16:58:24 [INFO]     vols = self.cinder_client.volumes.list()
2020-06-05 16:58:24 [INFO]   File "/tmp/tmp.v6WpcSKxEY/func/lib/python3.5/site-packages/cinderclient/v2/volumes.py", line 300, in list
2020-06-05 16:58:24 [INFO]     return self._list(url, resource_type, limit=limit)
2020-06-05 16:58:24 [INFO]   File "/tmp/tmp.v6WpcSKxEY/func/lib/python3.5/site-packages/cinderclient/base.py", line 80, in _list
2020-06-05 16:58:24 [INFO]     resp, body = self.api.client.get(url)
2020-06-05 16:58:24 [INFO]   File "/tmp/tmp.v6WpcSKxEY/func/lib/python3.5/site-packages/cinderclient/client.py", line 214, in get
2020-06-05 16:58:24 [INFO]     return self._cs_request(url, 'GET', **kwargs)
2020-06-05 16:58:24 [INFO]   File "/tmp/tmp.v6WpcSKxEY/func/lib/python3.5/site-packages/cinderclient/client.py", line 205, in _cs_request
2020-06-05 16:58:24 [INFO]     return self.request(url, method, **kwargs)
2020-06-05 16:58:24 [INFO]   File "/tmp/tmp.v6WpcSKxEY/func/lib/python3.5/site-packages/cinderclient/client.py", line 188, in request
2020-06-05 16:58:24 [INFO]     **kwargs)
2020-06-05 16:58:24 [INFO]   File "/tmp/tmp.v6WpcSKxEY/func/lib/python3.5/site-packages/keystoneauth1/adapter.py", line 545, in request
2020-06-05 16:58:24 [INFO]     resp = super(LegacyJsonAdapter, self).request(*args, **kwargs)
2020-06-05 16:58:24 [INFO]   File "/tmp/tmp.v6WpcSKxEY/func/lib/python3.5/site-packages/keystoneauth1/adapter.py", line 248, in request
2020-06-05 16:58:24 [INFO]     return self.session.request(url, method, **kwargs)
2020-06-05 16:58:24 [INFO]   File "/tmp/tmp.v6WpcSKxEY/func/lib/python3.5/site-packages/keystoneauth1/session.py", line 913, in request
2020-06-05 16:58:24 [INFO]     resp = send(**kwargs)
2020-06-05 16:58:24 [INFO]   File "/tmp/tmp.v6WpcSKxEY/func/lib/python3.5/site-packages/keystoneauth1/session.py", line 1020, in _send_request
2020-06-05 16:58:24 [INFO]     raise exceptions.ConnectFailure(msg)
2020-06-05 16:58:24 [INFO] keystoneauth1.exceptions.connection.ConnectFailure: Unable to establish connection to http://172.17.112.4:8776/v2/ffd9b06fb14b493d8c4b4d50ed496ab6/volumes/detail: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response',))

[focal-ussuri] Masakari test: Host <nnn> can't be updated as it is in-use to process notifications

Deploying bundle './tests/bundles/focal-ussuri-pacemaker-remote-ssl.yaml':

...

2020-05-12 00:08:27 [INFO] test_instance_failover (zaza.openstack.charm_tests.masakari.tests.MasakariTest)
2020-05-12 00:08:27 [INFO] Test masakari managed guest migration.
2020-05-12 00:08:27 [INFO]  ... 

...

 Removing maintenance mode from masakari host 88dbc6fd-2038-4329-8e98-979c43cc98ca
2020-05-12 00:16:50 [INFO] ERROR

...

ERROR: test_instance_failover (zaza.openstack.charm_tests.masakari.tests.MasakariTest)
2020-05-12 00:17:19 [INFO] Test masakari managed guest migration.
2020-05-12 00:17:19 [INFO] ----------------------------------------------------------------------
2020-05-12 00:17:19 [INFO] Traceback (most recent call last):
2020-05-12 00:17:19 [INFO]   File "/tmp/tmp.xrIp82VnBm/func/lib/python3.5/site-packages/zaza/openstack/charm_tests/masakari/tests.py", line 57, in tearDown
2020-05-12 00:17:19 [INFO]     zaza.openstack.configure.masakari.enable_hosts()
2020-05-12 00:17:19 [INFO]   File "/tmp/tmp.xrIp82VnBm/func/lib/python3.5/site-packages/zaza/openstack/configure/masakari.py", line 125, in enable_hosts
2020-05-12 00:17:19 [INFO]     enable_host(masakari_client, host.uuid, segment.uuid)
2020-05-12 00:17:19 [INFO]   File "/tmp/tmp.xrIp82VnBm/func/lib/python3.5/site-packages/tenacity/__init__.py", line 329, in wrapped_f
2020-05-12 00:17:19 [INFO]     return self.call(f, *args, **kw)
2020-05-12 00:17:19 [INFO]   File "/tmp/tmp.xrIp82VnBm/func/lib/python3.5/site-packages/tenacity/__init__.py", line 409, in call
2020-05-12 00:17:19 [INFO]     do = self.iter(retry_state=retry_state)
2020-05-12 00:17:19 [INFO]   File "/tmp/tmp.xrIp82VnBm/func/lib/python3.5/site-packages/tenacity/__init__.py", line 368, in iter
2020-05-12 00:17:19 [INFO]     raise retry_exc.reraise()
2020-05-12 00:17:19 [INFO]   File "/tmp/tmp.xrIp82VnBm/func/lib/python3.5/site-packages/tenacity/__init__.py", line 186, in reraise
2020-05-12 00:17:19 [INFO]     raise self.last_attempt.result()
2020-05-12 00:17:19 [INFO]   File "/usr/lib/python3.5/concurrent/futures/_base.py", line 398, in result
2020-05-12 00:17:19 [INFO]     return self.__get_result()
2020-05-12 00:17:19 [INFO]   File "/usr/lib/python3.5/concurrent/futures/_base.py", line 357, in __get_result
2020-05-12 00:17:19 [INFO]     raise self._exception
2020-05-12 00:17:19 [INFO]   File "/tmp/tmp.xrIp82VnBm/func/lib/python3.5/site-packages/tenacity/__init__.py", line 412, in call
2020-05-12 00:17:19 [INFO]     result = fn(*args, **kwargs)
2020-05-12 00:17:19 [INFO]   File "/tmp/tmp.xrIp82VnBm/func/lib/python3.5/site-packages/zaza/openstack/configure/masakari.py", line 105, in enable_host
2020-05-12 00:17:19 [INFO]     **{'on_maintenance': False})
2020-05-12 00:17:19 [INFO]   File "/tmp/tmp.xrIp82VnBm/func/lib/python3.5/site-packages/openstack/instance_ha/v1/_proxy.py", line 189, in update_host
2020-05-12 00:17:19 [INFO]     **attrs)
2020-05-12 00:17:19 [INFO]   File "/tmp/tmp.xrIp82VnBm/func/lib/python3.5/site-packages/openstack/proxy.py", line 46, in check
2020-05-12 00:17:19 [INFO]     return method(self, expected, actual, *args, **kwargs)
2020-05-12 00:17:19 [INFO]   File "/tmp/tmp.xrIp82VnBm/func/lib/python3.5/site-packages/openstack/proxy.py", line 393, in _update
2020-05-12 00:17:19 [INFO]     return res.commit(self, base_path=base_path)
2020-05-12 00:17:19 [INFO]   File "/tmp/tmp.xrIp82VnBm/func/lib/python3.5/site-packages/openstack/resource.py", line 1491, in commit
2020-05-12 00:17:19 [INFO]     retry_on_conflict=retry_on_conflict)
2020-05-12 00:17:19 [INFO]   File "/tmp/tmp.xrIp82VnBm/func/lib/python3.5/site-packages/openstack/resource.py", line 1517, in _commit
2020-05-12 00:17:19 [INFO]     self._translate_response(response, has_body=has_body)
2020-05-12 00:17:19 [INFO]   File "/tmp/tmp.xrIp82VnBm/func/lib/python3.5/site-packages/openstack/resource.py", line 1113, in _translate_response
2020-05-12 00:17:19 [INFO]     exceptions.raise_from_response(response, error_message=error_message)
2020-05-12 00:17:19 [INFO]   File "/tmp/tmp.xrIp82VnBm/func/lib/python3.5/site-packages/openstack/exceptions.py", line 236, in raise_from_response
2020-05-12 00:17:19 [INFO]     http_status=http_status, request_id=request_id
2020-05-12 00:17:19 [INFO] openstack.exceptions.ConflictException: ConflictException: 409: Client Error for url: https://172.17.102.230:15868/v1/0dc221ff59fb40199b0969721f2762a9/segments/840593fc-f510-43bd-85cb-054fe19934db/hosts/88dbc6fd-2038-4329-8e98-979c43cc98ca, Host 88dbc6fd-2038-4329-8e98-979c43cc98ca can't be updated as it is in-use to process notifications.

https://review.opendev.org/#/c/726808/
https://openstack-ci-reports.ubuntu.com/artifacts/test_charm_pipeline_func_full/openstack/charm-masakari-monitors/726808/1/5655/index.html

Looks to either be a test failure (not doing the right thing) or a change in behaviour in focal?

support multiple architectures

Primarily, this applies to:

zaza.openstack.charm_tests.glance.setup.add_lts_image

in that a different image should be downloaded for each arch, and that some flags need to be set for different architectures, and depending whether the compute node is in an lxd container or not e.g.:

    openstack image create --public --container-format bare --disk-format raw --property hw_firmware_type=uefi --property hypervisor_type=lxc --file ~/images/bionic-server-cloudimg-arm64-root.tar.xz bionic-arm64

or

    openstack image create --public --container-format bare --disk-format qcow2 --property hw_firmware_type=uefi --file ~/images/bionic-server-cloudimg-arm64.img bionic-arm64

We currently deploy to amd64, arm64, ppc64el and s390x

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.