geerlingguy / awx-container Goto Github PK
View Code? Open in Web Editor NEWAnsible Container project that manages the lifecycle of AWX on Docker.
Home Page: https://hub.docker.com/r/geerlingguy/awx_web/
License: MIT License
Ansible Container project that manages the lifecycle of AWX on Docker.
Home Page: https://hub.docker.com/r/geerlingguy/awx_web/
License: MIT License
...mostly because Ansible will likely be able to maintain those images a little more frequently than I currently have time for maintaining my Docker Hub images.
I would still like to keep this project up and running of course, partially as a demonstration of Ansible Container!
Hi,
I´m wondering if you could build lxd into the container as 99% of my hosts are lxd containers, thus using lxd as ansible_connection
Thx
If someone (like myself) just wants to get a quick and dirty AWX instance running for testing, demonstration purposes, etc... it should be pretty easy to use a single Docker Compose file with the public Docker Hub images built using this repository.
This may be an error with AWX itself, but when I run from docker-compose I receive this error when trying to login:
awx_web_1 | 2017-09-08 20:09:15,047 ERROR django.request Internal Server Error: /api/v2/auth/
awx_web_1 | Traceback (most recent call last):
awx_web_1 | File "/var/lib/awx/venv/awx/lib/python2.7/site-packages/django/core/handlers/base.py", line 132, in get_response
awx_web_1 | response = wrapped_callback(request, *callback_args, **callback_kwargs)
awx_web_1 | File "/var/lib/awx/venv/awx/lib/python2.7/site-packages/django/utils/decorators.py", line 145, in inner
awx_web_1 | return func(*args, **kwargs)
awx_web_1 | File "/var/lib/awx/venv/awx/lib/python2.7/site-packages/django/views/decorators/csrf.py", line 58, in wrapped_view
awx_web_1 | return view_func(*args, **kwargs)
awx_web_1 | File "/var/lib/awx/venv/awx/lib/python2.7/site-packages/django/views/generic/base.py", line 71, in view
awx_web_1 | return self.dispatch(request, *args, **kwargs)
awx_web_1 | File "/usr/lib/python2.7/site-packages/awx/api/generics.py", line 246, in dispatch
awx_web_1 | return super(APIView, self).dispatch(request, *args, **kwargs)
awx_web_1 | File "/var/lib/awx/venv/awx/lib/python2.7/site-packages/rest_framework/views.py", line 449, in dispatch
awx_web_1 | request = self.initialize_request(request, *args, **kwargs)
awx_web_1 | File "/usr/lib/python2.7/site-packages/awx/api/generics.py", line 107, in initialize_request
awx_web_1 | settings.PROXY_IP_WHITELIST,
awx_web_1 | File "/var/lib/awx/venv/awx/lib/python2.7/site-packages/django/conf/__init__.py", line 49, in __getattr__
awx_web_1 | return getattr(self._wrapped, name)
awx_web_1 | File "/usr/lib/python2.7/site-packages/awx/conf/settings.py", line 386, in __getattr__
awx_web_1 | value = self._get_local(name)
awx_web_1 | File "/usr/lib64/python2.7/contextlib.py", line 35, in __exit__
awx_web_1 | self.gen.throw(type, value, traceback)
awx_web_1 | File "/usr/lib/python2.7/site-packages/awx/conf/settings.py", line 64, in _log_database_error
awx_web_1 | if get_tower_migration_version() < '310':
awx_web_1 | File "/usr/lib/python2.7/site-packages/awx/main/utils/db.py", line 13, in get_tower_migration_version
awx_web_1 | loader = MigrationLoader(connection, ignore_no_migrations=True)
awx_web_1 | File "/var/lib/awx/venv/awx/lib/python2.7/site-packages/django/db/migrations/loader.py", line 47, in __init__
awx_web_1 | self.build_graph()
awx_web_1 | File "/var/lib/awx/venv/awx/lib/python2.7/site-packages/django/db/migrations/loader.py", line 191, in build_graph
awx_web_1 | self.applied_migrations = recorder.applied_migrations()
awx_web_1 | File "/var/lib/awx/venv/awx/lib/python2.7/site-packages/django/db/migrations/recorder.py", line 59, in applied_migrations
awx_web_1 | self.ensure_schema()
awx_web_1 | File "/var/lib/awx/venv/awx/lib/python2.7/site-packages/django/db/migrations/recorder.py", line 49, in ensure_schema
awx_web_1 | if self.Migration._meta.db_table in self.connection.introspection.table_names(self.connection.cursor()):
awx_web_1 | File "/var/lib/awx/venv/awx/lib/python2.7/site-packages/django/db/backends/base/introspection.py", line 58, in table_names
awx_web_1 | return get_names(cursor)
awx_web_1 | File "/var/lib/awx/venv/awx/lib/python2.7/site-packages/django/db/backends/base/introspection.py", line 53, in get_names
awx_web_1 | return sorted(ti.name for ti in self.get_table_list(cursor)
awx_web_1 | File "/var/lib/awx/venv/awx/lib/python2.7/site-packages/django/db/backends/postgresql_psycopg2/introspection.py", line 53, in get_table_list
awx_web_1 | AND pg_catalog.pg_table_is_visible(c.oid)""")
awx_web_1 | File "/var/lib/awx/venv/awx/lib/python2.7/site-packages/django/db/backends/utils.py", line 64, in execute
awx_web_1 | return self.cursor.execute(sql, params)
awx_web_1 | File "/var/lib/awx/venv/awx/lib/python2.7/site-packages/django/db/utils.py", line 98, in __exit__
awx_web_1 | six.reraise(dj_exc_type, dj_exc_value, traceback)
awx_web_1 | File "/var/lib/awx/venv/awx/lib/python2.7/site-packages/django/db/backends/utils.py", line 62, in execute
awx_web_1 | return self.cursor.execute(sql)
awx_web_1 | InternalError: current transaction is aborted, commands ignored until end of transaction block
Not sure where this is coming from. I noticed in the config for AWX port is set to 80 but it is actually serving on port 8052 and then being mapped to 80 with compose. Is this correct or related?
I'm also seeing this quite a bit in the database migrations
ERROR: relation "conf_setting" does not exist at character 158
Thanks
TASK [Stage sdist.] ************************************************************
An exception occurred during task execution. To see the full traceback, use -vvv. The error was: AnsibleFileNotFound: Could not find or access '/opt/awx/dist/awx-1.0.3.52.tar.gz'
fatal: [localhost]: FAILED! => {"changed": false, "msg": "Could not find or access '/opt/awx/dist/awx-1.0.3.52.tar.gz'"}
Example failed build: https://travis-ci.org/geerlingguy/awx-container/builds/338157010
Within the awx_task container is no IPv6 support:
[root@awx awx]# ip a
1: lo: <LOOPBACK,UP,LOWER_UP> mtu 65536 qdisc noqueue state UNKNOWN qlen 1
link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00
inet 127.0.0.1/8 scope host lo
valid_lft forever preferred_lft forever
2: tunl0@NONE: <NOARP> mtu 1480 qdisc noop state DOWN qlen 1
link/ipip 0.0.0.0 brd 0.0.0.0
3: gre0@NONE: <NOARP> mtu 1476 qdisc noop state DOWN qlen 1
link/gre 0.0.0.0 brd 0.0.0.0
4: gretap0@NONE: <BROADCAST,MULTICAST> mtu 1462 qdisc noop state DOWN qlen 1000
link/ether 00:00:00:00:00:00 brd ff:ff:ff:ff:ff:ff
5: ip_vti0@NONE: <NOARP> mtu 1332 qdisc noop state DOWN qlen 1
link/ipip 0.0.0.0 brd 0.0.0.0
6: ip6_vti0@NONE: <NOARP> mtu 1500 qdisc noop state DOWN qlen 1
link/tunnel6 :: brd ::
7: sit0@NONE: <NOARP> mtu 1480 qdisc noop state DOWN qlen 1
link/sit 0.0.0.0 brd 0.0.0.0
8: ip6tnl0@NONE: <NOARP> mtu 1452 qdisc noop state DOWN qlen 1
link/tunnel6 :: brd ::
9: ip6gre0@NONE: <NOARP> mtu 1448 qdisc noop state DOWN qlen 1
link/[823] 00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00 brd 00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00
75: eth0@if76: <BROADCAST,MULTICAST,UP,LOWER_UP> mtu 1500 qdisc noqueue state UP
link/ether 02:42:ac:12:00:06 brd ff:ff:ff:ff:ff:ff link-netnsid 0
inet 172.18.0.6/16 scope global eth0
valid_lft forever preferred_lft forever
(reverse-i-search)`p': i^Ca
[root@awx awx]# ping heise.de
PING heise.de (193.99.144.80) 56(84) bytes of data.
64 bytes from redirector.heise.de (193.99.144.80): icmp_seq=1 ttl=37 time=25.0 ms
64 bytes from redirector.heise.de (193.99.144.80): icmp_seq=2 ttl=37 time=24.0 ms
64 bytes from redirector.heise.de (193.99.144.80): icmp_seq=3 ttl=37 time=24.5 ms
^C
--- heise.de ping statistics ---
3 packets transmitted, 3 received, 0% packet loss, time 2006ms
rtt min/avg/max/mdev = 24.076/24.569/25.095/0.416 ms
[root@awx awx]# ping6 heise.de
connect: Cannot assign requested address
It is impossible from a playbook to connect a host wich is only accessible via IPv6.
There is an docker config item "enable_ipv6" but it seems incompatible with docker-compose:
"https://docs.docker.com/compose/compose-file/#ipv4_address-ipv6_address
When trying to build image for AWX, getting error for below task. It looks like geerlingguy.repo-epel refers to https://dl.fedoraproject.org/pub/epel/epel-release-latest-NA.noarch.rpm which is not found.
TASK [geerlingguy.repo-epel : Install EPEL repo.] ******************************
FAILED - RETRYING: Install EPEL repo. (5 retries left).
FAILED - RETRYING: Install EPEL repo. (4 retries left).
FAILED - RETRYING: Install EPEL repo. (3 retries left).
FAILED - RETRYING: Install EPEL repo. (2 retries left).
FAILED - RETRYING: Install EPEL repo. (1 retries left).
fatal: [localhost]: FAILED! => {"attempts": 5, "changed": false, "failed": true, "msg": "Failure downloading https://dl.fedoraproject.org/pub/epel/epel-release- latest-NA.noarch.rpm, HTTP Error 404: Not Found"}
Hello,
I'm trying to deploy AWX in a swarm cluster using stack and it doesn't work well.
I already changed the links to a network using the overlay driver...
Did you already tried this?
I set up credentials to use a private ssh key for the host machine, creating a job template using those credentials and a simple playbook project, then running the job in awx. The job fails with an error message looking like:
...control_persist_detach: background process is 745\r\nControl socket connect(/tmp/awx_18_nTI3M6/cp/10.16.141.8522root): Connection refused\r\nFailed to connect to new control master\r\n"
I found this article that says that overlayfs as a filesystem doesn't play well with unix sockets (i.e. ansible/awx uses ssh multiplexing by default). It suggests changing the ControlPath to /dev/shm/.
As I started looking into that as an option, it looks like awx overwrites the control_path_dir in ansible.cfg. if you look to the tower docs, which says:
For example, Tower stores the SSH ControlMaster sockets, the SSH agent socket, and any other per-job run items in a per-job temporary directory, secured by multi-tenancy access control restrictions via PRoot.
The current work around I have found, is to include this line in the ansible.cfg file
ssh_args = -o ControlMaster=no
I also found an awx setting called "AWX_PROOT_BASE_PATH": "/tmp" (The only place I see /tmp being set anywhere). I am wondering if this could be changed from /tmp to /dev/shm to get ssh multiplexing to work (Have not tested yet). Otherwise maybe a ticket should be opened in the awx project to create a setting for this issue.
Maybe there should be a configuration setting that could be set in the container build, so (1) these settings are documented and (2) there is an easy way to set these settings.
Keep up the good work and Thanks!
ERROR! Attempted to execute "/usr/lib/python2.7/site-packages/awx/plugins/inventory/awxrest.py" as inventory script: Inventory script (/usr/lib/python2.7/site-packages/awx/plugins/inventory/awxrest.py) had an execution error: HTTPConnectionPool(host='awxweb', port=8052): Max retries exceeded with url: /api/v1/inventories/2/script/?hostvars=1 (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x2220e10>: Failed to establish a new connection: [Errno -2] Name or service not known',))
This error comes when i try and run a template/playbook.
If i create a alias, for the awx_web link, on the awx_task composer, i get this error instead:
fatal: [XXX.dk]: UNREACHABLE! => {"changed": false, "msg": "Failed to connect to the host via ssh: No user exists for uid 1000\r\n", "unreachable": true}
So i added the "user: root" to the awx_task composer, and it ran with success!
My awx_task composer thingy now looks like this (marked with bold what i added/changed):
awx_task:
image: "geerlingguy/awx_task:latest"
links:
- rabbitmq
- memcached
- awx_web:awxweb
- postgres
hostname: awx
user: root
container_name: awx_task
environment:
SECRET_KEY: aabbcc
DATABASE_USER: awx
DATABASE_PASSWORD: awxpass
DATABASE_NAME: awx
DATABASE_PORT: 5432
DATABASE_HOST: postgres
RABBITMQ_USER: guest
RABBITMQ_PASSWORD: guest
RABBITMQ_HOST: rabbitmq
RABBITMQ_PORT: 5672
RABBITMQ_VHOST: awx
MEMCACHED_HOST: memcached
MEMCACHED_PORT: 11211
I could create a pullrequest, but i want to know if anyone else is seeing this error, and if this fix helped.
Hi,
Are there any plans to create a helm chart for this image?
Thx
I don't have a patch yet, but access to the /api
path (which is required for logging in) is not available unless you use the hostname that is configured in the /etc/tower/settings.py
file.
Fix here is probably to expose a default variable that can allow you to override the hostname. Work around for now is simply to add awxweb
to your /etc/hosts
file pointing at your servers IP address.
I'm using latest
image tag
I tried:
/var/lib/awx/venv/ansible/bin/pip install --upgrade ansible
and I get ...
Error: [('/var/lib/awx/venv/ansible/lib/python2.7/site-packages/pycparser-2.17-py2.7.egg-info/PKG-INFO',
Thanks
Hi,
Thanks for creating ansible awx role and containers.
I am running ansible awx role on oracle virtual box. Role has been ran perfectly and i was able to access the Ansible awx. I have added the playbooks manually and create a template to perform the job.
But while running job template i am getting below error:
Traceback (most recent call last):
File "/usr/lib/python2.7/site-packages/awx/main/tasks.py", line 799, in run
cwd = self.build_cwd(instance, **kwargs)
File "/usr/lib/python2.7/site-packages/awx/main/tasks.py", line 1188, in build_cwd
(job.project.local_path, root))
RuntimeError: project local_path user cannot be found in /var/lib/awx/projects
I have created the directory under awx_web container and do i have define somewhere in settings for PROJECT PATH. Link Ansible tower do we have any file w.r.t settings.
Maybe this Project could be extended to provide a update path for the AWX containers somehow ?
As For AWX, it seems that a productive use is impossible, because awx does not provide migration paths to higher versions, thus no updates in production.
ansible/awx#138
https://groups.google.com/forum/#!msg/awx-project/PQLxKl5Rj9s/UGy-3VaCCQAJ
If the aim of this Project is a productive deployment of awx, there should be a migration path of the main AWX releases (e.g 1.0.2 -> 1.0.3 ). If not , close this 😄.
Whats you opinion on that ?
When starting docker-compose up -d
I get some errors:
% docker logs docker-awx_awx_task_1
/usr/bin/launch_awx_task.sh: line 8: /etc/tower/conf.d/environment.sh: No such file or directory
Using /etc/ansible/ansible.cfg as config file
127.0.0.1 | SUCCESS => {
"changed": false,
"elapsed": 2,
"path": null,
"port": 5432,
"search_regex": null,
"state": "started"
}
Using /etc/ansible/ansible.cfg as config file
127.0.0.1 | SUCCESS => {
"changed": false,
"elapsed": 0,
"path": null,
"port": 11211,
"search_regex": null,
"state": "started"
}
Using /etc/ansible/ansible.cfg as config file
127.0.0.1 | SUCCESS => {
"changed": false,
"elapsed": 1,
"path": null,
"port": 5672,
"search_regex": null,
"state": "started"
}
Using /etc/ansible/ansible.cfg as config file
127.0.0.1 | SUCCESS => {
"changed": false,
"db": "awx"
}
Traceback (most recent call last):
File "/usr/bin/awx-manage", line 11, in <module>
load_entry_point('awx==4.0.0.0', 'console_scripts', 'awx-manage')()
File "/var/lib/awx/venv/awx/lib64/python3.6/site-packages/awx/__init__.py", line 124, in manage
prepare_env()
File "/var/lib/awx/venv/awx/lib64/python3.6/site-packages/awx/__init__.py", line 89, in prepare_env
if not settings.DEBUG: # pragma: no cover
File "/var/lib/awx/venv/awx/lib64/python3.6/site-packages/django/conf/__init__.py", line 56, in __getattr__
self._setup(name)
File "/var/lib/awx/venv/awx/lib64/python3.6/site-packages/django/conf/__init__.py", line 41, in _setup
self._wrapped = Settings(settings_module)
File "/var/lib/awx/venv/awx/lib64/python3.6/site-packages/django/conf/__init__.py", line 129, in __init__
raise ImproperlyConfigured("The SECRET_KEY setting must not be empty.")
django.core.exceptions.ImproperlyConfigured: The SECRET_KEY setting must not be empty.
Also, when visiting http://localhost I get an Internal Server Error.
TASK [Stage sdist.] *******************************************************************************************
task path: /Users/shaun_smiley/Seafile/scripts/mtn/ansible/ansible-awx/image-build.yml:48
Thursday 14 September 2017 16:39:24 -0700 (0:00:00.033) 0:00:21.767 ****
fatal: [awx-vagrant-build]: FAILED! => {"changed": false, "failed": true, "msg": "Unable to find '/opt/awx/dist/awx-1.0.0.367.tar.gz' in expected paths."}
It looks like this is because
remote_src: True
is not specified.
Confirmed... set that param and it continues successfully.
Greetings,
First, thanks for the awesome docker image, makes getting set up a breeze!
I've added a project successfully, but every time I run the SCM update command I see the following error in the project stdout:
Using /etc/ansible/ansible.cfg as config file
PLAY [all] *********************************************************************
TASK [delete project directory before update] **********************************
An exception occurred during task execution. To see the full traceback, use -vvv. The error was: KeyError: 'getpwuid(): uid not found: 1000'
fatal: [localhost]: FAILED! => {"failed": true, "msg": "Unexpected failure during module execution.", "stdout": ""}
PLAY RECAP *********************************************************************
localhost : ok=0 changed=0 unreachable=0 failed=1
I've tried with both my own Project and the default one included with the installation, both fail on that same step.
Any more info I can get you to be of use?
Once again, thanks!
Hello,
I m running into this problem, after I run several jobs, it seems it started to consume swap memory while there is plenty of ram available? how to fix this?
Thanks a lot
Basically, run sudo pip install docker-compose
, then:
# docker-compose up -d
WARNING: Dependency conflict: an older version of the 'docker-py' package may be polluting the namespace. If you're experiencing crashes, run the following command to remedy the issue:
pip uninstall docker-py; pip uninstall docker; pip install docker
Pulling memcached (memcached:alpine)...
Traceback (most recent call last):
File "/bin/docker-compose", line 11, in <module>
sys.exit(main())
File "/usr/lib/python2.7/site-packages/compose/cli/main.py", line 68, in main
command()
File "/usr/lib/python2.7/site-packages/compose/cli/main.py", line 121, in perform_command
handler(command, command_options)
File "/usr/lib/python2.7/site-packages/compose/cli/main.py", line 938, in up
scale_override=parse_scale_args(options['--scale']),
File "/usr/lib/python2.7/site-packages/compose/project.py", line 430, in up
svc.ensure_image_exists(do_build=do_build)
File "/usr/lib/python2.7/site-packages/compose/service.py", line 311, in ensure_image_exists
self.pull()
File "/usr/lib/python2.7/site-packages/compose/service.py", line 1024, in pull
output = self.client.pull(repo, tag=tag, stream=True)
File "/usr/lib/python2.7/site-packages/docker/api/image.py", line 381, in pull
header = auth.get_config_header(self, registry)
AttributeError: 'module' object has no attribute 'get_config_header'
It looks like it's because docker-py
is currently being installed, instead of just docker
(via pip
). See docker/docker-py#1353 (comment)
See: ansible/awx#104
Will save a little bit of the setup work by passing it off to AWX's sdist-builder-container.
Hi, when running the docker-compose from 5.09.2018 I cannot view the job logs, although the api works. Seems to be something very very similar to this: ansible/awx#1861
any help tipps ?
Frontend is not working due to empty /var/lib/awx/public/static/
in container awx_web
.
Temporary (and ugly) workaround is entering the container and set a symlink:
docker exec -ti awxcontainer_awx_web_1 bash
cd /var/lib/awx/public
rm -rf static
ln -s /usr/lib/python2.7/site-packages/awx/ui/static
Anyway, thanks heaps for your work, Jeff! 🏆
django.db.utils.OperationalError: could not translate host name "postgres" to address: Name or service not known
I have all of my system configured on the localhost. Everything else is working fine. I can not seem to load any inventory even through git. Inventory does load manually put it.
TASK [geerlingguy.repo-epel : Install EPEL repo.] ******************************
FAILED - RETRYING: Install EPEL repo. (5 retries left).
FAILED - RETRYING: Install EPEL repo. (4 retries left).
FAILED - RETRYING: Install EPEL repo. (3 retries left).
FAILED - RETRYING: Install EPEL repo. (2 retries left).
FAILED - RETRYING: Install EPEL repo. (1 retries left).
fatal: [awx_container]: FAILED! => {"attempts": 5, "changed": false, "msg": "", "rc": 0, "results": ["epel-release-7-11.noarch providing /tmp/epel-release-latest-7.noarchD25pNv.rpm is already installed"]}
to retry, use: --limit @/vagrant/prebuild/prebuild.retry
PLAY RECAP *********************************************************************
awx_container : ok=3 changed=0 unreachable=0 failed=1
Ansible failed to complete successfully. Any error output should be
visible above. Please fix these errors and try again.
Seems it is due to Ansible 2.5:
geerlingguy/ansible-role-repo-epel#26
A PR has been raised, waiting for merge.
after following the instructions
curl -O https://raw.githubusercontent.com/geerlingguy/awx-container/master/docker-compose.yml
docker-compose up -d
I get this message and it never goes:
AWX is currently upgrading or installing, this page will refresh when done.
Hello,
Im trying to build a Debian based docker image and need to know where to fetch the awx-manage
binary from.
Please use the official install guide to run AWX using the maintained AWX container images: https://github.com/ansible/awx/blob/devel/INSTALL.md#docker-compose
You might also consider using the Tower Operator I am building if you would like to deploy in Kubernetes: https://github.com/geerlingguy/tower-operator
Scheduling is completely broken in this containerized version. Would you be willing to adopt the patch from upstream? ansible/awx#244
Hi i'm new in AWX but i passed the day to run this is example who works fine with ansible command line ans does not work in AWX
hosts: all
become: yes
become_method: su
tasks:
name: re-read facts after adding custom_was fact
setup: filter=ansible_local
name: print ansible_local
debug: var=ansible_local`
the result is :
`root@s00vl9974554:/tools/ansible/ANSIBLE_GAO_DEV/ANSIBLE_GAO$ ansible-playbook pp.yml -Kk
SSH password:
SUDO password[defaults to SSH password]:
PLAY [firefDev] ****************************************************************
TASK [setup] **********
ok: [firefDev]
TASK [re-read facts after adding custom_was fact] ******************************
ok: [firefDev]
TASK [print ansible_local] *****************************************************
ok: [firefDev] => {
"ansible_local": {
"custom": {
"Application": {
"codeenv": "10824_r3f00"
},
"DirectoryDev": {
"archives": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/archives",
"archivesnossl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-nossl-1/archives",
"archivesssl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1/archives",
"conf": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/conf",
"confarchives": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/archives/conf",
"confarchivesnossl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-nossl-1/archives/conf",
"confarchivesssl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1/archives/conf",
"confnossl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-nossl-1/conf",
"confssl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1/conf",
"htdocsarchivenossl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-nossl-1/archives/htdocs",
"htdocsarchivessl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1/archives/htdocs",
"htdocsnossl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-nossl-1/htdocs",
"htdocsssl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1/htdocs",
"ihs": "/applis/10824-r3f00/ihs",
"installedapps": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/installedApps",
"installedappsarchives": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/archives/installedApps",
"jdbc": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/jdbc",
"jdbcarchives": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/archives/jbdc",
"lib": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/lib",
"libarchives": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/archives/lib",
"libraray": "/home/was/wasLibrary",
"lognossl": "/applis/logs/10824-r3f00/ihs/sw-10824_r3f00-prez-nossl-1",
"logssl": "/applis/logs/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1",
"pluginnossl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-nossl-1/plugin",
"pluginssl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1/plugin",
"serverdir": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1",
"serverlog": "/applis/logs/10824-r3f00/was/sa-10824_r3f00-prez-1",
"shared": "/applis/10824-r3f00/was/shared",
"sslarchivesssl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1/archives/ssl",
"sslssl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1/ss",
"temp": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/temp",
"tranlog": "/applis/logs/10824-r3f00/was/sa-10824_r3f00-prez-1/tranlog",
"was": "/applis/10824-r3f00/was",
"webservernossl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-nossl-1",
"webserverssl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1"
},
"FileSystemDev": {
"fs_size": "2G",
"log_was_fs": "/applis/logs/10824-r3f00",
"log_was_lv_prez": "lv_przLog01",
"var_was_fs": "/applis/10824-r3f00",
"var_was_lv": "lv_prz01",
"vg_name": "vg_apps"
},
"UserGroup": {
"group": "web",
"user": "was"
},
"certificat": {
"keyfile": "someFile.kdb",
"stashfile": "someFile.sth"
},
"ihsEnvDev": {
"webnosecport": "10080",
"websecport": "10443"
},
"serverEnvDev": {
"admwasprofile": "/apps/WebSphere85S/profiles/svr",
"authalias": "authAlias",
"authaliaspassword": "DC%rop7j52s",
"authaliasuser": "R3FDEV_LINUX",
"bindingname1": "audit",
"bindingname2": "log4j",
"bindingname3": "local",
"bindingname4": "export",
"bindingname5": "refogws",
"bindingnamespace1": "config/audit",
"bindingnamespace2": "config/log4j",
"bindingnamespace3": "config/local",
"bindingnamespace4": "config/export",
"bindingnamespace5": "config/refogws",
"bindingvalue1": "file:///applis/10824-r3f/r3fdev01/conf/audit.properties",
"bindingvalue2": "file:///applis/10824-r3f/r3fdev01/conf/log4j.properties",
"bindingvalue3": "file:///applis/10824-r3f/r3fdev01/conf/firef.properties",
"bindingvalue4": "file:///applis/10824-r3f/r3fdev01/conf/export.properties",
"bindingvalue5": "file:///applis/10824-r3f/r3fdev01/conf/refogws.properties",
"code_application": "10824-r3f00",
"datasourcejndi": "jdbc/r3ffrp01",
"datasourcename": "p10824ap10",
"maxfiles": "10",
"maxfilesize": "10",
"memorymax": "8192",
"memorymin": "1024",
"propertiename": "com.ibm.tools.attach.enable",
"propertiename1": "com.ibm.websphere.servlet.temp.dir",
"propertiename2": "ebx.home",
"propertiename3": "ebx.properties",
"propertiename4": "java.awt.headless",
"propertiename5": "java.io.tmpdir",
"propertiename6": "loginpassword.authentication.disabled",
"propertievalue": "no",
"propertievalue1": "/applis/10824-r3f/was/sa-10824_r3f-prez-1/temp",
"propertievalue2": "/applis/10824-r3f/r3fprd01/ebx_home",
"propertievalue3": "/applis/10824-r3f/r3fprd01/ebx_home/ebx.properties",
"propertievalue4": "true",
"propertievalue5": "/applis/10824-r3f/r3fprd01/tmp",
"propertievalue6": "true",
"servername": "sa-10824-r3f00-prez-1",
"startingport": "12060",
"url": "http://firef-echonet.fr",
"urlds": "jdbc:oracle:thin:@D-10824-P.fr.net.intra:1521/D10824AP10",
"urlssl": "https://firef-echonet.fr",
"virtualhostname": "sa-10824-r3f00-prez-nossl-1_vh",
"virtualhostnamessl": "sa-10824-r3f00-prez-ssl-1_vh"
}
},
"custom_was": {
"Application": {
"codeenv": "10824_r3f00"
},
"DirectoryDev": {
"archives": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/archives",
"archivesnossl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-nossl-1/archives",
"archivesssl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1/archives",
"conf": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/conf",
"confarchives": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/archives/conf",
"confarchivesnossl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-nossl-1/archives/conf",
"confarchivesssl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1/archives/conf",
"confnossl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-nossl-1/conf",
"confssl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1/conf",
"htdocsarchivenossl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-nossl-1/archives/htdocs",
"htdocsarchivessl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1/archives/htdocs",
"htdocsnossl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-nossl-1/htdocs",
"htdocsssl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1/htdocs",
"ihs": "/applis/10824-r3f00/ihs",
"installedapps": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/installedApps",
"installedappsarchives": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/archives/installedApps",
"jdbc": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/jdbc",
"jdbcarchives": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/archives/jbdc",
"lib": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/lib",
"libarchives": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/archives/lib",
"libraray": "/home/was/wasLibrary",
"lognossl": "/applis/logs/10824-r3f00/ihs/sw-10824_r3f00-prez-nossl-1",
"logssl": "/applis/logs/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1",
"pluginnossl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-nossl-1/plugin",
"pluginssl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1/plugin",
"serverdir": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1",
"serverlog": "/applis/logs/10824-r3f00/was/sa-10824_r3f00-prez-1",
"shared": "/applis/10824-r3f00/was/shared",
"sslarchivesssl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1/archives/ssl",
"sslssl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1/ss",
"temp": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/temp",
"tranlog": "/applis/logs/10824-r3f00/was/sa-10824_r3f00-prez-1/tranlog",
"was": "/applis/10824-r3f00/was",
"webservernossl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-nossl-1",
"webserverssl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1"
},
"FileSystemDev": {
"fs_size": "2G",
"log_was_fs": "/applis/logs/10824-r3f00",
"log_was_lv_prez": "lv_przLog01",
"var_was_fs": "/applis/10824-r3f00",
"var_was_lv": "lv_prz01",
"vg_name": "vg_apps"
},
"UserGroup": {
"group": "web",
"user": "was"
},
"certificat": {
"keyfile": "someFile.kdb",
"stashfile": "someFile.sth"
},
"ihsEnvDev": {
"webnosecport": "10080",
"websecport": "10443"
},
"serverEnvDev": {
"admwasprofile": "/apps/WebSphere85S/profiles/svr",
"authalias": "authAlias",
"authaliaspassword": "DC%rop7j52s",
"authaliasuser": "R3FDEV_LINUX",
"bindingname1": "audit",
"bindingname2": "log4j",
"bindingname3": "local",
"bindingname4": "export",
"bindingname5": "refogws",
"bindingnamespace1": "config/audit",
"bindingnamespace2": "config/log4j",
"bindingnamespace3": "config/local",
"bindingnamespace4": "config/export",
"bindingnamespace5": "config/refogws",
"bindingvalue1": "file:///applis/10824-r3f/r3fdev01/conf/audit.properties",
"bindingvalue2": "file:///applis/10824-r3f/r3fdev01/conf/log4j.properties",
"bindingvalue3": "file:///applis/10824-r3f/r3fdev01/conf/firef.properties",
"bindingvalue4": "file:///applis/10824-r3f/r3fdev01/conf/export.properties",
"bindingvalue5": "file:///applis/10824-r3f/r3fdev01/conf/refogws.properties",
"code_application": "10824-r3f00",
"datasourcejndi": "jdbc/r3ffrp01",
"datasourcename": "p10824ap10",
"maxfiles": "10",
"maxfilesize": "10",
"memorymax": "8192",
"memorymin": "1024",
"propertiename": "com.ibm.tools.attach.enable",
"propertiename1": "com.ibm.websphere.servlet.temp.dir",
"propertiename2": "ebx.home",
"propertiename3": "ebx.properties",
"propertiename4": "java.awt.headless",
"propertiename5": "java.io.tmpdir",
"propertiename6": "loginpassword.authentication.disabled",
"propertievalue": "no",
"propertievalue1": "/applis/10824-r3f/was/sa-10824_r3f-prez-1/temp",
"propertievalue2": "/applis/10824-r3f/r3fprd01/ebx_home",
"propertievalue3": "/applis/10824-r3f/r3fprd01/ebx_home/ebx.properties",
"propertievalue4": "true",
"propertievalue5": "/applis/10824-r3f/r3fprd01/tmp",
"propertievalue6": "true",
"servername": "sa-10824-r3f00-prez-1",
"startingport": "12060",
"url": "http://firef-echonet.fr",
"urlds": "jdbc:oracle:thin:@D-10824-P.fr.net.intra:1521/D10824AP10",
"urlssl": "https://firef-echonet.fr",
"virtualhostname": "sa-10824-r3f00-prez-nossl-1_vh",
"virtualhostnamessl": "sa-10824-r3f00-prez-ssl-1_vh"
}
}
}
}
PLAY RECAP *********************************************************************
firefDev : ok=3 changed=0 unreachable=0 failed=0`
when i run it on AWX,the result is here :
no informations in ansible_local.
![result](https://user-images.githubusercontent.com/29118527/32235030-a6b1c888-be5e-11e7-8ad3-a82e0276a663.png)
is there some spécific parameter ?
TASK [Create web Dockerfile from template.] ************************************
fatal: [localhost]: FAILED! => {"changed": false, "msg": "Could not find or access '/opt/awx/installer/image_build/templates/Dockerfile.j2'"}
Example: https://travis-ci.org/geerlingguy/awx-container/builds/369155170
If I change host port in awx_web, for example it be like
awx_web:
....
ports:
- "33380:8052"
Web interface ceases to work, with this errors
(index):12 GET http://127.0.0.1:33380/static/css/vendor.c7c34fcde5e8a885ab9c.css net::ERR_ABORTED
(index):19 GET http://127.0.0.1:33380/static/js/app.c7c34fcde5e8a885ab9c.js net::ERR_ABORTED
(index):17 GET http://127.0.0.1:33380/static/js/vendor.c7c34fcde5e8a885ab9c.js net::ERR_ABORTED
(index):14 GET http://127.0.0.1:33380/static/css/app.c7c34fcde5e8a885ab9c.css net::ERR_ABORTED
(index):19 GET http://127.0.0.1:33380/static/js/app.c7c34fcde5e8a885ab9c.js net::ERR_ABORTED
Should be able to test on Python 3.5 or 3.6 in Travis, also update docs to remove warnings about installing from source.
Is it wise to run the awx_web and awx_task containers as root (as defined in the docker-compose file)? Would it be possible to run them otherwise? Thanks!
Hi
First, thanks for doing this, i was making this my self but you beat me to it :)
Problem:
I could not access the webinterface, eventhough the migration was complete.
When looking at the logs for awxweb, i could see:
nginx: [alert] could not open error log file: open() "/var/log/nginx/error.log" failed (13: Permission denied)
2017/09/09 14:06:35 [emerg] 81#0: mkdir() "/var/lib/nginx/tmp/client_body" failed (13: Permission denied)
And supervisor log:
2017-09-09 14:06:35,762 INFO exited: nginx (exit status 1; not expected)
2017-09-09 14:06:36,770 INFO gave up: nginx entered FATAL state, too many start retries too quickly
And i could see that its running under the UID 1000, eventhough root is set to UID 0 in /etc/passwd
If i added: "user: root" to the docker-compose file, under awxweb, it runs fine.
And i didt have this issue when i created the images from scratch, from the official awx repo
I also tested it on another docker instance, same issue
Both running Debian 8 ( jessie )
Have you run into this issue?
During and after AnsibleFest SF 2017, I started this project and wanted to quickly get a web and task container up on Docker Hub since there weren't officially-built/supported ones from Ansible.
However, there now exist official images which are also automated builds that sync up with AWX official releases:
At a minimum, this project should point out the existence of these official images in the README; but going further, while this project's goal is to be able to create the web and task containers fully custom following the more elaborate AWX install process, the included docker-compose.yml
(highlighted in the README and in some highly-trafficed blog posts) is intended to give a simple and quick AWX environment for demo purposes.
It might be best to pin to the latest official images for the docker-compose.yml
. Then I can either deprecate the geerlingguy/
images, or figure out some way of making them maintainable via this project; see #1.
Hitting a loop of the following error in the awx_web container:
awx_web_1 | 2018-02-06 23:21:58,766 INFO spawned: 'daphne' with pid 189
awx_web_1 | 2018-02-06 23:21:59,753 ERROR Missing or incorrect metadata for Tower version. Ensure Tower was installed using the setup playbook.
awx_web_1 | Traceback (most recent call last):
awx_web_1 | File "/var/lib/awx/venv/awx/bin/daphne", line 11, in <module>
awx_web_1 | sys.exit(CommandLineInterface.entrypoint())
awx_web_1 | File "/var/lib/awx/venv/awx/lib/python2.7/site-packages/daphne/cli.py", line 144, in entrypoint
awx_web_1 | cls().run(sys.argv[1:])
awx_web_1 | File "/var/lib/awx/venv/awx/lib/python2.7/site-packages/daphne/cli.py", line 174, in run
awx_web_1 | channel_layer = importlib.import_module(module_path)
awx_web_1 | File "/usr/lib64/python2.7/importlib/__init__.py", line 37, in import_module
awx_web_1 | __import__(name)
awx_web_1 | File "/usr/lib/python2.7/site-packages/awx/asgi.py", line 31, in <module>
awx_web_1 | raise Exception("Missing or incorrect metadata for Tower version. Ensure Tower was installed using the setup playbook.")
awx_web_1 | Exception: Missing or incorrect metadata for Tower version. Ensure Tower was installed using the setup playbook.
awx_web_1 | 2018-02-06 23:21:59,810 INFO success: daphne entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
awx_web_1 | 2018-02-06 23:21:59,813 INFO exited: daphne (exit status 1; not expected)
Likely related to:
Currently, I've kind of hacked together the stuff inside the prebuild/
directory to be able to build the task and web containers. I would rather let ansible-container build
do all the building.
It was a little tricky the first time I tried to get it to work (not sure if that was due to accidentally using Ansible Container 0.9.1 which is kinda buggy, or something else), so I gave up and went with the prebuild solution for now.
The main issue is that the AWX installer is pretty environment-specific (and in some places seemingly redundant), and it assumes you're building on a bare VM/metal, not in a container build process, so there are some things that don't work too well (IMO) and needed refactoring. Plus there are a bunch of files that need to be copied out from the Ansible AWX project.
Hi,
I tried to put a webhook for Microsoft Teams and a slack (test if the problem was the webhook or the notifications....) but the both failed when I try the notifications.
No log on the host or on the docker at /var/log/tower/ all directories are empty.
Any Idea ?
ordereddict([('from', 'postgres:9.6'), ('roles', []), ('environment', ['POSTGRES_USER={{ pg_username }}', 'POSTGRES_PASSWORD={{ pg_password }}', 'POSTGRES_DB={{ pg_database }}'])]) is not JSON serializable
From build: https://travis-ci.org/geerlingguy/awx-container/jobs/420152820#L1210
Issue: Indefinite error loop when awx apps are ready before postgres.
Workaround: Start postgres container, check log that postgres is ready for connections, then start up awx containers.
I downloaded the provided docker-compose script and ran command:
docker-compose up -d
Tailing the awx_task container, I get the below error message.
django.db.utils.ProgrammingError:
relation "main_schedule" does not exist
LINE 1: ...le"."next_run", "main_schedule"."extra_data" FROM "main_sche...
^
2017-09-15 19:57:27,432 INFO exited: celery (exit status 1; not expected)
[2017-09-15 19:57:28,234: INFO/MainProcess] Scheduler: Sending due task task_manager (awx.main.scheduler.tasks.run_task_manager)
2017-09-15 19:57:28,237 INFO spawned: 'celery' with pid 3421
[2017-09-15 19:57:28,238: DEBUG/MainProcess] awx.main.scheduler.tasks.run_task_manager sent. id->a3ddb65d-21c8-49f9-ba6c-6298e2ce67b6
[2017-09-15 19:57:28,239: DEBUG/MainProcess] beat: Waking up in 19.90 seconds.
2017-09-15 19:57:29,241 INFO success: celery entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
2017-09-15 19:57:30,706 WARNING py.warnings /var/lib/awx/venv/awx/lib/python2.7/site-packages/django/core/management/base.py:260: RemovedInDjango19Warning: "requires_model_validation" is deprecated in favor of "requires_system_checks".
RemovedInDjango19Warning)
2017-09-15 19:57:30,749 INFO awx.main.tasks Syncing Schedules
2017-09-15 19:57:30,749 INFO awx.main.tasks Syncing Schedules
Traceback (most recent call last):
File "/usr/bin/awx-manage", line 9, in <module>
load_entry_point('awx==1.0.0.312', 'console_scripts', 'awx-manage')()
File "/usr/lib/python2.7/site-packages/awx/__init__.py", line 107, in manage
execute_from_command_line(sys.argv)
...
File "/var/lib/awx/venv/awx/lib/python2.7/site-packages/django/db/backends/utils.py", line 64, in execute
return self.cursor.execute(sql, params)
django.db.utils.ProgrammingError: relation "main_schedule" does not exist
LINE 1: ...le"."next_run", "main_schedule"."extra_data" FROM "main_sche...
^
I'm not sure if this should be considered a bug, but I'll report in case others run into the same issue.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.