Code Monkey home page Code Monkey logo

awx-container's Introduction

AWX (Built with Ansible Container)

DEPRECATED: This project has been deprecated. Please use the official AWX containers and docker-compose.yml file to run AWX instead.

Build Status

This project is in its early stages. There will be bugs!

This project is composed of three main parts:

  • Ansible Container project: This project is maintained on GitHub: geerlingguy/awx-container. Please file issues, support requests, etc. against this GitHub repository.
  • Docker Hub Image: If you just want to use geerlingguy/awx_web and geerlingguy/awx_task in your project, you can pull it from Docker Hub.
  • Ansible Role: If you need an Ansible role to build AWX, check out geerlingguy.awx on Ansible Galaxy. (This is the Ansible role that does the bulk of the work in managing the AWX container.)

Versions

Currently maintained versions include:

  • geerlingguy/awx_web:
    • 1.x, latest: AWX 1.x
    • 1.0.5
  • geerlingguy/awx_task:
    • 1.x, latest: AWX 1.x
    • 1.0.5

Quickstart - Standalone Usage with Docker Compose

If you just want to get an AWX environment running quickly, you can use the docker-compose.yml file included with this project to build a local environment accessible on http://localhost:80/:

mkdir awx-test && cd awx-test
curl -O https://raw.githubusercontent.com/geerlingguy/awx-container/master/docker-compose.yml
docker-compose up -d

The Docker Compose file uses community images for postgres, rabbitmq, and memcached, and the following images for AWX:

After the initial database migration completes (this can take a few minutes; follow the progress with docker logs -f [id-of-awx_task-container]), you will be able to access the AWX interface at http://localhost/. The default login is admin/password.

Note: Switch the image for the awx_web and awx_task containers in docker-compose.yml if you want to use the geerlingguy/ maintained images rather than the ones from ansible/. If you're just kicking AWX's tires though, stick with the defaults.

Management with Ansible Container

Prerequisites

Before using this project to build and maintain AWX images for Docker, you need to have the following installed:

Build the AWX images

Typically, you would build the images as specified in the container.yml file using ansible-container --var-file config.yml build, but in this case, since there are many dependencies bundled in the AWX repository, we will build the Docker images using a helper playbook:

$ cd prebuild/
$ ansible-galaxy install -r requirements.yml --force
$ ansible-playbook -i 'localhost,' -c local prebuild.yml

After this playbook runs, you should see two new Docker images (which we'll use in the Ansible Container definition):

$ docker images
REPOSITORY          TAG                 IMAGE ID            CREATED             SIZE
awx_task            devel               26311794058d        29 seconds ago      938MB
awx_web             devel               3d38dccc9190        58 seconds ago      913MB

A Vagrantfile is included with this project to assist in building a clean environment with all the dependencies required to build the AWX images (in case you don't want to install everything on your local workstation!). To use it:

  1. Run vagrant up.
  2. Wait for Vagrant's provisioning to complete (it will run prebuild.yml automatically).
  3. Log in with vagrant ssh and use docker or ansible as needed.

Build the conductor

Build the conductor using ansible-container build:

ansible-container --var-file config.yml build

Note: If you get any permission errors trying to generate a Docker container, make sure you're either running the commands as root or with sudo, or your user is in the docker group (e.g. sudo usermod -G docker -a [user], then log out and log back in).

Run the containers

ansible-container --var-file config.yml run

You should be able to reach AWX by accessing http://localhost/ in your browser.

(Use stop to stop the container, and destroy to reset the containers and all images.)

Push the containers to Docker Hub

Currently, the process for updating this image on Docker Hub is manual. Eventually this will be automated via Travis CI using ansible-container push (currently, this is waiting on this issue to be resolved).

  1. Log into Docker Hub on the command line:

    docker login --username=geerlingguy
    
  2. Tag the latest version (only if this is the latest/default version):

    docker tag awx_web:devel geerlingguy/awx_web:latest
    docker tag awx_web:devel geerlingguy/awx_web:1.x
    docker tag awx_task:devel geerlingguy/awx_task:latest
    docker tag awx_task:devel geerlingguy/awx_task:1.x
    
  3. Push tags to Docker Hub:

    docker push geerlingguy/awx_web:latest # (if this was just tagged)
    docker push geerlingguy/awx_web:1.x
    [etc...]
    

License

MIT / BSD

Author Information

This container build was created in 2017 by Jeff Geerling, author of Ansible for DevOps.

awx-container's People

Contributors

aaronk1 avatar daswars avatar geerlingguy avatar nebel54 avatar vonpelz avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

awx-container's Issues

docker-compose up fails

When starting docker-compose up -d I get some errors:

% docker logs docker-awx_awx_task_1                                     
/usr/bin/launch_awx_task.sh: line 8: /etc/tower/conf.d/environment.sh: No such file or directory                            
Using /etc/ansible/ansible.cfg as config file                                                           
127.0.0.1 | SUCCESS => {                                                                                               
    "changed": false,                                                                                      
    "elapsed": 2,                                                                                                         
    "path": null,                                                                         
    "port": 5432,                                                                                    
    "search_regex": null,                                                                                                                                                   
    "state": "started"                                                                           
}                                                                                                    
Using /etc/ansible/ansible.cfg as config file                                                        
127.0.0.1 | SUCCESS => {                                                                                        
    "changed": false,                                                                                        
    "elapsed": 0,                                                                                                   
    "path": null,                                                                                       
    "port": 11211,                                                                                                          
    "search_regex": null,                                                                                  
    "state": "started"                                                                                                 
}                                                                                     
Using /etc/ansible/ansible.cfg as config file                                                                             
127.0.0.1 | SUCCESS => {                                                                  
    "changed": false,                                                                                
    "elapsed": 1,                                                      
    "path": null,                                                                                    
    "port": 5672,                                                                     
    "search_regex": null,                                                                                       
    "state": "started"                       
}                                                                                                                   
Using /etc/ansible/ansible.cfg as config file               
127.0.0.1 | SUCCESS => {                                                                                                    
    "changed": false,                                                      
    "db": "awx"                                                                                                        
}                                                                          
Traceback (most recent call last):                                                                                        
  File "/usr/bin/awx-manage", line 11, in <module>                                        
    load_entry_point('awx==4.0.0.0', 'console_scripts', 'awx-manage')()                              
  File "/var/lib/awx/venv/awx/lib64/python3.6/site-packages/awx/__init__.py", line 124, in manage
    prepare_env()                                                                                    
  File "/var/lib/awx/venv/awx/lib64/python3.6/site-packages/awx/__init__.py", line 89, in prepare_env
    if not settings.DEBUG: # pragma: no cover                                                                   
  File "/var/lib/awx/venv/awx/lib64/python3.6/site-packages/django/conf/__init__.py", line 56, in __getattr__
    self._setup(name)                                                                                               
  File "/var/lib/awx/venv/awx/lib64/python3.6/site-packages/django/conf/__init__.py", line 41, in _setup
    self._wrapped = Settings(settings_module)                                                                               
  File "/var/lib/awx/venv/awx/lib64/python3.6/site-packages/django/conf/__init__.py", line 129, in __init__
    raise ImproperlyConfigured("The SECRET_KEY setting must not be empty.")                                            
django.core.exceptions.ImproperlyConfigured: The SECRET_KEY setting must not be empty.

Also, when visiting http://localhost I get an Internal Server Error.

No proper login page after first docker-compose up

Hi,

When I do docker-compose up -d (on OSX) I see this in my (chrome) browser:

image

First I thought it's because awx_task was still busy but even if I wait till it stops the page stays the same.

If I do a docker-compose restart it all works:

image

Any idea why this happens?

Problem with notifications

Hi,

I tried to put a webhook for Microsoft Teams and a slack (test if the problem was the webhook or the notifications....) but the both failed when I try the notifications.
No log on the host or on the docker at /var/log/tower/ all directories are empty.

Any Idea ?

Unable to find '/opt/awx/dist/awx-1.0.0.367.tar.gz' in expected paths.

TASK [Stage sdist.] *******************************************************************************************
task path: /Users/shaun_smiley/Seafile/scripts/mtn/ansible/ansible-awx/image-build.yml:48
Thursday 14 September 2017  16:39:24 -0700 (0:00:00.033)       0:00:21.767 ****
fatal: [awx-vagrant-build]: FAILED! => {"changed": false, "failed": true, "msg": "Unable to find '/opt/awx/dist/awx-1.0.0.367.tar.gz' in expected paths."}

It looks like this is because
remote_src: True
is not specified.

Confirmed... set that param and it continues successfully.

No Job view

Hi, when running the docker-compose from 5.09.2018 I cannot view the job logs, although the api works. Seems to be something very very similar to this: ansible/awx#1861

any help tipps ?

Travis CI builds failing on 'Stage sdist' task with AWX 1.0.3.52

TASK [Stage sdist.] ************************************************************
An exception occurred during task execution. To see the full traceback, use -vvv. The error was: AnsibleFileNotFound: Could not find or access '/opt/awx/dist/awx-1.0.3.52.tar.gz'
fatal: [localhost]: FAILED! => {"changed": false, "msg": "Could not find or access '/opt/awx/dist/awx-1.0.3.52.tar.gz'"}

Example failed build: https://travis-ci.org/geerlingguy/awx-container/builds/338157010

ssh keys won't work unless you set ControlMaster=no

I set up credentials to use a private ssh key for the host machine, creating a job template using those credentials and a simple playbook project, then running the job in awx. The job fails with an error message looking like:

...control_persist_detach: background process is 745\r\nControl socket connect(/tmp/awx_18_nTI3M6/cp/10.16.141.8522root): Connection refused\r\nFailed to connect to new control master\r\n"

I found this article that says that overlayfs as a filesystem doesn't play well with unix sockets (i.e. ansible/awx uses ssh multiplexing by default). It suggests changing the ControlPath to /dev/shm/.

As I started looking into that as an option, it looks like awx overwrites the control_path_dir in ansible.cfg. if you look to the tower docs, which says:

For example, Tower stores the SSH ControlMaster sockets, the SSH agent socket, and any other per-job run items in a per-job temporary directory, secured by multi-tenancy access control restrictions via PRoot.

The current work around I have found, is to include this line in the ansible.cfg file

ssh_args = -o ControlMaster=no

I also found an awx setting called "AWX_PROOT_BASE_PATH": "/tmp" (The only place I see /tmp being set anywhere). I am wondering if this could be changed from /tmp to /dev/shm to get ssh multiplexing to work (Have not tested yet). Otherwise maybe a ticket should be opened in the awx project to create a setting for this issue.

Maybe there should be a configuration setting that could be set in the container build, so (1) these settings are documented and (2) there is an easy way to set these settings.

Keep up the good work and Thanks!

Dockerfile for awx_task?

Hello,
Im trying to build a Debian based docker image and need to know where to fetch the awx-manage binary from.

geerlingguy.repo-epel error

When trying to build image for AWX, getting error for below task. It looks like geerlingguy.repo-epel refers to https://dl.fedoraproject.org/pub/epel/epel-release-latest-NA.noarch.rpm which is not found.

TASK [geerlingguy.repo-epel : Install EPEL repo.] ******************************
FAILED - RETRYING: Install EPEL repo. (5 retries left).
FAILED - RETRYING: Install EPEL repo. (4 retries left).
FAILED - RETRYING: Install EPEL repo. (3 retries left).
FAILED - RETRYING: Install EPEL repo. (2 retries left).
FAILED - RETRYING: Install EPEL repo. (1 retries left).
fatal: [localhost]: FAILED! => {"attempts": 5, "changed": false, "failed": true, "msg": "Failure downloading https://dl.fedoraproject.org/pub/epel/epel-release- latest-NA.noarch.rpm, HTTP Error 404: Not Found"}

Cannot run SCM Project update

Greetings,

First, thanks for the awesome docker image, makes getting set up a breeze!

I've added a project successfully, but every time I run the SCM update command I see the following error in the project stdout:

Using /etc/ansible/ansible.cfg as config file

PLAY [all] *********************************************************************

TASK [delete project directory before update] **********************************
An exception occurred during task execution. To see the full traceback, use -vvv. The error was: KeyError: 'getpwuid(): uid not found: 1000'
fatal: [localhost]: FAILED! => {"failed": true, "msg": "Unexpected failure during module execution.", "stdout": ""}

PLAY RECAP *********************************************************************
localhost                  : ok=0    changed=0    unreachable=0    failed=1   

I've tried with both my own Project and the default one included with the installation, both fail on that same step.

Any more info I can get you to be of use?

Once again, thanks!

How I can change port from 80?

If I change host port in awx_web, for example it be like

  awx_web:
....
    ports:
      - "33380:8052"

Web interface ceases to work, with this errors

(index):12 GET http://127.0.0.1:33380/static/css/vendor.c7c34fcde5e8a885ab9c.css net::ERR_ABORTED
(index):19 GET http://127.0.0.1:33380/static/js/app.c7c34fcde5e8a885ab9c.js net::ERR_ABORTED
(index):17 GET http://127.0.0.1:33380/static/js/vendor.c7c34fcde5e8a885ab9c.js net::ERR_ABORTED
(index):14 GET http://127.0.0.1:33380/static/css/app.c7c34fcde5e8a885ab9c.css net::ERR_ABORTED
(index):19 GET http://127.0.0.1:33380/static/js/app.c7c34fcde5e8a885ab9c.js net::ERR_ABORTED

Docker swarm support

Hello,
I'm trying to deploy AWX in a swarm cluster using stack and it doesn't work well.

I already changed the links to a network using the overlay driver...
Did you already tried this?

Add Docker Compose file for standalone/quickstart usage

If someone (like myself) just wants to get a quick and dirty AWX instance running for testing, demonstration purposes, etc... it should be pretty easy to use a single Docker Compose file with the public Docker Hub images built using this repository.

Getting "Exception: Missing or incorrect metadata for Tower version."

Hitting a loop of the following error in the awx_web container:

awx_web_1    | 2018-02-06 23:21:58,766 INFO spawned: 'daphne' with pid 189
awx_web_1    | 2018-02-06 23:21:59,753 ERROR    Missing or incorrect metadata for Tower version.  Ensure Tower was installed using the setup playbook.
awx_web_1    | Traceback (most recent call last):
awx_web_1    |   File "/var/lib/awx/venv/awx/bin/daphne", line 11, in <module>
awx_web_1    |     sys.exit(CommandLineInterface.entrypoint())
awx_web_1    |   File "/var/lib/awx/venv/awx/lib/python2.7/site-packages/daphne/cli.py", line 144, in entrypoint
awx_web_1    |     cls().run(sys.argv[1:])
awx_web_1    |   File "/var/lib/awx/venv/awx/lib/python2.7/site-packages/daphne/cli.py", line 174, in run
awx_web_1    |     channel_layer = importlib.import_module(module_path)
awx_web_1    |   File "/usr/lib64/python2.7/importlib/__init__.py", line 37, in import_module
awx_web_1    |     __import__(name)
awx_web_1    |   File "/usr/lib/python2.7/site-packages/awx/asgi.py", line 31, in <module>
awx_web_1    |     raise Exception("Missing or incorrect metadata for Tower version.  Ensure Tower was installed using the setup playbook.")
awx_web_1    | Exception: Missing or incorrect metadata for Tower version.  Ensure Tower was installed using the setup playbook.
awx_web_1    | 2018-02-06 23:21:59,810 INFO success: daphne entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
awx_web_1    | 2018-02-06 23:21:59,813 INFO exited: daphne (exit status 1; not expected)

Likely related to:

Discussion: Migration Paths for AWX

Maybe this Project could be extended to provide a update path for the AWX containers somehow ?

As For AWX, it seems that a productive use is impossible, because awx does not provide migration paths to higher versions, thus no updates in production.

ansible/awx#138
https://groups.google.com/forum/#!msg/awx-project/PQLxKl5Rj9s/UGy-3VaCCQAJ

If the aim of this Project is a productive deployment of awx, there should be a migration path of the main AWX releases (e.g 1.0.2 -> 1.0.3 ). If not , close this 😄.

Whats you opinion on that ?

RuntimeError: project local_path user cannot be found in /var/lib/awx/projects

Hi,

Thanks for creating ansible awx role and containers.

I am running ansible awx role on oracle virtual box. Role has been ran perfectly and i was able to access the Ansible awx. I have added the playbooks manually and create a template to perform the job.

But while running job template i am getting below error:

Traceback (most recent call last):
File "/usr/lib/python2.7/site-packages/awx/main/tasks.py", line 799, in run
cwd = self.build_cwd(instance, **kwargs)
File "/usr/lib/python2.7/site-packages/awx/main/tasks.py", line 1188, in build_cwd
(job.project.local_path, root))
RuntimeError: project local_path user cannot be found in /var/lib/awx/projects

I have created the directory under awx_web container and do i have define somewhere in settings for PROJECT PATH. Link Ansible tower do we have any file w.r.t settings.

Run awx_web and awx_task not as root?

Is it wise to run the awx_web and awx_task containers as root (as defined in the docker-compose file)? Would it be possible to run them otherwise? Thanks!

no IPv6-support in awx_task

Within the awx_task container is no IPv6 support:

[root@awx awx]# ip a
1: lo: <LOOPBACK,UP,LOWER_UP> mtu 65536 qdisc noqueue state UNKNOWN qlen 1
    link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00
    inet 127.0.0.1/8 scope host lo
       valid_lft forever preferred_lft forever
2: tunl0@NONE: <NOARP> mtu 1480 qdisc noop state DOWN qlen 1
    link/ipip 0.0.0.0 brd 0.0.0.0
3: gre0@NONE: <NOARP> mtu 1476 qdisc noop state DOWN qlen 1
    link/gre 0.0.0.0 brd 0.0.0.0
4: gretap0@NONE: <BROADCAST,MULTICAST> mtu 1462 qdisc noop state DOWN qlen 1000
    link/ether 00:00:00:00:00:00 brd ff:ff:ff:ff:ff:ff
5: ip_vti0@NONE: <NOARP> mtu 1332 qdisc noop state DOWN qlen 1
    link/ipip 0.0.0.0 brd 0.0.0.0
6: ip6_vti0@NONE: <NOARP> mtu 1500 qdisc noop state DOWN qlen 1
    link/tunnel6 :: brd ::
7: sit0@NONE: <NOARP> mtu 1480 qdisc noop state DOWN qlen 1
    link/sit 0.0.0.0 brd 0.0.0.0
8: ip6tnl0@NONE: <NOARP> mtu 1452 qdisc noop state DOWN qlen 1
    link/tunnel6 :: brd ::
9: ip6gre0@NONE: <NOARP> mtu 1448 qdisc noop state DOWN qlen 1
    link/[823] 00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00 brd 00:00:00:00:00:00:00:00:00:00:00:00:00:00:00:00
75: eth0@if76: <BROADCAST,MULTICAST,UP,LOWER_UP> mtu 1500 qdisc noqueue state UP
    link/ether 02:42:ac:12:00:06 brd ff:ff:ff:ff:ff:ff link-netnsid 0
    inet 172.18.0.6/16 scope global eth0
       valid_lft forever preferred_lft forever
(reverse-i-search)`p': i^Ca
[root@awx awx]# ping heise.de
PING heise.de (193.99.144.80) 56(84) bytes of data.
64 bytes from redirector.heise.de (193.99.144.80): icmp_seq=1 ttl=37 time=25.0 ms
64 bytes from redirector.heise.de (193.99.144.80): icmp_seq=2 ttl=37 time=24.0 ms
64 bytes from redirector.heise.de (193.99.144.80): icmp_seq=3 ttl=37 time=24.5 ms
^C
--- heise.de ping statistics ---
3 packets transmitted, 3 received, 0% packet loss, time 2006ms
rtt min/avg/max/mdev = 24.076/24.569/25.095/0.416 ms
[root@awx awx]# ping6 heise.de
connect: Cannot assign requested address

It is impossible from a playbook to connect a host wich is only accessible via IPv6.

There is an docker config item "enable_ipv6" but it seems incompatible with docker-compose:
"https://docs.docker.com/compose/compose-file/#ipv4_address-ipv6_address

What's the trick to upgrade Ansible?

I'm using latest image tag

I tried:

/var/lib/awx/venv/ansible/bin/pip install --upgrade ansible

and I get ...

Error: [('/var/lib/awx/venv/ansible/lib/python2.7/site-packages/pycparser-2.17-py2.7.egg-info/PKG-INFO',

Thanks

Empty dir /var/lib/awx/public/static/

Frontend is not working due to empty /var/lib/awx/public/static/ in container awx_web.

Temporary (and ugly) workaround is entering the container and set a symlink:

docker exec -ti awxcontainer_awx_web_1 bash
cd /var/lib/awx/public
rm -rf static
ln -s /usr/lib/python2.7/site-packages/awx/ui/static

Anyway, thanks heaps for your work, Jeff! 🏆

Make ansible-container build actually build the task/web containers

Currently, I've kind of hacked together the stuff inside the prebuild/ directory to be able to build the task and web containers. I would rather let ansible-container build do all the building.

It was a little tricky the first time I tried to get it to work (not sure if that was due to accidentally using Ansible Container 0.9.1 which is kinda buggy, or something else), so I gave up and went with the prebuild solution for now.

The main issue is that the AWX installer is pretty environment-specific (and in some places seemingly redundant), and it assumes you're building on a bare VM/metal, not in a container build process, so there are some things that don't work too well (IMO) and needed refactoring. Plus there are a bunch of files that need to be copied out from the Ansible AWX project.

can't acces to ansible_local

Hi i'm new in AWX but i passed the day to run this is example who works fine with ansible command line ans does not work in AWX

`cat pp.yml

Fonction principale pour l'ajout d'un groupe

  • hosts: all
    become: yes
    become_method: su

    tasks:

    • name: re-read facts after adding custom_was fact
      setup: filter=ansible_local

    • name: print ansible_local
      debug: var=ansible_local`

the result is :

`root@s00vl9974554:/tools/ansible/ANSIBLE_GAO_DEV/ANSIBLE_GAO$ ansible-playbook pp.yml -Kk
SSH password:
SUDO password[defaults to SSH password]:

PLAY [firefDev] ****************************************************************

TASK [setup] **********

ok: [firefDev]

TASK [re-read facts after adding custom_was fact] ******************************
ok: [firefDev]

TASK [print ansible_local] *****************************************************
ok: [firefDev] => {
    "ansible_local": {
        "custom": {
            "Application": {
                "codeenv": "10824_r3f00"
            },
            "DirectoryDev": {
                "archives": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/archives",
                "archivesnossl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-nossl-1/archives",
                "archivesssl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1/archives",
                "conf": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/conf",
                "confarchives": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/archives/conf",
                "confarchivesnossl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-nossl-1/archives/conf",
                "confarchivesssl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1/archives/conf",
                "confnossl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-nossl-1/conf",
                "confssl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1/conf",
                "htdocsarchivenossl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-nossl-1/archives/htdocs",
                "htdocsarchivessl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1/archives/htdocs",
                "htdocsnossl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-nossl-1/htdocs",
                "htdocsssl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1/htdocs",
                "ihs": "/applis/10824-r3f00/ihs",
                "installedapps": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/installedApps",
                "installedappsarchives": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/archives/installedApps",
                "jdbc": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/jdbc",
                "jdbcarchives": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/archives/jbdc",
                "lib": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/lib",
                "libarchives": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/archives/lib",
                "libraray": "/home/was/wasLibrary",
                "lognossl": "/applis/logs/10824-r3f00/ihs/sw-10824_r3f00-prez-nossl-1",
                "logssl": "/applis/logs/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1",
                "pluginnossl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-nossl-1/plugin",
                "pluginssl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1/plugin",
                "serverdir": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1",
                "serverlog": "/applis/logs/10824-r3f00/was/sa-10824_r3f00-prez-1",
                "shared": "/applis/10824-r3f00/was/shared",
                "sslarchivesssl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1/archives/ssl",
                "sslssl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1/ss",
                "temp": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/temp",
                "tranlog": "/applis/logs/10824-r3f00/was/sa-10824_r3f00-prez-1/tranlog",
                "was": "/applis/10824-r3f00/was",
                "webservernossl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-nossl-1",
                "webserverssl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1"
            },
            "FileSystemDev": {
                "fs_size": "2G",
                "log_was_fs": "/applis/logs/10824-r3f00",
                "log_was_lv_prez": "lv_przLog01",
                "var_was_fs": "/applis/10824-r3f00",
                "var_was_lv": "lv_prz01",
                "vg_name": "vg_apps"
            },
            "UserGroup": {
                "group": "web",
                "user": "was"
            },
            "certificat": {
                "keyfile": "someFile.kdb",
                "stashfile": "someFile.sth"
            },
            "ihsEnvDev": {
                "webnosecport": "10080",
                "websecport": "10443"
            },
            "serverEnvDev": {
                "admwasprofile": "/apps/WebSphere85S/profiles/svr",
                "authalias": "authAlias",
                "authaliaspassword": "DC%rop7j52s",
                "authaliasuser": "R3FDEV_LINUX",
                "bindingname1": "audit",
                "bindingname2": "log4j",
                "bindingname3": "local",
                "bindingname4": "export",
                "bindingname5": "refogws",
                "bindingnamespace1": "config/audit",
                "bindingnamespace2": "config/log4j",
                "bindingnamespace3": "config/local",
                "bindingnamespace4": "config/export",
                "bindingnamespace5": "config/refogws",
                "bindingvalue1": "file:///applis/10824-r3f/r3fdev01/conf/audit.properties",
                "bindingvalue2": "file:///applis/10824-r3f/r3fdev01/conf/log4j.properties",
                "bindingvalue3": "file:///applis/10824-r3f/r3fdev01/conf/firef.properties",
                "bindingvalue4": "file:///applis/10824-r3f/r3fdev01/conf/export.properties",
                "bindingvalue5": "file:///applis/10824-r3f/r3fdev01/conf/refogws.properties",
                "code_application": "10824-r3f00",
                "datasourcejndi": "jdbc/r3ffrp01",
                "datasourcename": "p10824ap10",
                "maxfiles": "10",
                "maxfilesize": "10",
                "memorymax": "8192",
                "memorymin": "1024",
                "propertiename": "com.ibm.tools.attach.enable",
                "propertiename1": "com.ibm.websphere.servlet.temp.dir",
                "propertiename2": "ebx.home",
                "propertiename3": "ebx.properties",
                "propertiename4": "java.awt.headless",
                "propertiename5": "java.io.tmpdir",
                "propertiename6": "loginpassword.authentication.disabled",
                "propertievalue": "no",
                "propertievalue1": "/applis/10824-r3f/was/sa-10824_r3f-prez-1/temp",
                "propertievalue2": "/applis/10824-r3f/r3fprd01/ebx_home",
                "propertievalue3": "/applis/10824-r3f/r3fprd01/ebx_home/ebx.properties",
                "propertievalue4": "true",
                "propertievalue5": "/applis/10824-r3f/r3fprd01/tmp",
                "propertievalue6": "true",
                "servername": "sa-10824-r3f00-prez-1",
                "startingport": "12060",
                "url": "http://firef-echonet.fr",
                "urlds": "jdbc:oracle:thin:@D-10824-P.fr.net.intra:1521/D10824AP10",
                "urlssl": "https://firef-echonet.fr",
                "virtualhostname": "sa-10824-r3f00-prez-nossl-1_vh",
                "virtualhostnamessl": "sa-10824-r3f00-prez-ssl-1_vh"
            }
        },
        "custom_was": {
            "Application": {
                "codeenv": "10824_r3f00"
            },
            "DirectoryDev": {
                "archives": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/archives",
                "archivesnossl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-nossl-1/archives",
                "archivesssl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1/archives",
                "conf": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/conf",
                "confarchives": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/archives/conf",
                "confarchivesnossl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-nossl-1/archives/conf",
                "confarchivesssl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1/archives/conf",
                "confnossl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-nossl-1/conf",
                "confssl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1/conf",
                "htdocsarchivenossl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-nossl-1/archives/htdocs",
                "htdocsarchivessl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1/archives/htdocs",
                "htdocsnossl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-nossl-1/htdocs",
                "htdocsssl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1/htdocs",
                "ihs": "/applis/10824-r3f00/ihs",
                "installedapps": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/installedApps",
                "installedappsarchives": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/archives/installedApps",
                "jdbc": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/jdbc",
                "jdbcarchives": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/archives/jbdc",
                "lib": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/lib",
                "libarchives": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/archives/lib",
                "libraray": "/home/was/wasLibrary",
                "lognossl": "/applis/logs/10824-r3f00/ihs/sw-10824_r3f00-prez-nossl-1",
                "logssl": "/applis/logs/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1",
                "pluginnossl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-nossl-1/plugin",
                "pluginssl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1/plugin",
                "serverdir": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1",
                "serverlog": "/applis/logs/10824-r3f00/was/sa-10824_r3f00-prez-1",
                "shared": "/applis/10824-r3f00/was/shared",
                "sslarchivesssl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1/archives/ssl",
                "sslssl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1/ss",
                "temp": "/applis/10824-r3f00/was/sa-10824_r3f00-prez-1/temp",
                "tranlog": "/applis/logs/10824-r3f00/was/sa-10824_r3f00-prez-1/tranlog",
                "was": "/applis/10824-r3f00/was",
                "webservernossl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-nossl-1",
                "webserverssl": "/applis/10824-r3f00/ihs/sw-10824_r3f00-prez-ssl-1"
            },
            "FileSystemDev": {
                "fs_size": "2G",
                "log_was_fs": "/applis/logs/10824-r3f00",
                "log_was_lv_prez": "lv_przLog01",
                "var_was_fs": "/applis/10824-r3f00",
                "var_was_lv": "lv_prz01",
                "vg_name": "vg_apps"
            },
            "UserGroup": {
                "group": "web",
                "user": "was"
            },
            "certificat": {
                "keyfile": "someFile.kdb",
                "stashfile": "someFile.sth"
            },
            "ihsEnvDev": {
                "webnosecport": "10080",
                "websecport": "10443"
            },
            "serverEnvDev": {
                "admwasprofile": "/apps/WebSphere85S/profiles/svr",
                "authalias": "authAlias",
                "authaliaspassword": "DC%rop7j52s",
                "authaliasuser": "R3FDEV_LINUX",
                "bindingname1": "audit",
                "bindingname2": "log4j",
                "bindingname3": "local",
                "bindingname4": "export",
                "bindingname5": "refogws",
                "bindingnamespace1": "config/audit",
                "bindingnamespace2": "config/log4j",
                "bindingnamespace3": "config/local",
                "bindingnamespace4": "config/export",
                "bindingnamespace5": "config/refogws",
                "bindingvalue1": "file:///applis/10824-r3f/r3fdev01/conf/audit.properties",
                "bindingvalue2": "file:///applis/10824-r3f/r3fdev01/conf/log4j.properties",
                "bindingvalue3": "file:///applis/10824-r3f/r3fdev01/conf/firef.properties",
                "bindingvalue4": "file:///applis/10824-r3f/r3fdev01/conf/export.properties",
                "bindingvalue5": "file:///applis/10824-r3f/r3fdev01/conf/refogws.properties",
                "code_application": "10824-r3f00",
                "datasourcejndi": "jdbc/r3ffrp01",
                "datasourcename": "p10824ap10",
                "maxfiles": "10",
                "maxfilesize": "10",
                "memorymax": "8192",
                "memorymin": "1024",
                "propertiename": "com.ibm.tools.attach.enable",
                "propertiename1": "com.ibm.websphere.servlet.temp.dir",
                "propertiename2": "ebx.home",
                "propertiename3": "ebx.properties",
                "propertiename4": "java.awt.headless",
                "propertiename5": "java.io.tmpdir",
                "propertiename6": "loginpassword.authentication.disabled",
                "propertievalue": "no",
                "propertievalue1": "/applis/10824-r3f/was/sa-10824_r3f-prez-1/temp",
                "propertievalue2": "/applis/10824-r3f/r3fprd01/ebx_home",
                "propertievalue3": "/applis/10824-r3f/r3fprd01/ebx_home/ebx.properties",
                "propertievalue4": "true",
                "propertievalue5": "/applis/10824-r3f/r3fprd01/tmp",
                "propertievalue6": "true",
                "servername": "sa-10824-r3f00-prez-1",
                "startingport": "12060",
                "url": "http://firef-echonet.fr",
                "urlds": "jdbc:oracle:thin:@D-10824-P.fr.net.intra:1521/D10824AP10",
                "urlssl": "https://firef-echonet.fr",
                "virtualhostname": "sa-10824-r3f00-prez-nossl-1_vh",
                "virtualhostnamessl": "sa-10824-r3f00-prez-ssl-1_vh"
            }
        }
    }
}

PLAY RECAP *********************************************************************
firefDev                   : ok=3    changed=0    unreachable=0    failed=0`

when i run it on AWX,the result is  here : 

no informations in ansible_local.

![result](https://user-images.githubusercontent.com/29118527/32235030-a6b1c888-be5e-11e7-8ad3-a82e0276a663.png)

is there some spécific parameter ?

Vagrant build fails at task EPEL repo installation

TASK [geerlingguy.repo-epel : Install EPEL repo.] ******************************
FAILED - RETRYING: Install EPEL repo. (5 retries left).
FAILED - RETRYING: Install EPEL repo. (4 retries left).
FAILED - RETRYING: Install EPEL repo. (3 retries left).
FAILED - RETRYING: Install EPEL repo. (2 retries left).
FAILED - RETRYING: Install EPEL repo. (1 retries left).
fatal: [awx_container]: FAILED! => {"attempts": 5, "changed": false, "msg": "", "rc": 0, "results": ["epel-release-7-11.noarch providing /tmp/epel-release-latest-7.noarchD25pNv.rpm is already installed"]}
        to retry, use: --limit @/vagrant/prebuild/prebuild.retry

PLAY RECAP *********************************************************************
awx_container              : ok=3    changed=0    unreachable=0    failed=1

Ansible failed to complete successfully. Any error output should be
visible above. Please fix these errors and try again.

Seems it is due to Ansible 2.5:

geerlingguy/ansible-role-repo-epel#26

A PR has been raised, waiting for merge.

Docker Compose inside Vagrant VM is broken

Basically, run sudo pip install docker-compose, then:

# docker-compose up -d
WARNING: Dependency conflict: an older version of the 'docker-py' package may be polluting the namespace. If you're experiencing crashes, run the following command to remedy the issue:
pip uninstall docker-py; pip uninstall docker; pip install docker
Pulling memcached (memcached:alpine)...
Traceback (most recent call last):
  File "/bin/docker-compose", line 11, in <module>
    sys.exit(main())
  File "/usr/lib/python2.7/site-packages/compose/cli/main.py", line 68, in main
    command()
  File "/usr/lib/python2.7/site-packages/compose/cli/main.py", line 121, in perform_command
    handler(command, command_options)
  File "/usr/lib/python2.7/site-packages/compose/cli/main.py", line 938, in up
    scale_override=parse_scale_args(options['--scale']),
  File "/usr/lib/python2.7/site-packages/compose/project.py", line 430, in up
    svc.ensure_image_exists(do_build=do_build)
  File "/usr/lib/python2.7/site-packages/compose/service.py", line 311, in ensure_image_exists
    self.pull()
  File "/usr/lib/python2.7/site-packages/compose/service.py", line 1024, in pull
    output = self.client.pull(repo, tag=tag, stream=True)
  File "/usr/lib/python2.7/site-packages/docker/api/image.py", line 381, in pull
    header = auth.get_config_header(self, registry)
AttributeError: 'module' object has no attribute 'get_config_header'

It looks like it's because docker-py is currently being installed, instead of just docker (via pip). See docker/docker-py#1353 (comment)

Problem when running playbook - Name or service not known

ERROR! Attempted to execute "/usr/lib/python2.7/site-packages/awx/plugins/inventory/awxrest.py" as inventory script: Inventory script (/usr/lib/python2.7/site-packages/awx/plugins/inventory/awxrest.py) had an execution error: HTTPConnectionPool(host='awxweb', port=8052): Max retries exceeded with url: /api/v1/inventories/2/script/?hostvars=1 (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x2220e10>: Failed to establish a new connection: [Errno -2] Name or service not known',))

This error comes when i try and run a template/playbook.

If i create a alias, for the awx_web link, on the awx_task composer, i get this error instead:

fatal: [XXX.dk]: UNREACHABLE! => {"changed": false, "msg": "Failed to connect to the host via ssh: No user exists for uid 1000\r\n", "unreachable": true}

So i added the "user: root" to the awx_task composer, and it ran with success!

My awx_task composer thingy now looks like this (marked with bold what i added/changed):

awx_task:
image: "geerlingguy/awx_task:latest"
links:
- rabbitmq
- memcached
- awx_web:awxweb
- postgres
hostname: awx
user: root
container_name: awx_task
environment:
SECRET_KEY: aabbcc
DATABASE_USER: awx
DATABASE_PASSWORD: awxpass
DATABASE_NAME: awx
DATABASE_PORT: 5432
DATABASE_HOST: postgres
RABBITMQ_USER: guest
RABBITMQ_PASSWORD: guest
RABBITMQ_HOST: rabbitmq
RABBITMQ_PORT: 5672
RABBITMQ_VHOST: awx
MEMCACHED_HOST: memcached
MEMCACHED_PORT: 11211

I could create a pullrequest, but i want to know if anyone else is seeing this error, and if this fix helped.

I get this when try to load my inventory through aws

django.db.utils.OperationalError: could not translate host name "postgres" to address: Name or service not known

I have all of my system configured on the localhost. Everything else is working fine. I can not seem to load any inventory even through git. Inventory does load manually put it.

Can not access the /api path (unable to login)

I don't have a patch yet, but access to the /api path (which is required for logging in) is not available unless you use the hostname that is configured in the /etc/tower/settings.py file.

Fix here is probably to expose a default variable that can allow you to override the hostname. Work around for now is simply to add awxweb to your /etc/hosts file pointing at your servers IP address.

Docker Compose Error

This may be an error with AWX itself, but when I run from docker-compose I receive this error when trying to login:

awx_web_1    | 2017-09-08 20:09:15,047 ERROR    django.request Internal Server Error: /api/v2/auth/
awx_web_1    | Traceback (most recent call last):
awx_web_1    |   File "/var/lib/awx/venv/awx/lib/python2.7/site-packages/django/core/handlers/base.py", line 132, in get_response
awx_web_1    |     response = wrapped_callback(request, *callback_args, **callback_kwargs)
awx_web_1    |   File "/var/lib/awx/venv/awx/lib/python2.7/site-packages/django/utils/decorators.py", line 145, in inner
awx_web_1    |     return func(*args, **kwargs)
awx_web_1    |   File "/var/lib/awx/venv/awx/lib/python2.7/site-packages/django/views/decorators/csrf.py", line 58, in wrapped_view
awx_web_1    |     return view_func(*args, **kwargs)
awx_web_1    |   File "/var/lib/awx/venv/awx/lib/python2.7/site-packages/django/views/generic/base.py", line 71, in view
awx_web_1    |     return self.dispatch(request, *args, **kwargs)
awx_web_1    |   File "/usr/lib/python2.7/site-packages/awx/api/generics.py", line 246, in dispatch
awx_web_1    |     return super(APIView, self).dispatch(request, *args, **kwargs)
awx_web_1    |   File "/var/lib/awx/venv/awx/lib/python2.7/site-packages/rest_framework/views.py", line 449, in dispatch
awx_web_1    |     request = self.initialize_request(request, *args, **kwargs)
awx_web_1    |   File "/usr/lib/python2.7/site-packages/awx/api/generics.py", line 107, in initialize_request
awx_web_1    |     settings.PROXY_IP_WHITELIST,
awx_web_1    |   File "/var/lib/awx/venv/awx/lib/python2.7/site-packages/django/conf/__init__.py", line 49, in __getattr__
awx_web_1    |     return getattr(self._wrapped, name)
awx_web_1    |   File "/usr/lib/python2.7/site-packages/awx/conf/settings.py", line 386, in __getattr__
awx_web_1    |     value = self._get_local(name)
awx_web_1    |   File "/usr/lib64/python2.7/contextlib.py", line 35, in __exit__
awx_web_1    |     self.gen.throw(type, value, traceback)
awx_web_1    |   File "/usr/lib/python2.7/site-packages/awx/conf/settings.py", line 64, in _log_database_error
awx_web_1    |     if get_tower_migration_version() < '310':
awx_web_1    |   File "/usr/lib/python2.7/site-packages/awx/main/utils/db.py", line 13, in get_tower_migration_version
awx_web_1    |     loader = MigrationLoader(connection, ignore_no_migrations=True)
awx_web_1    |   File "/var/lib/awx/venv/awx/lib/python2.7/site-packages/django/db/migrations/loader.py", line 47, in __init__
awx_web_1    |     self.build_graph()
awx_web_1    |   File "/var/lib/awx/venv/awx/lib/python2.7/site-packages/django/db/migrations/loader.py", line 191, in build_graph
awx_web_1    |     self.applied_migrations = recorder.applied_migrations()
awx_web_1    |   File "/var/lib/awx/venv/awx/lib/python2.7/site-packages/django/db/migrations/recorder.py", line 59, in applied_migrations
awx_web_1    |     self.ensure_schema()
awx_web_1    |   File "/var/lib/awx/venv/awx/lib/python2.7/site-packages/django/db/migrations/recorder.py", line 49, in ensure_schema
awx_web_1    |     if self.Migration._meta.db_table in self.connection.introspection.table_names(self.connection.cursor()):
awx_web_1    |   File "/var/lib/awx/venv/awx/lib/python2.7/site-packages/django/db/backends/base/introspection.py", line 58, in table_names
awx_web_1    |     return get_names(cursor)
awx_web_1    |   File "/var/lib/awx/venv/awx/lib/python2.7/site-packages/django/db/backends/base/introspection.py", line 53, in get_names
awx_web_1    |     return sorted(ti.name for ti in self.get_table_list(cursor)
awx_web_1    |   File "/var/lib/awx/venv/awx/lib/python2.7/site-packages/django/db/backends/postgresql_psycopg2/introspection.py", line 53, in get_table_list
awx_web_1    |     AND pg_catalog.pg_table_is_visible(c.oid)""")
awx_web_1    |   File "/var/lib/awx/venv/awx/lib/python2.7/site-packages/django/db/backends/utils.py", line 64, in execute
awx_web_1    |     return self.cursor.execute(sql, params)
awx_web_1    |   File "/var/lib/awx/venv/awx/lib/python2.7/site-packages/django/db/utils.py", line 98, in __exit__
awx_web_1    |     six.reraise(dj_exc_type, dj_exc_value, traceback)
awx_web_1    |   File "/var/lib/awx/venv/awx/lib/python2.7/site-packages/django/db/backends/utils.py", line 62, in execute
awx_web_1    |     return self.cursor.execute(sql)
awx_web_1    | InternalError: current transaction is aborted, commands ignored until end of transaction block

Not sure where this is coming from. I noticed in the config for AWX port is set to 80 but it is actually serving on port 8052 and then being mapped to 80 with compose. Is this correct or related?

I'm also seeing this quite a bit in the database migrations

ERROR:  relation "conf_setting" does not exist at character 158

Thanks

Point out availability of official AWX web and task containers

During and after AnsibleFest SF 2017, I started this project and wanted to quickly get a web and task container up on Docker Hub since there weren't officially-built/supported ones from Ansible.

However, there now exist official images which are also automated builds that sync up with AWX official releases:

At a minimum, this project should point out the existence of these official images in the README; but going further, while this project's goal is to be able to create the web and task containers fully custom following the more elaborate AWX install process, the included docker-compose.yml (highlighted in the README and in some highly-trafficed blog posts) is intended to give a simple and quick AWX environment for demo purposes.

It might be best to pin to the latest official images for the docker-compose.yml. Then I can either deprecate the geerlingguy/ images, or figure out some way of making them maintainable via this project; see #1.

Provided docker-compose errors indefinitely if postgres startup is slow

Issue: Indefinite error loop when awx apps are ready before postgres.
Workaround: Start postgres container, check log that postgres is ready for connections, then start up awx containers.

I downloaded the provided docker-compose script and ran command:
docker-compose up -d

Tailing the awx_task container, I get the below error message.

django.db.utils.ProgrammingError:

 relation "main_schedule" does not exist
LINE 1: ...le"."next_run", "main_schedule"."extra_data" FROM "main_sche...
                                                             ^

2017-09-15 19:57:27,432 INFO exited: celery (exit status 1; not expected)
[2017-09-15 19:57:28,234: INFO/MainProcess] Scheduler: Sending due task task_manager (awx.main.scheduler.tasks.run_task_manager)
2017-09-15 19:57:28,237 INFO spawned: 'celery' with pid 3421
[2017-09-15 19:57:28,238: DEBUG/MainProcess] awx.main.scheduler.tasks.run_task_manager sent. id->a3ddb65d-21c8-49f9-ba6c-6298e2ce67b6
[2017-09-15 19:57:28,239: DEBUG/MainProcess] beat: Waking up in 19.90 seconds.
2017-09-15 19:57:29,241 INFO success: celery entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
2017-09-15 19:57:30,706 WARNING  py.warnings /var/lib/awx/venv/awx/lib/python2.7/site-packages/django/core/management/base.py:260: RemovedInDjango19Warning: "requires_model_validation" is deprecated in favor of "requires_system_checks".
  RemovedInDjango19Warning)

2017-09-15 19:57:30,749 INFO     awx.main.tasks Syncing Schedules
2017-09-15 19:57:30,749 INFO     awx.main.tasks Syncing Schedules
Traceback (most recent call last):
  File "/usr/bin/awx-manage", line 9, in <module>
    load_entry_point('awx==1.0.0.312', 'console_scripts', 'awx-manage')()
  File "/usr/lib/python2.7/site-packages/awx/__init__.py", line 107, in manage
    execute_from_command_line(sys.argv)
...
  File "/var/lib/awx/venv/awx/lib/python2.7/site-packages/django/db/backends/utils.py", line 64, in execute
    return self.cursor.execute(sql, params)
django.db.utils.ProgrammingError: relation "main_schedule" does not exist
LINE 1: ...le"."next_run", "main_schedule"."extra_data" FROM "main_sche...
                                                             ^

I'm not sure if this should be considered a bug, but I'll report in case others run into the same issue.

docker-compose

after following the instructions

curl -O https://raw.githubusercontent.com/geerlingguy/awx-container/master/docker-compose.yml
docker-compose up -d

I get this message and it never goes:
AWX is currently upgrading or installing, this page will refresh when done.

awx_web running as uid 1000 but should be 0?

Hi

First, thanks for doing this, i was making this my self but you beat me to it :)

Problem:
I could not access the webinterface, eventhough the migration was complete.

When looking at the logs for awxweb, i could see:

nginx: [alert] could not open error log file: open() "/var/log/nginx/error.log" failed (13: Permission denied)
2017/09/09 14:06:35 [emerg] 81#0: mkdir() "/var/lib/nginx/tmp/client_body" failed (13: Permission denied)

And supervisor log:

2017-09-09 14:06:35,762 INFO exited: nginx (exit status 1; not expected)
2017-09-09 14:06:36,770 INFO gave up: nginx entered FATAL state, too many start retries too quickly

And i could see that its running under the UID 1000, eventhough root is set to UID 0 in /etc/passwd

If i added: "user: root" to the docker-compose file, under awxweb, it runs fine.
And i didt have this issue when i created the images from scratch, from the official awx repo

I also tested it on another docker instance, same issue
Both running Debian 8 ( jessie )

Have you run into this issue?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.