This issue was originally opened by @SteveTalbot as hashicorp/packer#6347. It was migrated here as a result of the Packer plugin split. The original body of the issue is below.
Affected version: Packer 1.2.4
Host platform: Ubuntu 16.04 LTS, Jenkins slave
Builder: amazon-chroot
Provisioner: ansible-local
with Ansible v2.5.4.0
This was originally raised as issue hashicorp/packer#5335. A software bug relating to escaping of special characters was fixed as a result of hashicorp/packer#5335, and this ticket captures the remaining incompatibility between amazon-chroot and ansible-local. The problem is entirely with documentation, not the Packer software itself.
The following Packer configuration causes Ansible to fail with the error "provided hosts list is empty, only localhost is available" when used with the Amazon chroot builder.
{
"builders": [
{
"type": "amazon-chroot",
"name": "amazon-chroot-ansible-local",
"ami_name": "packer-bug-demo/amazon-chroot/ansible-local",
"ami_description": "Demo to reproduce Packer issue; based on Ubuntu 16.04 LTS (HVM)",
"source_ami": "ami-58d7e821",
"chroot_mounts": [
["proc", "proc", "/proc"],
["sysfs", "sysfs", "/sys"],
["bind", "/dev", "/dev"],
["bind", "/dev/pts", "/dev/pts"],
["bind", "/dev/shm", "/dev/shm"],
["binfmt_misc", "binfmt_misc", "/proc/sys/fs/binfmt_misc"]
],
"command_wrapper": "sudo {{.Command}}",
"copy_files": [
"/etc/resolv.conf"
],
"ena_support": true,
"force_deregister": true,
"sriov_support": true
},
],
"provisioners": [
{
"type": "shell",
"execute_command": "chmod +x {{.Path}}; {{.Vars}} sudo -E sh \"{{.Path}}\"",
"script": "{{template_dir}}/chroot/pre.sh"
},
{
"type": "ansible-local",
"command": "ANSIBLE_FORCE_COLOR=1 PYTHONUNBUFFERED=1 ANSIBLE_LOCAL_TEMP=/tmp/ansible ANSIBLE_REMOTE_TEMP=/tmp/ansible-managed ANSIBLE_ROLES_PATH=/tmp/packer-provisioner-ansible-local/galaxy_roles:/etc/ansible/roles ansible-playbook",
"playbook_dir": "{{template_dir}}/ansible",
"playbook_file": "{{template_dir}}/ansible/playbook.yml",
"staging_directory": "/tmp/packer-provisioner-ansible-local",
"extra_arguments": [
"--tags=install,package",
"-vvv"
]
},
{
"type": "shell",
"execute_command": "chmod +x {{.Path}}; {{.Vars}} sudo -E sh \"{{.Path}}\"",
"script": "{{template_dir}}/chroot/post.sh"
},
{
"type": "shell-local",
"command": "sleep 5"
}
]
}
The pre.sh
script prevents services from being started within the chroot and installs Ansible and its dependencies. The posh.sh
script cleans up afterwards. But otherwise you don't need to worry about what they're doing for the purpose of this example.
Packer runs a command that looks like the following:
cd /tmp/packer-provisioner-ansible-local && ANSIBLE_FORCE_COLOR=1 PYTHONUNBUFFERED=1 ANSIBLE_LOCAL_TEMP=/tmp/ansible ANSIBLE_REMOTE_TEMP=/tmp/ansible-managed ANSIBLE_ROLES_PATH=/tmp/packer-provisioner-ansible-local/galaxy_roles:/etc/ansible/roles ansible-playbook /tmp/packer-provisioner-ansible-local/playbook.yml --extra-vars "packer_build_name=amazon-chroot-ansible-local packer_builder_type=amazon-chroot packer_http_addr=" --tags=install,package -vvv -c local -i /tmp/packer-provisioner-ansible-local/packer-provisioner-ansible-local117135750
The reason for the failure is that Packer's ansible-local provisioner builds an inventory file containing localhost (an "implicit localhost"), rather than specifying -i localhost,
on the ansible-playbook command line (an "explicit localhost"). Most recent versions of Ansible do not include an implicit localhost in the "all" hosts group, so none of the playbook tasks get run.
Firstly, the Packer documentation does say "Building within a chroot (e.g. amazon-chroot) requires changing the Ansible connection to chroot", but it says this on the documentation page for the Ansible remote provisioner. It would be helpful if this were included with the "gotchas" documentation for the amazon-chroot builder and also with the documentation for the Ansible local provisioner.
Secondly, for the benefit of anyone who might stumble across this ticket in future, it is possible to use Ansible in local mode with a chroot, but you need to:
- Use a file provisioner to copy the playbook into the chroot
- Manually construct an inventory file
- Use a file provisioner to copy the inventory file to
/etc/ansible/hosts
in the chroot
- Use a shell provisioner to execute Ansible in local mode within the chroot
So an example working configuration might look like:
"provisioners": [
{
"type": "shell",
"execute_command": "chmod +x {{.Path}}; {{.Vars}} sudo -E sh \"{{.Path}}\"",
"script": "{{template_dir}}/chroot/pre.sh"
},
{
"type": "file",
"source": "{{template_dir}}/ansible",
"destination": "/etc/ansible/local-playbook"
},
{
"type": "file",
"source": "{{template_dir}}/ansible/inventories/local.ini",
"destination": "/etc/ansible/hosts"
},
{
"type": "shell",
"inline": [
"sudo -E ANSIBLE_FORCE_COLOR=1 PYTHONUNBUFFERED=1 ANSIBLE_LOCAL_TEMP=/tmp/ansible ANSIBLE_REMOTE_TEMP=/tmp/ansible-managed ANSIBLE_ROLES_PATH=/etc/ansible/local-playbook/galaxy_roles:/etc/ansible/roles ansible-playbook --extra-vars \"packer_build_name={{build_name}} packer_builder_type={{build_type}}\" --tags=install,package -vvv /etc/ansible/local-playbook/playbook.yml"
]
},
]
We have often found it necessary to specify the ANSIBLE_LOCAL_TEMP and ANSIBLE_REMOTE_TEMP environment variables as part of the ansible-playbook command, but your mileage may vary.
Theoretically it would be possible to add an option to the ansible-local provisioner to specify the location of the generated inventory file within the chroot (it would have to be /etc/ansible/hosts
), but as per the documentation, it is probably better to use the ansible provisioner in chroot connection mode instead.
So in summary, a couple of additions to the amazon-chroot and ansible-local documentation would help stop people getting stuck with the same problem I did.