Code Monkey home page Code Monkey logo

packer-plugin-cnspec's People

Contributors

benr avatar charlesjohnson avatar chris-rock avatar czunker avatar dependabot[bot] avatar ericreeves avatar github-actions[bot] avatar imilchev avatar mariuskimmina avatar misterpantz avatar nywilken avatar preslavgerchev avatar scottford-io avatar tas50 avatar username-is-already-taken2 avatar vjeffrey avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

packer-plugin-cnspec's Issues

Assets being overwritten. What determines unique Assets?

Describe the bug
Assets in Mondoo Inventory are being overwritten each time I run a build. I am setting asset_name uniquely, and my assumption is that assets are identified based off of the asset name?

To Reproduce
Steps to reproduce the behavior:

  1. Create a Packer file (similar setup to these packer examples: https://github.com/vmware-samples/packer-examples-for-vsphere/blob/develop/builds/windows/server/2022/windows-server.pkr.hcl)
  2. Add cnspec provisioner, setting asset_name to the vm name + build_date
  3. Run build for Windows Server 2022 Datacenter. Asset shows up in Mondoo Console as packer-updated-windows-server-2022-datacenter-2024-04-30 14:27 UTC
  4. Run build for Windows Server 2019 Datacenter. Asset shows up in Mondoo Console as packer-updated-windows-server-2019-datacenter-2024-04-30 14:38 UTC. This asset has the same inventory ID in the URL of the console. The 2022 one is overwritten
locals {
build_date                 = formatdate("YYYY-MM-DD hh:mm ZZZ", timestamp())

...

vm_name_datacenter_updated      = "packer-updated-${var.vm_guest_os_family}-${var.vm_guest_os_name}-${var.vm_guest_os_version}-${var.vm_guest_os_edition_datacenter}"

}

...

build {
  provisioner "cnspec" {
    only = ["vsphere-clone.windows-server-datacenter-update"]
    winrm_user         = var.build_username
    winrm_password     = var.build_password
    asset_name         = "${local.vm_name_datacenter_updated}-${local.build_date}"
    host_alias         = "${local.vm_name_datacenter_updated}-${local.build_date}"
    on_failure         = "continue"
    score_threshold    = 85
    mondoo_config_path = "${path.cwd}/scripts/mondoo.json"
  }
}

Expected behavior
Because the Asset name is being set uniquely, including timestamp, I would expect a new Asset to show up in Mondoo Inventory on each build.

Screenshots or CLI Output
image
image
image
image

Additional context
Provisioner version 11.1.1

Cannot detect custom `policybundle`

I'm trying to write some basic custom checks that will run as part of my packer build. Following the policy author guide i basically just copy / pasted the file into a dir just to see if it would run...

$ tree
.
|-- README.md
|-- centos
|   |-- centos.pkr.hcl
|   `-- variables.pkr.hcl
|-- cnspec-tests
|   `-- buildtime_check.mql.yaml

# packer run command im using
$ packer build ./centos

provisioner definition:

 provisioner "cnspec" {
    policybundle = "${path.root}/../cnspec-tests/" 
    # also tried w/o a trailing slash
    # also tried with file name like:
    # "${path.root}/../cnspec-tests/buildtime_check.mql.yaml" 
   
    score_threshold = 1
  }

Output:

==> azure-arm.centos: Running cnspec packer provisioner by Mondoo (Version: 8.0.1, Build: 0f7366c)
    azure-arm.centos: detected packer build via ssh
    azure-arm.centos: load policy bundle from: ./centos/../cnspec-tests/
    azure-arm.centos: no configuration provided
    azure-arm.centos: scan packer build in incognito mode
    azure-arm.centos: scan completed successfully
    azure-arm.centos: Asset: pkrvmycnnkad0fq
    azure-arm.centos: ----------------------
    azure-arm.centos: 
    azure-arm.centos: error: bundle doesn't contain any policies
    azure-arm.centos: 
    azure-arm.centos: The following policies are available:
    azure-arm.centos: - 
    azure-arm.centos: 
    azure-arm.centos: User selected policies that are allowed to run:
    azure-arm.centos: -

even though i posted the tree above, just to show that the file is where cnspec is looking

$ ll ./centos/../cnspec-tests/
-rw-r--r--  1 root root 725 Apr 10 18:11 buildtime_check.mql.yaml

plugin cannot gather the full report

Describe the bug
Different error messages while running the plugin:

failed to handshake
could not gather the full report

To Reproduce
Steps to reproduce the behavior:

  1. Go to '...'
  2. Select '....'
  3. Scroll down to '....'
  4. Note the error

Expected behavior
This shouldn't error.

Desktop (please complete the following information):

  • packer-plugin-cnspec v9.2.4

Additional context
It works with packer-plugin-cnspec v9.2.2.

asset-name configuration is not sent back to Mondoo Platform

Describe the bug
When the cnspec provisioner is connected to Mondoo Platform, the asset-name configuration is not sent back correctly to Mondoo Platform.

image

To Reproduce
Steps to reproduce the behavior:

In the packer build template add the following configuration ...

  provisioner "cnspec" {
    on_failure = "continue"
    asset_name = "test-name"
    sudo {
      active = true
    }
    annotations = {
      Source_AMI    = "Foo"
      Creation_Date = "Bar"
    }
  }
}

Expected behavior
A clear and concise description of what you expected to happen.

Screenshots or CLI Output
If applicable, add screenshots or the CLI output to help explain your problem.

Desktop (please complete the following information):

  • OS: [e.g. macOS]
  • OS Version: [e.g. 13.0]
  • Browser if applicable: [e.g. Chrome, Firefox]
  • Browser Version: [e.g. 106]

Additional context
Add any other context about the problem here.

Errors with sshd.config.params checks

Describe the bug
When using the cnspec packer plugin, which loads the os provider, versions >11.0.2 error when checking any sshd.config.params.
This does not cause errors with the plugin or the packer run, but the mondoo console is displaying the following for all checks using sshd.config.params, when it previously displayed true/false values:

[Query Error]
1 error occurred:
	* not implemented

To Reproduce
Steps to reproduce the behavior:

  1. Using cnspec plugin in a packer build (tested on cnspec 11.1.0 and 11.1.1)
  2. Build for Rocky Linux using the proxmox or vsphere builder, packer hcl based on this example (issues noticed on both)
  3. Allow the cnspec plugin to pull the OS provider >11.0.2 (issues noticed on OS provider 11.0.4 and 11.1.1)
  4. Have a policy assigned that uses sshd.config.params to check for compliance (mondoo-linux-security)
  5. Review the plugin output or mondoo console to see the errors on SSH related policy checks

Expected behavior
The cnspec plugin should have no issue collecting info on sshd.config.params and reporting the results back to the mondoo console. Instead of seeing errors in the plugin output or mondoo console, you should see pass/fail statuses.

Screenshots or CLI Output
Attached output of two runs of the same packer build, one with the os provisioner on 11.0.2 where the sshd.config.params are collected, and one with the os provisioner on 11.1.1 where the sshd.config.params are not collected.
These builds are done on the same commit of the repo, no changes were made to the packer build HCL.
I removed irrelevant failed/passed checks that are working and match between the two logs, so you will only see the erroring checks.
working run.txt
issue_run.txt

Here's what I see in the mondoo console for these checks that rely on sshd.config.params:
2024-05-06-214939_hyprshot

Desktop (please complete the following information):

  • OS: Proxmox / vCenter /(packer alpine docker container)
  • OS Version: 8.2.2 / 7.0.3 /(packer alpine 1.10.2)

Additional context
I believe this to be an issue with the OS provider on mondoo's side given the issue is happening across cnspec versions, packer builders, hypervisors, with the only variable being the OS provider version.

Cannot get cnspec provisioner to fail under any circumstances

Describe the bug
As I test this tool I'm trying to get it to fail on any check and validate the image is not published. Looking through the source code it seems like you only check for on_failure to be continue or not. Setting on_failure to anything else doesnt seem to fail the build under any combinations i have tried...

This tool is new to me so its possible I'm missing something easy

Example of a few checks that are failing in my test:

    azure-arm.centos: ✕ Fail:  F   0  Ensure source routed packets are not accepted
    azure-arm.centos: ! Error: Ensure SSH X11 forwarding is disabled
    azure-arm.centos: ✓ Pass:  A 100  Ensure NIS server is stopped and not enabled
    azure-arm.centos: ✕ Fail:  D  20  Ensure session initiation information is collected

Combinations I've tried:

  provisioner "cnspec" {
    on_failure      = "fail"
    score_threshold = 1
  }
  provisioner "cnspec" {
    on_failure      = "fail"
    score_threshold = 100
  }

Expected behavior
Packer provisioner cnspec should fail, the build should fail, then clean up resources - not publishing any image

However, I'm still getting images published:
image

Mondoo for vSphere builder ignores/not using communicator SSH password?

Description

Hello, Mondoo community!

The Mondoo Packer provisioner seems to ignore the SSH communicator password - all other provisions, like Shell, work as expected.

I've tried to look at the other available options for the plugin, but I do not see anything relative to the SSH password (I do see options for SSH keys).

If I manually run a scan using mondoo scan ssh with -p option, it connects just fine.

I am wondering if any of you lot have come across this behaviour. Pardon me if I missed somewhere that only SSH keys are possible with the plugin.

Many thanks,

Daniel

Additional context

  • Workstation OS: macOS Monterey 12.6
  • Image OS: Ubuntu Server Core 22.04 LTS
  • vSphere: 6.7u3
  • Packer:
    • Version: 1.8.3
    • Builder: vSphere-ISO
    • Plugins:
      • vsphere @ 1.0.8
      • mondoo @ 0.5.1

Screenshots

image

Packer build output

image

Packer build output Mondoo set with debug.

Shared information variables do not work with annotations

Describe the bug
Packer provides special shared information variables for packer templates to populate the values of specific configuration such as:

BuildRegion - The region (for example eu-central-1) where Packer is building the AMI.
SourceAMI - The source AMI ID (for example ami-a2412fcd) used to build the AMI.
SourceAMICreationDate - The source AMI creation date (for example "2020-05-14T19:26:34.000Z").
SourceAMIName - The source AMI Name (for example ubuntu/images/ebs-ssd/ubuntu-xenial-16.04-amd64-server-20180306) used to build the AMI.
SourceAMIOwner - The source AMI owner ID.
SourceAMIOwnerName - The source AMI owner alias/name (for example amazon).

These values are not rendered when passed to the cnspec provisioner annotations:

image

image

To Reproduce
Steps to reproduce the behavior:

  1. Go to '...'
  2. Select '....'
  3. Scroll down to '....'
  4. Note the error

Expected behavior
A clear and concise description of what you expected to happen.

Screenshots or CLI Output
If applicable, add screenshots or the CLI output to help explain your problem.

Desktop (please complete the following information):

  • OS: [e.g. macOS]
  • OS Version: [e.g. 13.0]
  • Browser if applicable: [e.g. Chrome, Firefox]
  • Browser Version: [e.g. 106]

Additional context
Add any other context about the problem here.

Register provisioner documenation on Packer.io

Hey there, your friendly Packer maintainer requesting that you add the provisioner documentation to the Packer.io site under https://packer.io/plugins/

We have a set of automated processes setup to help external plugins create and deploy their documentation. To start you would want to take a look at the Registering Documentation on Packer.io documentation, which requires a docs/ directory at the top-level directory of the plugin.

Once you have the docs directory along with the provisioner documentation in place you can update the release configuration to generate the zip file needed for ingesting the documentation in Packer.io.

Quickly looking at the configuration for the provisioner I see that there are a number of configuration options. We have a tool as part of the packer-sdc command that you can use to automatically generate partial markdown files from the comments of each of the configurations.

Here's a link to the Ansible provisioner source where we include the go generate comment for generating markdown files from the configuration comments. Once that's all in place and the partial files are generated you can reference the partials as includes in the markdown files under docs/ which upon release can be merged together to create the final documentation files.

I recommend you take a look at the Ansible documentation directory and files as an example to get a better gist of how all the pieces work together.

As a last step, once all the docs are ready to go and released alongside the plugin binaries you can open a PR similar to this one to complete the registration process.

Please feel free to reach out if you have questions. Cheers!

Feature: Ability to write cnspec results to a file for pipeline artifacts

Is your feature request related to a problem? Please describe.
I'm trying to showcase how automated scanning can be done in pipelines. Mondoo saas is 1 option and it looks pretty great! However, its also possible to use the CLI to produce outputs. It would be nice to have that as an option in packer.

The implementation may actually be a packer post_processor but im not a packer dev so im not completely sure...

  provisioner "cnspec" {
    on_failure      = "continue"
    score_threshold = 100
    output =  "json"
    output_file = "cnspec-test-results.json"
  }

OR

post_processor "cnspec-output" {
  output_file = "cnspec-test-results.json"
}

Runtime error scanning windows azure vm with v9

Describe the bug
A panic runtime error is produced when trying to scan a windows azure image with version > 9.

To Reproduce
Steps to reproduce the behavior:

Deploy a windows azure image and scan it with cnspec version > 9.

Expected behavior
Scan ends successfully and doesn't output panic runtime errors.

Screenshots or CLI Output

==> azure-arm.windows: Waiting for WinRM to become available...
2024/01/26 07:03:15 packer-plugin-azure_v2.0.2_x5.0_linux_amd64 plugin: 2024/01/26 07:03:15 [INFO] Attempting WinRM connection...
2024/01/26 07:03:15 packer-plugin-azure_v2.0.2_x5.0_linux_amd64 plugin: 2024/01/26 07:03:15 [DEBUG] connecting to remote shell using WinRM
2024/01/26 07:03:28 packer-plugin-azure_v2.0.2_x5.0_linux_amd64 plugin: 2024/01/26 07:03:28 Checking that WinRM is connected with: 'powershell.exe -EncodedCommand JABQAHIAbwBnAHIAZQBzAHMAUAByAGUAZgBlAHIAZQBuAGMAZQAgAD0AIAAnAFMAaQBsAGUAbgB0AGwAeQBDAG8AbgB0AGkAbgB1AGUAJwA7AGkAZgAgACgAVABlAHMAdAAtAFAAYQB0AGgAIAB2AGEAcgBpAGEAYgBsAGUAOgBnAGwAbwBiAGEAbAA6AFAAcgBvAGcAcgBlAHMAcwBQAHIAZQBmAGUAcgBlAG4AYwBlACkAewAkAFAAcgBvAGcAcgBlAHMAcwBQAHIAZQBmAGUAcgBlAG4AYwBlAD0AJwBTAGkAbABlAG4AdABsAHkAQwBvAG4AdABpAG4AdQBlACcAfQA7ACAAZQBjAGgAbwAgACIAVwBpAG4AUgBNACAAYwBvAG4AbgBlAGMAdABlAGQALgAiAA=='
2024/01/26 07:03:28 packer-plugin-azure_v2.0.2_x5.0_linux_amd64 plugin: 2024/01/26 07:03:28 [INFO] starting remote command: powershell.exe -EncodedCommand JABQAHIAbwBnAHIAZQBzAHMAUAByAGUAZgBlAHIAZQBuAGMAZQAgAD0AIAAnAFMAaQBsAGUAbgB0AGwAeQBDAG8AbgB0AGkAbgB1AGUAJwA7AGkAZgAgACgAVABlAHMAdAAtAFAAYQB0AGgAIAB2AGEAcgBpAGEAYgBsAGUAOgBnAGwAbwBiAGEAbAA6AFAAcgBvAGcAcgBlAHMAcwBQAHIAZQBmAGUAcgBlAG4AYwBlACkAewAkAFAAcgBvAGcAcgBlAHMAcwBQAHIAZQBmAGUAcgBlAG4AYwBlAD0AJwBTAGkAbABlAG4AdABsAHkAQwBvAG4AdABpAG4AdQBlACcAfQA7ACAAZQBjAGgAbwAgACIAVwBpAG4AUgBNACAAYwBvAG4AbgBlAGMAdABlAGQALgAiAA==
    azure-arm.windows: WinRM connected.
2024/01/26 07:03:38 packer-plugin-azure_v2.0.2_x5.0_linux_amd64 plugin: 2024/01/26 07:03:38 [INFO] command 'powershell.exe -EncodedCommand JABQAHIAbwBnAHIAZQBzAHMAUAByAGUAZgBlAHIAZQBuAGMAZQAgAD0AIAAnAFMAaQBsAGUAbgB0AGwAeQBDAG8AbgB0AGkAbgB1AGUAJwA7AGkAZgAgACgAVABlAHMAdAAtAFAAYQB0AGgAIAB2AGEAcgBpAGEAYgBsAGUAOgBnAGwAbwBiAGEAbAA6AFAAcgBvAGcAcgBlAHMAcwBQAHIAZQBmAGUAcgBlAG4AYwBlACkAewAkAFAAcgBvAGcAcgBlAHMAcwBQAHIAZQBmAGUAcgBlAG4AYwBlAD0AJwBTAGkAbABlAG4AdABsAHkAQwBvAG4AdABpAG4AdQBlACcAfQA7ACAAZQBjAGgAbwAgACIAVwBpAG4AUgBNACAAYwBvAG4AbgBlAGMAdABlAGQALgAiAA==' exited with code: 0
2024/01/26 07:03:38 packer-plugin-azure_v2.0.2_x5.0_linux_amd64 plugin: 2024/01/26 07:03:38 Connected to machine
==> azure-arm.windows: Connected to WinRM!
2024/01/26 07:03:38 packer-plugin-azure_v2.0.2_x5.0_linux_amd64 plugin: 2024/01/26 07:03:38 Running the provision hook
2024/01/26 07:03:38 [INFO] (telemetry) Starting provisioner cnspec
==> azure-arm.windows: Running cnspec packer provisioner by Mondoo (Version: 9.14.0, Build: 3105bc6)
    azure-arm.windows: detected packer build via winrm
    azure-arm.windows: load config from detected MONDOO_CONFIG_BASE64
    azure-arm.windows: using service account credentials
    azure-arm.windows: scan packer build
==> azure-arm.windows: Provisioning step had errors: Running the cleanup provisioner, if present...
2024/01/26 07:03:38 packer-plugin-cnspec_v9.14.0_x5.0_linux_amd64 plugin: goroutine 26 [running]:
2024/01/26 07:03:38 packer-plugin-cnspec_v9.14.0_x5.0_linux_amd64 plugin: go.mondoo.com/cnquery/v9/providers.(*Runtime).DetectProvider(0x3087c20?, 0xc000126000?)
2024/01/26 07:03:38 packer-plugin-cnspec_v9.14.0_x5.0_linux_amd64 plugin: 	/home/runner/go/pkg/mod/go.mondoo.com/cnquery/[email protected]/providers/runtime.go:172 +0x29
2024/01/26 07:03:38 packer-plugin-cnspec_v9.14.0_x5.0_linux_amd64 plugin: go.mondoo.com/cnquery/v9/providers.(*coordinator).RuntimeFor(0x3087c20, 0xc000460280, 0x0?)
2024/01/26 07:03:38 packer-plugin-cnspec_v9.14.0_x5.0_linux_amd64 plugin: 	/home/runner/go/pkg/mod/go.mondoo.com/cnquery/[email protected]/providers/coordinator.go:383 +0x165
2024/01/26 07:03:38 packer-plugin-cnspec_v9.14.0_x5.0_linux_amd64 plugin: go.mondoo.com/cnspec/v9/policy/scan.createAssetCandidateList({0x2265e78, 0x30b9a60}, 0x0?, 0xc00041f200, {0x2267420, 0x30b9a60})
2024/01/26 07:03:38 packer-plugin-cnspec_v9.14.0_x5.0_linux_amd64 plugin: 	/home/runner/go/pkg/mod/go.mondoo.com/cnspec/[email protected]/policy/scan/local_scanner.go:238 +0x30a
2024/01/26 07:03:38 packer-plugin-cnspec_v9.14.0_x5.0_linux_amd64 plugin: go.mondoo.com/cnspec/v9/policy/scan.(*LocalScanner).distributeJob(0xc000772380, 0xc000772230, {0x2265e78?, 0x30b9a60}, 0xc00041f200)
2024/01/26 07:03:38 packer-plugin-cnspec_v9.14.0_x5.0_linux_amd64 plugin: 	/home/runner/go/pkg/mod/go.mondoo.com/cnspec/[email protected]/policy/scan/local_scanner.go:304 +0x45c
2024/01/26 07:03:38 packer-plugin-cnspec_v9.14.0_x5.0_linux_amd64 plugin: go.mondoo.com/cnspec/v9/policy/scan.(*LocalScanner).Run(0xc000772380, {0x2265e78, 0x30b9a60}, 0xc000772230)
2024/01/26 07:03:38 packer-plugin-cnspec_v9.14.0_x5.0_linux_amd64 plugin: 	/home/runner/go/pkg/mod/go.mondoo.com/cnspec/[email protected]/policy/scan/local_scanner.go:164 +0x127
2024/01/26 07:03:38 packer-plugin-cnspec_v9.14.0_x5.0_linux_amd64 plugin: go.mondoo.com/packer-plugin-cnspec/provisioner.(*Provisioner).executeCnspec(0xc000698900, {0x2268938, 0xc0004b98f0}, {0xc000698b00?, 0xc00079e908?})
2024/01/26 07:03:38 packer-plugin-cnspec_v9.14.0_x5.0_linux_amd64 plugin: 	/home/runner/_work/packer-plugin-cnspec/packer-plugin-cnspec/provisioner/provisioner.go:559 +0x1e8e
2024/01/26 07:03:38 packer-plugin-cnspec_v9.14.0_x5.0_linux_amd64 plugin: go.mondoo.com/packer-plugin-cnspec/provisioner.(*Provisioner).Provision(0xc000698900, {0x30b9a60?, 0x0?}, {0x2268938, 0xc0004b98f0}, {0x2267460?, 0xc00079b7e0}, 0xc0004b9740)
2024/01/26 07:03:38 packer-plugin-cnspec_v9.14.0_x5.0_linux_amd64 plugin: 	/home/runner/_work/packer-plugin-cnspec/packer-plugin-cnspec/provisioner/provisioner.go:300 +0x7b9
2024/01/26 07:03:38 packer-plugin-cnspec_v9.14.0_x5.0_linux_amd64 plugin: github.com/hashicorp/packer-plugin-sdk/rpc.(*ProvisionerServer).Provision(0xc00071af80, 0xc000467ce0, 0x1?)
2024/01/26 07:03:38 packer-plugin-cnspec_v9.14.0_x5.0_linux_amd64 plugin: 	/home/runner/go/pkg/mod/github.com/hashicorp/[email protected]/rpc/provisioner.go:91 +0x1c9
2024/01/26 07:03:38 packer-plugin-cnspec_v9.14.0_x5.0_linux_amd64 plugin: reflect.Value.call({0xc0007c2240?, 0xc00006b308?, 0x13?}, {0x1cf774b, 0x4}, {0xc0007c4ef8, 0x3, 0x3?})
2024/01/26 07:03:38 packer-plugin-cnspec_v9.14.0_x5.0_linux_amd64 plugin: 	/home/runner/_work/_tool/go/1.21.3/x64/src/reflect/value.go:596 +0xce7
2024/01/26 07:03:38 packer-plugin-cnspec_v9.14.0_x5.0_linux_amd64 plugin: reflect.Value.Call({0xc0007c2240?, 0xc00006b308?, 0x1a5bfe0?}, {0xc000086ef8?, 0xc000725cc0?, 0xc000086f50?})
2024/01/26 07:03:38 packer-plugin-cnspec_v9.14.0_x5.0_linux_amd64 plugin: 	/home/runner/_work/_tool/go/1.21.3/x64/src/reflect/value.go:380 +0xb9
2024/01/26 07:03:38 packer-plugin-cnspec_v9.14.0_x5.0_linux_amd64 plugin: net/rpc.(*service).call(0xc00071afc0, 0xc00079e240?, 0xb57f58?, 0xc000762f80, 0xc00077b800, 0x0?, {0x1911b60?, 0xc000467ce0?, 0xb55ea5?}, {0x19381c0, ...}, ...)
2024/01/26 07:03:38 packer-plugin-cnspec_v9.14.0_x5.0_linux_amd64 plugin: 	/home/runner/_work/_tool/go/1.21.3/x64/src/net/rpc/server.go:382 +0x214
2024/01/26 07:03:38 packer-plugin-cnspec_v9.14.0_x5.0_linux_amd64 plugin: created by net/rpc.(*Server).ServeCodec in goroutine 1
2024/01/26 07:03:38 packer-plugin-cnspec_v9.14.0_x5.0_linux_amd64 plugin: 	/home/runner/_work/_tool/go/1.21.3/x64/src/net/rpc/server.go:479 +0x410
2024/01/26 07:03:38 [INFO] (telemetry) ending cnspec
==> azure-arm.windows: Deleting Virtual Machine deployment and its attatched resources...

Desktop (please complete the following information):

  • OS: Ubuntu
  • OS Version: 20.04
  • Packer version: 1.10.0
  • Azure Plugin version: 2.0.2
  • cnspec plugin version: 9.14.0

Additional context
The code used to build the image:

packer {
  required_plugins {
    azure = {
      source  = "github.com/hashicorp/azure"
      version = ">= 2"
    }
    cnspec = {
      version = ">= 8.23.1"
      source  = "github.com/mondoohq/cnspec"
    }
  }
}


locals {
  random = uuidv4()
  date   = timestamp()
}



source "azure-arm" "windows" {
  client_id       = var.azureRmClientId
  client_secret   = var.azureRmClientSecret
  subscription_id = var.subscriptionId
  tenant_id       = var.tenantId


  os_type         = "Windows"
  image_publisher = "MicrosoftWindowsServer"
  image_offer     = "WindowsServer"
  image_sku       = "2019-Datacenter"


  azure_tags = {
    packer   = "true",
    build-id = "${local.random}"
  }

  shared_image_gallery_destination {
    subscription         = var.subscriptionId
    resource_group       = var.resourceGroup
    gallery_name         = var.galleryName
    image_name           = var.imageName
    image_version        = var.imageVersion
    storage_account_type = "Standard_LRS"

  }

  location = var.location
  vm_size  = "Standard_B4ms"

  communicator   = "winrm"
  winrm_use_ssl  = "true"
  winrm_insecure = "true"
  winrm_timeout  = "50m"
  winrm_username = "packer"
}



build {

  hcp_packer_registry {
    bucket_name = var.imageName
  }

  sources = ["sources.azure-arm.windows"]

  provisioner "cnspec" {
    asset_name      = "${var.imageName}-${var.imageVersion}"
    score_threshold = 80
    on_failure      = "continue"
    annotations = {
      os-type       = "WindowsServer"
      os-version    = "2019-Datacenter"
      image-version = "${var.imageVersion}"
      build-time    = "${local.date}"
      build-id      = "${local.random}"
    }
  }

  provisioner "powershell" {
    inline = [
      "# If Guest Agent services are installed, make sure that they have started.",
      "foreach ($service in Get-Service -Name RdAgent, WindowsAzureTelemetryService, WindowsAzureGuestAgent -ErrorAction SilentlyContinue) { while ((Get-Service $service.Name).Status -ne 'Running') { Start-Sleep -s 5 } }",

      "& $env:SystemRoot\\System32\\Sysprep\\Sysprep.exe /oobe /generalize /quiet /quit /mode:vm",
      "while($true) { $imageState = Get-ItemProperty HKLM:\\SOFTWARE\\Microsoft\\Windows\\CurrentVersion\\Setup\\State | Select ImageState; if($imageState.ImageState -ne 'IMAGE_STATE_GENERALIZE_RESEAL_TO_OOBE') { Write-Output $imageState.ImageState; Start-Sleep -s 10  } else { break } }"
    ]
  }


}

os provider is not updated with packer plugin

Describe the bug

When running the packer plugin, the cnquery plugins are not auto-updated

To Reproduce

  1. Install old os provider
  2. Run cnspec
  3. os provider is not updated

Expected behavior

The plugin checks an up-to-date os provider before running the scan

Unable to install plugin in ubuntu aarch64 container

Describe the bug
cannot install in ubuntu aarch64 container. not sure where the break down is... perhaps not respecting that aarch64 = arm64? maybe im missing something easy /shrug

$ uname -a
Linux 136607898fef 5.15.49-linuxkit #1 SMP PREEMPT Tue Sep 13 07:51:32 UTC 2022 aarch64 aarch64 aarch64 GNU/Linux
$ packer init ./
Failed getting the "github.com/mondoohq/cnspec" plugin:
13 errors occurred:
        * ignoring invalid remote binary packer-plugin-cnspec_v6.2.0_x5.0_windows_arm64.zip: wrong system, expected linux_arm 
        * ignoring invalid remote binary packer-plugin-mondoo_v6.2.0_x5.0_windows_amd64.zip: wrong version: 'packer-plugin-mondoo' does not match expected v6.2.0 
        * ignoring invalid remote binary packer-plugin-mondoo_v6.2.0_x5.0_darwin_arm64.zip: wrong version: 'packer-plugin-mondoo' does not match expected v6.2.0 
        * ignoring invalid remote binary packer-plugin-cnspec_v6.2.0_x5.0_darwin_amd64.zip: wrong system, expected linux_arm 
        * ignoring invalid remote binary packer-plugin-mondoo_v6.2.0_x5.0_darwin_amd64.zip: wrong version: 'packer-plugin-mondoo' does not match expected v6.2.0 
        * ignoring invalid remote binary packer-plugin-mondoo_v6.2.0_x5.0_windows_arm64.zip: wrong version: 'packer-plugin-mondoo' does not match expected v6.2.0 
        * ignoring invalid remote binary packer-plugin-mondoo_v6.2.0_x5.0_linux_arm64.zip: wrong version: 'packer-plugin-mondoo' does not match expected v6.2.0 
        * ignoring invalid remote binary packer-plugin-mondoo_v6.2.0_x5.0_linux_amd64.zip: wrong version: 'packer-plugin-mondoo' does not match expected v6.2.0 
        * ignoring invalid remote binary packer-plugin-cnspec_v6.2.0_x5.0_linux_arm64.zip: wrong system, expected linux_arm 
        * ignoring invalid remote binary packer-plugin-cnspec_v6.2.0_x5.0_linux_amd64.zip: wrong system, expected linux_arm 
        * ignoring invalid remote binary packer-plugin-cnspec_v6.2.0_x5.0_darwin_arm64.zip: wrong system, expected linux_arm 
        * ignoring invalid remote binary packer-plugin-cnspec_v6.2.0_x5.0_windows_amd64.zip: wrong system, expected linux_arm 
        * could not install any compatible version of plugin "github.com/mondoohq/cnspec"

To Reproduce
Steps to reproduce the behavior:

$ docker run -it drewmullen/pkr-ans:latest

create a basic packer config and include:

packer {
  required_plugins {
    cnspec = {
      version = ">= 6.2.0"
      source  = "github.com/mondoohq/cnspec"
    }
  }
}
# ... within build {}

  provisioner "cnspec" {
    on_failure      = "continue"
    score_threshold = 85
  }

Desktop (please complete the following information):

Docker running on macos m2 pro max

Plugin versions & Packer 1.11.0

Hi there!

I'm one of the Packer core developers, and I'm opening this issue to let you know of the upcoming changes to Packer, specifically with v1.11.0 (tentatively scheduled for an April release), namely we're changing how plugins are discovered and loaded.
We're officialising pre-release support for our tooling, we've settled on suporting -dev versions only to keep the logic simple, any other pre-release (e.g. -alpha1, -beta1, etc.) will be rejected by Packer.

With this change, we're also being a bit more strict on the plugins we report and load. Namely we're mandating SHA256SUM files alongside the binary, and versions need to be coherent between the file name, and the version reported by describe.

This last point is one we noticed will be a problem for your plugin.

I attempted to install the latest version of the plugin earlier today, and noticed that they're published under 10.7.3-dev, although they're a final version.
Our changes make it impossible for Packer to load the plugin, as its reported version will be different from the one it's named with.

To be transparent with you, this is something we'll probably change slightly between now and the final release: today the behaviour is that installation works, and loading doesn't, and by the time we release, we'll add a check that the version installed is indeed what is indicated by the origin.

While looking at the Makefile for the plugin to try and adapt it to work with our updated workflow, we also noticed that the version produced by the Makefile's build/dev targets has metadata in the version radical, to which a prerelease is appended if it exists.

This is technically valid semver, however the specification mandates that the metadata appears as the final information, after the version and pre-release status. The version that is produced by the Makefile will look like 10.7.1+161-dev, which ends-up being recognised as a release version, since the pre-release is amalgamated with the added metadata.
To contrast, a pre-release version with metadata should be of the following form: 10.7.1-dev+161.

In order for the plugin to smoothly be supported when v1.11.0 of Packer is released, I suggest doing the following:

  1. Remove the PreRelease value of dev in the branch when releasing: this will make sure the final binary is not marked as a pre-release, and that Packer will be able to install, discover and load the plugin when necessary.
  2. Change how the VERSION is computed in the Makefile: I suggest keeping the version simple, by only exposing the radical of the version, i.e. 10.7.3
  3. If metadata are important to you, I suggest adding them to the VersionPrerelease during the dev build: adding a value with that metadata for dev builds will make it semver compliant, and Packer will be able to load it as a pre-release.

Regarding my first suggestion, I believe this might also be done during the release build. I believe our plugins change the value of VersionPrerelease to be empty when building it through some ldflags.

Please let me know if those suggestions make sense to you.
I'm also disposed to open a PR for applying those suggestions if you think they make sense.

PS: For information and testing, we have released a blog post recently to explain the changes coming up to Packer, and we've also released Packer v1.11.0-alpha with which you can test your plugin.

mondoo not found in PATH with GitHub Action

Testing the Packer Plugin Mondoo with v0.5.1 with GitHub Actions and the HashiCorp Packer Action, it seems mondoo cannot be found in PATH despite being installed on the runner.

ERROR: mondoo executable not found in $PATH

2022/08/22 23:27:09 packer-provisioner-shell plugin: [INFO] RPC client: Communicator ended with: 0
2022/08/22 23:27:09 [INFO] (telemetry) ending shell
2022/08/22 23:27:09 [INFO] (telemetry) Starting provisioner mondoo
==> mondoo-ubuntu-2004-secure-base.amazon-ebs.ubuntu2004: Running Mondoo packer provisioner (Version: 0.5.1, Build: f71f26f)
==> mondoo-ubuntu-2004-secure-base.amazon-ebs.ubuntu2004: Executing Mondoo: [mondoo scan ssh [email protected]:22 --score-threshold 0]
==> mondoo-ubuntu-2004-secure-base.amazon-ebs.ubuntu2004: read |0: file already closed
==> mondoo-ubuntu-2004-secure-base.amazon-ebs.ubuntu2004: read |0: file already closed
==> mondoo-ubuntu-2004-secure-base.amazon-ebs.ubuntu2004: exec: "mondoo": executable file not found in $PATH
2022/08/22 23:27:09 [INFO] (telemetry) ending mondoo

Mondoo Client installed and configured successfully in Github Runner

The GitHub workflow is configured to install and configure Mondoo Client as follows:

      - name: Install Mondoo Client
        shell: bash
        run: |
            echo Installing Mondoo Client...
            echo ${{ secrets.MONDOO_SERVICE_ACCOUNT }} | base64 -d > mondoo.json
            curl -sSL https://mondoo.com/install.sh | bash
            mondoo status --config ${{ env.MONDOO_CONFIG_PATH }}

Running mondoo status --config config.json in the workflow returns the a successful status:

https://mondoo.com/docs/operating_systems/installation/registration/
→ loaded configuration from mondoo.json using source --config
→ Hostname:	REDACTED
→ IP:		10.1.1.27
→ Platform:	ubuntu
→ Release:	20.04
→ Time:		2022-08-22T23:25:39Z
→ Version:	6.11.1 (API Version: 6)
→ API ConnectionConfig:	https://us.api.mondoo.com/
→ API Status:	SERVING
→ API Time:	2022-08-22T23:25:40Z
→ API Version:	6
→ Space:	//captain.api.mondoo.app/spaces/REDACTED
→ Client:	no managed client
→ Service Account:	//agents.api.mondoo.app/spaces/REDACTED/serviceaccounts/REDACTED
→ client is registered
→ client authenticated successfully

Failed builds with "unexpected EOF"

Describe the bug
When running a build that uses the cnspec provisioner it fails for me with "unexpected EOF".

To Reproduce
Steps to reproduce the behavior:

  1. set MONDOO_CONFIG_BASE64
  2. configure the cnspec provisioner in the packer build (so far happens on ubuntu focal and amazon2)
  3. Run a build

Expected behavior

  1. If the scan fails, I would expect to see errors that indicate that.
  2. If I re-run the build with PACKER_LOG=1 I expect to see some extra log information that can help me understand what's going wrong
  3. Ultimately I expect the scan to complete and report the results to mondoo.

Screenshots or CLI Output
What I actually get is this:

==> amazon-ebs.ami: Running cnspec packer provisioner by Mondoo (Version: 10.0.3, Build: b2dd4a6)
    amazon-ebs.ami: activated sudo
    amazon-ebs.ami: detected packer build via ssh
    amazon-ebs.ami: load config from detected MONDOO_CONFIG_BASE64
    amazon-ebs.ami: using service account credentials
    amazon-ebs.ami: scan packer build
    amazon-ebs.ami: scan completed successfully
==> amazon-ebs.ami: unexpected EOF
==> amazon-ebs.ami: Step "StepProvision" failed

And if I "retry" the step then packer crashes. But, the crash output does show me some useful additional information. Note that this is from the initial attempt, not the retry:

2024/01/26 15:03:15 packer-plugin-cnspec_v10.0.3_x5.0_linux_amd64 plugin: DBG could not parse ssh port error="strconv.Atoi: parsing \"\": invalid syntax" file=/home/mbainter/.ssh/config port=
2024/01/26 15:03:15 packer-plugin-cnspec_v10.0.3_x5.0_linux_amd64 plugin: DBG ssh> read ssh identity key from ssh config host=10.15.12.92 key=
2024/01/26 15:03:15 packer-plugin-cnspec_v10.0.3_x5.0_linux_amd64 plugin: DBG could not establish ssh session error="no authentication method defined" host=xx.xx.xx.xx i
nsecure=true port=22 provider=ssh
2024/01/26 15:03:15 ui: ESC[0;32m    amazon-ebs.ami: scan completed successfullyESC[0m
2024/01/26 15:03:15 packer-plugin-cnspec_v10.0.3_x5.0_linux_amd64 plugin: panic: runtime error: invalid memory address or nil pointer dereference
2024/01/26 15:03:15 packer-plugin-cnspec_v10.0.3_x5.0_linux_amd64 plugin: [signal SIGSEGV: segmentation violation code=0x1 addr=0x38 pc=0x1913965]
2024/01/26 15:03:15 packer-plugin-cnspec_v10.0.3_x5.0_linux_amd64 plugin:
2024/01/26 15:03:15 packer-plugin-cnspec_v10.0.3_x5.0_linux_amd64 plugin: goroutine 26 [running]:
2024/01/26 15:03:15 packer-plugin-cnspec_v10.0.3_x5.0_linux_amd64 plugin: go.mondoo.com/cnspec/v10/policy.(*ReportCollection).GetWorstScore(...)
2024/01/26 15:03:15 packer-plugin-cnspec_v10.0.3_x5.0_linux_amd64 plugin:       /home/runner/go/pkg/mod/go.mondoo.com/cnspec/[email protected]/policy/report.go:159
2024/01/26 15:03:15 packer-plugin-cnspec_v10.0.3_x5.0_linux_amd64 plugin: go.mondoo.com/packer-plugin-cnspec/provisioner.(*Provisioner).executeCnspec(0xc000005500, {0x23
977c0, 0xc000bb40c0}, {0xc000005700?, 0xc0007f0a88?})
2024/01/26 15:03:15 packer-plugin-cnspec_v10.0.3_x5.0_linux_amd64 plugin:       /home/runner/_work/packer-plugin-cnspec/packer-plugin-cnspec/provisioner/provisioner.go:6
04 +0x2165
2024/01/26 15:03:15 packer-plugin-cnspec_v10.0.3_x5.0_linux_amd64 plugin: go.mondoo.com/packer-plugin-cnspec/provisioner.(*Provisioner).Provision(0xc000005500, {0x3289a0
0?, 0xc00079d230?}, {0x23977c0, 0xc000bb40c0}, {0x2396208?, 0xc0007b23a0}, 0xc000a81440)
2024/01/26 15:03:15 packer-plugin-cnspec_v10.0.3_x5.0_linux_amd64 plugin:       /home/runner/_work/packer-plugin-cnspec/packer-plugin-cnspec/provisioner/provisioner.go:3
01 +0x799
2024/01/26 15:03:15 packer-plugin-cnspec_v10.0.3_x5.0_linux_amd64 plugin: github.com/hashicorp/packer-plugin-sdk/rpc.(*ProvisionerServer).Provision(0xc00092e040, 0xc0006
57b90, 0x1?)
2024/01/26 15:03:15 packer-plugin-cnspec_v10.0.3_x5.0_linux_amd64 plugin:       /home/runner/go/pkg/mod/github.com/hashicorp/[email protected]/rpc/provisioner.go:
91 +0x1c9
2024/01/26 15:03:15 packer-plugin-cnspec_v10.0.3_x5.0_linux_amd64 plugin: reflect.Value.call({0xc00080c360?, 0xc0000b6120?, 0x13?}, {0x1dfe297, 0x4}, {0xc000865ef8, 0x3,
 0x3?})
2024/01/26 15:03:15 packer-plugin-cnspec_v10.0.3_x5.0_linux_amd64 plugin:       /home/runner/_work/_tool/go/1.21.6/x64/src/reflect/value.go:596 +0xce7
2024/01/26 15:03:15 packer-plugin-cnspec_v10.0.3_x5.0_linux_amd64 plugin: reflect.Value.Call({0xc00080c360?, 0xc0000b6120?, 0x7fed5f9d4a88?}, {0xc0007c86f8?, 0xc0007c86b
8?, 0x4b59b0?})
2024/01/26 15:03:15 packer-plugin-cnspec_v10.0.3_x5.0_linux_amd64 plugin:       /home/runner/_work/_tool/go/1.21.6/x64/src/reflect/value.go:380 +0xb9
2024/01/26 15:03:15 packer-plugin-cnspec_v10.0.3_x5.0_linux_amd64 plugin: net/rpc.(*service).call(0xc00092e080, 0xc0007c87b8?, 0xb7d838?, 0xc0006d8060, 0xc000926180, 0x6
96c222c7d7d226b?, {0x19dbaa0?, 0xc000657b90?, 0xb7aea5?}, {0x1a07340, ...}, ...)
2024/01/26 15:03:15 packer-plugin-cnspec_v10.0.3_x5.0_linux_amd64 plugin:       /home/runner/_work/_tool/go/1.21.6/x64/src/net/rpc/server.go:382 +0x211
2024/01/26 15:03:15 packer-plugin-cnspec_v10.0.3_x5.0_linux_amd64 plugin: created by net/rpc.(*Server).ServeCodec in goroutine 1
2024/01/26 15:03:15 packer-plugin-cnspec_v10.0.3_x5.0_linux_amd64 plugin:       /home/runner/_work/_tool/go/1.21.6/x64/src/net/rpc/server.go:479 +0x410
2024/01/26 15:03:15 [INFO] (telemetry) ending cnspec

Desktop (please complete the following information):

  • OS: Linux
  • OS Version: PopOS
  • Packer: 1.10.0
  • CNSpec Plugin: 9.14 and 10.0.3

Additional context

I thought maybe something in my ssh config was tripping it up based on the log data, so I removed it so it wouldn't be a factor. it still failed the same way.

I also tried running just a straight cnspec ssh to make sure my ssh setup wasn't somehow incompatible with cnspec itself, but that runs fine with cnspec scan ssh ubuntu@<host>

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.