Code Monkey home page Code Monkey logo

terraform-azurerm-storage-sftp's Introduction

Azure Storage Account for SFTP

Changelog Notice Apache V2 License TF Registry

This Terraform module creates an Azure Blob Storage with the SFTP feature.

It also manages the creation of local SFTP users within the Storage Account. An SSH key pair is automatically generated by Terraform and you have the option of downloading it (enabled by default). SFTP connection command lines and users' passwords are available in the storage_sftp_users output of this module.

Storage is created with Premium SKU by default for production ready performances.

Global versioning rule for Claranet Azure modules

Module version Terraform version AzureRM version
>= 7.x.x 1.3.x >= 3.0
>= 6.x.x 1.x >= 3.0
>= 5.x.x 0.15.x >= 2.0
>= 4.x.x 0.13.x / 0.14.x >= 2.0
>= 3.x.x 0.12.x >= 2.0
>= 2.x.x 0.12.x < 2.0
< 2.x.x 0.11.x < 2.0

Contributing

If you want to contribute to this repository, feel free to use our pre-commit git hook configuration which will help you automatically update and format some files for you by enforcing our Terraform code module best-practices.

More details are available in the CONTRIBUTING.md file.

Usage

This module is optimized to work with the Claranet terraform-wrapper tool which set some terraform variables in the environment needed by this module. More details about variables set by the terraform-wrapper available in the documentation.

module "azure_region" {
  source  = "claranet/regions/azurerm"
  version = "x.x.x"

  azure_region = var.azure_region
}

module "rg" {
  source  = "claranet/rg/azurerm"
  version = "x.x.x"

  location    = module.azure_region.location
  client_name = var.client_name
  environment = var.environment
  stack       = var.stack
}

module "logs" {
  source  = "claranet/run/azurerm//modules/logs"
  version = "x.x.x"

  location       = module.azure_region.location
  location_short = module.azure_region.location_short
  client_name    = var.client_name
  environment    = var.environment
  stack          = var.stack

  resource_group_name = module.rg.resource_group_name
}

data "http" "my_ip" {
  url = "https://ip.clara.net"
}

# When using RSA algorithm, do not forget to add `-o PubkeyAcceptedKeyTypes=+ssh-rsa` in your SFTP connection command line
# e.g. `sftp -o PubkeyAcceptedKeyTypes=+ssh-rsa -i <privateKeyPath> <storageAccountName>.<sftpLocalUserName>@<storageAccountName>.blob.core.windows.net`
resource "tls_private_key" "bar_example" {
  algorithm = "RSA"
  rsa_bits  = 4096
}

module "storage_sftp" {
  source  = "claranet/storage-sftp/azurerm"
  version = "x.x.x"

  location       = module.azure_region.location
  location_short = module.azure_region.location_short
  client_name    = var.client_name
  environment    = var.environment
  stack          = var.stack

  resource_group_name = module.rg.resource_group_name

  name_suffix = "sftp"

  account_replication_type = "LRS"

  allowed_cidrs = [chomp(data.http.my_ip.response_body)]

  containers = [
    {
      name = "foo"
    },
    {
      name = "bar"
    },
  ]

  nfsv3_enabled = true # SFTP can be used alongside the NFSv3 feature for Blob Storage

  sftp_users = [
    {
      name                 = "foo"
      home_directory       = "foo/example" # `example` is a subdirectory under `foo` container
      ssh_password_enabled = true
      permissions_scopes = [
        {
          target_container = "foo"
        },
        {
          target_container = "bar"
          permissions      = ["Read", "Write", "List"]
        },
      ]
    },
    {
      name = "bar"
      permissions_scopes = [
        {
          target_container = "bar"
        },
        {
          target_container = "foo"
          permissions      = ["List", "Create"]
        },
      ]
      ssh_authorized_keys = [{
        key         = tls_private_key.bar_example.public_key_openssh
        description = "Example"
      }]
    }
  ]

  logs_destinations_ids = [
    module.logs.logs_storage_account_id,
    module.logs.log_analytics_workspace_id,
  ]

  extra_tags = {
    foo = "bar"
  }
}

Providers

Name Version
azurerm ~> 3.102
local ~> 2.3
tls ~> 4.0

Modules

Name Source Version
storage_account claranet/storage-account/azurerm ~> 7.13.0

Resources

Name Type
azurerm_storage_account_local_user.sftp_users resource
local_sensitive_file.sftp_users_private_keys resource
local_sensitive_file.sftp_users_public_keys resource
tls_private_key.sftp_users_keys resource

Inputs

Name Description Type Default Required
access_tier Defines the access tier for StorageV2 accounts. Valid options are Hot and Cool, defaults to Hot. string "Hot" no
account_replication_type Defines the type of replication to use for this Storage Account. Valid options are LRS, GRS, RAGRS, ZRS, GZRS and RAGZRS. string "ZRS" no
advanced_threat_protection_enabled Boolean flag which controls if advanced threat protection is enabled, see documentation for more information. bool false no
allowed_cidrs List of CIDR to allow access to that Storage Account. list(string) [] no
client_name Client name/account used in naming. string n/a yes
containers List of objects to create some Blob containers in this Storage Account.
list(object({
name = string
container_access_type = optional(string)
metadata = optional(map(string))
}))
n/a yes
create_sftp_users_keys Whether or not key pairs should be created on the filesystem. bool true no
custom_diagnostic_settings_name Custom name of the diagnostics settings, name will be default if not set. string "default" no
custom_storage_account_name Custom Azure Storage Account name, generated if not set. string "" no
default_firewall_action Which default firewalling policy to apply. Valid values are Allow or Deny. string "Deny" no
default_tags_enabled Option to enable or disable default tags. bool true no
environment Project environment. string n/a yes
extra_tags Additional tags to associate with the Storage Account. map(string) {} no
https_traffic_only_enabled Boolean flag which forces HTTPS if enabled. bool true no
identity_ids Specifies a list of User Assigned Managed Identity IDs to be assigned to this Storage Account. list(string) null no
identity_type Specifies the type of Managed Service Identity that should be configured on this Storage Account. Possible values are SystemAssigned, UserAssigned, SystemAssigned, UserAssigned (to enable both). string "SystemAssigned" no
is_premium true to enable Premium tier for this Storage Account. bool true no
location Azure location. string n/a yes
location_short Short string for Azure location. string n/a yes
logs_categories Log categories to send to destinations. list(string) null no
logs_destinations_ids List of destination resources IDs for logs diagnostic destination.
Can be Storage Account, Log Analytics Workspace and Event Hub. No more than one of each can be set.
If you want to specify an Azure Event Hub to send logs and metrics to, you need to provide a formated string with both the Event Hub Namespace authorization send ID and the Event Hub name (name of the queue to use in the Namespace) separated by the `
` character. list(string) n/a
logs_metrics_categories Metrics categories to send to destinations. list(string) null no
min_tls_version The minimum supported TLS version for the Storage Account. Possible values are TLS1_0, TLS1_1, and TLS1_2. string "TLS1_2" no
name_prefix Optional prefix for the generated name. string "" no
name_suffix Optional suffix for the generated name. string "" no
network_bypass Specifies whether traffic is bypassed for 'Logging', 'Metrics', 'AzureServices' or 'None'. list(string)
[
"Logging",
"Metrics",
"AzureServices"
]
no
network_rules_enabled Boolean to enable network rules on the Storage Account, requires network_bypass, allowed_cidrs, subnet_ids or default_firewall_action correctly set if enabled. bool true no
nfsv3_enabled Is NFSv3 protocol enabled? Changing this forces a new resource to be created. bool false no
private_link_access List of Privatelink objects to allow access from.
list(object({
endpoint_resource_id = string
endpoint_tenant_id = optional(string, null)
}))
[] no
public_nested_items_allowed Allow or disallow nested items within this Storage Account to opt into being public. bool false no
resource_group_name Resource Group name. string n/a yes
sftp_users List of local SFTP user objects.
list(object({
name = string
home_directory = optional(string)
ssh_key_enabled = optional(bool, true)
ssh_password_enabled = optional(bool)
permissions_scopes = list(object({
target_container = string
permissions = optional(list(string), ["All"])
}))
ssh_authorized_keys = optional(list(object({
key = string
description = optional(string)
})), [])
}))
n/a yes
sftp_users_keys_path The filesystem location in which the key pairs will be created. Default to ~/.ssh/keys. string "~/.ssh/keys" no
shared_access_key_enabled Indicates whether the Storage Account permits requests to be authorized with the account access key via Shared Key. If false, then all requests, including shared access signatures, must be authorized with Azure Active Directory (Azure AD). bool true no
stack Project stack name. string n/a yes
static_website_config Static website configuration.
object({
index_document = optional(string)
error_404_document = optional(string)
})
null no
storage_blob_cors_rules Storage Account blob CORS rules. Please refer to the documentation for more information.
list(object({
allowed_headers = list(string)
allowed_methods = list(string)
allowed_origins = list(string)
exposed_headers = list(string)
max_age_in_seconds = number
}))
[] no
storage_blob_data_protection Blob Storage data protection parameters.
object({
delete_retention_policy_in_days = optional(number, 0)
container_delete_retention_policy_in_days = optional(number, 0)
})
{
"container_delete_retention_policy_in_days": 30,
"delete_retention_policy_in_days": 30
}
no
subnet_ids Subnets to allow access to that Storage Account. list(string) [] no
use_caf_naming Use the Azure CAF naming provider to generate default resource name. storage_account_custom_name override this if set. Legacy default name is used if this is set to false. bool true no

Outputs

Name Description
storage_account_id Created Storage Account ID.
storage_account_identity Created Storage Account identity block.
storage_account_name Created Storage Account name.
storage_account_network_rules Network rules of the associated Storage Account.
storage_account_properties Created Storage Account properties.
storage_blob_containers Created Blob containers in the Storage Account.
storage_sftp_users Information about created local SFTP users.

terraform-azurerm-storage-sftp's People

Contributors

shr3ps avatar rossifumax avatar zfiel avatar bzspi avatar semantic-release-bot avatar davidh-claranet avatar jmapro avatar

Stargazers

John Weis avatar  avatar

Watchers

Adrien Pestel avatar  avatar  avatar Arnaud Dematte avatar  avatar  avatar

Forkers

muraliv21

terraform-azurerm-storage-sftp's Issues

[FEAT] Store secrets in KeyVault

Community Note

  • Please vote on this issue by adding a ๐Ÿ‘ reaction to the original issue to help the community and maintainers prioritize this request
  • Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request
  • If you are interested in working on this issue or have submitted a pull request, please leave a comment

Description

I am running this module from Azure DevOps. To keep the private values secret, I am adding a module to store the generated values in KeyVault, as it's not clear how to download the values in a secure way. My module is like:

resource "azurerm_key_vault_secret" "sftp_password" {

  for_each = local.sftp_users

  name = "${each.key}-sftp-password"

  value        = local.sftp_users_output[each.key].password
  key_vault_id = data.terraform_remote_state.law.outputs.mgmt_key_vault_app_id
}

resource "azurerm_key_vault_secret" "sftp_private" {

  for_each = local.sftp_users

  name = "${each.key}-sftp-private-key"

  value        = local.sftp_users_output[each.key].auto_generated_private_key
  key_vault_id = data.terraform_remote_state.law.outputs.mgmt_key_vault_app_id
}

resource "azurerm_key_vault_secret" "sftp_public_key" {

  for_each = local.sftp_users

  name = "${each.key}-sftp-public-key"

  value        = local.sftp_users_output[each.key].auto_generated_public_key
  key_vault_id = data.terraform_remote_state.law.outputs.mgmt_key_vault_app_id
}

Of course, it must be my problem, but I can't find a way to make this work. Terraform is not creating a different entry for each secret, when adding a second sftp user to the list, it's replacing the first one, so it's a mess. Perhaps you can add some feature to do this.

Kind regards,
Victor

New or Affected Resource(s)/Data Source(s)

azurerm_storage_account

Potential Terraform Configuration

No response

References

No response

Support for SFTP Local Users on Storage account

Community Note

  • Please vote on this issue by adding a ๐Ÿ‘ reaction to the original issue to help the community and maintainers prioritize this request
  • Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request
  • If you are interested in working on this issue or have submitted a pull request, please leave a comment

Description

Support for SFTP local users on Storage account is now available in Terraform. https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/resources/storage_account_local_user .
Can this be added to the module instead of the custom resource "Azapi" using the localusers api at https://github.com/claranet/terraform-azurerm-storage-sftp/blob/master/r-sftp-users.tf?

New or Affected Resource(s)/Data Source(s)

azurerm_storage_account_local_user

Potential Terraform Configuration

No response

References

No response

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.