Code Monkey home page Code Monkey logo

cisagov / scubagear Goto Github PK

View Code? Open in Web Editor NEW
1.4K 42.0 195.0 27.73 MB

Automation to assess the state of your M365 tenant against CISA's baselines

Home Page: https://www.cisa.gov/resources-tools/services/secure-cloud-business-applications-scuba-project

License: Creative Commons Zero v1.0 Universal

PowerShell 29.89% HTML 25.79% CSS 0.39% JavaScript 0.86% Open Policy Agent 43.08%
assessment-tool cisa cybersecurity m365 open-policy-agent open-source powershell rego scuba security

scubagear's Introduction

ScubaGear

Developed by CISA, ScubaGear is an assessment tool that verifies that a Microsoft 365 (M365) tenant’s configuration conforms to the policies described in the Secure Cloud Business Applications (SCuBA) Security Configuration Baseline documents.

NOTE: This documentation can be read using GitHub Pages.

Table of Contents

M365 Product License Assumptions

This tool was tested against tenants that have an M365 E3 or G3 and E5 or G5 license bundle. It may still function for tenants that do not have one of these bundles.

Some of the policy checks in the baseline rely on the following licenses which are included by default in M365 E5 and G5.

  • Microsoft Entra ID P2
  • Microsoft Defender for Office 365 Plan 1

If a tenant does not have the licenses listed above, the report will display a non-compliant output for those policies.

NOTE: DOD endpoints are included, but have not been tested. Please open an issue if you encounter bugs.

Getting Started

Download the Latest Release

To download ScubaGear:

  1. Click here to see the latest release.
  2. Click ScubaGear-v1-2-0.zip (or latest version) to download the release.
  3. Extract the folder in the zip file.

PowerShell Execution Policies

Starting with release 0.3.0, ScubaGear is signed by a commonly trusted CA. On Windows Servers, the default execution policy is RemoteSigned, which will allow ScubaGear to run after the publisher (CISA) is agreed to once.

On Windows clients, the default execution policy is Restricted. In this case, Set-ExecutionPolicy RemoteSigned should be invoked to permit ScubaGear to run.

Windows clients with an execution policy of Unrestricted generate a warning about running only trusted scripts when executing ScubaGear, even when the scripts and modules are signed. This is because the files contain an identifier showing they were downloaded from the Internet. These zone identifiers, informally referred to as Mark of the Web restrictions can be removed by running Unblock-File on scripts and modules in the ScubaGear folder. Users should carefully consider use of Unblock-File and only run it on files they have vetted and deem trustworthy to execute on their system. See here for more information from Microsoft on the Unblock-File cmdlet.

Usage

ScubaGear can be invoked interactively or non-interactively. See Required Permissions for the permissions needed to execute the tool in either mode. The interactive authentication mode will prompt the user for credentials via Microsoft's popup windows. Non-interactive mode is for invoking ScubaGear using an Azure AD application service principal and supports running the tool in automated scenarios such as pipelines or scheduled jobs. Examples 1-3 provide examples for running with interactive mode and example 4 provides an example for running in non-interactive mode.

Importing the module

NOTE: Only PowerShell 5.1 is currently supported. PowerShell 7 may work, but has not been tested. Full PowerShell 7 support will be added in a future release.

ScubaGear currently must be imported into each new PowerShell terminal session to execute. To import the module, navigate to the repository folder in a PowerShell 5.1 terminal.

Then run:

Import-Module -Name .\PowerShell\ScubaGear # Imports the module into your session

If you receive a warning that The required supporting PowerShell modules are not installed, run the following cmdlet:

Initialize-SCuBA # Installs the minimum required dependencies

IMPORTANT: The Install-OPA cmdlet is called by default when running Initialize-SCuBA.

The Install-OPA cmdlet can also be run by itself to download the executable. In the event of an unsuccessful download, users can manually download the OPA executable with the following steps:

  1. Go to the OPA download site.
  2. Look for the latest OPA version (Currently v0.63.0) for ScubaGear and select the corresponding version on top left of the website.
  3. Navigate to the menu on left side of the screen: Introduction - Running OPA - Download OPA
  4. Locate the downloaded file, add the file to your desired location (default is ~\.scubagear\Tools), open PowerShell, and use the following command to check the downloaded OPA version:
.\opa_windows_amd64.exe version

Examples

Example 1

Run an assessment against all products (except PowerPlatform):

Invoke-SCuBA

Example 2

Run an assessment against Azure Active Directory with custom report output location:

Invoke-SCuBA -ProductNames aad -OutPath C:\Users\johndoe\reports

Example 3

Run assessments against multiple products:

Invoke-SCuBA -ProductNames aad, sharepoint, teams

Example 4

Run assessments non-interactively using an application service principal and authenticating via CertificateThumbprint:

Invoke-SCuBA -ProductNames * -CertificateThumbprint "<insert-thumbprint>" -AppID "<insert-appid>" -Organization tenant.onmicrosoft.com

To view more examples and see detailed help run:

Get-Help -Name Invoke-SCuBA -Full

Parameter Definitions

  • $ConfigFilePath is an optional parameter that refers to the path to a configuration file that the tool parses for input parameters when executing ScubaGear. ScubaGear supports either a YAML or JSON formatted configuration file. A sample configuration file is included in sample-config-files/aad-config.yaml. The syntax defines:

    • Use of Pascal case convention for variable names consistent with parameters on the command line
    • A global namespace for values to be used across baselines and products (i.e., GlobalVars)
    • Per product namespace for values related to that specific product (i.e., Aad, SharePoint)
    • Namespace for each policy item within a product for variables related only to one policy item (i.e., MS.AAD.2.1v1)
    • Use of YAML anchors and aliases following Don't Repeat Yourself (DRY) principle for repeated values and sections If a -ConfigFilePath is specified, default values will be used for parameters that are not added to the config file. These default values are shown in the full config file template to guide the user, but they can be omitted if desired. Other command line parameters can also be used with the -ConfigFilePath. This should reduce the number of config files needed. Examples might be: using -M365Environment to override commercial config value to gcc, switching the tenant being targeted, or supplying credential references so they do not need to be in the config file. Smaller config files can facilitate sharing among admins. The config file path defaults to the same directory where the script is executed. ConfigFilePath accepts both absolute and relative file paths. The file can be used to specify command line parameters and policy-specific parameters used by the Azure Active Directory (AAD) and Defender product assessments. See See ScubaGear Configuration File Syntax and Examples and AAD Conditional Access Policy Exemptions for more details.
  • $LogIn is a $true or $false variable that if set to $true will prompt the user to provide credentials to establish a connection to the specified M365 products in the $ProductNames variable. For most use cases, leave this variable to be $true. A connection is established in the current PowerShell terminal session with the first authentication. To run another verification in the same PowerShell session, set this variable to be $false to bypass the need to authenticate again in the same session. Defender will ask for authentication even if this variable is set to $false

  • $ProductNames is a list of one or more M365 shortened product names that the tool will assess when it is executed. Acceptable product name values are listed below. To assess Azure Active Directory you would enter the value aad. To assess Exchange Online you would enter exo and so forth.

    • Azure Active Directory: aad
    • Defender for Office 365: defender
    • Exchange Online: exo
    • Power Platform: powerplatform
    • SharePoint Online and OneDrive for Business: sharepoint
    • Microsoft Teams: teams
  • $M365Environment parameter is used to authenticate to the various M365 commercial/ government environments. Valid values include commercial, gcc, gcchigh, or dod. Default value is commercial.

    • For M365 tenants that are non-government environments enter the value commercial.
    • For M365 Government Commercial Cloud tenants with G3/G5 licenses enter the value gcc.
    • For M365 Government Commercial Cloud High tenants enter the value gcchigh.
    • For M365 Department of Defense tenants enter the value dod.
  • $OPAPath refers to the folder location of the Open Policy Agent (OPA) policy engine executable file. By default the OPA policy engine executable embedded with this project is located in the project's root folder "./" and for most cases this value will not need to be modified. To execute the tool using a version of the OPA policy engine located in another folder, customize the variable value with the full path to the folder containing the OPA policy engine executable file.

  • $OutPath refers to the folder path where the output JSON and the HTML report will be created. Defaults to the same directory where the script is executed. This parameter is only necessary if an alternate report folder path is desired. The folder will be created if it does not exist.

ScubaGear Configuration File Syntax and Examples

Most of the Invoke-SCuBA cmdlet parameters can be placed into a configuration file with the path specified by the -ConfigFilePath parameter. Please note the following parameters are supported only on the command line.

  • ConfigFilePath
  • Version
  • DarkMode
  • Quiet
  • MergeJson

Each authentication parameter must be supplied either the command line or in the config file if a non-interactive login is supplied. An authentication parameter may be present in both, but the command line will always take precedence. The parameters can be split between the config file and the command line.

All of the configuration file examples referenced below are in the sample-config-files directory and the examples assume a Invoke-SCuBA is run in that directory. Each example shows the sample config file name and a command line example with it.

The authentication parameter values shown below are examples only. The user must supply parameter values appropriate for their tenant and principal.

Basic Use : config file basic_config.yaml Basic use specifies a product name and an M365 environment variable. In this example product is entered a a single value.

Description: YAML Basic Config file ( one product )
ProductNames: teams
M365Environment: commercial

Command line Invoke-SCuBA -ConfigFilePath minimal_config.yaml

Command line with override of M365Environment

Invoke-SCuBA -M365Environment gcc -ConfigFilePath minimal_config.yaml

Typical Use : config file typical_config.yaml Typical use includes multiple products, specified as a list, and an M365 environment variable. Note that additional product values are commented out and will not be included, but are retained in the config file to easily add them back later. ScubaGear's Support module also has functionality to generate an empty sample config file. Runnning the New-Config Cmdlet will generate a full sample config called SampleConfig.yaml that can be filled out based on the guidance below. Also parameters can be passed to the New-Config Cmdlet to change values inside the sample config.

Description: YAML Typical Config ( multiple products )
ProductNames:
- teams
# - exo
# - defender
- aad
# - sharepoint
M365Environment: commercial

Command line with Auth Parameters

Invoke-SCuBA -Organization abcdef.example.com `
             -AppID 0123456789abcdef01234566789abcde `
             -CertificateThumbprint: fedcba9876543210fedcba9876543210fedcba98 `
             -ConfigFilePath typical_config.yaml

Credential Use : config file creds_config.yaml Credentials, in the form of a service principal AppID and certificate thumbprint ID can be supplied in the config file. While these credentials alone do not provide access without the associated private key, appropriate protection should be considered if including them in a configuration file.

Description: YAML Configuration file with credentials ( invalid ones )
ProductNames:
- teams
# - exo
# - defender
- aad
# - sharepoint
M365Environment: commercial
Organization: abcdef.example.com
AppID:  0123456789abcdef01234566789abcde
CertificateThumbprint: fedcba9876543210fedcba9876543210fedcba98

Command line with override of product names

Invoke-SCuBA -ProductNames  defender -ConfigFilePath typical_config.yaml

Full Use: config file full_config.yaml Full use shows all of the global parameters supported by ScubaConfig specified in the config file. Any one of these parameters may be commented out. If not specified or commented out, ScubaConfig will supply the default value instead unless overridden on the command line. This default value does not apply to authentication parameters.

Description: YAML Configuration file with all parameters
ProductNames:
- teams
- exo
- defender
- aad
- sharepoint
M365Environment: commercial
OPAPath: .
LogIn: true
DisconnectOnExit: false
OutPath: .
OutFolderName: M365BaselineConformance
OutProviderFileName: ProviderSettingsExport
OutRegoFileName: TestResults
OutReportName: BaselineReports
Organization: abcdef.example.com
AppID:  0123456789abcdef01234566789abcde
CertificateThumbprint: fedcba9876543210fedcba9876543210fedcba98

Command line invocation (no overrides )

Invoke-SCuBA  -ConfigFilePath full_config.yaml

AAD Conditional Access Policy Exemptions

The ScubaGear -ConfigFilePath command line option allows users to define custom variables for use in policy assessments against the AAD baseline. These custom variables are used to exempt specific user and group exclusions from conditional access policy checks that normally would not pass if exclusions are present. These parameters support operational use cases for having backup or "break glass" account exclusions to global user policies without failing best practices. Any exemptions and their risks should be carefully considered and documented as part of an organization's cybersecurity risk management program process and practices.

YAML AAD Configuration File Syntax and Examples

Aad - Defines the AAD specific variables to specify user, group, and role exclusions that are documented exemptions to select conditional access policies (CAP) in the AAD configuration policy baselines. Users, groups, and roles are specified by their respective Universally Unique Identifier (UUID) in the tenant. This variable set is only needed if the agency has documented CAP exemptions.

CapExclusions - Supports both a Users and Groups list with each entry representing the UUID of a user or group that is approved by the agency to be included in a conditional access policy assignment exclusion. Adding an entry to this variable will prevent ScubaGear from failing the policy assessment due to the presence of the users and groups in an exclusion.

CapExclusions can be defined in the following policy namespaces:

  • MS.AAD.1.1v1
  • MS.AAD.2.1v1
  • MS.AAD.2.3v1
  • MS.AAD.3.1v1
  • MS.AAD.3.2v1
  • MS.AAD.3.3v1
  • MS.AAD.3.6v1
  • MS.AAD.3.7v1
  • MS.AAD.3.8v1

RoleExclusions - Supports both a Users and Groups list with each entry representing the UUID of a user or group that is approved by the agency to be included in a role assignment. Adding an entry to this variable will prevent ScubaGear from failing the policy assessment due to the presence of a role assignment for those users and groups.

RoleExclusions can be defined in the following policy namespaces:

  • MS.AAD.7.4v1

The example below illustrates the syntax for defining user, group, and role exemptions to select policies. The syntax allows the use of a YAML anchor and alias to simplify formatting policies having the same documented exemptions. Items surrounded by chevrons are to be supplied by the user.

    Aad:
      MS.AAD.1.1v1: &CommonExclusions
        CapExclusions:
          Users:
            - <Exempted User 1 UUID>
            - <Exempted User 2 UUID>
          Groups:
            - <Exempted Group 1 UUID>
      MS.AAD.2.1v1:  *CommonExclusions
      MS.AAD.2.3v1:  *CommonExclusions
      MS.AAD.3.2v1:  *CommonExclusions
      MS.AAD.7.4v1:
        RoleExclusions:
          Users:
            - <Exempted User 3 UUID>
          Groups:
            - <Exempted Group 2 UUID>

Viewing the Report

The HTML report should open in your browser once the script completes. If it does not, navigate to the output folder and open the BaselineReports.html file using your browser. The result files generated from the tool are also saved to the output folder.

Required Permissions

When executing the tool interactively, there are two types of permissions that are required:

  • User Permissions (which are associated with Azure AD roles assigned to a user)
  • Application Permissions (which are assigned to the MS Graph PowerShell application in Azure AD).

When executing the tool via app-only authentication a slightly different set of User and Application Permissions are required to be assigned directly to the Service Principal application.

User Permissions

The minimum user roles needed for each product are described in the table below.

This article also explains how to assign admin roles in M365.

Product Role
Azure Active Directory Global Reader
Defender for Office 365 Global Reader (or Exchange Administrator)
Exchange Online Global Reader (or Exchange Administrator)
Power Platform Power Platform Administrator with a "Power Apps for Office 365" license
Sharepoint Online SharePoint Administrator
Microsoft Teams Global Reader (or Teams Administrator)

NOTE: Users with the Global Administrator role always have the necessary user permissions to run the tool.

Microsoft Graph Powershell SDK permissions

The Azure AD baseline requires the use of Microsoft Graph. The script will attempt to configure the required API permissions needed by the Microsoft Graph PowerShell module, if they have not already been configured in the target tenant.

The process to configure the application permissions is sometimes referred to as the "application consent process" because an Administrator must "consent" for the Microsoft Graph PowerShell application to access the tenant and the necessary Graph APIs to extract the configuration data. Depending on the Azure AD roles assigned to the user running the tool and how the application consent settings are configured in the target tenant, the process may vary slightly. To understand the application consent process, read this article from Microsoft.

Microsoft Graph is used, because Azure AD PowerShell is being deprecated.

NOTE: Microsoft Graph PowerShell SDK appears as "unverified" on the AAD application consent screen. This is a known issue.

The following API permissions are required for Microsoft Graph Powershell:

  • Directory.Read.All
  • GroupMember.Read.All
  • Organization.Read.All
  • Policy.Read.All
  • RoleManagement.Read.Directory
  • User.Read.All
  • PrivilegedEligibilitySchedule.Read.AzureADGroup
  • PrivilegedAccess.Read.AzureADGroup
  • RoleManagementPolicy.Read.AzureADGroup

Service Principal Application Permissions & Setup

The minimum API permissions & user roles for each product that need to be assigned to a service principal application for ScubaGear app-only authentication are listed in the table below.

Product API Permissions Role
Azure Active Directory Directory.Read.All, GroupMember.Read.All,
Organization.Read.All, Policy.Read.All,
RoleManagement.Read.Directory, User.Read.All
PrivilegedEligibilitySchedule.Read.AzureADGroup
PrivilegedAccess.Read.AzureADGroup
RoleManagementPolicy.Read.AzureADGroup
Defender for Office 365 Exchange.ManageAsApp Global Reader
Exchange Online Exchange.ManageAsApp Global Reader
Power Platform See Power Platform App Registration
SharePoint Online Sites.FullControl.All, Directory.Read.All
Microsoft Teams Global Reader

This video provides a good tutorial for creating an application manually in the Azure Portal. Augment the API permissions and replace the role assignment instructions in the video with the permissions listed above.

Power Platform App Registration

For Power Platform, the application must be manually registered to Power Platform via interactive authentication with a administrative account. See Limitations of Service Principals for how applications are treated within Power Platform.

Add-PowerAppsAccount -Endpoint prod -TenantID $tenantId # use -Endpoint usgov for gcc tenants
New-PowerAppManagementApp -ApplicationId $appId # Must be run from a Power Platform Administrator or Global Administrator account

Certificate store notes

  • Power Platform has a hardcoded expectation that the certificate is located in Cert:\CurrentUser\My.
  • MS Graph has an expectation that the certificate at least be located in one of the local client's certificate store(s).

Additional Notes: Only authentication via CertificateThumbprint is currently supported. We will be supporting automated app registration in a later release.

Architecture

SCuBA Architecture diagram The tool employs a three-step process:

  1. Extract & Export. In this step, we utilize the various PowerShell modules authored by Microsoft to export and serialize all the relevant settings into JSON.
  2. Test & Record. Compare the exported settings from the previous step with the configuration prescribed in the baselines. This is done using OPA Rego, a declarative query language for defining policy. OPA provides a ready-to-use policy engine executable and version v0.42.0 is the minimum tested version the ScubaGear tool was tested against. To use a later version of the OPA policy engine, follow the instructions listed here and customize the $OPAPath variable described in the Usage section above.
  3. Format & Report. Package the data output by the OPA policy engine into a human-friendly HTML report.

Repository Organization

  • PowerShell contains the code used to export the configuration settings from the M365 tenant and orchestrate the entire process from export through evaluation to report. The main PowerShell module manifest ScubaGear.psd1 is located in the PowerShell/ScubaGear folder.
  • Rego, located within the PowerShell/ScubaGear folder, holds the .rego files. The Open Policy Agent executable uses each Rego file to audit the desired state for each product, per the SCuBA M365 secure configuration baseline documents.
  • baselines contains the SCuBA M365 secure configuration baseline documents in Markdown format.
  • Testing contains code that is used during the development process to test ScubaGear's PowerShell and Rego code.
  • sample-report contains sample JSON and HTML report output of the ScubaGear tool. Right click on the BaselineReports.html to open the file in a browser of your choice to view the sample report.
  • utils contains an assorted array of helper scripts for ScubaGear usage and development. See Utility Scripts for detailed descriptions of scripts that can help with troubleshooting.

Troubleshooting

Executing against multiple tenants

ScubaGear creates connections to several M365 services. If running against multiple tenants, it is necessary to disconnect those sessions.

Invoke-SCuBA includes the -DisconnectOnExit parameter to disconnect each of connection upon exit. To disconnect sessions after a run, use Disconnect-SCuBATenant. The cmdlet disconnects from Azure Active Directory (via MS Graph API), Defender, Exchange Online, Power Platform, SharePoint Online, and Microsoft Teams.

Disconnect-SCuBATenant

The cmdlet will attempt to disconnect from all services regardless of current session state. Only connections established within the current PowerShell session will be disconnected and removed. Services that are already disconnected will not generate an error.

Errors connecting to Defender

If when running the tool against Defender (via ExchangeOnlineManagement PowerShell Module), you may see the connection error "Create Powershell Session is failed using OAuth" in the Powershell window, follow the instructions in this section. An example of the full error message is provided below.

WARNING: Please note that you can only use above 9 new EXO cmdlets (the one with *-EXO* naming pattern). You can't use other cmdlets
as we couldn't establish a Remote PowerShell session as basic auth is disabled in your client machine. To enable Basic Auth, please
check instruction here
https://docs.microsoft.com/en-us/powershell/exchange/exchange-online-powershell-v2?view=exchange-ps#prerequisites-for-the-exo-v2-module
Create Powershell Session is failed using OAuth

If you see this error message it means that you are running a version of the ExchangeOnlineManagement PowerShell module less than Version 3.2. The automation relies on the Microsoft Security & Compliance PowerShell environment for Defender information. Security & Compliance PowerShell connections, unlike other services used by the ExchangeOnlineManagement module, once required basic authentication to be enabled. As of June 2023, Microsoft has deprecated Remote PowerShell for Exchange Online and Security & Compliance PowerShell. To resolve this error, you should run the Initialize-SCuBA cmdlet to install the latest ExchangeOnlineManagement module version.

Exchange Online maximum connections error

If when running the tool against Exchange Online, you see the error below in the Powershell window, follow the instructions in this section.

New-ExoPSSession : Processing data from remote server outlook.office365.com failed with the
following error message: [AuthZRequestId=8feccdea-493c-4c12-85dd-d185232cc0be][FailureCategory=A
uthZ-AuthorizationException] Fail to create a runspace because you have exceeded the maximum
number of connections allowed : 3

If you see the error above run the command below in Powershell:

Disconnect-ExchangeOnline

or alternatively run Disconnect-SCuBATenant exported by the ScubaGear module.

Disconnect-SCuBATenant

Power Platform errors

In order for the tool to properly assess the Power Platform product, one of the following conditions must be met:

  • The tenant includes the Power Apps for Office 365 license AND the user running the tool has the Power Platform Administrator role assigned.
  • The user running the tool has the Global Administrator role.

In addition to those conditions, the correct $M365Environment parameter value must be passed into the Invoke-SCuBA otherwise an error will be thrown like the one shown below.

Invoke-ProviderList : Error with the PowerPlatform Provider. See the exception message for more details: "Power Platform Provider ERROR: The M365Environment parameter value is not set correctly which WILL cause the Power Platform report to display incorrect values.                                                                                                 ---------------------------------------                                                                     M365Environment Parameter value: commercial
        Your tenant's OpenId-Configuration: tenant_region_scope: NA, tenant_region_sub_scope: GCC
        ---------------------------------------
        Rerun ScubaGear with the correct M365Environment parameter value
        by looking at your tenant's OpenId-Configuration displayed above and
        contrast it with the mapped values in the table below
        M365Environment => OpenId-Configuration
        ---------------------------------------
        commercial: tenant_region_scope:NA, tenant_region_sub_scope:
        gcc: tenant_region_scope:NA, tenant_region_sub_scope: GCC
        gcchigh : tenant_region_scope:USGov, tenant_region_sub_scope: DODCON
        dod: tenant_region_scope:USGov, tenant_region_sub_scope: DOD
        ---------------------------------------
        Example Rerun for gcc tenants: Invoke-Scuba -M365Environment gcc

Microsoft Graph Errors

Infinite AAD Sign in Loop

While running the tool, AAD sign in prompts sometimes get stuck in a loop. This is likely an issue with the connection to Microsoft Graph.

To fix the loop, run:

Disconnect-MgGraph

Then run the tool again.

Error Connect-MgGraph : Key not valid for use in specified state.

This is due to a bug in the Microsoft Authentication Library. The workaround is to delete broken configuration information by running this command (replace {username} with your username):

rm -r C:\Users\{username}\.graph

After deleting the .graph folder in your home directory, re-run the tool and the error should disappear.

Error Could not load file or assembly 'Microsoft.Graph.Authentication'

This indicates that the authentication module is at a version level that conflicts with the MS Graph modules used by the tool. Follow the instructions in the Installation section and execute the 'Initialize-SCuBA' cmdlet again. This will ensure that the module versions get synchronized with dependencies and then execute the tool again.

Running the Tool Behind Some Proxies

If you receive connection or network proxy errors, try running:

$Wcl=New-Object System.Net.WebClient
$Wcl.Proxy.Credentials=[System.Net.CredentialCache]::DefaultNetworkCredentials

Utility Scripts

The ScubaGear repository includes several utility scripts to help with troubleshooting and recovery from error conditions in the utils folder. These helper scripts are designed to assist developers and users when running into errors with the ScubaGear tool or local system environment. See the sections below for details on each script.

ScubaGear Support

If a user receives errors and needs additional support diagnosing issues, the ScubaGearSupport.ps1 script can be run to gather information about their system environment and previous tool output. The script gathers this information into a single ZIP formatted archive to allow for easy sharing with developers or other support staff to assist in troubleshooting. Since the script does gather report output, do keep in mind that the resulting archive may contain details about the associated M365 environment and its settings.

The script can be run with no arguments and will only collect environment information for troubleshooting. If the IncludeReports parameter is provided, it will contain the most recent report from the default Reports folder.

.\ScubaGearSupport.ps1

An alternate report path can be specified via the ReportPath parameter.

.\ScubaGearSupport.ps1 -ReportPath C:\ScubaGear\Reports

Finally, the script can optionally include all previous reports rather than the most recent one by adding the AllReports option.

.\ScubaGearSupport.ps1 -AllReports

Data gathered by the script includes:

  • Listings of locally installed PowerShell modules and their installation paths
  • PowerShell versions and environment details
  • WinRM client service Basic Authentication registry setting
  • (optional) ScubaGear output from one or more previous invocations which contains
    • HTML product and summary reports
    • JSON-formatted M365 product configuration extracts
    • JSON and CSV-formatted M365 baseline test results

Removing installed modules

ScubaGear requires a number of PowerShell modules to function. A user or developer, however, may wish to remove these PowerShell modules for testing or for cleanup after ScubaGear has been run. The UninstallModules.ps1 script will remove the latest version of the modules required by ScubaGear and installed by the associated Initialize-SCuBA cmdlet. The script does not take any options and can be as follows:

.\UninstallModules.ps1

PowerShellGet 2.x has a known issue uninstalling modules installed on a OneDrive path that may result in an "Access to the cloud file is denied" error. Installing PSGet 3.0, currently in beta, will allow the script to successfully uninstall such modules or you can remove the modules files from OneDrive manually.

Project License

Unless otherwise noted, this project is distributed under the Creative Commons Zero license. With developer approval, contributions may be submitted with an alternate compatible license. If accepted, those contributions will be listed herein with the appropriate license.

scubagear's People

Contributors

adhilto avatar ahuynhmitre avatar amart241 avatar anderseknert avatar buidav avatar crutchfield avatar dagarwal-mitre avatar dylan-mitre avatar ethanb-cisa avatar gdasher avatar github-actions[bot] avatar isab-m avatar james-garriss avatar julianjburgos avatar kennethpalmer avatar mitchelbaker-cisa avatar nanda-katikaneni avatar rgbrow1949 avatar schrolla avatar sloane4 avatar ssatyapal123 avatar tkol2022 avatar twneale avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

scubagear's Issues

Add CONTRIBUTING and RELEASE documentation

As pointed out in the public repo, we don't have a CONTRIBUTING.md file yet which will contain instructions for users that want to contribute to the project. Review similar coding efforts at CISA to re-use their language. Have the contributing file reference other documentation, such as the RELEASE file.

Various sections of report incorrectly list failure

Initially ran a baseline and corrected the ones I saw fit. Tenant is commercial. Followed CISA documents, such as this. Re-ran and got these results:

Defender 2.7

All but the last entry list failure.

Defender 2.8

Malware in attachments is set to block.

Defender 2.9

Not seeing either "disabled" report: Malware campaign detected after delivery, Unusual increase in email reported as phish. User is E5 licensed.

SharePoint 2.5

Setting was already configured to block.

Issues Checking Multiple Defender DLP Policies

🐛 Summary

A tenant with 2 DLP policies was tested. The DLP policies are set for 1 for Teams and 1 for devices. The Teams policy is checking for PII, credit cards, and UK passports. The Device policy only checks for credit cards. The output of the script states that the requirement for PII is met, but not all policies are checking for PII. The script needs to be modified to iterate through the policies and provide the correct output for policies that do not meet the requirement.

Device Security Baseline for Cross-Tenant Access

💡 Summary

Section 2.17 of the Microsoft Azure Active Directory Baseline states that the device claim (compliance status of the device) from the home tenant should be trusted for guests in the resource tenant. When conditional access policies are evaluated during authentication, it only renders a binary answer of whether a device is compliant or not. While this type of policy provides a rudimentary filter for blocking access to non-compliant devices, it does not yield that a guest user's device complies with the resource tenant's security requirements and fundamentally goes against the ZT concept of not allowing implicit trust across boundaries.

Motivation and context

Let's consider a simple scenario of a resource tenant and home tenant have differing security requirements for HAADJ/AADJ devices. If the resource tenant requires BitLocker to be enabled for a device to be marked as compliant, but the home tenant does not require BitLocker to be enabled to mark a device as compliant, then a user from the home tenant (with whom a resource has been shared) can access a resource as a guest user within the resource tenant without meeting the BitLocker requirements of the resource tenant. This carries a significant security risk if there were no other DLP policies applied to the resources. This yields the question, what is the real value of section 2.17 in the context of guest users as it stands today? While general guidance for securing devices has been laid out through a lot of publications from NIST and CISA, there has to be consensus and a shared understanding of what is considered secure before the implementation steps 2 and 3 under section 2.17.4 are executed. This is confounded by the fact that CISOs and CIOs may have opposing opinions on the on this topic.

We also have to consider that sharing real-time device configuration across two tenants is probably not a capability that Microsoft can (or will) implement for the conditional access policy feature set any time soon. If and when they do, it can be incorporated into the baseline. In the meantime, to be pragmatic, CISA should also provide a framework for how device trust should be established.

This would be helpful so that CISOs and CIOs can follow guidance from a single authoritative source and build consensus on the absolute minimum required device configuration to meet the security thresholds for cross-tenant collaboration.

Implementation notes

The framework could include the following:

  • Publishing a security baseline for Windows Devices (or use an existing one that defines the bare minimum required).
  • Developing scripts that can generate reports about whether security baselines for a tenant have been implemented or not.
  • Providing guidance on how home tenant ISOs should supply their security baseline reports as evidence to resource tenant ISOs through Interagency agreements (or some other medium).

Acceptance criteria

Resource tenant implementers have a shared understanding with home tenant implementers on the minimum security/configuration requirements for devices, before cross-tenant access is enabled.

Azure AD Baseline Feedback

Azure AD Baseline Feedback

Overview

From an analysis of the Azure AD baseline that is part of SCuBA, I would like to provide feedback from the perspective of someone who has worked in a consultative role in the space of Azure AD with a variety of organizations.

It's understandable that the government provide guidance for agencies when implementing M365 and subsequently Azure AD tenants that are providing the identity layer to those services.

There are areas that are of somewhat concern, with public consumption of the guidance and the potential for adoption of such guidance without proper understanding, in particular in the commercial space. While it's understandable that the primary directive of this guidance is not to strengthen security of commercial organizations, it's likely that those organizations may attempt to consume this guidance as written to the letter. Considering how easy it is to run the ScubaGear tooling, one can see organizations easily adopting this tooling for baselining their environments, yet not necessarily understanding the ramifications behind adopting certain guidelines, especially those that tend to be more security-heavy or do not align with Microsoft recommendations. Along similar lines, there are other security recommendations for properly securing an Azure AD tenant that are not covered within here, and it may be helpful to note that this is not a comprehensive security baseline for all aspects of Azure AD.

The submission of this relates to my recent blog post in which I analyze the results, but I want to speak a bit further to some of them within here. You can find that post here, https://ericonidentity.com/2022/10/26/cisa-scuba-diving-into-the-azure-ad-baseline.

Perhaps it's simply a matter of CISA acknowledging within the guidance that if the public decides to consume it, that it may or may not cover all aspects of identity security in Azure AD, and that the recommendations may or may not align to any particular business model.

General Areas of Concern

SHALL v SHOULD

It could be presumed that SHALL v SHOULD align to NIST definitions, but it is not apparent within the guidance itself. While these terms perhaps are clear in the federal space, outside of that it may help from an open-source perspective to include the definitions within the guidance.

2.2 and 2.3 - Implementation Gap

For AAD tenants that have Azure AD Identity Protection available to them (AAD P2 or M365 E/G5), it has always been somewhat a bit misunderstood as far as whether to choose Conditional Access v AAD Identity Protection as far as the place to implement such policies. It's understandable to design to implement with CA, considering that you can only have one policy for user risk and sign-in risk defined within AAD Identity Protection.

There is one gap though when not defining these policies within Identity Protection. As CA policies are only ever evaluated in the resource tenant, if the users account is at risk or the sign-in is risky, the CA policy defined in the home tenant will not be evaluated when the user authenticates to the resource tenant. This primarily is a driver why baseline Identity Protection policies to the "majority" user base are still enabled, and then you can build from there with CAP.

2.9 and 2.10 - Usability Issues

It can be presumed that the recommendations here are to align to NIST 800-63B 4.2.3 or 4.3.3 for reauthentication and 7.3 for secret retention/session persistence.

It's understandable that CISA guidance would align to NIST, but from a usability perspective it's highly likely to become problematic, especially on mobile devices. Again, perhaps without insight into the user personas within federal agencies, this may or may not be an issue. But this area in particular is one that consumers outside of the federal space, who want to apply CISA guidance, should be warned that usability is at risk.

A 12-hour time window will generally cause little interference with a "normal desk job" worker, who likely locks their Windows device when away, or the device locks itself on idle, as unlock will cause reauthentication to take place. Therefore, a policy as such could be applied and not interfere with devices under control of the organization.

Mobile devices though can suffer from these settings, as they can interfere with client applications. A common example is session expiration in the Outlook application, and then users not realizing that they have not been receiving email, as the client will not alert the user that reauthentication is necessary.

2.14 - Lack of Break-glass accounts

The general recommendation on break-glass accounts is missing, which perhaps CISA and the federal space account for the inherent risk, but organizations that do accidently lock themselves out of their tenant have to go through a decent process to gain access to their tenant. Hence the need for break-glass accounts.

While there may also be the possibility that the federal space has different mechanisms for tenant recovery, commercial organizations that do not account for break-glass, regardless of their size, will spend days locked out of their tenant from a management perspective, which depending on the severity of what was implemented, could be highly impactful to the entire organization.

Likewise, in the case that there are issues with the Azure MFA services or Azure AD PIM role assignment, recommendation from Microsoft is that at least one break-glass account is without MFA, and that it is permanently assigned Global Administrator. Again, CISA guidance may align to federal requirements and recommendations, but this advice may not necessarily translate well beyond those walls.

2.14 and 2.15 - Implementation Gap

Out of the box all role assignments in Azure AD PIM provide an 8-hour role elevation lifetime. Perhaps covered in other federal recommendations outside of this guidance, but for roles such as Global Administrator, Microsoft recommends that activation should be limited to an hour. General identity security guidance would align in that a highly privileged role should be limited to the minimal viable window - it's unlikely a Global Administrator requires activated privileges for an entire 8 hours. Alternatively, recommendation should indicate that roles should be deactivated through Azure AD PIM once the usage of the role for the required work is complete.

Getting Started

🐛 Summary

The instructions tell us to run the setup.ps1 file but nothing about what we need to do before this step. Are supposed to clone the repo, download some files, or something else. A little bit of extra help for newbies would be helpful.

To reproduce

Steps to reproduce the behavior:

  1. Do this
  2. Then this

Expected behavior

What did you expect to happen that didn't?

Any helpful log output or screenshots

Paste the results here:

Add any screenshots of the problem here.

Exchange External Sender Policy: Multiple Rules not iterated through

For Exchange 2.7, the script marks this check as not implemented if Multiple External Sender Policies are present, and the first policy is disabled but the follow-on rules are appropriately configured. Suggest iterating through the policies and checking the configs of each to determine the outcome.

Agency 2 Pilot: Defender 2.2 potential false "PASS" results

Agency 2 noted that their DLP is managed outside Defender through another system, however they noted that have policies enabled for credit cards, TIN, SSN and PII for monitoring purposes. When they ran the assessment script the results for the sensitive information came up as "PASS" but are not configured to be blocked when the policy states “the DLP policy SHOULD be set to block sharing sensitive information”.

Support Assignment Exclusions for Conditional Access Policies

Updated AAD policy and unit tests to define policy for users, groups, and roles, and to test for any exclusions to these.
Policy updates were made to 2.1, 2.2, 2.3, 2.4, 2.9, 2.10, and 2.13.

Policy 2.17 involves hybrid joined and compliant devices and is unchanged. This is the only "should" vs "shall" requirement, and we are not currently applying these policies in the tenant configurations.

Add retry logic to DNS that attempts to retry against a public resolver. Ensure that report indicates level of confidence in the correctness of the result.

Agency 2 assessment showed a "FAIL" but when looking into the domains flagged by the assessment script, many of which were sub-domains to ones configured in the DNS records hosted by the agency's domain. Agency 2 also mentioned that they noticed two domains that should be approved.

Ethan did check some domains, one did provide a SFT and one provided a “null” output. Ethan also investigated and it could be some weird DNS lookup failures on our end. Maybe we just try to do more testing against tenants with many (think Agency 2 has 60) domains.

OneDrive 2.7 report output shows "Currently cannot be checked automatically" but there is a setting in the provider JSON that can be checked

Problem

The OneDrive 2.7 "Legacy Authentication SHALL Be Blocked" report output says that the setting cannot be checked automatically but I found a setting named LegacyAuthProtocolsEnabled that is in the existing provider export JSON.

Fix

Modify the Rego policy to check for the LegacyAuthProtocolsEnabled setting. Follow the instructions in the baseline document to configure the setting and test to ensure that this setting that I found is the right one.
image

Evaluate permitting more narrowly scoped CA policies

Today, we require CA policies apply to all applications. Since this is an M365 baseline, it should be permissible to allow CA policies that apply to just O365 (there is a label for this) or All apps or even potentially explicit lists of apps.

This may be required in practice as some apps may not be able to enforce certain requirements due to their use cases (e.g. being public facing).

Agency 2 Pilot: OneDrive 2.4, 2.5, and 2.6; Agency Defined Domains

Agency 2 has enabled restrictions for OneDrive for Business using a list of domains but showed "FAIL" for 2.4 and 2.6 and "PASS" for 2.5.

There are already ongoing discussions on possible consolidation of these policies and Microsoft has provided test scripts to run to verify findings. Testing is expected to occur post Coral.

From Ram:

  1. Set-SPOTenantSyncClientRestriction -Enable -DomainGuids "786548DD-877B-4760-A749-6B1EFBC1190A; 877564FF-877B-4760-A749-6B1EFBC1190A"
    And what is experience of Mac Sync Client on Mac for , when trying to sync OneDrive for Business.
    A. Mac Joined to 786548DD-877B-4760-A749-6B1EFBC1190A domain -
    B. Mac Joined to domain different from "786548DD-877B-4760-A749-6B1EFBC1190A; 877564FF-877B-4760-A749-6B1EFBC1190A"
    C. Mac not joined any domain.

  2. Set-SPOTenantSyncClientRestriction -Enable -DomainGuids "786548DD-877B-4760-A749-6B1EFBC1190A; 877564FF-877B-4760-A749-6B1EFBC1190A" -BlockMacSync:$true
    And what is experience of Mac Sync Client on Mac for , when trying to sync OneDrive for Business.
    A. Mac Joined to 786548DD-877B-4760-A749-6B1EFBC1190A domain
    B. Mac Joined to domain different from "786548DD-877B-4760-A749-6B1EFBC1190A; 877564FF-877B-4760-A749-6B1EFBC1190A"
    C. Mac not joined any domain.

  3. Set-SPOTenantSyncClientRestriction -Enable -DomainGuids "786548DD-877B-4760-A749-6B1EFBC1190A; 877564FF-877B-4760-A749-6B1EFBC1190A" -BlockMacSync:$false
    And what is experience of Mac Sync Client on Mac for , when trying to sync OneDrive for Business.
    A. Mac Joined to 786548DD-877B-4760-A749-6B1EFBC1190A domain
    B. Mac Joined to domain different from "786548DD-877B-4760-A749-6B1EFBC1190A; 877564FF-877B-4760-A749-6B1EFBC1190A"
    C. Mac not joined any domain.

  4. Set-SPOTenantSyncClientRestriction -BlockMacSync:$false ( with no domain specified in either enable or disable list)
    And what is experience of Mac Sync Client on Mac for , when trying to sync OneDrive for Business.
    A. Mac Joined to any domain
    B. Mac not joined to any domain.

  5. If you run two separate commands
    Set-SPOTenantSyncClientRestriction -BlockMacSync:$true
    Set-SPOTenantSyncClientRestriction -Enable -DomainGuids "786548DD-877B-4760-A749-6B1EFBC1190A; 877564FF-877B-4760-A749-6B1EFBC1190A" " -BlockMacSync:$false
    A. Mac Joined to 786548DD-877B-4760-A749-6B1EFBC1190A domain
    B. Mac Joined to domain different from "786548DD-877B-4760-A749-6B1EFBC1190A; 877564FF-877B-4760-A749-6B1EFBC1190A"
    C. Mac not joined any domain.

Agency 2 Pilot: OneDrive; 2.2 and 2.3 error if anyone links are disabled (Policy 2.1)

OneDrive for Business policy 2.1 is "Anyone Links SHOULD Be Turned Off". If this is turned off the current rego will mark 2.2 and 2.3 as "FAIL" (Expiration Date SHOULD Be Set for Anyone Links & Link Permissions SHOULD Be Set to Enabled Anyone Links to View) since the admin cannot set an expiration date or permissions to "view".

Suggest adding to the rego a "PASS" scenario if anyone links are disable 2.2 and 2.3 will also show up as "PASS".

Improve handling of complex mail flow and DKIM/DMARC/SPF

I was investigating the DKIM related issues with Agency 1, which seem to be do mostly to their more complex mail flow configuration.

Agency 1 mail flow is more complex than just office365 to the internet. They route their mail back through what looks like an on-prem Forcepoint email gateway, which is what actually signs their mail as seen by recipient. This is based on me looking at some inbound email from people at Agency 1. This uses selector agency1seg2048a._domainkey.agency1.gov. and it seems to work as expected.

However, EXO is also configured with DKIM signing of email being sent to the forcepoint gateway. They have multiple domains in EXO, some of which have dkim "disabled" and others have it "enabled". The actual mail samples I saw seem to use DKIM signing using a a selector associated with the domain "agency1365.onmicrosoft.com", which is marked as enabled in the config. This is apparently what microsoft does when the custom domain dkim policy is not enabled (as in this example). This mail is purely internal to Agency 1 infrastructure between O365 and their on-prem email gateway (which also appears to validate DKIM).

We probably at least need to add some documentation indicating this behavior for situations with more complicated mail flow. In theory, we could detect mail routing configs that send outbound email not directly to the destination and use that to influence our messages.

Additional Defender prodcuts

💡 Summary

Please add baselines and assessments for the other products in the M365 Defender suite, specifically Defender for Endpoint, Defender for Identity and Defender for Cloud Apps.

Motivation and context

Only providing baselines for Defender for Office 365 is not sufficient. there are many other attack paths that don't include email.

This would be useful because... organizations need help protecting themselves from all of the attack paths.

Implementation notes

It would be similar to the documents and tools provided for Defender for O365

Acceptance criteria

When the baselines for all the M365 Defender suite have been published.

  • Criterion

User interaction required for evaluating SPO when using MFA

🐛 Summary

I'm trying to produce a report using a docker container as part of a headless workflow. As a part of the SharePoint rego Get-SPOTenant is used and it requires Connect-SPOService to auth before. According to the documentation it require user interaction (or a global administrator username/password with MFA turned off).

The Graph API can use certificates to authenticate without user interaction, but as far as I can tell the baselines for Sharepoint & Onedrive aren't exposed via Graph API.

To reproduce

Steps to reproduce the behavior:

  1. Install dependencies
  2. Connect-MgGraph -ClientID <client id> -TentantId <tenant id> -CertificateThumbprint <thumbprint>
  3. $InitDom = (Get-MgOrganization).VerifiedDomains | Where-Object {$_.isInitial}
  4. $InitDomPrefix = $InitDom.Name.split(".")[0]
  5. Connect-SPOService -Url "https://$($InitDomPrefix)-admin.sharepoint.com"

Expected behavior

To be able to evaluate baselines in a headless workflow.

Any helpful log output or screenshots

PS C:\> Connect-SPOService -Url "https://$($InitDomPrefix)-admin.sharepoint.com"
Connect-SPOService : No valid OAuth 2.0 authentication session exists
At line:1 char:1
+ Connect-SPOService -Url "https://$($InitDomPrefix)-admin.sharepoint.c ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : NotSpecified: (:) [Connect-SPOService], AuthenticationException
    + FullyQualifiedErrorId : Microsoft.Online.SharePoint.PowerShell.AuthenticationException,Microsoft.Online.SharePoint.PowerShell.ConnectSPOService

Getting an unable to parse input: yaml error

🐛 Summary

When I run the script using all products, I get a message saying "unable to parse input: yaml: line 184: found unknown escape character" followed by a much of objects not found in the Orchestrator.psml. If I run the products individually in the RunScuba script the issue does not occur.

Basically, if I run this, I get the error:
$ProductNames = @("aad", "defender", "exo", "onedrive", "sharepoint", "teams") # The specific products that you want the tool to assess.

If I run the script with this (tested with all products) it runs find
$ProductNames =@("teams")

Any help would be greatly appreciated.

2022-11-11_10-25-43

Current Rego does not check for empty values which can result in False Positives

I noticed this in Teams but this oversight affects EXO and most likely multiple other products as well.
This ties into the Provider Error handling issues where before it was implemented, an error would be thrown and an incorrect report would be displayed.
However, after adding in error handling we discovered a critical oversight in the current Rego and this can be considered an extension of the adding error handling issues.

Below is screenshot of the Provider JSON after adding in error handling and running the provider on an account with insufficient permissions. This is expected behavior.

NullCheck

However, below is screenshot of the report. Notice that everything is passing, and some policies are missing. Teams has 16 testable policies.
Report

Below is the Rego logic for Teams 2.3, where we are checking for if there are any policies that have a property that is listed as a SHOULD NOT be allowed.
2 3

Since no policies are returned the Rego passes as no policies are meeting the fail condition :) fails and nothing is output to the report.
Team23

For the Rego Logic for Teams 2.2 the Rego passes as no policies are meeting the fail condition :)
22

This needs to be addressed fully in the Rego which we're expecting will take a large amount of rework unless a quicker fix can be found.
The temporary solution for the Coral release is to catch the false positives in the CreateReport.psm1 (see picture below) and display the error.
However, TestResults.json will still contain the false positives which is a critical bug that will need to be addressed immediately in the near future.

TempFixPowerShellError

Report is creating errors when run with non ASCII characters

Hello

when I run script runscuba.ps1 report is empty
I have an error inputobject is value null ?

see extract powershell command

unable to parse input: yaml: invalid trailing UTF-8 octet
ConvertTo-Csv : Impossible de lier l'argument au paramètre « InputObject », car il a la valeur Null. Au caractère C:\scubaO365\ScubaGear-main\PowerShell\ScubaGear\Modules\Orchestrator.psm1:272 : 46 + ... $TestResultsCsv = $TestResults | ConvertTo-Csv -NoTypeInformation + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : InvalidData : (:) [ConvertTo-Csv], ParameterBindingValidationException + FullyQualifiedErrorId : ParameterArgumentValidationErrorNullNotAllowed,Microsoft.PowerShell.Commands.ConvertToCs vCommand ConvertTo-Csv : Impossible de lier l'argument au paramètre « InputObject », car il a la valeur Null. Au caractère C:\scubaO365\ScubaGear-main\PowerShell\ScubaGear\Modules\Orchestrator.psm1:272 : 46 + ... $TestResultsCsv = $TestResults | ConvertTo-Csv -NoTypeInformation + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

if you have a solution ?

thanks

PowerPlatform policy 2.1 is missing a check in the Rego

Problem

PowerPlatform policy 2.1 contains 2 separate implementation configurations in the baseline instructions steps 4 and 5. The Rego is currently only checking for step 4, so configuration step 5 is missing.

image

The Fix

Another Rego policy for 2.1 needs to be added to the code. The Rego should check that the disableTrialEnvironmentCreationByNonAdminUsers field in the provider export is == true

Adjust exported cmdlet names to be ScubaGear specific

    I'm wondering if, in the future, we rename all exported command to include the name "Scuba" in them, so as to distinguish them from any other cmdlets on the system? This would be similar to the way that MS Graph has the "Mg" (Get-MgUser) in the cmdlet names. I don't think this is critical for Coral.

_Originally posted by @tkol2022
This is probably most important for export cmdlets.

e.g. Disconnect-Tenant -> Disconnect-SCuBATenant

Add GCCHigh Endpoint

I modified the code to connect to my GCC High Tenants, but it would be nice to have more elegant built-in Endpoint

ProviderSettingsExport.json is broken JSON

🐛 Summary

ProviderSettingsExport.json is broken JSON, resulting in the following reporting error:

ConvertFrom-Json : Invalid JSON primitive: .
At C:\Users\Sankgreall\Documents\AzureDevOps\ScubaGear\PowerShell\ScubaGear\Modules\CreateReport\CreateReport.psm1:39
char:44
+ $SettingsExport =  Get-Content $FileName | ConvertFrom-Json
+                                            ~~~~~~~~~~~~~~~~
    + CategoryInfo          : NotSpecified: (:) [ConvertFrom-Json], ArgumentException
    + FullyQualifiedErrorId : System.ArgumentException,Microsoft.PowerShell.Commands.ConvertFromJsonCommand

Upon investigating ProviderSettingsExport.json and sticking it in a JSON validator, it flagged the following section:

            "conditional_access_policies": ,
    "authorization_policies": {
    "AllowEmailVerifiedUsersToJoinOrganization":  false,
    "AllowInvitesFrom":  "everyone",
    "AllowUserConsentForRiskyApps":  null,
    "AllowedToSignUpEmailBasedSubscriptions":  true,
    "AllowedToUseSspr":  true,
    "BlockMsolPowerShell":  false,
    "DefaultUserRoleOverrides":  null,
    "DefaultUserRolePermissions":  {
                                       "AllowedToCreateApps":  true,
                                       "AllowedToCreateSecurityGroups":  true,
                                       "AllowedToReadBitlockerKeysForOwnedDevice":  true,
                                       "AllowedToReadOtherUsers":  true
                                   },
    "DeletedDateTime":  null,
    "Description":  "Used to manage authorization related settings across the company.",
    "DisplayName":  "Authorization Policy",
    "EnabledPreviewFeatures":  [

                               ],
    "GuestUserRoleId":  "10dae51f-b6af-4016-8d66-8c2a99b929b3",
    "Id":  "authorizationPolicy",
    "PermissionGrantPolicyIdsAssignedToDefaultUserRole":  [
                                                              "ManagePermissionGrantsForSelf.microsoft-user-default-legacy"
                                                          ],
    "AdditionalProperties":  {

                             }
},

I'm assuming this is either supposed to be "conditional_access_policies": "[some value]" or "conditional_access_policies": {... nested dict}

To reproduce

Steps to reproduce the behavior:

  • I've been conducting my testing on a newly provisioned O365 tenant under the MS Developer Sandbox scheme. This is on E3 licensing and configured with a baseline of features.

  • Running the script against this tenant lead to this issue. Should be capable of reproducing as all sandbox tenants are theoretically the same.

Expected behavior

Report should be populated, but it's unable to parse JSON

Cannot connect to multiple tenants

🐛 Summary

We have ran this on our test tenant, and then against a dev environment.
When we get the report from the dev environment it is listing users and settings from the test setup.
Even after a reboot, and deleting the scuba folder and running again from a new location.

To reproduce

Steps to reproduce the behavior:

  1. Run Scuba against tenant 1
  2. Run Scuba against tenant 2
  3. Review reports and you will see details from tenant 1 in tenant 2.

Expected behavior

Would have expected a sign in to occur for each tenant. - which seemed to occur
This reports data from tenant 1 only.
Tenant 2 is then requested a powershell app to setup, and then reports data from tenant 2 only.
See issues for getting the initial domain.

Any helpful log output or screenshots

image

Unit Testing Passing Parameters

When using pester it there is an issue when passing command line parameters.
This will need to be resolved so that a user won't have to edit the tests themselves.
Currently trying to find a work around

Keep track of the latest Microsoft Teams and Microsoft Graph PowerShell module versions

Making this issue to track the MicrosoftTeams PowerShell module 5.0 release.
SetUp.ps1 by default installs the latest module versions.
For ScubaGear, we are enforcing a Maximum version of 4.9999 for MicrosoftTeams. The currently released version is 4.9.1.
Looking at the Teams module release pattern, 5.0 should be due for release in the next few months.

When the Teams module updates to 5.0, this issue is for testing and raising the Maximum plus Minimum allowable versions (Teams 4.9.1 added Application-based auth support) in RequiredVersions.ps1.

Unexpected exception returned from msal

🐛 Summary

When attempting to scan AAD, multiple AAD prompts occur even though the account being used is a Global Reader or even Global Admin and the enterprise application has the appropriate consent granted for the organization. This occurs during the "Running the AAD Provider; 1 of 1 Product settings extracted" process. If you respond to the constant authentication prompts about 20 times it, one of two things will occur.

  1. Powershell will eventually return an error saying "Unexpected exception returned from msal".
  2. MS logon will deny logging in with an error that says: "We couldn't sign you in, pleas try again later". Selecting the option for "use another account" and supplying the same credentials will result in the error above from Item #1.

To reproduce

Steps to reproduce the behavior:

  1. RunSCUBA.ps1 with logon=True and products including AAD.

Expected behavior

Should complete the AAD check

Any helpful log output or screenshots

ERROR when getting the MS "we couldn't sign you in..."

Export-AADProvider : Check the second error message below and if it appears to be related to permissions, your user
account must have a minimum of Global Reader role to run this script. You must also get an administrator to consent to the required MS Graph Powershell application permissions. View the README file for detailed instructions and then try again. At C:\temp2\ScubaGear-main\PowerShell\ScubaGear\Modules\Orchestrator.psm1:154 char:31 + $RetVal = Export-AADProvider | Select-Object -Las ... + ~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [Write-Error], WriteErrorException
+ FullyQualifiedErrorId : Microsoft.PowerShell.Commands.WriteErrorException,Export-AADProvider

Get-MgRoleManagementDirectoryRoleAssignmentScheduleInstance : Code: generalException
Message: Unexpected exception returned from MSAL.
At C:\temp2\ScubaGear-main\PowerShell\ScubaGear\Modules\Providers\ExportAADProvider.psm1:221 char:34

  • ... gnments = @(Get-MgRoleManagementDirectoryRoleAssignmentScheduleInstan ...
  •             ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    
    • CategoryInfo : NotSpecified: (:) [Get-MgRoleManag...leInstance_List], AuthenticationException
    • FullyQualifiedErrorId : Microsoft.Graph.PowerShell.Cmdlets.GetMgRoleManagementDirectoryRoleAssignmentScheduleIns
      tance_List

ERROR when just clicking on the authentication account about 20 times.
PS C:\temp2\ScubaGear-main> .\RunSCuBA.ps1
Export-AADProvider : Check the second error message below and if it appears to be related to permissions, your user
account must have a minimum of Global Reader role to run this script. You must also get an administrator to consent to the required MS Graph Powershell application permissions. View the README file for detailed instructions and then try again. At C:\temp2\ScubaGear-main\PowerShell\ScubaGear\Modules\Orchestrator.psm1:154 char:31 + $RetVal = Export-AADProvider | Select-Object -Las ... + ~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [Write-Error], WriteErrorException
+ FullyQualifiedErrorId : Microsoft.PowerShell.Commands.WriteErrorException,Export-AADProvider

Get-MgUser : Code: generalException
Message: Unexpected exception returned from MSAL.
At C:\temp2\ScubaGear-main\PowerShell\ScubaGear\Modules\Providers\ExportAADProvider.psm1:120 char:17

  • ... $AADUser = Get-MgUser -ErrorAction Stop -UserId $User.Id
  •              ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    
    • CategoryInfo : NotSpecified: (:) [Get-MgUser_Get], AuthenticationException
    • FullyQualifiedErrorId : Microsoft.Graph.PowerShell.Cmdlets.GetMgUser_Get

Add any screenshots of the problem here.
image
image

Teams Provider Error after leaving the PowerShell terminal running

long running teams error

Unable to repeat this error consistently and it only seems to happen on Test Tenant 1 (G5) as it doesn't occur on the Test Tenant 3 (E5) I'm testing on. Documenting this odd error here.

As far as I can tell, it only happens when I leave my PowerShell terminal connected to Test Tenant 1 (G5) and leave it there for some period of time. Then I'll rerun ScubaGear with teams in the list of $ProductNames and this error which only affects the Get-CsTeamsMeetingPolicy cmdlet will pop up from the Team Provider.

Running the disconnect cmdlet, (e.g. Disconnect-MicrosoftTeams), closing and opening new PowerShell terminals doesn't seem to make this error go away. Instead, the error will vanish by itself if I just go do something else for an hour.

Investigate how to consistently alert the user when they do not have the required permissions

I ran the Coral release code for each of the products independently with a test user that does not have any Azure AD roles.

Although each of the products do alert the user that permissions are missing, the output message is very different for each product. Also in some cases (like EXO/Defender/Teams) the tool produces a report, whereas for other products (AAD,OneDrive,Sharepoint) no report is produced.

The scope of this issue would be to investigate if there is a way to consistently display a message to the user alerting them about the missing permissions and also ensure that the tool handles the scenario in the same way. This is probably not a high priority issue, but more like a code standardization / cleanup task.
I have screenshots below showing how each product handles not having the required user role.

image

image

image

image

image

image

image

image

image

image

Fix error handling bugs for teams Rego 2.10

Issue found: Teams rego for 2.10 should be modified
Screen Shot 2022-12-13 at 7 21 42 AM
Should be a easy fix just two lines for 2.10 to change the commandlet
to only "Get-CsTeamsMeetingBroadcastPolicy" but currently is set for "Get-CsTeamsMeetingPolicy"

_Originally posted by @Dylan-MITRE

Azure Government and GCC High Support

💡 Summary

Added some logic and a new parameter to support targeting Azure Government and GCC High cloud environments. Added examples and details to the readme.

Motivation and context

We almost exclusively with clients in Azure Government and GCC High. Targeting the various services with different connection strings require some tedious checking. Once past the connection strings there are various differences between Commercial Azure/M365 and GCC High.

You can change a parameter and target your cloud environment or further expand to the other Azure Clouds (ie DOD).

Implementation notes

Forked repository could pull into this repo. However, many files were edited to make sure the cloud environment identity is carried through the modules. A review needs to take place.

https://github.com/sentinelblue/ScubaGear

Acceptance criteria

How do we know when this work is done?

  • Validated against a GCC High tenant. I validated against our tenant in GCC High/Azure Gov.
  • How to handle errors that occur to configuration differences in GCC High and Commercial? The Teams provider throws some errors due to missing properties on the 'Get-CsTenant' cmdlt. It appears these values aren't populated in my tenant or not populated in GCC High.

Fix EXO deprecated alert policies in MS.EXO.16.1

Defender2 9

Defender 2.9 was showing as a fail in the report and highlighted 2 policies.

  • "Malware campaign detected after delivery"
  • "Unusual increase in email reported as phish"

Both of these prebuilt alert policies have disappeared from the Alert Policy list and thus from current Provider exports.
I looked back at an older Provider JSON and found that policies were still there a little over month ago.

OctoberJson

The names of these policies are listed in EXO 2.16, so this will require both a baseline policy update and a Rego code change.

Improve Cached Tenant Settings cmdlet support

These are my comments on Run-Cached:

  • I tested it and no errors. It only works if there is a ProviderSettingsExport.json in the current directory. See my last bullet.
  • Rename it to make it unique just like we did for Disconnect-Tenant. FYI - Cassey is referencing Run-Cached in her regression test script. (This may be OBE as regression test script likely replaced by new testing framework.)
  • When I execute Run-Cached without any parameters, it simply acts like Run-Scuba. I think we should change the default behavior to distinguish it. Perhaps by default it will set the -ExportProvider $false ?
  • Since we are exporting Run-Cached, do we need to add any instructions to the README? Or do we simply let developers figure this script out by examining the code on their own?
  • I think the script should output a friendly message when there is no ProviderSettingsExport.json file in the current directory or in OutPath. Right now the user gets a bunch of generic errors.

_Originally posted by @tkol2022

Should also be renamed from Invoke-RunCached to Invoke-SCuBACached

Add note about Unblock-File

💡 Summary

Add note about Unblock-File.

Motivation and context

Needed when downloading files.

Implementation notes

cd ScubaGear-0.1.0
Get-ChildItem -Recurse . | Unblock-File

Acceptance criteria

When this has been added.

Explore how to improve the execution performance of the Azure AD provider

The Azure AD export provider is by far the longest of all the products and can take over a minute.
I ran a profiler tool and determined that most of the time in this provider is spent executing in the functions Get-PrivilegedUser and Get-PrivilegedRole. Both of those functions have loops and inside the loops, calls to back-end M365 services are called.

This issue is to explore if there's a way we can make significant improvements to the execution time by exploring alternative provider export algorithms. The ideal outcome of this issue would be a code prototype.

Sample Report Output

Is it possible to provide a sample output of the report generated by this tool?

Getting multiple product results when any number of products are assigned to the ProductNames variable.

🐛 Summary

What's wrong? Please be specific.

When inputting the variable $ProductNames and setting the variable to any number of products, the script runs against all products.

Steps to reproduce the behavior:

Set parameters
$LogIn = $true
$ProductNames = @("aad")
$Endpoint = ""
$OutPath = "./Reports"
$OPAPath = "./"

2
Run .\RunSCuBA.ps1

Expected behavior

What did you expect to happen that didn't?

The scripts will only log into aad and give results for aad. However, all products are logged into and results are given for each.

Add any screenshots of the problem here.

Below is the output from the results of selecting just aad to be tested.

Baseline Conformance Reports Details
Azure Active Directory 9 tests passed 3 warnings 10 tests failed 8 manual checks needed
Microsoft 365 Defender 67 tests passed 7 warnings 3 tests failed 6 manual checks needed
Exchange Online 10 tests passed  4 tests failed 23 manual checks needed
OneDrive for Business 4 tests passed 1 warning 1 test failed 2 manual checks needed
SharePoint Online 4 tests passed  1 test failed 1 manual check needed
Microsoft Teams 16 tests passed   9 manual checks needed

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.