Code Monkey home page Code Monkey logo

sharpliner's Introduction

Build Status Nuget

Sharpliner is a .NET library that lets you use C# for Azure DevOps pipeline definition instead of YAML. Exchange YAML indentation problems for the type-safe environment of C# and let the intellisense speed up your work!

Getting started

All you have to do is reference our NuGet package in your project, override a class with your definition and dotnet build the project! Dead simple!

For more detailed steps, check our documentation.

Example

// Just override prepared abstract classes and `dotnet build` the project, nothing else is needed!
// For a full list of classes you can override
//    see https://github.com/sharpliner/sharpliner/blob/main/src/Sharpliner/AzureDevOps/PublicDefinitions.cs
// You can also generate collections of definitions dynamically
//    see https://github.com/sharpliner/sharpliner/blob/main/docs/AzureDevOps/DefinitionCollections.md
class PullRequestPipeline : SingleStagePipelineDefinition
{
    // Say where to publish the YAML to
    public override string TargetFile => "eng/pr.yml";
    public override TargetPathType TargetPathType => TargetPathType.RelativeToGitRoot;

    public override SingleStagePipeline Pipeline => new()
    {
        Pr = new PrTrigger("main"),

        Variables =
        [
            // YAML ${{ if }} conditions are available with handy macros that expand into the
            // expressions such as comparing branch names. We also have "else"
            If.IsBranch("net-6.0")
                .Variable("DotnetVersion", "6.0.100")
                .Group("net6-keyvault")
            .Else
                .Variable("DotnetVersion", "5.0.202"),
        ],

        Jobs =
        [
            new Job("Build")
            {
                Pool = new HostedPool("Azure Pipelines", "windows-latest"),
                Steps =
                [
                    // Many tasks have helper methods for shorter notation
                    DotNet.Install.Sdk(variables["DotnetVersion"]),

                    // You can also specify any pipeline task in full too
                    Task("DotNetCoreCLI@2", "Build and test") with
                    {
                        Inputs = new()
                        {
                            { "command", "test" },
                            { "projects", "src/MyProject.sln" },
                        }
                    },

                    // Frequently used ${{ if }} statements have readable macros
                    If.IsPullRequest
                        // You can load script contents from a .ps1 file and inline them into YAML
                        // This way you can write scripts with syntax highlighting separately
                        .Step(Powershell.FromResourceFile("New-Report.ps1", "Create build report")),
                ]
            }
        ],
    };
}

Sharpliner features

Apart from the obvious benefits of using static type language with IDE support, not having to have to deal with indentation problems ever again, being able to split the code easily or the ability to generate YAML programatically, there are several other benefits of using Sharpliner.

Intellisense

One of the best things when using Sharpliner is that you won't have to go the YAML reference every time you're adding a new piece of your pipeline. Having everything strongly typed will make your IDE give you hints all the way!

Example intellisense for pipeline variables

Nice APIs

Imagine you want to install the .NET SDK. For that, Azure Pipelines have the DotNetCoreCLI@2 task. However, this task's specification is quite long since the task does many things:

# .NET Core
# Build, test, package, or publish a dotnet application, or run a custom dotnet command
# https://docs.microsoft.com/en-us/azure/devops/pipelines/tasks/build/dotnet-core-cli?view=azure-devops
- task: DotNetCoreCLI@2
  inputs:
    command: 'build' # Options: build, push, pack, publish, restore, run, test, custom
    publishWebProjects: true # Required when command == Publish
    projects: # Optional
    custom: # Required when command == Custom
    arguments: # Optional
    publishTestResults: true # Optional
    testRunTitle: # Optional
    zipAfterPublish: true # Optional
    modifyOutputPath: true # Optional
    feedsToUse: 'select' # Options: select, config
    vstsFeed: # Required when feedsToUse == Select
    feedRestore: # Required when command == restore. projectName/feedName for project-scoped feed. FeedName only for organization-scoped feed.
    includeNuGetOrg: true # Required when feedsToUse == Select
    nugetConfigPath: # Required when feedsToUse == Config
    externalFeedCredentials: # Optional
    noCache: false
    restoreDirectory:
    restoreArguments: # Optional
    verbosityRestore: 'Detailed' # Options: -, quiet, minimal, normal, detailed, diagnostic
    packagesToPush: '$(Build.ArtifactStagingDirectory)/*.nupkg' # Required when command == Push
    nuGetFeedType: 'internal' # Required when command == Push# Options: internal, external
    publishVstsFeed: # Required when command == Push && NuGetFeedType == Internal
    publishPackageMetadata: true # Optional
    publishFeedCredentials: # Required when command == Push && NuGetFeedType == External
    packagesToPack: '**/*.csproj' # Required when command == Pack
    packDirectory: '$(Build.ArtifactStagingDirectory)' # Optional
    nobuild: false # Optional
    includesymbols: false # Optional
    includesource: false # Optional
    versioningScheme: 'off' # Options: off, byPrereleaseNumber, byEnvVar, byBuildNumber
    versionEnvVar: # Required when versioningScheme == byEnvVar
    majorVersion: '1' # Required when versioningScheme == ByPrereleaseNumber
    minorVersion: '0' # Required when versioningScheme == ByPrereleaseNumber
    patchVersion: '0' # Required when versioningScheme == ByPrereleaseNumber
    buildProperties: # Optional
    verbosityPack: 'Detailed' # Options: -, quiet, minimal, normal, detailed, diagnostic
    workingDirectory:

Notice how some of the properties are only valid in a specific combination with other. With Sharpliner, we remove some of this complexity using nice fluent APIs:

DotNet.Install.Sdk(parameters["version"]),

DotNet.Restore.FromFeed("dotnet-7-preview-feed", includeNuGetOrg: false) with
{
    ExternalFeedCredentials = "feeds/dotnet-7",
    NoCache = true,
    RestoreDirectory = ".packages",
},

DotNet.Build("src/MyProject.csproj") with
{
    Timeout = TimeSpan.FromMinutes(20)
},

Useful macros

Some very common pipeline patterns such as comparing the current branch name or detecting pull requests are very cumbersome to do in YAML (long conditions full of complicated ${{ if }} syntax). For many of these, we have handy macros so that you get more readable and shorter code.

For example this YAML

${{ if eq(variables['Build.SourceBranch'], 'refs/heads/production') }}:
    name: rg-suffix
    value: -pr
${{ if ne(variables['Build.SourceBranch'], 'refs/heads/production') }}:
    name: rg-suffix
    value: -prod

can become this C#

If.IsBranch("production")
    .Variable("rg-suffix", "-pr")
.Else
    .Variable("rg-suffix", "-prod")

Re-usable pipeline blocks

Sharpliner lets you re-use code more easily than YAML templates do. Apart from obvious C# code re-use, you can also define sets of C# building blocks and re-use them in your pipelines:

class ProjectBuildSteps : StepLibrary
{
    public override List<Conditioned<Step>> Steps =>
    [
        DotNet.Install.Sdk("6.0.100"),

        If.IsBranch("main")
            .Step(DotNet.Restore.Projects("src/MyProject.sln")),

        DotNet.Build("src/MyProject.sln"),
    ];
}

You can then reference this library in between build steps and it will get expanded into the pipeline's YAML:

...
    new Job("Build")
    {
        Steps =
        [
            Script.Inline("echo 'Hello World'"),

            StepLibrary<ProjectBuildSteps>(),

            Script.Inline("echo 'Goodbye World'"),
        ]
    }
...

More about this feature can be found here (DefinitionLibraries.md).

Sourcing scripts from files

When you need to add cmd, PowerShell or bash steps into your pipeline, maintaining these bits inside YAML can be error prone. With Sharpliner you can keep scripts in their own files (.ps1, .sh..) where you get the natural environment you're used to such as syntax highlighting. Sharpliner gives you APIs to load these on build time and include them inline:

Steps =
{
    Bash.FromResourceFile("embedded-script.sh") with
    {
        DisplayName = "Run post-build clean-up",
        Timeout = TimeSpan.FromMinutes(5),
    }
}

Correct variable/parameter types

Frequent struggle people have with Azure pipelines is using the right type of variable in the right context. Be it a ${{ compile time parameter }}, a variable['used in runtime'] or a $(macro) syntax, with Sharpliner you won't have to worry about which one to pick as it understands the context and selects the right one for you.

Pipeline validation

Your pipeline definition can be validated during publishing and you can uncover issues, such as typos inside dependsOn, you would only find by trying to run the pipeline in CI. This gives you a faster dev loop and greater productivity.

We are continuosly adding new validations as we find new error-prone spots. Each validation can be individually configured/silenced in case you don't wish to take advantage of these:

class YourCustomConfiguration : SharplinerConfiguration
{
    public override void Configure()
    {
        // You can set severity for various validations
        Validations.DependsOn = ValidationSeverity.Off;
        Validations.Name = ValidationSeverity.Warning;

        // You can also further customize serialization
        Serialization.PrettifyYaml = false;
        Serialization.UseElseExpression = true;
        Serialization.IncludeHeaders = false;

        // You can add hooks that execute during the publish process
        Hooks.BeforePublish = (definition, path) => {};
        Hooks.AfterPublish = (definition, path) => {};
    }
}

Something missing?

If you find a missing feature / API / property / use case, file an issue in project's repository. We try to be very responsive and for small asks can deliver you a new version very fast.

If you want to start contributing, either you already know about something missing or you can choose from some of the open issues. We will help you review your first change so that you can continue with something advanced!

Another way to start is to try out Sharpliner to define your own, already existing pipeline. This way you can uncover missing features or you can introduce shortcuts for definitions of build tasks or similar that you use frequently. Contributions like these are also very welcome! In these cases, it is worth starting with describing your intent in an issue first.

Developing Sharpliner

Contributions are very welcome and if you find yourself opening the codebase there are couple of things you should know. The repository layout is quite simple:

.
├── artifacts            # All build outputs go here. Nuke it to clean
├── docs                 # Documentation
├── eng                  # CI/CD for the repo
│   ├── Sharpliner.CI    # C# definitions for pipelines of this repo
│   └── pipelines        # YAML pipelines of the repository
├── src
│   └── Sharpliner       # Main Sharpliner project
│       └── build        # Targets/props for the Sharpliner .nupkg
├── tests
│   ├── NuGet.Tests      # E2E tests using the Sharpliner  .nupkg
│   └── Sharpliner.Tests # Unit tests for the main Sharpliner project
└── Sharpliner.sln       # Main solution of the project

Developing is quite easy - open the Sharpliner.sln solution in VS. However, the solution won't build 100% the first time. This is because of the Sharpliner.CI project. This projects uses Sharpliner and defines pipelines for the Sharpliner repository - the YAML is published to eng/pipelines. This way we test quite many Sharpliner features right in the PR build. The Sharpliner.CI project expects that a package Sharpliner.43.43.43.nupkg is built locally which it then references it simulating the real usage of Sharpliner from nuget.org.

To build all of the solution 100%, you have to build Sharpliner.CI from console as building inside VS won't work on cold checkout. This will package Sharpliner.csproj first and produce the 43.43.43 package:

> dotnet build eng/Sharpliner.CI/Sharpliner.CI.csproj

If you make changes to the main library and want to test it using Sharpliner.CI, clean and then build the CI project from console:

> dotnet clean eng/Sharpliner.CI/Sharpliner.CI.csproj
> dotnet build eng/Sharpliner.CI/Sharpliner.CI.csproj

sharpliner's People

Contributors

aditnryn avatar aheiming avatar iamebonyhope avatar jrbancel avatar jshield avatar mandel-macaque avatar mathiasi avatar modernronin avatar premun avatar thomhurst avatar topperdel avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

sharpliner's Issues

Make it possible to add 1-line inline scripts easier

Motivation

Many times you want to include a very long shell call as a script that should fit into 1 line, e.g.:

dotnet tool install Microsoft.DotNet.XHarness.CLI                                                   \
    --global                                                                                        \
    --add-source https://pkgs.dev.azure.com/dnceng/public/_packaging/dotnet-eng/nuget/v3/index.json \
    --version "1.0.0-prerelease*"

However, you need to escape the EOLs or you need to concatenate it as a long C# string which then makes it harder to read in the YAML.

Goal

Make some API that would tell YAML to use the notation where it concatenates lines of an array it receives via the > sign and allow user to supply an array of strings.

Or some other pleasant way of dealing with this use case.

F5-ing Sharpliner projects in VS fails to build

The MSBuild tasks that publish Sharpliner pipelines out of user projects are only supporting .NET 5 and 6.

Since VS is using the .NET FW MSBuild inside, this means that "F5-ing" projects that are using Sharpliner in Visual Studio returns this error:

MSB4062 The Sharpliner.PublishPipelines task could not be loaded from the assembly Sharpliner.dll. Could not load file or assembly 'System.Runtime, Version=5.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a' or one of its dependencies. The system cannot find the file specified. Confirm that the <UsingTask> declaration is correct, that the assembly and all its dependencies are available, and that the task contains a public class that implements Microsoft.Build.Framework.ITask

I am currently not aware of a work around and I tried targeting netstandard2.1 but that is unfortunately not possible with Sharpliner thanks to the use of records and other features.

More details: https://natemcmaster.com/blog/2017/07/05/msbuild-task-in-nuget/

Missing Azure pipelines expressions

Context

Currently, we can do expressions such as or(), eq() and few more, but not all.

Goals

  • We need to implement all expressions such as containsValue - [full list] (#147)
  • We need to have some way to allow user specify any string expression in case there is some edge case we forget about (#146)
  • All of the current experssions must accept string parameters because we need to be able to assemble any expression (#144)
  • Make or and and accept various number of parameters (#144)

Validate checkout repository references

Say, you do something like:

 new RepositoryCheckoutTask("Cloud") { FetchDepth = 1 }

without having defined the repository under resources.

currently, it will compile without error only when trying to run the pipeline on Devops, you'll get a message

Checkout of repository 'Cloud' is not supported. Only 'self', 'none', or a repository alias are supported.

It would be nice (and I think in the spirit of Sharpliner) if this could be caught at compile time.

Cheers,
Markus / MR

Add source generators for template references

When you define a template, it would be nice you could also get a way to reference/include it for free with strong typing.

Example template:

class InstallDotNetTemplate : StepTemplateDefinition
{
    public override string TargetFile => "install-dotnet.yml";

    public override List<TemplateParameter> Parameters => new()
    {
        BooleanParameter("fullSdk", defaultValue: true),
        StringParameter("version"),
    };

    public override ConditionedList<Step> Definition => ...
}

When referencing this template, you need to do this:

StepTemplate("install-dotnet.yml", new()
{
    { "fullSdk", true },
    { "version", "7.0.100-preview.4.22252.9" }
}),

It would be nice, if instead, Sharpliner generated a method for every template using the source generators which would give you this API:

public static Template<Step> InstallDotNetTemplate(string version, bool fullSdk = true);

Requirements

The generated code should

  • Respect parameter order, default values
  • Figure out location and file name (though this might be difficult)
  • Make sure generated APIs for similar templates do not get in conflict over names - we should just respect the namespaces then?
  • Theoretically we could ask people for isRequired to generate a default value for arguments or not

Further, the generators could be in a separate NuGet that you would install to turn on this behaviour.

Some reading

Allow conditional values in Dictionaries (e.g. template parameters)

We should allow following:

      parameters:
        runtimeFlavor: ${{ parameters.runtimeFlavor }}
        stagedBuild: ${{ parameters.stagedBuild }}
        buildConfig: ${{ parameters.buildConfig }}
        ${{ if eq(parameters.passPlatforms, true) }}:					<---- THIS
          platforms: ${{ parameters.platforms }}
        helixQueueGroup: ${{ parameters.helixQueueGroup }}
        crossBuild: true
        crossrootfsDir: '/crossrootfs/arm64'
        ${{ insert }}: ${{ parameters.jobParameters }}

Validate artifact names in AzDO pipelines

When validating pipeline before publishing, we could validate that all required artifacts are also published.

We could compare the list of published artifacts (get all Publish tasks and get the artifact names) and we could search for all Download tasks that download from the current build.
If we have an artifact in the Download task that does not appear anywhere in the Publish tasks (and we're not using variables in Publish tasks ideally), we can probably warn/error out.

Create a way to define/consume templates

We need to figure out if:

  • We want to support YAML templates still or if C# will overcome these and we won't need them
  • We should have our own way of consuming C# definitions (C# templates) and adding them into the pipeline
    • We need to define pieces of pipelines in C# and parametrize them
  • We need to figure out how to define a template file
    • We can only define a pipeline now, not a template file
    • Template can start at any level of the pipeline, so we have to handle that in the code somehow

Progress:

  • Consume templates in AzDO pipelines (#18)
  • Define templates (#60)
  • Add documentation (#89)

Stop producing bin/ artifacts from Sharpliner projects

When we build projects using Sharpliner, we produce binaries just like it was a regular library.
These binaries might not be needed as we could theoretically just use the obj/ DLL to get the definition from.
I am not sure it is possible but it would be nicer to not produce these extra files.

Guardian build task

Every time we change the C# definition of the pipeline, it needs to be exported into the .yaml file in the repo
which the pipeline is defined by. I currently don't see a way to have the C# definition only.

There should be a build step defined that will run in PRs and verify that the C# definition was exported into the YAML file in the repo.
It will export the definition and compare with the YAML file and fail if it doesn't match.
This way you will have a validation that you did the export before opening a PR.

[Model] Remove the need of 'new' in branches for a trigger

In the sample code we have:

Pr = new DetailedPrTrigger()
{
    Branches = new()
    {
        Include = { "main", "xcode/*" }
    }
},

I am very lazy, can I remove the need of 'new'?

Pr = new DetailedPrTrigger()
{
    Branches =
    {
        Include = { "main", "xcode/*" }
    }
},

Since the code generated does not need to be efficient, we just serialise, I don't mind putting some pressure in the GC and have an object by default and later remove it. Will this complicate the serialisation, definitely, but we are here for that :)

Script tasks should allow loading from file/resources

When we have PowerShell/bash scripts that are used as build steps, we should be able to develop them in their native environment - as standalone files - with all intellisense and IDE help we are used to.

This means that we should be able to define the step using an embedded resource or a file path which will load the contents and create the inline step.

Create shorthands for all things

Context

I think it's a bit unclear when people should use new to create objects in the pipeline definition and when they can use shorthands such as Script or Download.

We should consider creating shorthands for all things that require new still so that we have options for everything and people don't have to new.

Example

Now:

Jobs =
{
    Dotnet.Install("5.0.100"),
    Script.Inline("echo 'foo'"),
    new Job("name", "display name")
    {
        Property = value
    }
}

After:

Jobs =
{
    Dotnet.Install("5.0.100"),
    Script.Inline("echo 'foo'"),
    Job("name", "display name") with
    {
        Property = value
    }
}

Downside is having to have to use with.

RFC: Provide a declarative syntax

While a fluent syntax was very nice before we had collection initialisers, property initialisers and other new syntax in C# nowadays we can (at the same time) provide a declarative syntax with the fluent one. The way I see it is as follows:

Declarative approach

// declare a pipeline with either jobs add stages
Pipeline {
    Name =  "Example pipeline",
    Resources = [
        new Repository {Url = ""},
    ],
    Variables = [
        new Variable<String> {Name = "Env", Value = "String" },
        new Variable<bool> {Name ="RunTests", Value = true },
    ],
    Stages = {
        new Stage {
            Variables = [
                new Variable<String> {Name = "StageEnv", Value = "String" },
            ],
            new Job {
                DisplayName: "My naive job",
                Steps = {
                    new Bash {
                        Name = "Naive bash step",
                        Path = "test-example.sh",  
                    }
               }   
           }
       }
    }
}

With the above, we can take advance of several language features, specially from C# 9 that will make it nice to work with AND intellisense will work.

  • Object property initializer
  • Collection initiallizers
  • Variables can use generics, we should be able to infer the type and not even need to add the type.

We will need some validation, for example for loops etc... but we can make our users write very straight forward declarative definitions that are type safe.

Schedules is written out as singular instead of plural

Example:

public class CreatePulumiImagePipeline : SingleStagePipelineDefinition
{
    public override SingleStagePipeline Pipeline =>
        new()
        {
            Name = "CreatePulumiImages",
            Schedule = new List<ScheduledTrigger> { new("31 01 * * *") },
            Variables = { Group("CommonPipelineVariables") },
            Jobs = ...
        };
    public override string TargetFile => "../Pipelines/CreatePulumiImage.yml";
}

comes out as

name: CreatePulumiImages

schedule:                       # should be "schedules"
- cron: 31 01 * * *

variables:
- group: CommonPipelineVariables

jobs: ...

Thanks for this library! Before I tried Nuke, but that is really rather ugly. Conversely, Sharpliner has a much cleaner concept (generating yamls) and allows to describe ones pipelines very nicely.

PowershellFileTask generates invalid yaml

Hi,

maybe doing something wrong, but

new PowershellFileTask("tools/scripts/RunDocker.ps1")
{
    Arguments = args,
    DisplayName = "Running docker image",
    Pwsh = true
};

creates this yaml:

name: CreateReleasableImages

trigger:
  branches:
    include:
    - master
    - STABLE

variables:
- group: CommonPipelineVariables

- name: DOTNET_SKIP_FIRST_TIME_EXPERIENCE
  value: true

jobs:
- job: job
  pool:
    vmImage: ubuntu-latest
  steps:
  - checkout: self
    fetchDepth: 1

  - powershell: tools/scripts/RunDocker.ps1
    targetType: filepath
    arguments: -image ourcontainerregistry.azurecr.io/tools/pulumi:latest -command "/usr/bin/pwsh -Command {./vso.ps1; ./LogIntoAzure.ps1; ./RunPulumi.ps1 }" -doEnableDocker $true -$volumeMounts $(Build.SourcesDirectory)/image-maker:~/project
    displayName: Running docker image
    pwsh: true

which DevOps doesn't like:

/tools/Pipelines/CreateReleasableImages.yml (Line: 31, Col: 5): Unexpected value 'targetType'
/tools/Pipelines/CreateReleasableImages.yml (Line: 32, Col: 5): Unexpected value 'arguments'
/tools/Pipelines/CreateReleasableImages.yml (Line: 34, Col: 5): Unexpected value 'pwsh'

Looking at the documentation of the task, I get the impression that powershell is only for inline scripts and if one wants to use a file, one has to use the PowerShell@2 task.

Docker task can be made better

Only logging this issue for later. Originally taken from here - #162 (comment)

I am also not sure how much this follows the general library style (simply because I've just started using it), but maybe it's of some help:

EDIT: a few things yet need generalization, this was just for my use-case:

  • currently it assumes you always want to push, there will be situations when one doesn't
  • arguments and dockerfilepath have no defaults/cannot be left out
  • one would need to test with multiple tags - the syntax for this is a bit weird in the docker task, instead of being a yaml array it wants it to be a multiline string

and ofc, sorry, I didn't have the time to write an approvals test.

public class DockerBuildAndPushSteps : StepLibrary
{
    public override List<Conditioned<Step>> Steps =>
        new()
        {
            new Login(ServiceConnection).Step,
            new Build(ServiceConnection, Repository, Tags, DockerfilePath, Arguments).Step,
            new Push(ServiceConnection, Repository, Tags).Step,
            new Logout(ServiceConnection).Step
        };
    public IDictionary<string, string> Arguments { get; init; }
    public string DockerfilePath { get; init; }
    public string Repository { get; init; }

    public string ServiceConnection { get; init; }
    public string[] Tags { get; init; }

    record Build(string ServiceConnection, string Repository, string[] Tags, string DockerfilePath, IDictionary<string, string> Arguments)
        : RepositoryTask(ServiceConnection, Repository, Tags)
    {
        protected override string Command => "build";
        protected override string DisplayName => "build image";
        protected override ImmutableDictionary<string, object> Inputs
        {
            get
            {
                var args = string.Join(' ', Arguments.Select(toSecretArg));
                return base.Inputs.Add("Dockerfile", DockerfilePath).Add("arguments", args);

                string toSecretArg(KeyValuePair<string, string> kvp) => $"--build-arg {kvp.Key}={kvp.Value}";
            }
        }
    }

    abstract record DockerTask(string ServiceConnection)
    {
        public AzureDevOpsTask Step
        {
            get
            {
                var inputs = new TaskInputs();
                foreach (var (key, value) in Inputs.Add("command", Command).Add("containerRegistry", ServiceConnection))
                    inputs.Add(key, value);
                return new ***@***.***")
                {
                    DisplayName = DisplayName,
                    Inputs = inputs
                };
            }
       }
        protected virtual ImmutableDictionary<string, object> Inputs => ImmutableDictionary<string, object>.Empty;
        protected abstract string DisplayName { get; }
        protected abstract string Command { get; }
    }

    record Login(string ServiceConnection) : DockerTask(ServiceConnection)
    {
        protected override string Command => "login";
        protected override string DisplayName => "Login to ACR";
    }

    record Logout(string ServiceConnection) : DockerTask(ServiceConnection)
    {
        protected override string Command => "logout";
        protected override string DisplayName => "Logout from ACR";
    }

    record Push(string ServiceConnection, string Repository, string[] Tags) : RepositoryTask(ServiceConnection, Repository, Tags)
    {
        protected override string Command => "push";
        protected override string DisplayName => "push image";
    }

    abstract record RepositoryTask(string ServiceConnection, string Repository, string[] Tags) : DockerTask(ServiceConnection)
    {
        protected override ImmutableDictionary<string, object> Inputs =>
            ImmutableDictionary<string, object>.Empty.Add("repository", Repository).Add("tags", string.Join(Environment.NewLine, Tags));
    }
}

Usage like this:

StepLibrary(new DockerBuildAndPushSteps
            {
                ServiceConnection = "InternalRegistry",
                Repository = "tools/pulumi",
                Tags = new[] { "latest" },
                DockerfilePath = "$(Build.SourcesDirectory)/tools/docker/Dockerfile.Pulumi",
                Arguments = new Dictionary<string, string>
                {
                    ["AzureLoginApplicationId"] = "$(AzureLoginApplicationId)",
                    ["AzureLoginSecret"] = "$(AzureLoginSecret)",
                    ["AzureLoginTenantId"] = "$(AzureLoginTenantId)"
                }
            })

Create a full Azure DevOps pipeline definition example

We should have somewhere a large pipeline definition that would showcase how to run all of the different parts of the pipeline since it might not be clear to everyone after they start using Sharpliner.

It should showcase all of the weird parts such as conditions in template parameters and so on..

Fluent definition factory

There should be a fluent factory model that will enable readable definition on-par or better than YAML:

Something in the lines of:

Pipelines.Create("pipeline-name")
    .TriggeredBy(...)
    .WithVariableGroup("some-variables")
    .WithVariable("IsPublic", true)
    .AddStage("stage-name", "Stage 1")
        .DependsOn("stage-foo")
        .AddStep(...)
            .OnlyWhenSucceeded()
        .AddStep(...)
    .SaveTo("azure-pipelines.yml")

Additionally, there should be implementations for known steps such as AddNuGetRestoreStep etc.

API Mock-up

This is a living issue. Edit the description directly.

public class MainPipeline : Sharpliner.PipelineDefinition
{
    protected override string TargetFile => "/azure-pipelines.yml";

    protected override Pipeline Pipeline => new()
    {
        Name = "$(Date:yyyMMdd).$(Rev:rr)",

        Triggers = new[]
        {
            new()
            {
                Branches = new[] { "main", "production" },
                IsBatch = true,
                Exclude = new[] { "tools/*" },
            }
        },

        Prs = new[]
        {
            new()
            {
                Branches = new[] { "main" },
                AutoCancel = true,
            }
        },

        Variables = new[]
        {
            Variable("Configuration", "Release"),
            Variable("DOTNET_SKIP_FIRST_TIME_EXPERIENCE", true),
            Group("PR keyvault variables"),

            If.Equal("variables['Build.Reason']", "PullRequest")
                .Variable("TargetBranch", "$(System.PullRequest.SourceBranch)"),

            If.NotEqual("variables['Build.Reason']", "PullRequest")
                .Variable("TargetBranch", "$(Build.SourceBranch)"),

            If(And(Equals("variables['Build.SourceBranch']", "refs/heads/production"), NotEquals("Configuration", "Debug")))
                .Variable("PublishProfileFile", "Prod")
                .Variable("AzureSubscription", "NetHelix"),
        },

        Stages = new[]
        {
            new()
            {
                Jobs = new[]
                {
                    new Job("PublishClient", "Publish wheel packages")
                    {
                        Timeout = TimeSpan.FromMinutes(30),
                        Pool = new HostedPool(vmImage: "windows-2019"),
                        Checkout = 'self',
                        Clean = true,
                        Steps = new[]
                        {
                            new BashTask("Install wheel and twine"
                                "pip install wheel==0.33.6 twine==2.0.0"),

                            new TwineAuthenticateTask(artifactFeed: "public/$(HelixClientFeedName)"),

                            new BashTask("Publish Helix Installer Wheel"
                                @"python $(HelixInstallerDir)\setup.py egg_info --tag-build $(HelixScriptsVersion) bdist_wheel
                                  python -m twine upload -r $(HelixClientFeedName) --config-file $(PYPIRC_PATH) $(HelixInstallerDir)\dist\*.whl"),
                        ]
                    },

                    new Job("BuildClient", "Build solution")
                    {
                        Timeout = TimeSpan.FromMinutes(30),
                        Pool = new HostedPool(vmImage: "windows-2019"),
                        Checkout = 'self',
                        Clean = true,
                        DependsOn = new[] { "PublishClient" },
                        Steps = new[]
                        {
                            If("and(eq(variables['Build.Reason'], 'PullRequest'), ne(variables['Build.SourceBranch'], 'refs/heads/production'))",
                                new PowerShellTask("Enforce GitHub issue link presence"
                                    "eng/enforce-issue.ps1 -PullRequestId $(System.PullRequest.PullRequestId) -AccessToken $(System.AccessToken)")),

                            new UseDotNetTask("Use .NET 2.1.x SDK", type: DotNetPackageType.SDK, version: "2.1.x"),

                            new CustomDotNetCoreCLITask("Build Helix.Machines.sln", "msbuild") {
                                Projects = new[] { "Helix.Machines.sln" },
                                Arguments = new[] { "-v:m", "-m:1", "-nr:false", "-restore", "-t:Build;Publish" },
                            }
                        }
                    }
                }
            }
        }
    };
}

This is the source pipeline that we are trying to model:

name: $(Date:yyyMMdd).$(Rev:rr)

trigger:
  batch: true
  branches:
    include:
      - main
      - production
  paths:
    exclude:
      - tools/*

pr:
  autoCancel: true
  branches:
    include:
      - main
  paths:
    exclude:
      - tools/*

variables:
  - name: Configuration
    value: Release
  - name: DOTNET_SKIP_FIRST_TIME_EXPERIENCE
    value: true
  - group: PR keyvault variables

  - ${{ if eq(variables['Build.Reason'], 'PullRequest') }}:
      - name: TargetBranch
        value: $(System.PullRequest.SourceBranch)
  - ${{ if ne(variables['Build.Reason'], 'PullRequest') }}:
      - name: TargetBranch
        value: $(Build.SourceBranch)

  - ${{ if ne(variables['Build.SourceBranch'], 'refs/heads/production') }}:
      - name: PublishProfileFile
        value: Prod
      - name: AzureSubscription
        value: NetHelix

stages:
  - stage: Build
    jobs:
      - job: PublishClient
        displayName: Publish wheel packages
        timeoutInMinutes: 30
        pool:
          vmImage: "windows-2019"

        steps:
          - script: pip install wheel==0.33.6 twine==2.0.0
            displayName: Install wheel and twine

          - task: TwineAuthenticate@1
            displayName: Twine Authenticate
            inputs:
              artifactFeed: public/$(HelixClientFeedName)

          - script: |
              python $(HelixInstallerDir)\setup.py egg_info --tag-build $(HelixScriptsVersion) bdist_wheel
              python -m twine upload -r $(HelixClientFeedName) --config-file $(PYPIRC_PATH) $(HelixInstallerDir)\dist\*.whl
            displayName: Create/Publish Helix Installer Wheel

      - job: Build
        displayName: Build solution
        dependsOn: PublishClient
        timeoutInMinutes: 30
        pool:
          vmImage: "windows-2019"

        steps:
          - checkout: self
            clean: true

          - ${{ if and(eq(variables['Build.Reason'], 'PullRequest'), ne(variables['Build.SourceBranch'], 'refs/heads/production')) }}:
              - powershell: eng/enforce-issue.ps1 -PullRequestId $(System.PullRequest.PullRequestId) -AccessToken $(System.AccessToken)
                displayName: Enforce GitHub issue link presence

          - task: UseDotNet@2
            displayName: Use .NET 2.1.x SDK
            inputs:
              packageType: sdk
              version: 2.1.x

          - task: DotNetCoreCLI@2
            inputs:
              command: custom
              custom: msbuild
              projects: Helix.Machines.sln
              arguments: >-
                -v:m
                -m:1
                -nr:false
                -restore
                -t:Build;Publish

Preserve order of properties in YAML

Context

Now, when we generate YAML, the order of properties is set by the YamlOrder attribute or just by the way the members are defined in C#.

Proposal

We could make the YAML get generated in the same order as the user defines properties by keeping them stored in a hidden dictionary and then only serialize this dictionary when serializing each object.

Something like this:

abstract class ParentOfAllModels
{
    protected T? Get<T>(string name, string? defaultValue = null)
        => _values.TryGetValue(name, out var value) ? value.ToString() : defaultValue;

    protected void Set<T>(string name, T? value)
    {
        if (value == null)
        {
            if (_values.ContainsKey(name))
            {
                _values.Remove(name);
            }
        }
        else
        {
            _values[name] = value;
        }
    }
}

and then have the properties do this:

    [YamlIgnore]
    public string? Projects
    {
        get => Get<string>("projects");
        init => Set<string>("projects", value);
    }

However, this would require a considerable amount of work to replace all members.

Print `##[error]` when validating YAML changes failed

In the SharplinerValidate task that checks if changes were published, we only exit with 1 when changes are found.

We could also print an ##[error] message that AzDO understands and can display it in the build errors.

Create a smart NuGet that will compile YAML on build

The developer flow will look like this:

  1. You create a CI project where your definitions will be, e.g. XHarness.CI.csproj
  2. You install these packages:
  <ItemGroup>
    <PackageReference Include="Sharpliner" Version="xy" />
    <PackageReference Include="Sharpliner.Tools" Version="xy" />
  </ItemGroup>
  1. You use the first package to define the pipeline using the PipelineDefinition class
  2. The second automatically adds AfterTargets="Build" that will scan the assembly for all PipelineDefinition implementations
  3. It will serialize them to a file based on the settings in the class

Enable dynamic definition of pipelines

Currently, it is not possible to generate pipelines dynamically. There are use cases where people might for example want to generate templates from some C# array of values.

We should come up with a way to define these.

Be able to automatically bootstrap new pipelines in AzDO

this a low priority request.

It'd be nice to be able to run a single pipeline that validates the pipelines are good and correct during a PullRequest, and when merged into the trunk branch, have the job running be able to create/update the pipelines in the project owning the repo using the SYSTEM_ACCESSTOKEN or another env var, based on attributes that can be added or overridable properties.

i.e. [AutomaticallyRegisterPipeline(string name, string? folder)]

It could potentially be achieved by having a task similar to the ValidateYamlsArePublished in AzureDevOpsDefinition

Dry run mode (local dev workflow)

It might be awesome to have a local developer workflow story where you would kick it off locally and it would simulate and say which commands would be run and when.

Not sure how possible it is but another northstar goal we can think about.

Onboarding story for fast adoption

There should be an onboarding process that will help you get started. There should be some dotnet tool that will take your YAML and create the C# counter-part.

Figure out a way to define re-usable sets of steps

A common scenario will be defining a series of useful build steps or maybe even jobs like we would do using templates and them re-using them across pipelines.

Currently, it's not possible to define an array of things and then embed it somewhere else, so we should find a way to make this very easy so that people can re-use stuff easier.

Improve assembly loading to get type matching

We should load the user's assembly into the correct binding context so that we can cast into a Definition type and call Publish or Validate directly.

Not sure if it is possible because we are loading the DLL from some path outside of GAC probably. But if we can make it, the code for publishing would get so much nicer.

Specifically, I think we're having this problem:
https://docs.microsoft.com/en-us/dotnet/framework/deployment/best-practices-for-assembly-loading?redirectedfrom=MSDN#avoid_loading_into_multiple_contexts

If the target assembly must remain outside your application path, you can use the LoadFrom method to load it into the load-from context. If the target assembly was compiled with a reference to your application's Utility assembly, it will use the Utility assembly that your application has loaded into the default load context. Note that problems can occur if the target assembly has a dependency on a copy of the Utility assembly located outside your application path. If that assembly is loaded into the load-from context before your application loads the Utility assembly, your application's load will fail.

Some reading:

Ability to consume pipelines from other assemblies

Hey guys, really awesome library you have here.

So here is the scenario:

I'd like to be able to publish a set of baseline pipelines that can be consumed through traditional packaging methods such as a nuget package such that a downstream team can extend from an abstract class filling in certain properties and getting a fully rendered pipeline out as a result.

so far a couple of things I have noticed,

  1. the msbuild task fires by default, was able to disable the behaviour by excluding build assets in the baseline library reference to Sharpliner
  2. and the real killer, seems that the msbuild task has issues resolving assemblies references that aren't YamlDotNet and Sharpliner.
// Baseline.Templates assembly

namespace Baseline.Templates;

public abstract class PublishToolsPipeline : SingleStagePipelineDefinition
{

    public abstract string MajorMinor { get; }
    
    public abstract string Project { get;  }

    public override string TargetFile => GetType().GetTargetFileFromAttribute<PipelineOutput>();

    public override SingleStagePipeline Pipeline => new()
    {
        
        Variables =
        {
            Variable("majorMinor",MajorMinor),
            Variable("patch", "$[counter(variables['majorMinor'], 0)]"),
            Variable("version" ,"$(majorMinor).$(patch)")
        },
        Jobs =
        {
            new Job("Publish", "Publish to AWE DevOps")
            {
                Pool = HostedPools.Ubuntu,
                Steps =
                {
                    Tasks.NugetAuthenticate("Authenticating Against Azure Artifact Feeds"),
                    DotNet.Install.Sdk("6.x"),
                    DotNet.Restore.FromNuGetConfig("Nuget.config") with
                    {
                        DisplayName = "Restoring Feeds in Nuget.config",
                    },
                    DotNet.Pack(Project) with
                    {
                        DisplayName  = "Packaging Azure DevOps Pipeline Tools",
                        ConfigurationToPack = "Release",
                        Arguments = "/p:PackageVersion=$(version)"

                    },
                    If.And(IsNotPullRequest, IsBranch("master"))
                        .Step(Tasks.PushToDevOps("Publish to DevOps Feed","$(Build.SourcesDirectory)/**/*.nupkg;!$(Build.SourcesDirectory)/**/*.symbols.nupkg"))
                }
            }
        }
        
    };
}

public static class HostedPools
{
    public static readonly HostedPool Ubuntu = new HostedPool("Azure Pipelines", "ubuntu-latest");
}

public static class Tasks
{
    public static AzureDevOpsTask NugetAuthenticate(string displayName) => new("NuGetAuthenticate@0") {DisplayName = displayName};

    public static AzureDevOpsTask PushToDevOps(string displayName, string packages = "") => new("NuGetCommand@2")
    {
        DisplayName = displayName,
        Inputs = new()
        {

            {"command", "push"},
            {"packagesToPush", packages},
            {"nuGetFeedType", "internal"},
            {"publishVstsFeed", "<redacted to protect the innocent>"},
        }
    };
}
// AzDo.Pipeline.Lint.Pipelines assembly.

namespace AzDo.Pipeline.Lint.Pipelines;

[PipelineOutput("../azure-pipelines.yml")]
public class PublishAzDoPipelineLintPipeline : PublishToolsPipeline
{
    public override string MajorMinor => "1.0";
    public override string Project => "azdo-pipeline-lint.sln";
}
  dotnet build --project AzDo.Pipeline.Lint.Pipelines -c Release
 
  Sharpliner.targets(20, 5): [MSB4018] The "PublishDefinitions" task failed unexpectedly.
System.Reflection.ReflectionTypeLoadException: Unable to load one or more of the requested types.
Could not load file or assembly 'Baseline.Templates, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null'. General Exception (0x80131500)
   at System.Reflection.RuntimeModule.GetTypes(RuntimeModule module)
   at System.Reflection.Assembly.GetTypes()
   at Sharpliner.PublishDefinitions.FindAllImplementations[T](Boolean isInterface)
   at Sharpliner.PublishDefinitions.Execute()
   at Microsoft.Build.BackEnd.TaskExecutionHost.Microsoft.Build.BackEnd.ITaskExecutionHost.Execute()
   at Microsoft.Build.BackEnd.TaskBuilder.ExecuteInstantiatedTask(ITaskExecutionHost taskExecutionHost, TaskLoggingContext taskLoggingContext, TaskHost taskHost, ItemBucket bucket, TaskExecutionMode howToExecuteTask)
System.IO.FileLoadException: Could not load file or assembly 'Baseline.Templates, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null'. General Exception (0x80131500)
File name: 'Baseline.Templates, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null'
 ---> System.Exception: Failed to find Sharpliner dependency Baseline.Templates, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null
   at Sharpliner.PublishDefinitions.<>c__DisplayClass12_0.<LoadAssembly>g__ResolveAssembly|4(Object sender, ResolveEventArgs e)
   at System.Runtime.Loader.AssemblyLoadContext.InvokeResolveEvent(ResolveEventHandler eventHandler, RuntimeAssembly assembly, String name)
   at System.Runtime.Loader.AssemblyLoadContext.OnAssemblyResolve(RuntimeAssembly assembly, String assemblyFullName)

am I doing something wrong?

AzDO model completeness tracking

Goal is to make all of the pipeline pieces available in C#: https://docs.microsoft.com/en-us/azure/devops/pipelines/yaml-schema?view=azure-devops&tabs=schema%2Cparameter-schema

We should probably use C# records so that we can utilize the with statement when defining properties in factories.

Maybe we can first aim at a pipeline we know and try to implement everything to get it working.

  • #69
  • #70
  • #71
  • #72
  • Steps
    • Script
    • Bash
    • pwsh
    • PowerShell
    • Publish
    • Download
    • Checkout
    • Task
  • Variables
  • Template references
  • Parameters
  • #73
  • #82
  • #74
  • #75
  • #76

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.