Comments (11)
No this is not blocking me, as I mentioned it's a low priority request, I consider it a Could Have following MoSCoW prioritization.
I completely agreed that getting this done right is much better from a maintenance perspective.
If reflection is not practical what about exposing a virtual Metadata or a Tags bag, similar to how you are currently exposing Headers that way I can do the same sort of thing above as I'm doing with TemplateOutput and load the Metadata bag based on my attributes and pull them out in the hook?
Thanks for pointing out Equals, yeah I'd basically straight copied the existing condition, and slapped Condition round it, i'll make sure to clean that up.
Strong Types for Predefined Variables would be fantastic, not sure if there is a schema somewhere you can generate them from, but Build.Reason would definitely be a good starting point.
from sharpliner.
As for the pipeline linter, most of it is specific to our existing template library, I'll see if I can clean out the org specific stuff and post an example at some point. But as I mentioned the assemblies containing the PipelineParser are distributed with the agent itself in the following namespaces
using Microsoft.TeamFoundation.DistributedTask.Pipelines;
using Microsoft.TeamFoundation.DistributedTask.Pipelines.Artifacts;
using Microsoft.TeamFoundation.DistributedTask.Pipelines.Yaml;
I wrote up a number of mocked interfaces the PipelineParser wants that bridges in some of the services like ITraceWriter, IArtifactResolver and IFileProvider, but once I had implemented those I was able to compile pipelines locally both to the object graph used by azure devops and the resulting yaml.
one of the other advantages is I can redirect repository references to local paths or to repos up to our azure devops org, so when I'm working on making changes to the baseline templates I can compile pipelines from across the organization, all locally without having to do the push, commit, validate or call the validate API on Azure DevOps itself.
The only major downside is it falls under a bit of a grey area, as I believe the assemblies i'm referencing aren't open source and are probably behind the MS EULA for Azure DevOps Server.
so I call it like:
azdo-pipeline-lint.exe -p examples\azure.functionApp\deploy.yml
and it returns:
Parsed: examples/azure.functionApp/deploy.yml@self
No errors.
Composition:
[1]: examples/azure.functionApp/deploy.yml@self
[2]: pipelines/application-azure-deploy.yml@self
[3]: pipelines/templates/dotNetCore/steps-web-build.yml@self
[4]: pipelines/templates/dotNetCore/steps-console-build.yml@self
[5]: pipelines/templates/dotNetCore/steps-unitTest.yml@self
[6]: pipelines/templates/python/steps-build.yml@self
[7]: pipelines/templates/msBuild/steps-msBuild.yml@self
[8]: pipelines/templates/maven/steps-maven-build.yml@self
[9]: pipelines/templates/nodejs/steps-web-build.yml@self
[10]: pipelines/templates/angular/steps-web-build.yml@self
[11]: pipelines/templates/reactjs/steps-web-build.yml@self
[12]: pipelines/templates/docker/steps-docker-build.yml@self
[13]: pipelines/templates/arm/steps-arm-createTemplate.yml@self
[14]: pipelines/templates/utilities/steps-copyFiles.yml@self
[15]: examples/azure.functionApp/variables-sandpit.yml@self
[16]: pipelines/templates/deployment/steps-azure.yml@self
[17]: pipelines/templates/deployment/../utilities/steps-getKeyVaultSecrets.yml@self
[18]: pipelines/templates/deployment/../utilities/steps-replaceTokensInFile.yml@self
[19]: pipelines/templates/deployment/../utilities/steps-replacePolicyInFile.yml@self
[20]: pipelines/templates/deployment/../utilities/steps-replaceApiSpecificationInFile.yml@self
[21]: pipelines/templates/deployment/../arm/steps-arm-deploy.yml@self
[22]: pipelines/templates/deployment/../utilities/steps-syncAzureBlobs.yml@self
[23]: pipelines/templates/deployment/../steps-webApp-deploy.yml@self
[24]: pipelines/templates/deployment/../steps-containerWebApp-deploy.yml@self
[25]: pipelines/templates/deployment/../steps-functionApp-deploy.yml@self
[26]: pipelines/templates/deployment/../utilities/steps-runScripts.yml@self
[27]: pipelines/templates/deployment/../../../database/migrations/liquibase/templates/steps-liquibase-update.yml@self
[28]: pipelines/templates/deployment/../steps-appService-swap.yml@self
[29]: pipelines/templates/deployment/../dotNetCore/steps-unitTest-binaries.yml@self
Checks:
[OK]: Uses Baseline Templates Check
> extends from pipelines/application-azure-deploy.yml authorized template.
[OK]: Uses Main Branch Check
> Uses main branch.
[OK]: Arm Configuration Exists If Set Check
> Loaded examples/azure.functionApp/armConfig.json, found 9 template references.
[OK]: Arm Template References Have Unique Names Check
> All template names are unique.
[OK]: Arm Templates Exist In Template Repository Check
> All templates exist in the templates repository.
from sharpliner.
@jshield I have added the publish process hooks:
https://github.com/sharpliner/sharpliner/blob/main/docs/AzureDevOps/GettingStarted.md#5-customize-serialization-or-configure-validations
They are quite basic but let me know if you need more
(also some more goodies https://github.com/sharpliner/sharpliner#intellisense)
from sharpliner.
Hi,
Thanks for the idea! Do you actually have a use case where you create new pipelines this often?
from sharpliner.
Hey, thanks for the fast response.
I could potentially manage this through something like terraform and the azure devops provider, but given I'm mostly a .NET developer, it'd be nice to be able to do it closer to the pipeline definition themselves.
It's personally for me less about frequency, and more about consistency. Another example of something in the same problem space would be synchronizing a pipeline trigger's path inclusion exclusion rules with branch policy build validation path rules as well.
Although yes in my case, as I provide devops pipeline support for multiple teams across multiple repositories/projects, I do standup a lot of pipelines regularly. particularly when new services are being stood up.
Cheers for considering it though.
from sharpliner.
I don't have a good opinion on this but similar topics have come up in some discussions and issues in this repo already. I am a little bit torn on what is the purpose of Sharpliner.
- One side is that it is a general purpose mostly non-opinionated generic tool that lets you do pipelines and just that but well. It is however very low-level with respect to the YAML scheme (some more high level exceptions are some fluent APIs like the .NET CLI task).
- On another side there is great potential to make opinionated libraries full of functionality and if generic enough, could be a nice drop-in solution for usual scenarios (build + test + deploy). Sort of a marketplace approach where you have libraries of steps.
This should probably be clearly separated maybe into another NuGet with libraries like what you are doing already (#175) so that it's clear we're in a higher layer of concerns than the core Sharpliner lib.
My personal experience is that every project eventually needs some custom step where you diverge from the generic approach and then things break because you need to either add some configuration into the generic library (making it more complex) or you need to stop using the library a go your own way.
Now, to your specific suggestion.
If I would be implementing this, I would not go the way of adding C# code to Sharpliner that calls AzDO APIs to create a pipeline. I would for sure create a library of pipeline steps that would do it. Same like ValidateYamlsArePublished
is implemented. This means that it would become an opinionated library like I am describing above. For this reason, I would first probably prefer to see your implementation for your own use case that you have and then think about if and how to generalize it and incorporate into some Sharpliner extension/task library.
That being said, I very much like your idea of tagging pipelines with attributes. I would consider whether it wouldn't be valuable to put following functionality into the core Sharpliner library:
Allow registering custom attributes together with some publish actions (callbacks) into the publish process. You could add custom attributes to definitions and when these are being published as YAML, your callbacks would be called during the process.
What are your thoughts on all this? Did I manage to describe my concerns and where I am coming from?
from sharpliner.
Actually, on another thought - maybe attributes are too much of an over-engineering.. It would just be easier to let people overload the Publish()
method on pipelines (or add some other methods/events like OnBeforePublish
or OnAfterPublish
) and this would have the same effect with a simpler approach.
from sharpliner.
Going in reverse:
Yeah upon reflection (heh get it) attributes are probably overkill, and overridable properties is probably the correct approach for sharpliner as a low level library. although sometimes they are useful to provide that layer of metadata in a clear obvious place.
I sealed TargetFile on my classes and require a TemplateOutput(PipelineActivity activity, Type parameterType) attribute to make it clear from looking at the class declaration where the template gets generated, based on conventions, but yes you are correct highly subjective and hard to apply to all cases.
example:
public record PulumiDeployment : IDeployment
{
[YamlMember(Order = 0)]
public string Type => GetType().ToTemplatePath();
[NotUsedByStepTemplate]
public string DependsOn { get; set; } = "";
[NotUsedByStepTemplate]
public bool RunDuringPullRequests { get; set; } = true;
public string Project { get; set; }
public string Stack { get; set; }
public AzureServicePrincipal ServiceConnection { get; set; }
[NotUsedByStepTemplate]
public AzureResourceGroup ResourceGroup { get; set; }
public AzureKeyVault ManagementKeyVault { get; set; }
}
public abstract class TypedStepTemplateDefinitionBase<T> : StepTemplateDefinition
{
public sealed override string TargetFile => GetType().GetTargetFileFromAttribute();
public override TargetPathType TargetPathType => TargetPathType.RelativeToGitRoot;
public sealed override List<TemplateParameter> Parameters => typeof(T).AsTemplateParameters().ToList();
protected string PipelineWorkspace => "$(Pipeline.Workspace)";
// Helper Method for referencing properties as passed parameters on <T>
public string Use<TR>(
Expression<Func<T, TR>> selector)
{
var member = ((MemberExpression) selector.Body).Member;
var name = member.GetCustomAttribute<JsonPropertyNameAttribute>()?.Name ?? member.Name.ToCamelCase();
return parameters[name];
}
protected TR UseNamed<TR>(
Expression<Func<T, TR>> selector) where TR: Named<TR>, new()
{
var member = ((MemberExpression) selector.Body).Member;
var name = member.GetCustomAttribute<JsonPropertyNameAttribute>()?.Name ?? member.Name.ToCamelCase();
return new TR {Name = parameters[name]};
}
protected Condition IsManual => Condition("eq(variables['Build.Reason'], 'Manual')");
protected readonly AzureBashInlineTaskBuilder AzureCli = new();
}
[TemplateOutput(PipelineActivity.Deploy, typeof(PulumiDeployment))]
internal class DeployPulumiSteps : TypedStepTemplateDefinitionBase<PulumiDeployment>
{
private string PulumiStateStorageSasToken => $"PULUMI-STATE-{Use(deployment => deployment.Stack)}-STORAGE-SAS-TOKEN";
private string PulumiStateStorageAccount => $"PULUMI-STATE-{Use(deployment => deployment.Stack)}-STORAGE-ACCOUNT";
protected string ProjectPath => $"{PipelineWorkspace}/self/{Use(artifact => artifact.Project)}";
public override ConditionedList<Step> Definition
{
get
{
return new()
{
Extensions.Tasks.NugetAuthenticate("Authenticate nuget against azure artifact feeds"),
new AzureKeyVaultTask()
{
AzureSubscription = UseNamed(deployment => deployment.ServiceConnection),
KeyVaultName = UseNamed(deployment => deployment.ManagementKeyVault),
SecretsFilter = new[]
{
PulumiStateStorageAccount,
PulumiStateStorageSasToken
}
},
AzureCli.FromResourceFile(UseNamed(deployment => deployment.ServiceConnection),$"{GetType().Namespace}.setup-pulumi.sh") with
{
Env = new()
{
{ "AZURE_STORAGE_ACCOUNT", $"$({PulumiStateStorageAccount})" },
{ "AZURE_STORAGE_SAS_TOKEN", $"$({PulumiStateStorageSasToken})" },
{ "STACK", Use(deployment => deployment.Stack) }
}, DisplayName = $"Configuring Pulumi for {Use(deployment => deployment.Stack)}"
},
AzureCli.FromResourceFile(UseNamed(deployment => deployment.ServiceConnection),$"{GetType().Namespace}.run-pulumi.sh") with
{
Env = new()
{
{ "AZURE_STORAGE_ACCOUNT", $"$({PulumiStateStorageAccount})" },
{ "AZURE_STORAGE_SAS_TOKEN", $"$({PulumiStateStorageSasToken})" },
{ "STACK", Use(deployment => deployment.Stack) },
{ "AZURE_KEYVAULT_NAME", UseNamed(deployment => deployment.ManagementKeyVault)}
},
WorkingDirectory = ProjectPath,
DisplayName = $"Running Pulumi for {Use(deployment => deployment.Stack)}"
}
};
}
}
}
# steps/pulumi/steps-deploy.yml
###
### DO NOT MODIFY THIS FILE!
###
### This YAML was auto-generated from DeployPulumiSteps
### To make changes, change the C# definition and rebuild its project
###
parameters:
- name: type
type: string
default: pulumi
- name: project
type: string
- name: stack
type: string
- name: serviceConnection
type: string
- name: managementKeyVault
type: string
steps:
- task: NuGetAuthenticate@0
displayName: Authenticate nuget against azure artifact feeds
- task: AzureKeyVault@2
inputs:
azureSubscription: ${{ parameters.serviceConnection }}
keyVaultName: ${{ parameters.managementKeyVault }}
secretsFilter: PULUMI-STATE-${{ parameters.stack }}-STORAGE-ACCOUNT,PULUMI-STATE-${{ parameters.stack }}-STORAGE-SAS-TOKEN
- task: AzureCLI@2
displayName: Configuring Pulumi for ${{ parameters.stack }}
inputs:
scriptType: bash
scriptLocation: inlineScript
inlineScript: >-
...
azureSubscription: ${{ parameters.serviceConnection }}
env:
AZURE_STORAGE_ACCOUNT: $(PULUMI-STATE-${{ parameters.stack }}-STORAGE-ACCOUNT)
AZURE_STORAGE_SAS_TOKEN: $(PULUMI-STATE-${{ parameters.stack }}-STORAGE-SAS-TOKEN)
STACK: ${{ parameters.stack }}
- task: AzureCLI@2
displayName: Running Pulumi for ${{ parameters.stack }}
inputs:
scriptType: bash
scriptLocation: inlineScript
inlineScript: >-
...
azureSubscription: ${{ parameters.serviceConnection }}
workingDirectory: $(Pipeline.Workspace)/self/${{ parameters.project }}
env:
AZURE_STORAGE_ACCOUNT: $(PULUMI-STATE-${{ parameters.stack }}-STORAGE-ACCOUNT)
AZURE_STORAGE_SAS_TOKEN: $(PULUMI-STATE-${{ parameters.stack }}-STORAGE-SAS-TOKEN)
STACK: ${{ parameters.stack }}
AZURE_KEYVAULT_NAME: ${{ parameters.managementKeyVault }} # AzureKeyVault
The events/virtual methods approach is probably correct for providing an extension framework, that way additional before/after callbacks could be added, and provided through external libraries without pulling in the whole world through the sharpliner core library.
That way I could add a method that would then call AzDo and perform logic to update service configuration (register new pipelines/update names) without having to bloat the underlying goals of sharpliner.
One of my side projects is a linter for AzDo pipelines using the PipelineParser in the Azure DevOps Pipeline Agent, so again I could add a post publish hook to run the linter on the resulting template to ensure it meets our style, naming, etc...
global and per pipeline hooks is probably ideal so being able to do something like the below:
class YourCustomConfiguration : SharplinerConfiguration
{
public override void Configure()
{
// Insert your overrides here
Serialization.PrettifyYaml = false;
Serialization.UseElseExpression = true;
Serialization.IncludeHeaders = false;
Validations.DependsOn = ValidationSeverity.Off;
Validations.Name = ValidationSeverity.Error;
Hooks.Pipeline.BeforePublish = (ISharplinerDefinition definition, Pipeline pipeline) =>
{
return PipelineLinter.Lint(pipeline); //bool on false fail build
};
Hooks.Pipeline.AfterPublish = (ISharplinerDefinition definition, Pipeline pipeline, string yaml) =>
{
var registration = typeof(definition).GetCustomAttribute<AutomaticallyRegisterPipeline>();
if (registration != null && Environment.GetEnvironmentVariable("BUILD_REASON") == "IndividualCI")
{
var id = AzDoUpdatePipelineRegistration.Register(definition.GetTargetPath(), registration.Name,
registration
.Folder); // find pipelines using TargetPath, if none are found create one under folder with name pointing to file, otherwise rename, and move if an existing pipeline is present.
AzDoUpdatePipelineRegistration.EnsureBuildValidationIsPresentOn("main", id,
pipeline.Trigger); // ensure that Build Validation on branch is enabled and configure path rules based on trigger path rules
}
};
}
}
would be fantastic.
Your concerns are totally valid, and I appreciate and respect your position as to the scope of what sharpliner should do.
from sharpliner.
Thanks for sharing the code! It is very useful for me to see how people use the library. The sheer amount of scenarios people are covering makes me more and more sure that it is very hard to be opinionated in this area.
Couple of points:
- I love the hooks. This shouldn't be too hard to implement and I would like to make it part of Sharpliner. Only problem is that I am off from Friday. I might make a quick version for you if you rely on this and then add documentation and finalize it in April but I also might not get to it.
- Can you please share this linter if it is public? It sounds very interesting. Is it using some models from AzDO open source code? I didn't find anything like that.
- I can also imagine having the linter as a validation in Sharpliner. I was already thinking about comparing the generated YAML with the YAML/JSON schema that AzDO provides.
- Nitty comment:
This piece:can also be written asprotected Condition IsManual => Condition("eq(variables['Build.Reason'], 'Manual')")
I am also thinking of adding known values as strong types -protected Condition IsManual => Condition(Equal(variables["Build.Reason"], "'Manual'"))
variables.Build.Reason
from sharpliner.
I had a couple spare minutes today so I tried to code the hooks and found a possible friction point.
When you use definition collections, the ISharplinerDefinition
that is actually being published is some other internal type that we create from your ISharplinerDefinitionCollection
and then it gets a little hard to match the type.
I think the only way to match the result is by looking at the file name (some template-generated-from-collection.yml
). I don't know if this is a deal breaker but I am also somewhat reluctant to add something into the library fast and then having to have to live it. In case this is something blocking you, I could add this and not mention it in the docs and finish the feature in April?
from sharpliner.
I am closing this as I think it is resolved. Please re-open in case I forgot or haven't addressed something
from sharpliner.
Related Issues (20)
- ObjectParameter but with List instead of Dictionary? HOT 2
- Guard which variables are available in the template context HOT 2
- How to use BooleanParameter as value in task? HOT 8
- Add `appendCommitMessageToRunName`
- The "PublishDefinitions" task failed unexpectedly in 1.5.0 HOT 6
- RepositoryResource doesn't serialize RepositoryType.Git HOT 10
- Missing a top level `pool` property on pipeline definition HOT 1
- Add support for each-expressions HOT 8
- Extends definition HOT 3
- Conditionally define `Arguments` for DotNetCoreCLI@2 task HOT 6
- Nested conditioned blocks (`if`, `each`...) not working
- Parameterize ExtendsPipelineDefinition to allow easier inheritance of PipelineWithExtends
- Make FlattenDefinition method on Conditioned public HOT 1
- Referencing a Sharpliner library project directly does not build
- Allow registration of custom validations via `SharplinerConfiguration`
- Allow overriding PipelineBase.Validation in subclasses outside of the Sharpliner project HOT 1
- Add models for PublishTestResults@2 and PublishCodeCoverageResults@1 ADO tasks
- DotNetInstallBuilder.FromGlobalJson() fails to set the useGlobalJson input HOT 1
- Add models for file operation tasks
- Make it easier to export new API
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from sharpliner.