You've already forked linux-packaging-mono
Imported Upstream version 5.2.0.175
Former-commit-id: bb0468d0f257ff100aa895eb5fe583fb5dfbf900
This commit is contained in:
parent
4bdbaf4a88
commit
966bba02bb
1
external/buildtools/.cliversion
vendored
Normal file
1
external/buildtools/.cliversion
vendored
Normal file
@@ -0,0 +1 @@
|
||||
1.0.0-preview2-1-003182
|
||||
3
external/buildtools/.toolversions
vendored
Normal file
3
external/buildtools/.toolversions
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
Microsoft.DotNet.BuildTools=1.0.27-prerelease-01205-03
|
||||
Microsoft.DotNet.BuildTools.Run=1.0.1-prerelease-01205-03
|
||||
NuGet.CommandLine=3.4.3
|
||||
1
external/buildtools/BuildToolsVersion.txt
vendored
1
external/buildtools/BuildToolsVersion.txt
vendored
@@ -1 +0,0 @@
|
||||
1.0.25-prerelease-00225-00
|
||||
66
external/buildtools/Documentation/Dev-workflow.md
vendored
Normal file
66
external/buildtools/Documentation/Dev-workflow.md
vendored
Normal file
@@ -0,0 +1,66 @@
|
||||
Dev Workflow
|
||||
===============
|
||||
The dev workflow describes the development process to follow. It is divided into specific tasks that are fast, transparent and easy to understand.
|
||||
|
||||
This provides flexibility as with discrete tasks people can iterate on a specific one without needing to run the full workflow, other processes can reuse the tasks as needed and anyone can put together the tasks as wanted to create a customized workflow.
|
||||
|
||||
## Process
|
||||

|
||||
|
||||
## Tasks
|
||||
|
||||
**Setup**
|
||||
|
||||
Set of instructions in order to have an environment ready to build the product.
|
||||
This means that at least two things need to happen:
|
||||
* All the external tool dependencies that a repo has should be installed in the machine. Because it is repo specific, each repo should have a list of dependencies with the respective versions and links. An example of this dependencies is Git, CMake, VS.
|
||||
* Download the files/tools that are needed in order to build the repo, for example, running init-tools to have the .Net CLI, Nuget and other tools available in the work environment.
|
||||
The Setup task should be the first task to be run in a new or in a clean repo.
|
||||
|
||||
######Dependencies
|
||||
None.
|
||||
|
||||
**Clean**
|
||||
|
||||
The clean task is responsible for cleaning and leaving the work environment in a state as close as possible to the repository/code server. This can be accomplished by having a set of cleaning phases (cleaning the bin or NuGet packages folder or cleaning the NuGet packages cache folder) and a full clean-up option.
|
||||
|
||||
######Dependencies
|
||||
Setup, as it may be required to run some tools in order to clean the work environment.
|
||||
|
||||
**Sync**
|
||||
|
||||
The sync task gets the latest source history and the dependencies the build needs in order to successfully build like, for example, restore the NuGet packages.
|
||||
It may require a pre-step to figure out what set of dependencies are needed for a build to run before restoring or downloading.
|
||||
It is the task in charge of eliminating all the network traffic when building, allowing to do a build in offline mode. This way we are able to hit the network only when we are intentional about it.
|
||||
|
||||
######Dependencies
|
||||
The Setup task is required, so Sync needs to verify that the tools are ready to be consumed, otherwise, Sync can be responsible for running the Setup task too.
|
||||
Also, there are cases where the Clean task should be run before the Sync task to avoid version issues across the different tools installed during the Setup task.
|
||||
|
||||
**Build**
|
||||
|
||||
Builds the source code. The order of how we build depends on each repo or the situation.
|
||||
|
||||
* Build product binaries: Builds the product binaries without the need of hitting the network to restore packages. It doesn't build tests, run tests or builds packages.
|
||||
* Build packages: Builds the NuGet packages from the binaries that were built in the Build product binaries step. If no binaries were produced, build package will be in charge of building the product binaries as well.
|
||||
* Build tests: Builds the tests that are going to be run against the product. The tests to build could be the local tests of the repo or could test that are located in an external resource.
|
||||
|
||||
######Dependencies
|
||||
* The Sync task is required in order to run the Build task. Build could be flexible to enforce Sync to always run as a separate task causing the build to fail if it wasn’t run, or always run Sync as a pre-step before building.
|
||||
* Build should have a possibility to be able to build all components (binaries, packages, and tests) or just a specific one with its dependencies.
|
||||
|
||||
**Run Tests**
|
||||
|
||||
Runs a set of tests against a set of product binaries in a specified work environment.
|
||||
|
||||
######Dependencies
|
||||
* One of the inputs of this task is the test binaries. If there are no test binaries available, the task should fail.
|
||||
* The product binaries should be available for this task to be able to run. Because the product binaries could come from different sources or work environments, the task run tests should fail if they are not provided.
|
||||
* A work environment should be provided. By default, it would try to run the tests in the work environment.
|
||||
|
||||
**Publish Packages**
|
||||
|
||||
Publishes the NuGet packages that were built in the build packages task to a specified location. If no packages were built, it should fail and let the developer know.
|
||||
|
||||
######Dependencies
|
||||
Build-packages
|
||||
289
external/buildtools/Documentation/RepoCompose.md
vendored
Normal file
289
external/buildtools/Documentation/RepoCompose.md
vendored
Normal file
@@ -0,0 +1,289 @@
|
||||
# Composing our repos
|
||||
|
||||
In order to have automated builds that compose the output of our ever growing number of repos we need to get more structured data about their relationships. Today repos provide nothing in the way of programmatic access to dependencies. All of the information is known by the owners and as a result the builds and composition is fully manual.
|
||||
|
||||
Going forward repos will be given contracts, or APIs, that provide structured data about their dependencies and outputs. This will give us the necessary data to compose our repos in our growing set of scenarios:
|
||||
|
||||
- Linux distro builds from source
|
||||
- Official Windows signed build
|
||||
- Builds to quickly test and deploy new packages across repos
|
||||
|
||||
These contracts will be included in the `run` command [specification](https://github.com/dotnet/buildtools/blob/master/Documentation/RunCommand.md). This document in particular will be discussing the commands which allow us to compose the outputs of our repos.
|
||||
|
||||
## Commands
|
||||
|
||||
This document describes the commands used to compose different repos together. There is a larger set of commands including `build`, `sync` and `clean` which is [described separately](https://github.com/dotnet/buildtools/blob/master/Documentation/Dev-workflow.md).
|
||||
|
||||
### consumes
|
||||
|
||||
The `consumes` command returns json output that describes all of the external artifacts consumed by this repo. This includes NuGet feeds, packages and arbitrary files from the web or file system. The format of all these items is describe in the Artifact Specification section below.
|
||||
|
||||
The artifacts are grouped into sections:
|
||||
|
||||
- Build Dependencies: artifacts which are referenced by the build output and include NuGet packages, MSI, etc ... In order to support a composed build these are further divided into floating and static dependencies:
|
||||
- Floating: dependencies where versions can change as a part of a composed build via the `change` command. This commonly includes CoreFx, CoreClr, etc ...
|
||||
- Static: dependencies whose version do not change via the `change` command. This is commonly used for SDK binaries, legacy tools, etc ...
|
||||
- Toolset Dependencies: artifacts used in the production of build output. This commonly includes NuGet.exe, compilers, etc ...
|
||||
|
||||
These sections are identified respectively by the following JSON sections:
|
||||
|
||||
``` json
|
||||
"dependencies": {
|
||||
"floating": {
|
||||
// Build artifacts
|
||||
},
|
||||
"static": {
|
||||
// Static artifacts
|
||||
},
|
||||
"toolset": {
|
||||
// Toolset artifacts
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
The data in the output is further grouped by operating system. Builds of the same repo on different operating system can reasonably consume a different set of resources. The output of `consumes` reflect this and allows per operating system dependencies:
|
||||
|
||||
``` json
|
||||
{
|
||||
"os-windows": {
|
||||
"dependencies": { }
|
||||
},
|
||||
"os-linux": {
|
||||
"dependencies": { }
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
In the case the `consumes` output doesn't want to take advantage of OS specific dependencies it can specify `"os-all"` as a catch all.
|
||||
|
||||
In addition to artifacts the consume feed can also optionally list any machine prerequitsites needed to build or test the repo:
|
||||
|
||||
``` json
|
||||
"prereq": {
|
||||
"Microsoft Visual Studio": "2015",
|
||||
"CMake" : "1.0",
|
||||
}
|
||||
```
|
||||
|
||||
A full sample output for `consumes` is available in the Samples section.
|
||||
|
||||
### produces
|
||||
|
||||
The `produces` command returns json output which describes the artifacts produced by the the repo. This includes NuGet packages and file artifacts.
|
||||
|
||||
The output format for artifacts is special for `produces` because it lacks any hard location information. For example:
|
||||
|
||||
- NuGet artifacts lack feeds
|
||||
- File artifacts lack a `"kind"` and supporting values
|
||||
|
||||
This is because the `produces` command represents "what" a repo produces, not "where" the repo produces it. The "where" portion is controled by the `publish` command. External components will take the output of `produces`, add the location information in and feed it back to `publish`.
|
||||
|
||||
Like `consumes` the `produces` output is also grouped by the operating system:
|
||||
|
||||
``` json
|
||||
{
|
||||
"os-windows": {
|
||||
"nuget": { },
|
||||
"file": { }
|
||||
},
|
||||
"os-linux": {
|
||||
"nuget": { },
|
||||
"file": { }
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
A ful sample output for `produces` is available in the Samples section.
|
||||
|
||||
### change
|
||||
|
||||
The `change` command is used to alter the floating build dependencies section. It can establish new versions of NuGet packages, new locations to find file artifacts, different NuGet feeds, etc ...
|
||||
|
||||
This is the command which allow us to use the build output of one repo as the input of a dependent repo. The first repo can build using a new output version (say beta5) and dependent repos can be changed to accept this new version.
|
||||
|
||||
This command operates by providing json whose format is a subset of the output of `consumes`. In particular it will provide the `"dependencies.floating"` section.
|
||||
|
||||
``` json
|
||||
"dependencies" {
|
||||
"floating": {
|
||||
"nuget": {
|
||||
"packages" {
|
||||
"MicroBuild.Core": "0.2.0",
|
||||
"Microsoft.NETCore.Platforms": "1.0.1"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
A tool responsible for composing repos would ideally:
|
||||
|
||||
1. Execute `run.cmd consumes` and capture the output.
|
||||
2. Alter the NuGet package versions in the json to have the correct build identifier: beta5, RTM, etc ..
|
||||
3. Execute `run.cmd change` and pass in the altered output.
|
||||
|
||||
|
||||
### publish
|
||||
|
||||
The `publish` command takes a json input that describes the locations artifacts should be published to. The input to this command is the output of `produces` that is augmented with location information.
|
||||
|
||||
## Artifact Specification
|
||||
|
||||
The json describing artifacts is the same between the `consume`, `produces` and `publish` commands. These items can be used anywhere artifacts are listed above.
|
||||
|
||||
### NuGet packages
|
||||
|
||||
The description for NuGet artifacts is it two parts:
|
||||
|
||||
1. The set of feeds packages are being read from.
|
||||
2. The set of packages that are being consumed and their respective versions.
|
||||
|
||||
Example:
|
||||
|
||||
``` json
|
||||
"nuget": {
|
||||
"feeds": [
|
||||
{
|
||||
"name": "core-clr",
|
||||
"value": "https://dotnet.myget.org/F/dotnet-coreclr/api/v3/index.json"
|
||||
},
|
||||
{
|
||||
"name": "dotnet-core",
|
||||
"value": "https://dotnet.myget.org/F/dotnet-core/api/v3/index.json"
|
||||
}
|
||||
],
|
||||
"packages": {
|
||||
"MicroBuild.Core": "0.2.0",
|
||||
"Microsoft.NETCore.Platforms": "1.0.1"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### File
|
||||
|
||||
Any file which is not a NuGet package should be listed as a file artifact. These can be downloaded from the web or copied from local places on the hard drive. Each type of file entry will have a name uniquely identifying the artifact and a kind property specifying the remainder of the properties:
|
||||
|
||||
- uri: a property named `"uri"` will contain an absolute Uri for the artifact.
|
||||
- filesystem: a property named `"location"` will contain an OS specific file path for the artifact.
|
||||
|
||||
Example:
|
||||
|
||||
``` json
|
||||
"file": {
|
||||
"nuget.exe": {
|
||||
"kind": "uri",
|
||||
"uri": "https://dist.nuget.org/win-x86-commandline/latest/nuget.exe"
|
||||
},
|
||||
"run.exe": {
|
||||
"kind": "filesystem",
|
||||
"location": "c:\\tools\\run.exe"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Samples
|
||||
|
||||
### consumes
|
||||
|
||||
``` json
|
||||
{
|
||||
"os-all": {
|
||||
"dependencies": {
|
||||
"floating": {
|
||||
"nuget": {
|
||||
"feeds": [
|
||||
{
|
||||
"name": "core-clr",
|
||||
"value": "https://dotnet.myget.org/F/dotnet-coreclr/api/v3/index.json"
|
||||
},
|
||||
{
|
||||
"name": "dotnet-core",
|
||||
"value": "https://dotnet.myget.org/F/dotnet-core/api/v3/index.json"
|
||||
}
|
||||
],
|
||||
"packages" {
|
||||
"MicroBuild.Core": "0.2.0",
|
||||
"Microsoft.NETCore.Platforms": "1.0.1"
|
||||
}
|
||||
},
|
||||
"file": {
|
||||
"nuget.exe": {
|
||||
"kind": "url",
|
||||
"uri": "https://dist.nuget.org/win-x86-commandline/latest/nuget.exe"
|
||||
}
|
||||
}
|
||||
},
|
||||
"toolset": {
|
||||
},
|
||||
"static": {
|
||||
}
|
||||
},
|
||||
"prereq": {
|
||||
"Microsoft Visual Studio": "2015",
|
||||
"CMake" : "1.0"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### produces
|
||||
|
||||
``` json
|
||||
{
|
||||
"os-all": {
|
||||
"nuget": {
|
||||
"packages" {
|
||||
"MicroBuild.Core": "0.2.0",
|
||||
"Microsoft.NETCore.Platforms": "1.0.1"
|
||||
}
|
||||
},
|
||||
"file": {
|
||||
"nuget.exe": { }
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Implementation Stages
|
||||
|
||||
Fully implementing the commands described in this document requires a decent amount of work. The expectation is that it will be implemented in stages. Each stage provides benefit to an existing tool or allows us to write a new tool for a major scenario:
|
||||
|
||||
- Stage 1: NuGet build information. NuGet packages are how the majority of repo artifacts are shared today. Getting the NuGet section of the above commands implemented allows us to automate the majority of our composed builds. This is really just a formalization of the commands repos already implement as a part of being inside the Maestro tool.
|
||||
- Stage 2: Remainder of artifacts: Fill out the build and toolset sections with all of the other file / NuGet dependencies. This output will allow us to fully understand the tools required for composed builds.
|
||||
- Stage 3: Pre-req information. Having correct prereq information will allow us to understand how to create and provision build machines.
|
||||
- Stage 4: Change command for all artifacts. The ability to change a dependency from downloading nuget.exe from nuget.org to a place on the local file system. This will give us the ability to implement fully offline builds.
|
||||
|
||||
## FAQ
|
||||
|
||||
### How would a composer tool match the inputs / outputs of repos?
|
||||
|
||||
The goal of this effort is to allow a composer tool to examine an arbitrary set of repos and establish a build order. It can do so by examining the output of `consumes` and `produces` for each repo and establishing a dependency graph based on the outputs. Artifacts from `produces` can be linked to floating build dependencies in `consumes`.
|
||||
|
||||
But there is no guarantee that at any given point in time that the outputs of one repo will match exactly with the inputs of a depndent repo. The names are likely to match but not the versions. For example what is the chance for any given commit that dotnet/corefx is outputing System.Collections.Immutable at the exact version that dotnet/roslyn is cosuming for a given commit? Probably fairly unlikely and getting them to agree requires coordination which is expensive and intended to be avoided by this very design.
|
||||
|
||||
This is not an issue though because composers should not consider version information when building a dependency graph. Versions of packages, both output from `publish` and built against via `consumes` can be controlled. The `change` command is used to alter floating build dependencies and `publish` takes version information as an input. This means version information is completely controlled by the composition process.
|
||||
|
||||
Hence when establishing dependency graphs a composer should link repo artifacts based on their name only.
|
||||
|
||||
### Why can't we use project.json + NuGet.config
|
||||
|
||||
At a glance it appears that much of the information described here is simply the contents of NuGet.config and the project.json in the repo. That is possibly true for small repos. For repos of significant size and dependencies more structure is needed to describe the intent of a given project.json in the repo.
|
||||
|
||||
For example at the time of this writing [dotnet/roslyn](https://github.com/dotnet/roslyn) contains 40+ project.json files. It's not possible to know which of these represent build dependencies, tooling or static dependencies. The repo has to provide a mechanism to discover this.
|
||||
|
||||
### This looks a lot like a package manager.
|
||||
|
||||
Indeed it does. If there is an existing package manager specification which meets our needs here I'm happy to see if we can leverage it.
|
||||
|
||||
### Your samples have comments in JSON. That's not legal.
|
||||
|
||||
Yes they do. It's a sample :smile:
|
||||
|
||||
## Open Issues
|
||||
|
||||
There are a series of open issues we need to track down here.
|
||||
|
||||
- What is the full set of operating system identifiers? The set I have listed above is just a place holder and was given no real thought. More information is needed here.
|
||||
- How can we relate file names between repos. It's easy for us to understand that Microsoft.CodeAnalysis.nupkg is the same between repos. How do we know that a repo which produces core-setup.msi is the input for a repo that consumes core-setup.msi? Perhaps we have to say that output file identifiers must be unique across repos? That seems like the simplest approach.
|
||||
- Can the `change` command also be used to alter `toolset` versions?
|
||||
- Should the NuGet feed be separated from the packages? Probably should be an entirely different section
|
||||
137
external/buildtools/Documentation/RunCommand.md
vendored
Normal file
137
external/buildtools/Documentation/RunCommand.md
vendored
Normal file
@@ -0,0 +1,137 @@
|
||||
Run Command Tool
|
||||
===========================
|
||||
The Run Command Tool has a published contract of inputs and outputs to encapsulate the [dev workflow](Dev-workflow.md). It parses arguments and maps properties to tools in order to run the command given by the user. It also provides documentation of the most common settings we use per repo, and the commands we can execute.
|
||||
|
||||
The source code of the tool lives in the Build Tools repo and it is included after the Build Tools package version 1.0.26-prerelease-00601-01.
|
||||
In order to on board the Run Command Tool, every repo should have a [run.cmd/sh](../run.cmd) script file that is in charge of:
|
||||
- Running init-tools (to download the Build Tools package)
|
||||
- Executing run.exe.
|
||||
|
||||
The Run Command Tool uses a config.json file that has all the information needed in order for the tool to work.
|
||||
|
||||
Config.json
|
||||
---------------------------
|
||||
The config.json file has three major sections:
|
||||
|
||||
**Settings**
|
||||
|
||||
Properties, variables, settings that are parsed according to the format of the tool that is going to be run. It provides documentation about what the setting is, the possible values it can have and the default value.
|
||||
|
||||
The structure of it is:
|
||||
```
|
||||
"settings": {
|
||||
"SettingName":{
|
||||
"description": "Brief description about the function of the Setting.",
|
||||
"valueType": "Value type name.",
|
||||
"values": ["Array of possible values for the Setting"],
|
||||
"defaultValue": "Default value for the Setting."
|
||||
}
|
||||
}
|
||||
```
|
||||
The value type could be per tool used (under the tools section) or run tool specific values like passThrough and runToolSetting.
|
||||
|
||||
- passThrough: No parsing is needed for the value of the Setting. The run tool will pass along the value of the Setting as-is.
|
||||
|
||||
- runToolSetting: A specific setting for the run tool, e.g. the '[RunQuiet](https://github.com/dotnet/buildtools/blob/master/src/Run/Setup.cs#L176)' Setting .
|
||||
|
||||
Note: The Run Command Tool needs a `Project` setting to specify the project specific commands will apply to. It can be specified per command, per alias or be set by the user.
|
||||
|
||||
**Commands**
|
||||
|
||||
The set of actions the tool should execute (clean, sync, build …). Each command has a set of `alias` that describe different behaviors that the command can run, a `defaultValues` section which contains the `toolName` and default settings that are always going to be passed to the command, and an optional `defaultAlias` value specifying which alias to call in the case of the tool being invoked with a command without an accompanying alias.
|
||||
|
||||
The structure of it is:
|
||||
```
|
||||
"commands": {
|
||||
"CommandName":{
|
||||
"alias":{
|
||||
"aliasName":{
|
||||
"description": "Brief description about the function of the alias in the given command",
|
||||
"settings":{
|
||||
"SettingName": "Value for the Setting. The value needs to be part of the values array of the Setting.",
|
||||
"SettingName": "If the value is specified as 'default', the command will use the default value defined for the Setting.",
|
||||
"Example" : "default"
|
||||
}
|
||||
},
|
||||
},
|
||||
"defaultValues":{
|
||||
"defaultAlias" : "Optional alias to use if no alias is passed",
|
||||
"toolName": "Each command needs only one tool, here we specify the name of the tool.",
|
||||
"settings": {
|
||||
"SettingName":"These settings are always going to be applied when calling the command CommandName."
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Tools**
|
||||
|
||||
Set of tools the run command will run (I.e. msbuild, cmd/sh).
|
||||
|
||||
The structure of it is:
|
||||
```
|
||||
"tools": {
|
||||
"toolName": {
|
||||
"osSpecific":{
|
||||
"windows": {
|
||||
"defaultParameters": "values we always want to pass when using this tool.",
|
||||
"path": "Where we can find the tool.",
|
||||
"filesExtension": "Extension of the files that the tool is going to use."
|
||||
},
|
||||
"unix":{
|
||||
"defaultParameters": "values we always want to pass when using this tool.",
|
||||
"path": "Where we can find the tool.",
|
||||
"filesExtension": "Extension of the files that the tool is going to use."
|
||||
}
|
||||
},
|
||||
"valueTypes": {
|
||||
"typeName": "Explains how to format the Setting for the specific tool."
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
Currently we have scripts for windows (.cmd) and for unix (.sh) that for discoverability have the same name. The `fileExtension` property is in charge of appending the corresponding extension to the name of the file specified in the `Project` Setting.
|
||||
|
||||
For example, using the following config.json:
|
||||
```
|
||||
{
|
||||
"settings": {
|
||||
"Project": {
|
||||
"description": "Project where the commands are going to be applied.",
|
||||
"valueType": "passThrough",
|
||||
"values": [],
|
||||
"defaultValue": ""
|
||||
}
|
||||
},
|
||||
"commands": {
|
||||
"build-native": {
|
||||
"alias": {},
|
||||
"defaultValues": {
|
||||
"toolName": "terminal",
|
||||
"Project": "src/Native/build-native",
|
||||
"settings": {}
|
||||
}
|
||||
},
|
||||
},
|
||||
"tools": {
|
||||
"terminal": {
|
||||
"osSpecific": {
|
||||
"windows": {
|
||||
"filesExtension": "cmd"
|
||||
},
|
||||
"unix": {
|
||||
"filesExtension": "sh"
|
||||
}
|
||||
},
|
||||
"valueTypes": {}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
One could call `run.exe build-native` and if running in Windows, this would execute the Windows-specific `src/Native/build-native.cmd` script, while the same command in Unix would execute the `src/Native/build-native.sh` bash script.
|
||||
|
||||
To access the information located in the config.json file, call `run.cmd -?` . This helps the commands and settings to be self-documented.
|
||||
|
||||
Build Tools [Config.json](../config.json).
|
||||
51
external/buildtools/Documentation/Samples/CloudTest.Helix.targets.sampleproject
vendored
Normal file
51
external/buildtools/Documentation/Samples/CloudTest.Helix.targets.sampleproject
vendored
Normal file
@@ -0,0 +1,51 @@
|
||||
<Project xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
|
||||
|
||||
<!-- The only required item type is HelixWorkItem -->
|
||||
<ItemGroup>
|
||||
<HelixWorkItem Include="Demo Work item #1">
|
||||
<Command>call DoStuff.bat</Command>
|
||||
<!-- Needs to be a zip file that contains DoStuff.bat, adjust accordingly -->
|
||||
<PayloadFile>Path to zip</PayloadFile>
|
||||
<WorkItemId>Demo Work item #1</WorkItemId>
|
||||
<TimeoutInSeconds>300</TimeoutInSeconds>
|
||||
</HelixWorkItem>
|
||||
|
||||
<HelixWorkItem Include="Demo Work item #2">
|
||||
<Command>call DoStuff.bat</Command>
|
||||
<!-- Needs to be a zip file that contains DoStuff.bat, adjust accordingly -->
|
||||
<PayloadFile>Path to zip</PayloadFile>
|
||||
<WorkItemId>Demo Work item #2</WorkItemId>
|
||||
<TimeoutInSeconds>200</TimeoutInSeconds>
|
||||
</HelixWorkItem>
|
||||
|
||||
</ItemGroup>
|
||||
|
||||
<!-- optionally, we can specify Correlation payload files -->
|
||||
<ItemGroup>
|
||||
<HelixCorrelationPayloadFile Include="CorrelationPayload.zip" />
|
||||
</ItemGroup>
|
||||
|
||||
<!-- Required properties -->
|
||||
<PropertyGroup>
|
||||
<HelixApiEndpoint>https://helix.int-dot.net/api/2016-06-28/jobs</HelixApiEndpoint>
|
||||
<HelixApiAccessKey></HelixApiAccessKey>
|
||||
<HelixJobType>unspecified/</HelixJobType>
|
||||
<HelixSource>pr/unspecified/</HelixSource>
|
||||
<TargetQueues>Windows.10.Amd64;Windows.7.Amd64;Windows.81.Amd64;Windows.10.Core.Amd64;</TargetQueues>
|
||||
<BuildMoniker>20170301.0000</BuildMoniker>
|
||||
<!-- Use any valid Connection string here -->
|
||||
<CloudDropConnectionString>...</CloudDropConnectionString>
|
||||
<CloudResultsConnectionString>...</CloudResultsConnectionString>
|
||||
<ArchivesRoot>.</ArchivesRoot>
|
||||
<HelixJobProperties>{ "architecture" : "x64", "configuration": "Debug", "operatingSystem" : "Windows_NT" }</HelixJobProperties>
|
||||
</PropertyGroup>
|
||||
|
||||
<!-- Generally need to have the above stuff defined before import... -->
|
||||
<Import Project="$(BuildToolsTaskDir)\CloudTest.Helix.targets" />
|
||||
|
||||
<Target Name="Build">
|
||||
<Message Text="Beginning Cloud Build!" />
|
||||
<HelixCloudBuild/>
|
||||
</Target>
|
||||
|
||||
</Project>
|
||||
156
external/buildtools/Documentation/annotated-dependency-props.md
vendored
Normal file
156
external/buildtools/Documentation/annotated-dependency-props.md
vendored
Normal file
@@ -0,0 +1,156 @@
|
||||
# Annotated dependencies.props
|
||||
|
||||
This file is used in CoreFX, CoreCLR, WCF, and BuildTools, located in the repository root. Below is a breakdown of [corefx's master dependencies.props](https://github.com/dotnet/corefx/blob/b57a43bb40fc2099e91d641a8b4f8c76a46afe6a/dependencies.props). It is used for dependency auto-upgrade and dependency verification.
|
||||
|
||||
``` xml
|
||||
<PropertyGroup>
|
||||
<CoreFxCurrentRef>450606241ffd24c3c9671cd002955a68e98008a7</CoreFxCurrentRef>
|
||||
<CoreClrCurrentRef>450606241ffd24c3c9671cd002955a68e98008a7</CoreClrCurrentRef>
|
||||
<ExternalCurrentRef>0db1f9d8996a6a05960f79712299652a4b04147f</ExternalCurrentRef>
|
||||
<ProjectNTfsCurrentRef>450606241ffd24c3c9671cd002955a68e98008a7</ProjectNTfsCurrentRef>
|
||||
</PropertyGroup>
|
||||
```
|
||||
|
||||
Source of truth for dependency tooling: the commit hash of the dotnet/versions master branch as of the last auto-upgrade. These are used with the GitHub raw api to download build-infos.
|
||||
|
||||
In `/t:UpdateDependenciesAndSubmitPullRequest`, the task first finds the latest CurrentRef of the dotnet/versions repository, which when used with the raw API will return the latest version of *every* build info. The update proceeds, using that latest build-info data. After updating, the task determines which build-infos were used and updates only the used build-info `*CurrentRef`s in the above part of dependencies.props.
|
||||
|
||||
When doing a manual update with `/t:UpdateDependencies`, you need to change these `CurrentRef`s yourself to whatever dotnet/versions commit you want to update to before executing the target.
|
||||
|
||||
During dependency verification (`/t:VerifyDependencies` or automatically during `sync`) the targeted build-infos are downloaded and project files are checked for consistency with the build-infos.
|
||||
|
||||
``` xml
|
||||
<!-- Auto-upgraded properties for each build info dependency. -->
|
||||
<PropertyGroup>
|
||||
<CoreFxExpectedPrerelease>beta-24601-02</CoreFxExpectedPrerelease>
|
||||
<CoreClrExpectedPrerelease>beta-24603-02</CoreClrExpectedPrerelease>
|
||||
<ExternalExpectedPrerelease>beta-24523-00</ExternalExpectedPrerelease>
|
||||
<ProjectNTfsExpectedPrerelease>beta-24603-00</ProjectNTfsExpectedPrerelease>
|
||||
</PropertyGroup>
|
||||
```
|
||||
|
||||
These are auto-updated by `UpdateDependenciesAndSubmitPullRequest` and `UpdateDependencies`, with values taken from `Latest.txt`. They are only used to flow this info into MSBuild, *not* by project.json validation (as they were in older types of dependency auto-update).
|
||||
|
||||
These properties are verified to match the downloaded build-info during `VerifyDependencies`.
|
||||
|
||||
``` xml
|
||||
<!-- Full package version strings that are used in other parts of the build. -->
|
||||
<PropertyGroup>
|
||||
<AppXRunnerVersion>1.0.3-prerelease-00826-05</AppXRunnerVersion>
|
||||
<XunitPerfAnalysisPackageVersion>1.0.0-alpha-build0040</XunitPerfAnalysisPackageVersion>
|
||||
</PropertyGroup>
|
||||
```
|
||||
|
||||
Similar to the `*ExpectedPrerelease` properties, but these are for specific named packages. They are used in MSBuild targets during builds. These versions in particular aren't auto-updated because they are tools that don't regularly change.
|
||||
|
||||
``` xml
|
||||
<!-- Package dependency verification/auto-upgrade configuration. -->
|
||||
<PropertyGroup>
|
||||
<BaseDotNetBuildInfo>build-info/dotnet/</BaseDotNetBuildInfo>
|
||||
<DependencyBranch>master</DependencyBranch>
|
||||
<CurrentRefXmlPath>$(MSBuildThisFileFullPath)</CurrentRefXmlPath>
|
||||
</PropertyGroup>
|
||||
```
|
||||
|
||||
The first two properties assemble the path to the build-info files that CoreFX master depends on.
|
||||
|
||||
`CurrentRefXmlPath` is used by the auto-update targets to determine where `dependencies.props` is.
|
||||
|
||||
``` xml
|
||||
<ItemGroup>
|
||||
<RemoteDependencyBuildInfo Include="CoreFx">
|
||||
<BuildInfoPath>$(BaseDotNetBuildInfo)corefx/$(DependencyBranch)</BuildInfoPath>
|
||||
<CurrentRef>$(CoreFxCurrentRef)</CurrentRef>
|
||||
</RemoteDependencyBuildInfo>
|
||||
<RemoteDependencyBuildInfo Include="CoreClr">
|
||||
<BuildInfoPath>$(BaseDotNetBuildInfo)coreclr/$(DependencyBranch)</BuildInfoPath>
|
||||
<CurrentRef>$(CoreClrCurrentRef)</CurrentRef>
|
||||
</RemoteDependencyBuildInfo>
|
||||
<RemoteDependencyBuildInfo Include="External">
|
||||
<BuildInfoPath>$(BaseDotNetBuildInfo)projectk-tfs/$(DependencyBranch)</BuildInfoPath>
|
||||
<CurrentRef>$(ExternalCurrentRef)</CurrentRef>
|
||||
</RemoteDependencyBuildInfo>
|
||||
<RemoteDependencyBuildInfo Include="ProjectNTfs">
|
||||
<BuildInfoPath>$(BaseDotNetBuildInfo)projectn-tfs/$(DependencyBranch)</BuildInfoPath>
|
||||
<CurrentRef>$(ProjectNTfsCurrentRef)</CurrentRef>
|
||||
</RemoteDependencyBuildInfo>
|
||||
|
||||
<DependencyBuildInfo Include="@(RemoteDependencyBuildInfo)">
|
||||
<RawVersionsBaseUrl>https://raw.githubusercontent.com/dotnet/versions</RawVersionsBaseUrl>
|
||||
</DependencyBuildInfo>
|
||||
|
||||
<XmlUpdateStep Include="CoreFx">
|
||||
<Path>$(MSBuildThisFileFullPath)</Path>
|
||||
<ElementName>CoreFxExpectedPrerelease</ElementName>
|
||||
<BuildInfoName>CoreFx</BuildInfoName>
|
||||
</XmlUpdateStep>
|
||||
<XmlUpdateStep Include="CoreClr">
|
||||
<Path>$(MSBuildThisFileFullPath)</Path>
|
||||
<ElementName>CoreClrExpectedPrerelease</ElementName>
|
||||
<BuildInfoName>CoreClr</BuildInfoName>
|
||||
</XmlUpdateStep>
|
||||
<XmlUpdateStep Include="External">
|
||||
<Path>$(MSBuildThisFileFullPath)</Path>
|
||||
<ElementName>ExternalExpectedPrerelease</ElementName>
|
||||
<BuildInfoName>External</BuildInfoName>
|
||||
</XmlUpdateStep>
|
||||
<XmlUpdateStep Include="ProjectNTfs">
|
||||
<Path>$(MSBuildThisFileFullPath)</Path>
|
||||
<ElementName>ProjectNTfsExpectedPrerelease</ElementName>
|
||||
<BuildInfoName>ProjectNTfs</BuildInfoName>
|
||||
</XmlUpdateStep>
|
||||
</ItemGroup>
|
||||
```
|
||||
|
||||
Each `RemoteDependencyBuildInfo` indicates a build-info to download, which consists of the `Latest.txt` and `Latest_Packages.txt` at a certain path. `CurrentRef` is flowed from the property into the item metadata rather than hard-coded in the metadata for enhanced visibility in auto-update diffs.
|
||||
|
||||
The `XmlUpdateStep`s are rules that match the `*ExpectedPrerelease` properties earlier in this file and link them to the build infos.
|
||||
|
||||
``` xml
|
||||
<!-- Set up dependencies on packages that aren't found in a BuildInfo. -->
|
||||
<ItemGroup>
|
||||
<TargetingPackDependency Include="Microsoft.TargetingPack.NETFramework.v4.5" />
|
||||
<TargetingPackDependency Include="Microsoft.TargetingPack.NETFramework.v4.5.1" />
|
||||
<TargetingPackDependency Include="Microsoft.TargetingPack.NETFramework.v4.5.2" />
|
||||
<TargetingPackDependency Include="Microsoft.TargetingPack.NETFramework.v4.6" />
|
||||
<TargetingPackDependency Include="Microsoft.TargetingPack.NETFramework.v4.6.1" />
|
||||
<TargetingPackDependency Include="Microsoft.TargetingPack.NETFramework.v4.6.2" />
|
||||
<TargetingPackDependency Include="Microsoft.TargetingPack.Private.WinRT" />
|
||||
<StaticDependency Include="@(TargetingPackDependency)">
|
||||
<Version>1.0.1</Version>
|
||||
</StaticDependency>
|
||||
|
||||
<XUnitDependency Include="xunit"/>
|
||||
<XUnitDependency Include="xunit.runner.utility"/>
|
||||
<XUnitDependency Include="xunit.runner.console"/>
|
||||
<StaticDependency Include="@(XUnitDependency)">
|
||||
<Version>2.1.0</Version>
|
||||
</StaticDependency>
|
||||
|
||||
<StaticDependency Include="Microsoft.xunit.netcore.extensions;Microsoft.DotNet.BuildTools.TestSuite">
|
||||
<Version>1.0.0-prerelease-00830-02</Version>
|
||||
</StaticDependency>
|
||||
|
||||
<PerformancePackDependency Include="Microsoft.DotNet.xunit.performance" />
|
||||
<PerformancePackDependency Include="Microsoft.DotNet.xunit.performance.analysis" />
|
||||
<PerformancePackDependency Include="Microsoft.DotNet.xunit.performance.analysis.cli" />
|
||||
<PerformancePackDependency Include="Microsoft.DotNet.xunit.performance.runner.cli" />
|
||||
<PerformancePackDependency Include="Microsoft.DotNet.xunit.performance.runner.Windows" />
|
||||
<StaticDependency Include="@(PerformancePackDependency)">
|
||||
<Version>$(XunitPerfAnalysisPackageVersion)</Version>
|
||||
</StaticDependency>
|
||||
|
||||
<DependencyBuildInfo Include="@(StaticDependency)">
|
||||
<PackageId>%(Identity)</PackageId>
|
||||
<UpdateStableVersions>true</UpdateStableVersions>
|
||||
</DependencyBuildInfo>
|
||||
|
||||
<DependencyBuildInfo Include="uwpRunnerVersion">
|
||||
<PackageId>microsoft.xunit.runner.uwp</PackageId>
|
||||
<Version>$(AppXRunnerVersion)</Version>
|
||||
</DependencyBuildInfo>
|
||||
|
||||
</ItemGroup>
|
||||
```
|
||||
|
||||
These are "local" `DependencyBuildInfo`s created to cover packages that aren't in downloaded build-infos because they are external, not published to dotnet/versions, or don't normally change. Specifically, these allow the package versions to be validated in `project.json` files.
|
||||
115
external/buildtools/Documentation/dependency-auto-update.md
vendored
Normal file
115
external/buildtools/Documentation/dependency-auto-update.md
vendored
Normal file
@@ -0,0 +1,115 @@
|
||||
# Dependency auto-update
|
||||
|
||||
Buildtools provides tooling for automatically updating a repository's dependencies (project.json, msbuild .props, and arbitrary versions in files). Combined with [Maestro][Maestro], this allows a flow where one repository finishes an official build, and all repositories that depend on that repository get an automatically generated pull request from a bot GitHub account.
|
||||
|
||||
As of writing, the Repo API Compose commands are not implemented yet. This document describes auto-update in a pre-Repo-API world.
|
||||
|
||||
|
||||
## Parts of auto-update flow
|
||||
|
||||
The [versions repo (dotnet/versions)][dotnet/versions] stores information about the last official build of each orchestrated repository in text files.
|
||||
|
||||
[Maestro][Maestro] detects changes to files in GitHub and kicks off tasks depending on the file that changed, according to [subscriptions.json][subscriptions]. For dependency auto-update, the relevent files are the build-info files in dotnet/versions.
|
||||
|
||||
VSTS builds are used to execute the auto-update pull request generation task. The naming scheme for these build definitions for CoreFX, CoreCLR, and WCF is `Maestro-<Project>GeneralExecutor`.
|
||||
|
||||
[VersionTools][VersionTools] is a library that can update and verify dependencies, commit and push changes, and make pull requests. The main BuildTools package includes "wrapper" targets/tasks that run this library, and the library's DLL.
|
||||
|
||||
|
||||
## Auto-update in action
|
||||
|
||||
For example, the flow that updated upgrade PR [coreclr/pull/7472](https://github.com/dotnet/coreclr/pull/7472) with CoreFX master beta-24604-02:
|
||||
|
||||
1. The official build writes its output data to the dotnet/versions [build-info/dotnet/corefx/master](https://github.com/dotnet/versions/tree/0e83ecde3e99f5c13b9532e3b06003b85b83f435/build-info/dotnet/corefx/master).
|
||||
1. Latest.txt stores the prerelease specifier for all packages published.
|
||||
2. Latest_Packages.txt stores the full id and version identifier for each package published.
|
||||
2. [Maestro][Maestro] detects the build-info commit and the subscription triggers [Maestro-CoreCLRGeneralExecutor #359499](https://devdiv.visualstudio.com/DefaultCollection/DevDiv/_build/index?buildId=359499).
|
||||
3. The VSTS build runs `/t:UpdateDependenciesAndSubmitPullRequest /p:GitHubUser=dotnet-bot /p:GitHubEmail=dotnet-bot@microsoft.com /p:GitHubAuthToken=******** /p:ProjectRepoOwner=dotnet /p:ProjectRepoName=coreclr /p:ProjectRepoBranch=master /p:NotifyGitHubUsers=dotnet/coreclr-contrib`.
|
||||
1. The latest build-info Latest_Packages.txt is downloaded and used to update project.jsons and msbuild files.
|
||||
2. The build searches for an auto-update PR that already exists for the branch and finds [coreclr/pull/7472](https://github.com/dotnet/coreclr/pull/7472).
|
||||
3. Auto-update PRs are reused when possible, so changes are committed and pushed (`--force`) to the branch for PR 7472.
|
||||
4. The build updates the title of PR 7472 to indicate the update that took place. If no auto-update PR had existed to update, it would create a fresh PR instead.
|
||||
4. CI runs, and when it's green, project maintainers can merge the auto-update PR when convenient.
|
||||
|
||||
|
||||
## Subscription
|
||||
|
||||
A subscription is simply the path that [Maestro][Maestro] should listen to, and what build to run when it detects a change. These are defined in [subscriptions.json][subscriptions]. For example, a subscription to create auto-update PRs in WCF for the latest CoreFX build, (comments mine):
|
||||
|
||||
```
|
||||
{
|
||||
"path": "https://github.com/dotnet/versions/blob/master/build-info/dotnet/corefx/master/Latest.txt",
|
||||
"handlers": [
|
||||
{
|
||||
// The build definition to run, a key to a dict elsewhere in subscriptions.json.
|
||||
"maestroAction": "wcf-general",
|
||||
// A delay after the change is detected to allow published artifacts to propagate.
|
||||
"maestroDelay": "00:10:00",
|
||||
|
||||
// A queue-time variable sent to the build definition. No special Maestro handling.
|
||||
// The definition runs the named script with the args given in the "Arguments" variable.
|
||||
"ScriptFileName": "build-managed.cmd",
|
||||
// Maestro concatenates the list of arguments and passes them at queue time as a string.
|
||||
"Arguments": [
|
||||
"--",
|
||||
"/t:UpdateDependenciesAndSubmitPullRequest",
|
||||
"/p:GitHubUser=dotnet-bot",
|
||||
"/p:GitHubEmail=dotnet-bot@microsoft.com",
|
||||
"/p:GitHubAuthToken=`$(`$Secrets['DotNetBotGitHubPassword'])",
|
||||
"/p:ProjectRepoOwner=dotnet",
|
||||
"/p:ProjectRepoName=wcf",
|
||||
"/p:ProjectRepoBranch=master",
|
||||
"/p:NotifyGitHubUsers=dotnet/wcf-contrib",
|
||||
"/verbosity:Normal"
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
([Link to subscription in context.](https://github.com/dotnet/versions/blob/0e83ecde3e99f5c13b9532e3b06003b85b83f435/Maestro/subscriptions.json#L146-L163))
|
||||
|
||||
`wcf-general` is [defined as](https://github.com/dotnet/versions/blob/0e83ecde3e99f5c13b9532e3b06003b85b83f435/Maestro/subscriptions.json#L51-L56) this, pointing to the definition that should be queued:
|
||||
|
||||
```
|
||||
"wcf-general": {
|
||||
"vsoInstance": "devdiv.visualstudio.com",
|
||||
"vsoProject": "DevDiv",
|
||||
"buildDefinitionId": 4226
|
||||
}
|
||||
```
|
||||
|
||||
### MSBuild arguments
|
||||
|
||||
* `/t:UpdateDependenciesAndSubmitPullRequest`: the target that invokes [VersionTools][VersionTools].
|
||||
* `/p:GitHubUser=dotnet-bot`: the fork owner and committer name.
|
||||
* `/p:GitHubEmail=dotnet-bot@microsoft.com`: committer email.
|
||||
* `/p:GitHubAuthToken=`$(`$Secrets['DotNetBotGitHubPassword'])`: fork owner token. The build definition has this as a secret variable, and the PowerShell in this string finds the secret's value and passes it to msbuild.
|
||||
* `/p:ProjectRepoOwner=dotnet`: the owner of the repo to submit the PR to.
|
||||
* `/p:ProjectRepoName=wcf`: the name of the repo to submit the PR to.
|
||||
* `/p:ProjectRepoBranch=master`: the base branch to PR against.
|
||||
* `/p:NotifyGitHubUsers=dotnet/wcf-contrib`: a user/group to `@`-mention in the PR description. In theory a semicolon-delimited list, but the layers of JSON and PowerShell escaping make it difficult so subscriptions have only specified a single user/group so far.
|
||||
|
||||
|
||||
## dependencies.props
|
||||
|
||||
For an in-depth look at `dependencies.props`, which stores the current dependencies and configures auto-update's specific behavior for a repository, read [annotated-dependency-props.md](annotated-dependency-props.md).
|
||||
|
||||
|
||||
# Adding or removing a subscription
|
||||
|
||||
In general, submit a PR for changes to [subscriptions.json][subscriptions].
|
||||
|
||||
When adding a subscription to a new branch, if there isn't an existing "non-master" subscription to examine as an example, note that the `vsoSourceBranch` property (sibling of `maestroAction` and `maestroDelay`) tells Maestro which branch to run the GeneralExecutor on.
|
||||
|
||||
When adding an entirely new project, a new build definition is also needed. Look at the `Maestro-*GeneralExecutor` definitions as examples and, if possible, clone an existing one and change the repository pointer in VSTS. Add a corresponding entry to the `actions` object in subscriptions.json.
|
||||
|
||||
To remove a subscription, delete the entry in the `handlers` array.
|
||||
|
||||
When changing subscriptions.json, match the existing comment style as best as possible, because it makes searching the file more consistent.
|
||||
|
||||
|
||||
[Maestro]: https://github.com/dotnet/versions/tree/master/Maestro
|
||||
[dotnet/versions]: https://github.com/dotnet/versions
|
||||
[subscriptions]: https://github.com/dotnet/versions/blob/master/Maestro/subscriptions.json
|
||||
[VersionTools]: https://github.com/dotnet/buildtools/tree/094e239b8c1e7f1495abf8c7dc96c71e56bf6c96/src/Microsoft.DotNet.VersionTools
|
||||
BIN
external/buildtools/Documentation/images/Dev-workflow.jpg
vendored
Normal file
BIN
external/buildtools/Documentation/images/Dev-workflow.jpg
vendored
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 68 KiB |
38
external/buildtools/Documentation/project-nuget-dependencies.md
vendored
Normal file
38
external/buildtools/Documentation/project-nuget-dependencies.md
vendored
Normal file
@@ -0,0 +1,38 @@
|
||||
# Project NuGet Dependencies
|
||||
|
||||
## Dependency version verification
|
||||
|
||||
The dependencies in each project.json file are validated by a few rules in `dependencies.props` to ensure package versions across the repository stay in sync. Dependencies are normally verified before the NuGet restore step, but to manually verify run the `VerifyDependencies` MSBuild target.
|
||||
|
||||
Errors from failed dependency version validation are like the following:
|
||||
|
||||
C:\git\corefx\Tools\VersionTools.targets(47,5): error : Dependency verification errors detected. To automatically fix based on dependency rules, run the msbuild target 'UpdateDependencies' [C:\git\corefx\build.proj]
|
||||
C:\git\corefx\Tools\VersionTools.targets(47,5): error : Dependencies invalid: In 'C:\git\corefx\src\Common\test-runtime\project.json', 'Microsoft.DotNet.BuildTools.TestSuite 1.0.0-prerelease-00704-04' must be '1.0.0-prerelease-00704-05' (Microsoft.DotNet.BuildTools.TestSuite) [C:\git\corefx\build.proj]
|
||||
C:\git\corefx\Tools\VersionTools.targets(47,5): error : Dependencies invalid: In 'C:\git\corefx\src\Common\tests\project.json', 'Microsoft.xunit.netcore.extensions 1.0.0-prerelease-00704-04' must be '1.0.0-prerelease-00704-05' (Microsoft.xunit.netcore.extensions) [C:\git\corefx\build.proj]
|
||||
|
||||
To automatically fix these by setting the versions to expected values, use the `UpdateDependencies` target. `UpdateDependencies` can also be used to automatically update package versions, described in the next section.
|
||||
|
||||
If an expected value looks wrong, edit `dependencies.props` to change it. See [annotated-dependency-props.md](annotated-dependency-props.md) for an explanation of `dependencies.props`.
|
||||
|
||||
|
||||
## Upgrading a package dependency
|
||||
|
||||
To update a package that isn't validated by a rule, simply change the project.json file.
|
||||
|
||||
Otherwise, follow these steps:
|
||||
|
||||
1. Edit `dependencies.props` to match the new expectations. See [annotated-dependency-props.md](annotated-dependency-props.md).
|
||||
2. Run the dependency update target in the repository root. In corefx, use this command:
|
||||
|
||||
build-managed.cmd -- /t:UpdateDependencies
|
||||
|
||||
Other repositories have slightly different ways to run targets.
|
||||
|
||||
3. Commit the automated updates in an independent commit, isolating them from other changes. This makes pull requests easier to review because updates can change many files.
|
||||
|
||||
The `UpdateDependencies` target looks through all dependencies, using the validation rules to update any invalid versions. On `/verbosity:Normal` or higher, it logs which files were changed.
|
||||
|
||||
|
||||
## Dependency auto-update
|
||||
|
||||
Dependency auto-update uses the dependency update/verify system to submit automated pull requests for package updates. See [dependency-auto-update.md](dependency-auto-update.md) for details.
|
||||
87
external/buildtools/Documentation/test-targets-design.md
vendored
Normal file
87
external/buildtools/Documentation/test-targets-design.md
vendored
Normal file
@@ -0,0 +1,87 @@
|
||||
# Test Target Design
|
||||
|
||||
This document describes the design of the BuildTools test targets. For documentation on their usage, see [Test Targets](test-targets-usage.md).
|
||||
|
||||
The primary entry points are the [`Test` target](https://github.com/dotnet/buildtools/blob/87422f6cb8/src/Microsoft.DotNet.Build.Tasks/PackageFiles/tests.targets#L340) in Microsoft.DotNet.Build.Tasks and the [`CloudBuild` target](https://github.com/dotnet/buildtools/blob/87422f6cb8/src/Microsoft.DotNet.Build.CloudTestTasks/PackageFiles/CloudTest.targets#L79) in Microsoft.DotNet.Build.CloudTestTasks, which are each described below.
|
||||
|
||||
## `Test` target
|
||||
|
||||
The `Test` target is used to run or archive tests locally. At a high level, this target performs four steps:
|
||||
|
||||
1. Copies the required files for running the tests to the test execution directory.
|
||||
2. Generates the test execution script, which was introduced for running the tests in distributed automation systems such as [Helix](https://helix.dot.net/), but is also used for running them locally.
|
||||
3. (Optional) Runs the tests locally.
|
||||
4. (Optional) Archives the tests, typically for running in Helix.
|
||||
|
||||
When the tests are run in Helix, the common test runtime dependencies which aren't directly referenced by the test project are copied from a common location into each test project's test execution directory at test execution time so that these files aren't duplicated across each project.
|
||||
|
||||
The `Test` target depends on the following targets:
|
||||
|
||||
#### [`SetupTestProperties`](https://github.com/dotnet/buildtools/blob/87422f6cb8/src/Microsoft.DotNet.Build.Tasks/PackageFiles/tests.targets#L325)
|
||||
Sets properties and items for the `Test` target. For example, `GetDefaultTestRid` sets the `TestNugetRuntimeId` property if it isn't already set, and `CheckTestPlatforms` disables the execution of the tests if the `TargetOS` is unsupported.
|
||||
|
||||
#### [`CopyTestToTestDirectory`](https://github.com/dotnet/buildtools/blob/87422f6cb8/src/Microsoft.DotNet.Build.Tasks/PackageFiles/publishtest.targets#L143)
|
||||
|
||||
Copies the test dependencies which are specific to the test project (e.g. the binaries for the test project and the projects it directly references) to the test directory. This copy happens at build-time in all cases. These items are calculated in [`DiscoverTestInputs`](https://github.com/dotnet/buildtools/blob/87422f6cb8/src/Microsoft.DotNet.Build.Tasks/PackageFiles/tests.targets#L125).
|
||||
|
||||
The [`CopySupplementalTestData`](https://github.com/dotnet/buildtools/blob/87422f6cb8/src/Microsoft.DotNet.Build.Tasks/PackageFiles/publishtest.targets#L162) target is executed after this target. This target copies the `SupplementalTestData` items to the test execution directory, which are files that are shared between multiple projects. These are copied in a separate step because, unlike the other files, they cannot be copied using hard links; doing so would result in race conditions between the archiving and copying of different links of the file.
|
||||
|
||||
#### [`CopyDependenciesToTestDirectory`](https://github.com/dotnet/buildtools/blob/87422f6cb8/src/Microsoft.DotNet.Build.Tasks/PackageFiles/publishtest.targets#L196)
|
||||
|
||||
Copies the common test runtime dependencies which aren't directly referenced by the test project to the test execution directory. These items are calculated in [`DiscoverTestDependencies`](https://github.com/dotnet/buildtools/blob/87422f6cb8/src/Microsoft.DotNet.Build.Tasks/PackageFiles/publishtest.targets#L24). This target is only executed when not archiving the tests.
|
||||
|
||||
#### [`GenerateTestBindingRedirects`](https://github.com/dotnet/buildtools/blob/87422f6cb8/src/Microsoft.DotNet.Build.Tasks/PackageFiles/tests.targets#L151)
|
||||
|
||||
Generates assembly binding redirects when running tests against .NET Framework Desktop.
|
||||
|
||||
#### [`GenerateTestExecutionScripts`](https://github.com/dotnet/buildtools/blob/87422f6cb8/src/Microsoft.DotNet.Build.Tasks/PackageFiles/tests.targets#L168)
|
||||
|
||||
Generates a script for running the tests. This script is either a batch file or a Bash script, depending on the `TargetOS`.
|
||||
|
||||
The script performs two high-level steps:
|
||||
|
||||
1. Copies the common test runtime dependencies calculated in `DiscoverTestDependencies` to the test execution directory. Each copy command no-ops if the file already exists in the test execution directory.
|
||||
2. Runs the tests.
|
||||
|
||||
#### [`RunTestsForProject`](https://github.com/dotnet/buildtools/blob/87422f6cb8/src/Microsoft.DotNet.Build.Tasks/PackageFiles/tests.targets#L265)
|
||||
|
||||
Runs the tests by invoking the test execution script. This target can be skipped by setting the `SkipTests` property to `True`.
|
||||
|
||||
This target is not executed if the [input files](https://github.com/dotnet/buildtools/blob/87422f6cb8/src/Microsoft.DotNet.Build.Tasks/PackageFiles/tests.targets#L129-L133) (the test project's binaries and direct dependencies) have not changed since the tests were last successfully run. This behavior can be overriden by setting the `ForceRunTests` property to `True`. This is implemented by creating a `TestsSuccessfulSemaphore` file when the tests are successfully run and declaring it as one of the `Outputs` of the `RunTestsForProject` target; this sempahore is deleted in `SetupTestProperties` if `ForceRunTests` is true.
|
||||
|
||||
#### [`ArchiveTestBuild`](https://github.com/dotnet/buildtools/blob/87422f6cb8/src/Microsoft.DotNet.Build.Tasks/PackageFiles/publishtest.targets#L221)
|
||||
|
||||
Archives the test execution directory. This target is only executed if `ArchiveTests` is set to `True`.
|
||||
|
||||
## Debugging a test project in Visual Studio
|
||||
|
||||
When building a test project in Visual Studio, the build [sets the project's debug settings](https://github.com/dotnet/buildtools/blob/87422f6cb8/src/Microsoft.DotNet.Build.Tasks/PackageFiles/tests.targets#L117-L123) to directly invoke the test program (e.g. the XUnit executable) to run the tests. This is used instead of the test execution script because the tests can only be debugged by attaching directly to the test program's process.
|
||||
|
||||
It then executes a subset of the `Test` subtargets: `CopyTestToTestDirectory`, `CopyDependenciesToTestDirectory`, and `GenerateTestBindingRedirects`. This is achieved by [adding them to `PrepareForRunDependsOn`](https://github.com/dotnet/buildtools/blob/87422f6cb8/src/Microsoft.DotNet.Build.Tasks/PackageFiles/publishtest.targets#L202-L204).
|
||||
|
||||
## `CloudBuild` target
|
||||
|
||||
The `CloudBuild` target is used to run tests in Helix. At a high level, this target performs four steps:
|
||||
|
||||
1. Gathers the list of test archives to upload to Azure.
|
||||
2. Generates JSON files specifying information used when running the tests in Helix.
|
||||
3. Uploads the test archives and other required files for running the tests in Helix.
|
||||
4. Submits a Helix job which downloads the archives from Azure and runs the tests.
|
||||
|
||||
It depends on the following targets:
|
||||
|
||||
#### [`VerifyInputs`](https://github.com/dotnet/buildtools/blob/87422f6cb8/src/Microsoft.DotNet.Build.CloudTestTasks/PackageFiles/CloudTest.targets#L84)
|
||||
|
||||
Verifies that all required properties have been specified, and then gathers the test archives for this build, optionally filtering them based on `FilterToTestTFM` and `FilterToOSGroup`.
|
||||
|
||||
#### [`PreCloudBuild`](https://github.com/dotnet/buildtools/blob/87422f6cb8/src/Microsoft.DotNet.Build.CloudTestTasks/PackageFiles/CloudTest.targets#L187)
|
||||
|
||||
Prepares other files required for running the tests in Helix to be uploaded to Azure, such as the runner scripts and the TestILC folder if using .NET Native.
|
||||
|
||||
#### [`CreateTestListJson`](https://github.com/dotnet/buildtools/blob/87422f6cb8/src/Microsoft.DotNet.Build.CloudTestTasks/PackageFiles/CloudTest.targets#L256)
|
||||
|
||||
Generates JSON files specifying information used when running the tests in Helix.
|
||||
|
||||
#### [`UploadContent`](https://github.com/dotnet/buildtools/blob/87422f6cb8/src/Microsoft.DotNet.Build.CloudTestTasks/PackageFiles/CloudTest.targets#L408)
|
||||
|
||||
Uploads the test archives and other required files to Azure, and then submits a Helix job which downloads the archives and runs the tests.
|
||||
103
external/buildtools/Documentation/test-targets-usage.md
vendored
Normal file
103
external/buildtools/Documentation/test-targets-usage.md
vendored
Normal file
@@ -0,0 +1,103 @@
|
||||
# Test Targets
|
||||
|
||||
BuildTools provides test targets and related logic for publishing and running tests. The primary entry points are the [`Test` target](https://github.com/dotnet/buildtools/blob/87422f6cb8/src/Microsoft.DotNet.Build.Tasks/PackageFiles/tests.targets#L340) in Microsoft.DotNet.Build.Tasks and the [`CloudBuild` target](https://github.com/dotnet/buildtools/blob/87422f6cb8/src/Microsoft.DotNet.Build.CloudTestTasks/PackageFiles/CloudTest.targets#L79) in Microsoft.DotNet.Build.CloudTestTasks. The `Test` target is used to run or archive tests locally, and the `CloudBuild` target is used to run tests in Helix.
|
||||
|
||||
## Usage scenarios
|
||||
|
||||
*Tips:*
|
||||
- If building in a non-Windows environment, call `<repo-root>/Tools/msbuild.sh` instead of just `msbuild`.
|
||||
- Dotnet repos typically provide `BuildAndTest` and `RebuildAndTest` targets which can be used to build and run tests from a single command, so you may wish to substitute either of those targets for `Test` in the examples below.
|
||||
|
||||
#### Run tests for a project with the default options
|
||||
|
||||
The following command runs tests for System.Collections.Immutable.Tests.csproj using the default options.
|
||||
```
|
||||
msbuild /t:Test System.Collections.Immutable.Tests.csproj
|
||||
```
|
||||
|
||||
#### Run a single XUnit method
|
||||
|
||||
You can use the `XUnitOptions` property to override the options used for the XUnit command that runs the tests. For example, the following command runs only the `System.Security.Cryptography.Pkcs.Tests.CmsRecipientCollectionTests.Twoary` method.
|
||||
|
||||
```
|
||||
msbuild /t:Test "/p:XunitOptions=-method System.Security.Cryptography.Pkcs.Tests.CmsRecipientCollectionTests.Twoary" System.Security.Cryptography.Pkcs.Tests.csproj
|
||||
```
|
||||
|
||||
#### Debug tests in Visual Studio
|
||||
|
||||
1. Open the project's solution in Visual Studio.
|
||||
2. Set the test project as the Startup Project.
|
||||
3. Build (or rebuild) the solution.
|
||||
4. Press F5 to debug the project.
|
||||
|
||||
To modify the XUnit command which will be executed when debugging, edit the *Command line arguments* in the *Debug* section of the test project's properties, for example to debug just a single test method. Note that these changes will be overwritten when rebuilding the project.
|
||||
|
||||
#### Run tests for a specified TFM and/or RID
|
||||
|
||||
To specify the target framework moniker (TFM) on which to run tests, use the `TestTFM` property. Similarly, the runtime ID (RID) can be specified with the `TestNugetRuntimeId` property.
|
||||
|
||||
For example, the first command below runs tests against .NET Framework Desktop, and the second runs tests against .NET Core 5.0 for Windows 10 x64 (using the UWP runner).
|
||||
```
|
||||
msbuild /t:Test /p:TestTFM=net46 /p:TargetGroup=netstandard1.3 /p:OSGroup=Windows_NT System.Collections.Concurrent.Tests.csproj
|
||||
msbuild /t:Test /p:TestTFM=netcore50 /p:TargetGroup=netstandard1.3 /p:OSGroup=Windows_NT /p:TestNugetRuntimeId=win10-x64 System.Collections.Concurrent.Tests.csproj
|
||||
```
|
||||
|
||||
As the above commands suggest, it is often necessary to specify the `TargetGroup` and/or `OSGroup` together with the `TestTFM`. For this purpose, CoreFX provides a .builds file for each test project which specifies supported configurations for the project. These can be used to build a specified TFM by using just `FilterToTestTFM` property. See [CoreFX's developer guide](https://github.com/dotnet/corefx/blob/master/Documentation/project-docs/developer-guide.md#running-tests-in-a-different-tfm) for more information on this approach.
|
||||
|
||||
#### Build and run tests with .NET Native (Windows only)
|
||||
|
||||
Tests can be compiled and run with .NET Native by specifying the `UseDotNetNativeToolchain` property.
|
||||
|
||||
```
|
||||
msbuild /t:BuildAndTest /p:TestTFM=netcore50aot /p:TestNugetRuntimeId=win10-x64-aot /p:UseDotNetNativeToolchain=true Microsoft.CSharp.Tests.csproj
|
||||
```
|
||||
|
||||
#### Run code coverage tests
|
||||
|
||||
Use the `Coverage` property to run code coverage tests.
|
||||
|
||||
```
|
||||
msbuild /t:Test /p:Coverage=true System.Collections.Immutable.Tests.csproj
|
||||
```
|
||||
|
||||
#### Run performance tests
|
||||
|
||||
Use the `Performance` property to run performance tests.
|
||||
|
||||
```
|
||||
msbuild /t:Test /p:Performance=true System.Collections.Immutable.Tests.csproj
|
||||
```
|
||||
|
||||
#### Archive tests for running remotely
|
||||
|
||||
Use the `ArchiveTests` property to archive tests. This is typically used to prepare for running the tests in Helix.
|
||||
|
||||
```
|
||||
msbuild /t:Test /p:ArchiveTests=true System.Collections.Immutable.Tests.csproj
|
||||
```
|
||||
|
||||
The [CoreFX official build](https://github.com/dotnet/corefx/blob/021c590e6cb166c4bfc62b2c9966e317c37c1ed6/buildpipeline/DotNet-CoreFx-Trusted-Windows-Build-Test.json#L178-L179) provides an example of this usage.
|
||||
|
||||
#### Run tests in Helix
|
||||
|
||||
The `CloudBuild` target can be used to queue a test run in Helix. The required properties are validated at [the top of the `VerifyInputs` target](https://github.com/dotnet/buildtools/blob/87422f6cb8/src/Microsoft.DotNet.Build.CloudTestTasks/PackageFiles/CloudTest.targets#L85-L98).
|
||||
|
||||
See [the CoreFX official build](https://github.com/dotnet/corefx/blob/021c590e6cb166c4bfc62b2c9966e317c37c1ed6/buildpipeline/DotNet-CoreFx-Trusted-Windows-Build-Test.json#L215-L216) for an example.
|
||||
|
||||
## Other common properties
|
||||
|
||||
The following Boolean properties can also be used when running the `Test` target:
|
||||
- `ForceRunTests`: Run tests even if the input files haven't changed since the tests were last successfully run.
|
||||
- `SkipTests`: Skip running tests.
|
||||
- `Outerloop`: Include outerloop tests in the test execution.
|
||||
- `TestWithLocalLibraries`: Use locally-built libraries for all test dependencies, rather than using packages for the dependencies not directly referenced by the test project.
|
||||
- `TestWithLocalNativeLibraries`: Use locally-built native libraries.
|
||||
|
||||
The following properties, each specified as a semicolon-separate list, can be used to specify which XUnit test categories should be run:
|
||||
- `WithCategories`: Run tests for these categories.
|
||||
- `WithoutCategories`: Do not run tests for these categories.
|
||||
|
||||
For example, tests in the `OuterLoop` and `failing` categories are excluded by default, but you can run only the tests which are in either of those two categories with the following command:
|
||||
```
|
||||
msbuild /t:Test /p:WithCategories="OuterLoop;failing"
|
||||
```
|
||||
2
external/buildtools/DotnetCLIVersion.txt
vendored
2
external/buildtools/DotnetCLIVersion.txt
vendored
@@ -1 +1 @@
|
||||
1.0.0-beta-002173
|
||||
1.0.0-preview2-002733
|
||||
4
external/buildtools/README.md
vendored
4
external/buildtools/README.md
vendored
@@ -1,6 +1,8 @@
|
||||
# .NET Core Build Tools
|
||||
|
||||
[](http://dotnet-ci.cloudapp.net/job/dotnet_buildtools/job/innerloop/)
|
||||
[](https://ci.dot.net/job/dotnet_buildtools/job/master/job/innerloop/)
|
||||
|
||||
[](https://dotnet.myget.org/gallery/dotnet-buildtools/)
|
||||
|
||||
This repository contains supporting build tools that are necessary for building
|
||||
the [.NET Core][dotnet-corefx] projects. These projects consume the build tools
|
||||
|
||||
27
external/buildtools/THIRD-PARTY-NOTICES
vendored
Normal file
27
external/buildtools/THIRD-PARTY-NOTICES
vendored
Normal file
@@ -0,0 +1,27 @@
|
||||
.NET Core uses third-party libraries or other resources that may be
|
||||
distributed under licenses different than the .NET Core software.
|
||||
|
||||
In the event that we accidentally failed to list a required notice, please
|
||||
bring it to our attention. Post an issue or email us:
|
||||
|
||||
dotnet@microsoft.com
|
||||
|
||||
The attached notices are provided for information only.
|
||||
|
||||
License notice for xUnit.net
|
||||
----------------------------
|
||||
|
||||
Copyright (c) .NET Foundation and Contributors
|
||||
All Rights Reserved
|
||||
|
||||
Licensed under the Apache License, Version 2.0 (the "License");
|
||||
you may not use this file except in compliance with the License.
|
||||
You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing, software
|
||||
distributed under the License is distributed on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
See the License for the specific language governing permissions and
|
||||
limitations under the License.
|
||||
148
external/buildtools/bootstrap/bootstrap.ps1
vendored
Normal file
148
external/buildtools/bootstrap/bootstrap.ps1
vendored
Normal file
@@ -0,0 +1,148 @@
|
||||
param
|
||||
(
|
||||
[Parameter(Mandatory=$false)][string]$RepositoryRoot = $PSScriptRoot,
|
||||
[Parameter(Mandatory=$false)][string]$ToolsLocalPath = (Join-Path $RepositoryRoot "Tools"),
|
||||
[Parameter(Mandatory=$false)][string]$CliLocalPath = (Join-Path $ToolsLocalPath "dotnetcli"),
|
||||
[Parameter(Mandatory=$false)][string]$SharedFrameworkSymlinkPath = (Join-Path $ToolsLocalPath "dotnetcli\shared\Microsoft.NETCore.App\version"),
|
||||
[Parameter(Mandatory=$false)][string]$SharedFrameworkVersion = "<auto>",
|
||||
[Parameter(Mandatory=$false)][string]$Architecture = "<auto>",
|
||||
[Parameter(Mandatory=$false)][string]$DotNetInstallBranch = "rel/1.0.0",
|
||||
[switch]$Force = $false
|
||||
)
|
||||
|
||||
$rootToolVersions = Join-Path $RepositoryRoot ".toolversions"
|
||||
$bootstrapComplete = Join-Path $ToolsLocalPath "bootstrap.complete"
|
||||
|
||||
# if the force switch is specified delete the semaphore file if it exists
|
||||
if ($Force -and (Test-Path $bootstrapComplete))
|
||||
{
|
||||
del $bootstrapComplete
|
||||
}
|
||||
|
||||
# if the semaphore file exists and is identical to the specified version then exit
|
||||
if ((Test-Path $bootstrapComplete) -and !(Compare-Object (Get-Content $rootToolVersions) (Get-Content $bootstrapComplete)))
|
||||
{
|
||||
exit 0
|
||||
}
|
||||
|
||||
$initCliScript = "dotnet-install.ps1"
|
||||
$dotnetInstallPath = Join-Path $ToolsLocalPath $initCliScript
|
||||
|
||||
# blow away the tools directory so we can start from a known state
|
||||
if (Test-Path $ToolsLocalPath)
|
||||
{
|
||||
# if the bootstrap.ps1 script was downloaded to the tools directory don't delete it
|
||||
rd -recurse -force $ToolsLocalPath -exclude "bootstrap.ps1"
|
||||
}
|
||||
else
|
||||
{
|
||||
mkdir $ToolsLocalPath | Out-Null
|
||||
}
|
||||
|
||||
# download CLI boot-strapper script
|
||||
Invoke-WebRequest "https://raw.githubusercontent.com/dotnet/cli/$DotNetInstallBranch/scripts/obtain/dotnet-install.ps1" -OutFile $dotnetInstallPath
|
||||
|
||||
# load the version of the CLI
|
||||
$rootCliVersion = Join-Path $RepositoryRoot ".cliversion"
|
||||
$dotNetCliVersion = Get-Content $rootCliVersion
|
||||
|
||||
if (-Not (Test-Path $CliLocalPath))
|
||||
{
|
||||
mkdir $CliLocalPath | Out-Null
|
||||
}
|
||||
|
||||
# now execute the script
|
||||
Write-Host "$dotnetInstallPath -Version $dotNetCliVersion -InstallDir $CliLocalPath -Architecture ""$Architecture"""
|
||||
Invoke-Expression "$dotnetInstallPath -Version $dotNetCliVersion -InstallDir $CliLocalPath -Architecture ""$Architecture"""
|
||||
if ($LastExitCode -ne 0)
|
||||
{
|
||||
Write-Output "The .NET CLI installation failed with exit code $LastExitCode"
|
||||
exit $LastExitCode
|
||||
}
|
||||
|
||||
# create a junction to the shared FX version directory. this is
|
||||
# so we have a stable path to dotnet.exe regardless of version.
|
||||
$runtimesPath = Join-Path $CliLocalPath "shared\Microsoft.NETCore.App"
|
||||
if ($SharedFrameworkVersion -eq "<auto>")
|
||||
{
|
||||
$SharedFrameworkVersion = Get-ChildItem $runtimesPath -Directory | % { New-Object System.Version($_) } | Sort-Object -Descending | Select-Object -First 1
|
||||
}
|
||||
$junctionTarget = Join-Path $runtimesPath $SharedFrameworkVersion
|
||||
$junctionParent = Split-Path $SharedFrameworkSymlinkPath -Parent
|
||||
if (-Not (Test-Path $junctionParent))
|
||||
{
|
||||
mkdir $junctionParent | Out-Null
|
||||
}
|
||||
if (-Not (Test-Path $SharedFrameworkSymlinkPath))
|
||||
{
|
||||
cmd.exe /c mklink /j $SharedFrameworkSymlinkPath $junctionTarget | Out-Null
|
||||
}
|
||||
|
||||
# create a project.json for the packages to restore
|
||||
$projectJson = Join-Path $ToolsLocalPath "project.json"
|
||||
$pjContent = "{ `"dependencies`": {"
|
||||
|
||||
$tools = Get-Content $rootToolVersions
|
||||
foreach ($tool in $tools)
|
||||
{
|
||||
$name, $version = $tool.split("=")
|
||||
$pjContent = $pjContent + "`"$name`": `"$version`","
|
||||
}
|
||||
$pjContent = $pjContent + "}, `"frameworks`": { `"netcoreapp1.0`": { } } }"
|
||||
$pjContent | Out-File $projectJson
|
||||
|
||||
# now restore the packages
|
||||
$buildToolsSource = "https://dotnet.myget.org/F/dotnet-buildtools/api/v3/index.json"
|
||||
$nugetOrgSource = "https://api.nuget.org/v3/index.json"
|
||||
if ($env:buildtools_source -ne $null)
|
||||
{
|
||||
$buildToolsSource = $env:buildtools_source
|
||||
}
|
||||
$packagesPath = Join-Path $RepositoryRoot "packages"
|
||||
$dotNetExe = Join-Path $cliLocalPath "dotnet.exe"
|
||||
$restoreArgs = "restore $projectJson --packages $packagesPath --source $buildToolsSource --source $nugetOrgSource"
|
||||
$process = Start-Process -Wait -NoNewWindow -FilePath $dotNetExe -ArgumentList $restoreArgs -PassThru
|
||||
if ($process.ExitCode -ne 0)
|
||||
{
|
||||
exit $process.ExitCode
|
||||
}
|
||||
|
||||
# now stage the contents to tools directory and run any init scripts
|
||||
foreach ($tool in $tools)
|
||||
{
|
||||
$name, $version = $tool.split("=")
|
||||
|
||||
# verify that the version we expect is what was restored
|
||||
$pkgVerPath = Join-Path $packagesPath "$name\$version"
|
||||
if ((Test-Path $pkgVerPath) -eq 0)
|
||||
{
|
||||
Write-Output "Directory '$pkgVerPath' doesn't exist, ensure that the version restored matches the version specified."
|
||||
exit 1
|
||||
}
|
||||
|
||||
# at present we have the following conventions when staging package content:
|
||||
# 1. if a package contains a "tools" directory then recursively copy its contents
|
||||
# to a directory named the package ID that's under $ToolsLocalPath.
|
||||
# 2. if a package contains a "libs" directory then recursively copy its contents
|
||||
# under the $ToolsLocalPath directory.
|
||||
# 3. if a package contains a file "lib\init-tools.cmd" execute it.
|
||||
|
||||
if (Test-Path (Join-Path $pkgVerPath "tools"))
|
||||
{
|
||||
$destination = Join-Path $ToolsLocalPath $name
|
||||
mkdir $destination | Out-Null
|
||||
copy (Join-Path $pkgVerPath "tools\*") $destination -recurse
|
||||
}
|
||||
elseif (Test-Path (Join-Path $pkgVerPath "lib"))
|
||||
{
|
||||
copy (Join-Path $pkgVerPath "lib\*") $ToolsLocalPath -recurse
|
||||
}
|
||||
|
||||
if (Test-Path (Join-Path $pkgVerPath "lib\init-tools.cmd"))
|
||||
{
|
||||
cmd.exe /c (Join-Path $pkgVerPath "lib\init-tools.cmd") $RepositoryRoot $dotNetExe $ToolsLocalPath | Out-File (Join-Path $RepositoryRoot "Init-$name.log")
|
||||
}
|
||||
}
|
||||
|
||||
# write semaphore file
|
||||
copy $rootToolVersions $bootstrapComplete
|
||||
274
external/buildtools/bootstrap/bootstrap.sh
vendored
Normal file
274
external/buildtools/bootstrap/bootstrap.sh
vendored
Normal file
@@ -0,0 +1,274 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
# Stop script on NZEC
|
||||
set -e
|
||||
# Stop script if unbound variable found (use ${var:-} if intentional)
|
||||
set -u
|
||||
# By default cmd1 | cmd2 returns exit code of cmd2 regardless of cmd1 success
|
||||
# This is causing it to fail
|
||||
set -o pipefail
|
||||
|
||||
# Use in the the functions: eval $invocation
|
||||
invocation='say_verbose "Calling: ${FUNCNAME[0]}"'
|
||||
|
||||
# standard output may be used as a return value in the functions
|
||||
# we need a way to write text on the screen in the functions so that
|
||||
# it won't interfere with the return value.
|
||||
# Exposing stream 3 as a pipe to standard output of the script itself
|
||||
exec 3>&1
|
||||
|
||||
say_err() {
|
||||
printf "%b\n" "bootstrap: Error: $1" >&2
|
||||
}
|
||||
|
||||
say() {
|
||||
# using stream 3 (defined in the beginning) to not interfere with stdout of functions
|
||||
# which may be used as return value
|
||||
printf "%b\n" "bootstrap: $1" >&3
|
||||
}
|
||||
|
||||
say_verbose() {
|
||||
if [ "$verbose" = true ]; then
|
||||
say "$1"
|
||||
fi
|
||||
}
|
||||
|
||||
machine_has() {
|
||||
eval $invocation
|
||||
|
||||
hash "$1" > /dev/null 2>&1
|
||||
return $?
|
||||
}
|
||||
|
||||
check_min_reqs() {
|
||||
if ! machine_has "curl"; then
|
||||
say_err "curl is required to download dotnet. Install curl to proceed."
|
||||
return 1
|
||||
fi
|
||||
|
||||
return 0
|
||||
}
|
||||
|
||||
# args:
|
||||
# remote_path - $1
|
||||
# [out_path] - $2 - stdout if not provided
|
||||
download() {
|
||||
eval $invocation
|
||||
|
||||
local remote_path=$1
|
||||
local out_path=${2:-}
|
||||
|
||||
local failed=false
|
||||
if [ -z "$out_path" ]; then
|
||||
curl --retry 10 -sSL --create-dirs $remote_path || failed=true
|
||||
else
|
||||
curl --retry 10 -sSL --create-dirs -o $out_path $remote_path || failed=true
|
||||
fi
|
||||
|
||||
if [ "$failed" = true ]; then
|
||||
say_err "Download failed"
|
||||
return 1
|
||||
fi
|
||||
}
|
||||
|
||||
verbose=false
|
||||
repoRoot=`pwd`
|
||||
toolsLocalPath="<auto>"
|
||||
cliLocalPath="<auto>"
|
||||
symlinkPath="<auto>"
|
||||
sharedFxVersion="<auto>"
|
||||
force=
|
||||
forcedCliLocalPath="<none>"
|
||||
architecture="<auto>"
|
||||
dotNetInstallBranch="rel/1.0.0"
|
||||
|
||||
while [ $# -ne 0 ]
|
||||
do
|
||||
name=$1
|
||||
case $name in
|
||||
-r|--repositoryRoot|-[Rr]epositoryRoot)
|
||||
shift
|
||||
repoRoot="$1"
|
||||
;;
|
||||
-t|--toolsLocalPath|-[Tt]oolsLocalPath)
|
||||
shift
|
||||
toolsLocalPath="$1"
|
||||
;;
|
||||
-c|--cliInstallPath|--cliLocalPath|-[Cc]liLocalPath)
|
||||
shift
|
||||
cliLocalPath="$1"
|
||||
;;
|
||||
-u|--useLocalCli|-[Uu]seLocalCli)
|
||||
shift
|
||||
forcedCliLocalPath="$1"
|
||||
;;
|
||||
-a|--architecture|-[Aa]rchitecture)
|
||||
shift
|
||||
architecture="$1"
|
||||
;;
|
||||
--dotNetInstallBranch|-[Dd]ot[Nn]et[Ii]nstall[Bb]ranch)
|
||||
shift
|
||||
dotNetInstallBranch="$1"
|
||||
;;
|
||||
--sharedFrameworkSymlinkPath|--symlink|-[Ss]haredFrameworkSymlinkPath)
|
||||
shift
|
||||
symlinkPath="$1"
|
||||
;;
|
||||
--sharedFrameworkVersion|-[Ss]haredFrameworkVersion)
|
||||
sharedFxVersion="$1"
|
||||
;;
|
||||
--force|-[Ff]orce)
|
||||
force=true
|
||||
;;
|
||||
-v|--verbose|-[Vv]erbose)
|
||||
verbose=true
|
||||
;;
|
||||
*)
|
||||
say_err "Unknown argument \`$name\`"
|
||||
exit 1
|
||||
;;
|
||||
esac
|
||||
|
||||
shift
|
||||
done
|
||||
|
||||
if [ $toolsLocalPath = "<auto>" ]; then
|
||||
toolsLocalPath="$repoRoot/Tools"
|
||||
fi
|
||||
|
||||
if [ $cliLocalPath = "<auto>" ]; then
|
||||
if [ $forcedCliLocalPath = "<none>" ]; then
|
||||
cliLocalPath="$toolsLocalPath/dotnetcli"
|
||||
else
|
||||
cliLocalPath=$forcedCliLocalPath
|
||||
fi
|
||||
fi
|
||||
|
||||
if [ $symlinkPath = "<auto>" ]; then
|
||||
symlinkPath="$toolsLocalPath/dotnetcli/shared/Microsoft.NETCore.App/version"
|
||||
fi
|
||||
|
||||
rootToolVersions="$repoRoot/.toolversions"
|
||||
bootstrapComplete="$toolsLocalPath/bootstrap.complete"
|
||||
|
||||
# if the force switch is specified delete the semaphore file if it exists
|
||||
if [[ $force && -f $bootstrapComplete ]]; then
|
||||
rm -f $bootstrapComplete
|
||||
fi
|
||||
|
||||
# if the semaphore file exists and is identical to the specified version then exit
|
||||
if [[ -f $bootstrapComplete && ! `cmp $bootstrapComplete $rootToolVersions` ]]; then
|
||||
say "$bootstrapComplete appears to show that bootstrapping is complete. Use --force if you want to re-bootstrap."
|
||||
exit 0
|
||||
fi
|
||||
|
||||
initCliScript="dotnet-install.sh"
|
||||
dotnetInstallPath="$toolsLocalPath/$initCliScript"
|
||||
|
||||
# blow away the tools directory so we can start from a known state
|
||||
if [ -d $toolsLocalPath ]; then
|
||||
# if the bootstrap.sh script was downloaded to the tools directory don't delete it
|
||||
find $toolsLocalPath -type f -not -name bootstrap.sh -exec rm -f {} \;
|
||||
else
|
||||
mkdir $toolsLocalPath
|
||||
fi
|
||||
|
||||
if [ $forcedCliLocalPath = "<none>" ]; then
|
||||
check_min_reqs
|
||||
|
||||
# download CLI boot-strapper script
|
||||
download "https://raw.githubusercontent.com/dotnet/cli/$dotNetInstallBranch/scripts/obtain/dotnet-install.sh" "$dotnetInstallPath"
|
||||
chmod u+x "$dotnetInstallPath"
|
||||
|
||||
# load the version of the CLI
|
||||
rootCliVersion="$repoRoot/.cliversion"
|
||||
dotNetCliVersion=`cat $rootCliVersion`
|
||||
|
||||
if [ ! -e $cliLocalPath ]; then
|
||||
mkdir -p "$cliLocalPath"
|
||||
fi
|
||||
|
||||
# now execute the script
|
||||
say_verbose "installing CLI: $dotnetInstallPath --version \"$dotNetCliVersion\" --install-dir $cliLocalPath --architecture \"$architecture\""
|
||||
$dotnetInstallPath --version "$dotNetCliVersion" --install-dir $cliLocalPath --architecture "$architecture"
|
||||
if [ $? != 0 ]; then
|
||||
say_err "The .NET CLI installation failed with exit code $?"
|
||||
exit $?
|
||||
fi
|
||||
fi
|
||||
|
||||
runtimesPath="$cliLocalPath/shared/Microsoft.NETCore.App"
|
||||
if [ $sharedFxVersion = "<auto>" ]; then
|
||||
# OSX doesn't support --version-sort, https://stackoverflow.com/questions/21394536/how-to-simulate-sort-v-on-mac-osx
|
||||
sharedFxVersion=`ls $runtimesPath | sed 's/^[0-9]\./0&/; s/\.\([0-9]\)$/.0\1/; s/\.\([0-9]\)\./.0\1./g; s/\.\([0-9]\)\./.0\1./g' | sort -r | sed 's/^0// ; s/\.0/./g' | head -n 1`
|
||||
fi
|
||||
|
||||
# create a junction to the shared FX version directory. this is
|
||||
# so we have a stable path to dotnet.exe regardless of version.
|
||||
junctionTarget="$runtimesPath/$sharedFxVersion"
|
||||
junctionParent="$(dirname "$symlinkPath")"
|
||||
|
||||
if [ ! -d $junctionParent ]; then
|
||||
mkdir -p $junctionParent
|
||||
fi
|
||||
|
||||
if [ ! -e $symlinkPath ]; then
|
||||
ln -s $junctionTarget $symlinkPath
|
||||
fi
|
||||
|
||||
# create a project.json for the packages to restore
|
||||
projectJson="$toolsLocalPath/project.json"
|
||||
pjContent="{ \"dependencies\": {"
|
||||
while read v; do
|
||||
IFS='=' read -r -a line <<< "$v"
|
||||
pjContent="$pjContent \"${line[0]}\": \"${line[1]}\","
|
||||
done <$rootToolVersions
|
||||
pjContent="$pjContent }, \"frameworks\": { \"netcoreapp1.0\": { } } }"
|
||||
echo $pjContent > $projectJson
|
||||
|
||||
# now restore the packages
|
||||
buildToolsSource="${BUILDTOOLS_SOURCE:-https://dotnet.myget.org/F/dotnet-buildtools/api/v3/index.json}"
|
||||
nugetOrgSource="https://api.nuget.org/v3/index.json"
|
||||
|
||||
packagesPath="$repoRoot/packages"
|
||||
dotNetExe="$cliLocalPath/dotnet"
|
||||
restoreArgs="restore $projectJson --packages $packagesPath --source $buildToolsSource --source $nugetOrgSource"
|
||||
say_verbose "Running $dotNetExe $restoreArgs"
|
||||
$dotNetExe $restoreArgs
|
||||
if [ $? != 0 ]; then
|
||||
say_err "project.json restore failed with exit code $?"
|
||||
exit $?
|
||||
fi
|
||||
|
||||
# now stage the contents to tools directory and run any init scripts
|
||||
while read v; do
|
||||
IFS='=' read -r -a line <<< "$v"
|
||||
# verify that the version we expect is what was restored
|
||||
pkgVerPath="$packagesPath/${line[0]}/${line[1]}"
|
||||
if [ ! -d $pkgVerPath ]; then
|
||||
say_err "Directory $pkgVerPath doesn't exist, ensure that the version restore matches the version specified."
|
||||
exit 1
|
||||
fi
|
||||
# at present we have the following conventions when staging package content:
|
||||
# 1. if a package contains a "tools" directory then recursively copy its contents
|
||||
# to a directory named the package ID that's under $ToolsLocalPath.
|
||||
# 2. if a package contains a "libs" directory then recursively copy its contents
|
||||
# under the $ToolsLocalPath directory.
|
||||
# 3. if a package contains a file "lib\init-tools.cmd" execute it.
|
||||
if [ -d "$pkgVerPath/tools" ]; then
|
||||
destination="$toolsLocalPath/${line[0]}"
|
||||
mkdir -p $destination
|
||||
cp -r $pkgVerPath/* $destination
|
||||
fi
|
||||
if [ -d "$pkgVerPath/lib" ]; then
|
||||
cp -r $pkgVerPath/lib/* $toolsLocalPath
|
||||
fi
|
||||
if [ -f "$pkgVerPath/lib/init-tools.sh" ]; then
|
||||
"$pkgVerPath/lib/init-tools.sh" "$repoRoot" "$dotNetExe" "$toolsLocalPath" > "init-${line[0]}.log"
|
||||
fi
|
||||
done <$rootToolVersions
|
||||
|
||||
cp $rootToolVersions $bootstrapComplete
|
||||
|
||||
say "Bootstrap finished successfully."
|
||||
|
||||
57
external/buildtools/build.cmd
vendored
57
external/buildtools/build.cmd
vendored
@@ -1,55 +1,2 @@
|
||||
@echo off
|
||||
setlocal
|
||||
|
||||
:: Note: We've disabled node reuse because it causes file locking issues.
|
||||
:: The issue is that we extend the build with our own targets which
|
||||
:: means that that rebuilding cannot successfully delete the task
|
||||
:: assembly.
|
||||
|
||||
if not defined VisualStudioVersion (
|
||||
if defined VS140COMNTOOLS (
|
||||
call "%VS140COMNTOOLS%\VsDevCmd.bat"
|
||||
goto :EnvSet
|
||||
)
|
||||
|
||||
if defined VS120COMNTOOLS (
|
||||
call "%VS120COMNTOOLS%\VsDevCmd.bat"
|
||||
goto :EnvSet
|
||||
)
|
||||
|
||||
echo Error: build.cmd requires Visual Studio 2013 or 2015.
|
||||
echo Please see https://github.com/dotnet/corefx/blob/master/Documentation/developer-guide.md for build instructions.
|
||||
exit /b 1
|
||||
)
|
||||
|
||||
:EnvSet
|
||||
|
||||
call %~dp0init-tools.cmd
|
||||
|
||||
:: Log build command line
|
||||
set _buildproj=%~dp0build.proj
|
||||
set _buildlog=%~dp0msbuild.log
|
||||
set _buildprefix=echo
|
||||
set _buildpostfix=^> "%_buildlog%"
|
||||
call :build %*
|
||||
|
||||
:: Build
|
||||
set _buildprefix=
|
||||
set _buildpostfix=
|
||||
call :build %*
|
||||
|
||||
goto :AfterBuild
|
||||
|
||||
:build
|
||||
%_buildprefix% msbuild "%_buildproj%" /nologo /maxcpucount /verbosity:minimal /nodeReuse:false /fileloggerparameters:Verbosity=normal;LogFile="%_buildlog%";Append %* %_buildpostfix%
|
||||
set BUILDERRORLEVEL=%ERRORLEVEL%
|
||||
goto :eof
|
||||
|
||||
:AfterBuild
|
||||
|
||||
echo.
|
||||
:: Pull the build summary from the log file
|
||||
findstr /ir /c:".*Warning(s)" /c:".*Error(s)" /c:"Time Elapsed.*" "%_buildlog%"
|
||||
echo Build Exit Code = %BUILDERRORLEVEL%
|
||||
|
||||
exit /b %BUILDERRORLEVEL%
|
||||
@call run.cmd build-managed %*
|
||||
@exit /b %ERRORLEVEL%
|
||||
40
external/buildtools/build.proj
vendored
40
external/buildtools/build.proj
vendored
@@ -6,6 +6,13 @@
|
||||
<PropertyGroup>
|
||||
<SerializeProjects>true</SerializeProjects>
|
||||
</PropertyGroup>
|
||||
|
||||
<PropertyGroup>
|
||||
<!-- To disable the restoration of packages, set RestoreDuringBuild=false or pass /p:RestoreDuringBuild=false.-->
|
||||
<RestoreDuringBuild Condition="'$(RestoreDuringBuild)'==''">true</RestoreDuringBuild>
|
||||
</PropertyGroup>
|
||||
|
||||
<Import Project="$(ToolsDir)VersionTools.targets" Condition="Exists('$(ToolsDir)VersionTools.targets')" />
|
||||
|
||||
<ItemGroup>
|
||||
<Project Include="src\dirs.proj" />
|
||||
@@ -16,17 +23,19 @@
|
||||
<Import Project="dir.targets" />
|
||||
|
||||
<Import Project="dir.traversal.targets" />
|
||||
|
||||
<Import Project="$(ToolsDir)clean.targets" />
|
||||
|
||||
<PropertyGroup>
|
||||
<TraversalBuildDependsOn>
|
||||
ValidateAllProjectDependencies;
|
||||
BatchRestorePackages;
|
||||
ValidateExactRestore;
|
||||
CreateOrUpdateCurrentVersionFile;
|
||||
$(TraversalBuildDependsOn);
|
||||
</TraversalBuildDependsOn>
|
||||
</PropertyGroup>
|
||||
|
||||
<Target Name="BatchRestorePackages">
|
||||
<Target Name="BatchRestorePackages" Condition="'$(RestoreDuringBuild)'=='true'" DependsOnTargets="VerifyDependencies">
|
||||
<Message Importance="High" Text="Restoring all packages..." />
|
||||
|
||||
<Exec Command="$(DnuRestoreCommand) $(DnuRestoreDirs)" StandardOutputImportance="Low" CustomErrorRegularExpression="^Unable to locate .*" />
|
||||
@@ -37,23 +46,6 @@
|
||||
<Exec Condition="'@(_allPackagesConfigs)' != ''" Command="$(NugetRestoreCommand) "%(_allPackagesConfigs.FullPath)"" StandardOutputImportance="Low" />
|
||||
</Target>
|
||||
|
||||
<!-- Task from buildtools that validates dependencies contained in project.json files. -->
|
||||
<UsingTask TaskName="ValidateProjectDependencyVersions" AssemblyFile="$(BuildToolsTaskDir)Microsoft.DotNet.Build.Tasks.dll" />
|
||||
|
||||
<Target Name="ValidateAllProjectDependencies"
|
||||
Condition="'$(ValidatePackageVersions)'=='true' and '@(ProjectJsonFiles)'!=''">
|
||||
<ValidateProjectDependencyVersions ProjectJsons="@(ProjectJsonFiles)"
|
||||
ProhibitFloatingDependencies="$(ProhibitFloatingDependencies)"
|
||||
ValidationPatterns="@(ValidationPattern)" />
|
||||
</Target>
|
||||
|
||||
<Target Name="UpdateInvalidPackageVersions">
|
||||
<ValidateProjectDependencyVersions ProjectJsons="@(ProjectJsonFiles)"
|
||||
ProhibitFloatingDependencies="$(ProhibitFloatingDependencies)"
|
||||
ValidationPatterns="@(ValidationPattern)"
|
||||
UpdateInvalidDependencies="true" />
|
||||
</Target>
|
||||
|
||||
<!-- Tasks from buildtools for easy project.json dependency updates -->
|
||||
<UsingTask TaskName="UpdatePackageDependencyVersion" AssemblyFile="$(BuildToolsTaskDir)Microsoft.DotNet.Build.Tasks.dll" />
|
||||
|
||||
@@ -63,12 +55,20 @@
|
||||
OldVersion="$(OldVersion)"
|
||||
NewVersion="$(NewVersion)" />
|
||||
</Target>
|
||||
|
||||
<!-- Task from buildtools that uses lockfiles to validate that packages restored are exactly what were specified. -->
|
||||
<UsingTask TaskName="ValidateExactRestore" AssemblyFile="$(BuildToolsTaskDir)Microsoft.DotNet.Build.Tasks.dll" />
|
||||
|
||||
<Target Name="ValidateExactRestore"
|
||||
Condition="'$(AllowInexactRestore)'!='true'">
|
||||
<ValidateExactRestore ProjectLockJsons="@(ProjectJsonFiles->'%(RootDir)%(Directory)%(Filename).lock.json')" />
|
||||
</Target>
|
||||
|
||||
<!-- Override RestorePackages from dir.traversal.targets and do a batch restore -->
|
||||
<Target Name="RestorePackages" DependsOnTargets="BatchRestorePackages" />
|
||||
|
||||
<!-- Override clean from dir.traversal.targets and just remove the full BinDir -->
|
||||
<Target Name="Clean">
|
||||
<Target Name="CleanAllProjects">
|
||||
<RemoveDir Directories="$(BinDir)" />
|
||||
</Target>
|
||||
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user