We're modifying an app that rebuilds everything in a Visual Studio solution. We were using BuildEngine but since it's deprecated, we're moving to Microsoft.Build.
The code is like this:
var projectCollection = new Microsoft.Build.Evaluation.ProjectCollection();
var globalProperty = new Dictionary<String, String>();
var buildRequest = new Microsoft.Build.Execution.BuildRequestData(Directory.GetFiles(#"Build\", "*.sln").First(), globalProperty, null, new String[] { "Clean", "Build" }, null);
Microsoft.Build.Execution.BuildManager.DefaultBuildManager.Build(new Microsoft.Build.Execution.BuildParameters(projectCollection), buildRequest);
// 2nd time's a charm?
// Microsoft.Build.Execution.BuildManager.DefaultBuildManager.Build(new Microsoft.Build.Execution.BuildParameters(projectCollection), buildRequest);
But the projects are not being rebuilt. However, if the Build method is run twice (as the commented code above), the projects are rebuilt properly.
Related
I have created a Roslyn Analyzer project which generates a nuget package and DLL of it. I want to use that DLL in a standalone code analysis project. How can i do that? For example i have following code:
MSBuildLocator.RegisterDefaults();
var filePath = #"C:\Users\user\repos\ConsoleApp\ConsoleApp.sln";
var msbws = MSBuildWorkspace.Create();
var soln = await msbws.OpenSolutionAsync(filePath);
var errors = new List<Diagnostic>();
foreach (var proj in soln.Projects)
{
var analyzer = //Here i want to load analyzer from DLL present locally.
var compilation = await proj.GetCompilationAsync();
var compWithAnalyzer = compilation.WithAnalyzers(analyzer.GetAnalyzersForAllLanguages());
var res = compWithAnalyzer.GetAllDiagnosticsAsync().Result;
errors.AddRange(res.Where(r => r.Severity == DiagnosticSeverity.Error).ToList());
}
I have tried following
var analyzer = new AnalyzerFileReference("Path to DLL", new AnalyzerAssemblyLoader());
But here AnalyzerAssemblyLoader shows error as it is inaccessible to to its protection level (class is internal).
Kindly suggest me if we can do this.
the .WithAnalyzers() option will allow you to pass an instance of an analyzer. If you're referencing the DLL locally, you can just create the analyzer like you would any other object and pass it to the compilation.
var analyzer = new MyAnalyzer();
var compilation = await proj.GetCompilationAsync();
var compWithAnalyzer = compilation.WithAnalyzers(ImmutableArray.Create<DiagnosticAnalyzer>(analyzer));
If you're not referencing the assembly, but want to load it at runtime, you can use the usual System.Reflection based methods to get an instance of the analyzer:
var assembly = Assembly.LoadFrom(#"<path to assembly>.dll");
var analyzers = assembly.GetTypes()
.Where(t => t.GetCustomAttribute<DiagnosticAnalyzerAttribute>() is object)
.Select(t => (DiagnosticAnalyzer) Activator.CreateInstance(t))
.ToArray();
compWithAnalyzers = compilation.WithAnalyzers(ImmutableArray.Create(analyzers));
Is there any 'easy' way to rename models in RavenDb when the database already has existing data? I have various models which were originally created in another language, and now I would like to rename them to English as the codebase is becoming quite unmaintainable. If I just rename them, then the data won't be loaded because the properties don't match anymore.
I would like the system to automatically do it on first load. Is there any best way how to approach this? My solution would be:
Check if a document exists to determine if the upgrade has been done or not
If upgrade has not been done, execute patch scripts to update fields
Update document to know that the upgrade has been done
I'd recommend you create new documents from the old documents.
This can be done pretty easily using patching via docStore.UpdateByIndex.
Suppose I had an old type name, Foo, and wanted to rename it to the new type name, Bar. And I wanted all the IDs to change from Foos/123 to Bars/123.
It would look something like this:
var patchScript = #"
// Copy all the properties from the old document
var newDoc = {};
for (var prop in this) {
if (prop !== '#metadata') {
newDoc[prop] = this[prop];
}
}
// Create the metadata.
var meta = {};
meta['Raven-Entity-Name'] = newCollection;
meta['Raven-Clr-Type'] = newType;
// Store the new document.
var newId = __document_id.replace(oldCollection, newCollection);
PutDocument(newId, newDoc, meta);
";
var oldCollection = "Foos";
var newCollection = "Bars";
var newType = "KarlCassar.Bar, KarlCassar"; // Where KarlCassar is your assembly name.
var query = new IndexQuery { Query = $"Tag:{oldCollection}" };
var options = new BulkOperationOptions { AllowStale = false };
var patch = new ScriptedPatchRequest
{
Script = patchScript,
Values = new Dictionary<string, object>
{
{ nameof(oldCollection), oldCollection },
{ nameof(newCollection), newCollection },
{ nameof(newType), newType }
}
};
var patchOperation = docStore.DatabaseCommands.UpdateByIndex("Raven/DocumentsByEntityName", query, patch, options);
patchOperation.WaitForCompletion();
Run that code once at startup, and then your app will be able to work with the new name entities. Your old entities are still around - those can be safely deleted via the Studio.
What are the options for setting a project version with .NET Core / ASP.NET Core projects?
Found so far:
Set the version property in project.json. Source: DNX Overview, Working with DNX projects. This seems to set the AssemblyVersion, AssemblyFileVersion and AssemblyInformationalVersion unless overridden by an attribute (see next point).
Setting the AssemblyVersion, AssemblyFileVersion, AssemblyInformationalVersion attributes also seems to work and override the version property specified in project.json.
For example, including 'version':'4.1.1-*' in project.json and setting [assembly:AssemblyFileVersion("4.3.5.0")] in a .cs file will result in AssemblyVersion=4.1.1.0, AssemblyInformationalVersion=4.1.1.0 and AssemblyFileVersion=4.3.5.0
Is setting the version number via attributes, e.g. AssemblyFileVersion, still supported?
Have I missed something - are there other ways?
Context
The scenario I'm looking at is sharing a single version number between multiple related projects. Some of the projects are using .NET Core (project.json), others are using the full .NET Framework (.csproj). All are logically part of a single system and versioned together.
The strategy we used up until now is having a SharedAssemblyInfo.cs file at the root of our solution with the AssemblyVersion and AssemblyFileVersion attributes. The projects include a link to the file.
I'm looking for ways to achieve the same result with .NET Core projects, i.e. have a single file to modify.
You can create a Directory.Build.props file in the root/parent folder of your projects and set the version information there.
However, now you can add a new property to every project in one step by defining it in a single file called Directory.Build.props in the root folder that contains your source. When MSBuild runs, Microsoft.Common.props searches your directory structure for the Directory.Build.props file (and Microsoft.Common.targets looks for Directory.Build.targets). If it finds one, it imports the property. Directory.Build.props is a user-defined file that provides customizations to projects under a directory.
For example:
<Project>
<PropertyGroup>
<Version>0.0.0.0</Version>
<FileVersion>0.0.0.0</FileVersion>
<InformationalVersion>0.0.0.0.myversion</InformationalVersion>
</PropertyGroup>
</Project>
Another option for setting version info when calling build or publish is to use the undocumented /p option.
dotnet command internally passes these flags to MSBuild.
Example:
dotnet publish ./MyProject.csproj /p:Version="1.2.3" /p:InformationalVersion="1.2.3-qa"
See here for more information: https://github.com/dotnet/docs/issues/7568
Not sure if this helps, but you can set version suffixes at publish time. Our versions are usually datetime driven, so that developers don't have to remember to update them.
If your json has something like "1.0-*"
"dotnet publish --version-suffix 2016.01.02" will make it "1.0-2016.01.02".
It's important to stick to "semvar" standards, or else you'll get errors. Dotnet publish will tell you.
Why not just change the value in the project.json file. Using CakeBuild you could do something like this (optimizations probably possible)
Task("Bump").Does(() => {
var files = GetFiles(config.SrcDir + "**/project.json");
foreach(var file in files)
{
Information("Processing: {0}", file);
var path = file.ToString();
var trg = new StringBuilder();
var regExVersion = new System.Text.RegularExpressions.Regex("\"version\":(\\s)?\"0.0.0-\\*\",");
using (var src = System.IO.File.OpenRead(path))
{
using (var reader = new StreamReader(src))
{
while (!reader.EndOfStream)
{
var line = reader.ReadLine();
if(line == null)
continue;
line = regExVersion.Replace(line, string.Format("\"version\": \"{0}\",", config.SemVer));
trg.AppendLine(line);
}
}
}
System.IO.File.WriteAllText(path, trg.ToString());
}
});
Then if you have e.g. a UnitTest project that takes a dependency on the project, use "*" for dependency resolution.
Also, do the bump before doing dotnet restore. My order is as follows:
Task("Default")
.IsDependentOn("InitOutDir")
.IsDependentOn("Bump")
.IsDependentOn("Restore")
.IsDependentOn("Build")
.IsDependentOn("UnitTest");
Task("CI")
.IsDependentOn("Default")
.IsDependentOn("Pack");
Link to full build script: https://github.com/danielwertheim/Ensure.That/blob/3a278f05d940d9994f0fde9266c6f2c41900a884/build.cake
The actual values, e.g. the version is coming from importing a separate build.config file, in the build script:
#load "./buildconfig.cake"
var config = BuildConfig.Create(Context, BuildSystem);
The config file looks like this (taken from https://github.com/danielwertheim/Ensure.That/blob/3a278f05d940d9994f0fde9266c6f2c41900a884/buildconfig.cake):
public class BuildConfig
{
private const string Version = "5.0.0";
public readonly string SrcDir = "./src/";
public readonly string OutDir = "./build/";
public string Target { get; private set; }
public string Branch { get; private set; }
public string SemVer { get; private set; }
public string BuildProfile { get; private set; }
public bool IsTeamCityBuild { get; private set; }
public static BuildConfig Create(
ICakeContext context,
BuildSystem buildSystem)
{
if (context == null)
throw new ArgumentNullException("context");
var target = context.Argument("target", "Default");
var branch = context.Argument("branch", string.Empty);
var branchIsRelease = branch.ToLower() == "release";
var buildRevision = context.Argument("buildrevision", "0");
return new BuildConfig
{
Target = target,
Branch = branch,
SemVer = Version + (branchIsRelease ? string.Empty : "-b" + buildRevision),
BuildProfile = context.Argument("configuration", "Release"),
IsTeamCityBuild = buildSystem.TeamCity.IsRunningOnTeamCity
};
}
}
If you still want to have the Solution Level SharedVersionInfo.cs you can do it by adding these lines to your project.json file:
"buildOptions": {
"compile": {
"includeFiles": [
"../../SharedVersionInfo.cs"
]
}
}
Your relative path may vary, of course.
use external version.txt file with version, and prebuild step to publish this version in projects
RavenDB throws InvalidOperationException when IsOperationAllowedOnDocument is called using embedded mode.
I can see in the IsOperationAllowedOnDocument implementation a clause checking for calls in embedded mode.
namespace Raven.Client.Authorization
{
public static class AuthorizationClientExtensions
{
public static OperationAllowedResult[] IsOperationAllowedOnDocument(this ISyncAdvancedSessionOperation session, string userId, string operation, params string[] documentIds)
{
var serverClient = session.DatabaseCommands as ServerClient;
if (serverClient == null)
throw new InvalidOperationException("Cannot get whatever operation is allowed on document in embedded mode.");
Is there a workaround for this other than not using embedded mode?
Thanks for your time.
I encountered the same situation while writing some unit tests. The solution James provided worked; however, it resulted in having one code path for the unit test and another path for the production code, which defeated the purpose of the unit test. We were able to create a second document store and connect it to the first document store which allowed us to then access the authorization extension methods successfully. While this solution would probably not be good for production code (because creating Document Stores is expensive) it works nicely for unit tests. Here is a code sample:
using (var documentStore = new EmbeddableDocumentStore
{ RunInMemory = true,
UseEmbeddedHttpServer = true,
Configuration = {Port = EmbeddedModePort} })
{
documentStore.Initialize();
var url = documentStore.Configuration.ServerUrl;
using (var docStoreHttp = new DocumentStore {Url = url})
{
docStoreHttp.Initialize();
using (var session = docStoreHttp.OpenSession())
{
// now you can run code like:
// session.GetAuthorizationFor(),
// session.SetAuthorizationFor(),
// session.Advanced.IsOperationAllowedOnDocument(),
// etc...
}
}
}
There are couple of other items that should be mentioned:
The first document store needs to be run with the UseEmbeddedHttpServer set to true so that the second one can access it.
I created a constant for the Port so it would be used consistently and ensure use of a non reserved port.
I encountered this as well. Looking at the source, there's no way to do that operation as written. Not sure if there's some intrinsic reason why since I could easily replicate the functionality in my app by making a http request directly for the same info:
HttpClient http = new HttpClient();
http.BaseAddress = new Uri("http://localhost:8080");
var url = new StringBuilder("/authorization/IsAllowed/")
.Append(Uri.EscapeUriString(userid))
.Append("?operation=")
.Append(Uri.EscapeUriString(operation)
.Append("&id=").Append(Uri.EscapeUriString(entityid));
http.GetStringAsync(url.ToString()).ContinueWith((response) =>
{
var results = _session.Advanced.DocumentStore.Conventions.CreateSerializer()
.Deserialize<OperationAllowedResult[]>(
new RavenJTokenReader(RavenJToken.Parse(response.Result)));
}).Wait();
So I m building a tool that creates a ContentProject on the fly and then Builds it (so that it outputs Xnb's)
At the moment the problem is that files that are not to be compiled should be copied into the output directory if marked with CopyToOutputDirectory = 'always' or 'PreserveNewest' . If we were looking at the .contentProj that section for a file that shouldn't be built but should be copied would look like this
<ItemGroup>
<None Include="MyFile.file">
<Name>level1</Name>
<Importer>XmlImporter</Importer>
<Processor>PassThroughProcessor</Processor>
<CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
</None>
However, I m building the conntent project on the fly so I need the following code to create the project and add the items
var projectPath = Path.Combine(buildDirectory, "content.contentproj");
_projectRootElement = ProjectRootElement.Create(projectPath);
_projectRootElement.AddImport("$(MSBuildExtensionsPath)\\Microsoft\\XNA Game Studio\\v4.0\\Microsoft.Xna.GameStudio.ContentPipeline.targets");
_contentProject = new Project(_projectRootElement);
_contentProject.SetProperty("XnaPlatform", "Windows");
_contentProject.SetProperty("XnaProfile", "HiDef");
_contentProject.SetProperty("XnaFrameworkVersion", "v4.0");
_contentProject.SetProperty("Configuration", "Debug");
_contentProject.SetProperty("OutputPath", _outputDirectory);
// Register any custom importers or processors.
foreach (string pipelineAssembly in PipelineAssemblies)
{
_contentProject.AddItem("Reference", pipelineAssembly);
}
// Hook up our custom error logger.
_errorLogger = new ErrorLogger();
_buildParameters = new BuildParameters(ProjectCollection.GlobalProjectCollection)
{
Loggers = new ILogger[] { _errorLogger, }
};
//.... removed code that is not required the following is code that adds each item to the project. In case of items that shoulndt compile I m using None (as in xml from project above)
var itemType = compile ? "Compile" : "None";
var items = _contentProject.AddItem(itemType, filename);
var item = items.SingleOrDefault(x => x.EvaluatedInclude.Equals(filename, StringComparison.InvariantCultureIgnoreCase));
item.SetMetadataValue("Link", Path.GetFileName(filename));
item.SetMetadataValue("Name", Path.GetFileNameWithoutExtension(filename));
if (!compile)
item.SetMetadataValue("CopyToOutputDirectory", "Always");
Finally the build code
BuildManager.DefaultBuildManager.BeginBuild(_buildParameters);
var request = new BuildRequestData(_contentProject.CreateProjectInstance(), new string[0]);
var submission = BuildManager.DefaultBuildManager.PendBuildRequest(request);
var execute = Task.Factory.StartNew(() => submission.ExecuteAsync(null, null), cancellationTokenSource.Token);
var endBuild = execute.ContinueWith(ant => BuildManager.DefaultBuildManager.EndBuild());
endBuild.Wait();
In BuildRequest the empty array takes parameters that are targets... I ve been going through many different targets that do different things, from building but not really outputing the files to copy dependency dlls into the main folder but not what I need
Cheers