How to include proto files from one project in another project - asp.net-core

I am not able to include protos present in Project A in a project B. The idea would be to have the protos with GrpcServices="Server" in project A and in project B, of tests, include the same protos but, now, as GrpcServices="Client"
ProjectA/Protos/Profile.proto
syntax = "proto3";
package profile;
option csharp_namespace = "ProjectA.Protos";
import "google/protobuf/empty.proto";
service ProfileService {
rpc Get(google.protobuf.Empty) returns (Profile);
}
message Profile {
string profile_id = 1;
string description = 2;
}
ProjectA/Protos/User.proto
syntax = "proto3";
package user;
option csharp_namespace = "ProjectA.Protos";
import "google/protobuf/wrappers.proto";
import "Protos/Profile.proto";
service UserService {
rpc Get(google.protobuf.StringValue) returns (UserDetail);
}
message UserDetail {
string id = 1;
string name = 2;
repeated profile .Profile profiles = 7;
}
Project B .csproj (The test project)
<ItemGroup>
<Protobuf Include="..\ProjectA\Protos\*.proto" GrpcServices="Client" ProtoRoot="Protos">
<Link>Protos\*.proto</Link>
</Protobuf>
</ItemGroup>
With these settings I always end up having this error return
error : File does not reside within any path specified using --proto_path (or -I). You must specify a --proto_path which encompasses this file. Note that the proto_path must be an exact prefix of the .proto file names -- protoc is too dumb to figure out when two paths (e.g. absolute and relative) are equivalent (it's harder than you think).

Bringing a return on the applied solution.
The problem that occurs when trying to do this import in the same solution is that each inclusion in a .csproj it will try to generate C# classes and these classes are generated with "global context" and this prevents the generation of "server" and "client" classes because they would have the same names.
The easiest solution was to add Both to the proto file's include configuration. Thus, the generated C# classes already contemplate the client and server side, so it is possible to use them both in project A (server) and in project B (client).
<Protobuf Include="Protos\*.proto" GrpcServices="Both" />
The solution presented by Jan Tattermusch, in the comments, would have a project C with only the protos files, would follow the same configuration and is also a great alternative.

Related

How to fix third party dll include not being staged correctly by Unreal Build Tool?

I am using a pre-built C++ library in my Unreal project using a dynamic library file (let's say it's called MyPluginLib.dll). The library is contained in a plugin, let’s call it MyPlugin.
Building, packaging, playing in editor works fine. However, a packaged build doesn’t start, giving the following error: Code execution cannot proceed, MyPluginLib.dll was not found.
The packaging process places MyPluginLib.dll file in MyGame\Plugins\MyPlugin\Binaries. However, the execution process is seemingly looking for it in MyGame\Binaries – moving the library there manually solves this issue.
Why is the OS unable to find the dll in the first folder? Is there something wrong in the build.cs, or my folder structure?
The folder structure of the plugin folder is as follows:
Includes in Plugins\MyPlugin\Source\ThirdParty\MyPluginLib\
Binaries in Plugins\MyPlugin\Binaries\(PLATFORM)\
The plugin’s Build.cs looks like this:
public class MyPlugin : ModuleRules
{
public MyPlugin(ReadOnlyTargetRules Target) : base(Target)
{
PCHUsage = ModuleRules.PCHUsageMode.UseExplicitOrSharedPCHs;
string PluginRoot = Path.GetFullPath(Path.Combine(ModuleDirectory, "..", ".."));
string PlatformString = Target.Platform.ToString();
string LibraryDirectory = Path.Combine(PluginRoot, "Binaries", PlatformString);
PublicIncludePaths.Add(Path.Combine(PluginRoot, "Source", "ThirdParty", "MyPluginLib"));
if ((Target.Platform == UnrealTargetPlatform.Win64))
{
PublicAdditionalLibraries.Add(Path.Combine(LibraryDirectory, "MyPluginLib.lib"));
RuntimeDependencies.Add(Path.Combine(LibraryDirectory, "MyPluginLib.dll"), StagedFileType.NonUFS);
}
else if (Target.Platform == UnrealTargetPlatform.Linux)
{
// linux binaries...
}
}
Would appreciate any help.
Check your packaged games files, unreal loves to not include certain thing in packaged builds regarding plugins.

Avro generated class: Cannot access class 'Builder'. Check your module classpath for missing or conflicting dependencies

Running
val myAvroObject = MyAvroObject.newBuilder()
results in a compilation error:
Cannot access class 'MyAvroObject.Builder'. Check your module classpath for missing or conflicting dependencies
I am able to access other MyAvroObject variables. More precisely, methods such as
val schema = MyAvroObject.getClassSchema()
val decoder = MyAvroObject.getDecoder()
compiles fine. What makes it even stranger is that I can access newBuilder() in my test/ folder, but not in my src/ folder.
Why do I get a compile error when using newBuilder()? Is the namespace of the avro-schema used to generate MyAvroObject of importance?
Check your module classpath generally means, that your dependencies (which you didn't provide) are messed up. One of them should read implementation instead of testImplementation, in order to have the method available in the main source-set, instead of only the test source-set - but this may well have to do with the input classes, the output location of generated classes, or annotations alike #VisibleForTesting (just see what it even generates). Command gradlew can also list the dependencies per configuration. The builder seems to be called org.apache.avro.SchemaBuilder... there's only avro-1.11.0.jar & avro-tools-1.11.0.jar. With the "builder" design pattern, .newBuilder() tries to return inner class Builder.
had the same problem today and was able to solve it by adding the following additional source folder
<sourceDir>${project.basedir}/target/generated-sources/avro</sourceDir>
to the kotlin-maven-plugin.

How to disable default gradle buildType suffix (-release, -debug)

I migrated a 3rd-party tool's gradle.build configs, so it uses android gradle plugin 3.5.3 and gradle 5.4.1.
The build goes all smoothly, but when I'm trying to make an .aab archive, things got broken because the toolchain expects the output .aab file to be named MyApplicationId.aab, but the new gradle defaults to output MyApplicationId-release.aab, with the buildType suffix which wasn't there.
I tried to search for a solution, but documentations about product flavors are mostly about adding suffix. How do I prevent the default "-release" suffix to be added? There wasn't any product flavor blocks in the toolchain's gradle config files.
I realzed that I have to create custom tasks after reading other questions and answers:
How to change the generated filename for App Bundles with Gradle?
Renaming applicationVariants.outputs' outputFileName does not work because those are for .apks.
I'm using Gradle 5.4.1 so my Copy task syntax reference is here.
I don't quite understand where the "app.aab" name string came from, so I defined my own aabFile name string to match my toolchain's output.
I don't care about the source file so it's not deleted by another delete task.
Also my toolchain seems to be removing unknown variables surrounded by "${}" so I had to work around ${buildDir} and ${flavor} by omitting the brackets and using concatenation for proper delimiting.
tasks.whenTaskAdded { task ->
if (task.name.startsWith("bundle")) { // e.g: buildRelease
def renameTaskName = "rename${task.name.capitalize()}Aab" // renameBundleReleaseAab
def flavorSuffix = task.name.substring("bundle".length()).uncapitalize() // "release"
tasks.create(renameTaskName, Copy) {
def path = "$buildDir/outputs/bundle/" + "$flavorSuffix/"
def aabFile = "${android.defaultConfig.applicationId}-" + "$flavorSuffix" + ".aab"
from(path) {
include aabFile
rename aabFile, "${android.defaultConfig.applicationId}.aab"
}
into path
}
task.finalizedBy(renameTaskName)
}
}
As the original answer said: This will add more tasks than necessary, but those tasks will be skipped since they don't match any folder.
e.g.
Task :app:renameBundleReleaseResourcesAab NO-SOURCE

Programmatically having ivy fetch sources

We have a custom build tool which is dependent on the ivy functionality to resolve dependencies. The configuration of the dependencies is not an ivy.xml file, but a custom configuration that allows for.. well, irrelevant. The key is that we're using ivy programmatically.
Given a dependency (group id, artifact id, version), we create a ModuleRevisionId:
ModuleRevisionId id = ModuleRevisionId.newInstance(orgName, moduleName, revisionName);
followed by a ModuleDescriptor. This is, I'm guessing, where I'm not convincing enough to inform ivy that I want both the target library jar file as well as the sources. I'm just not sure what a DependencyConfiguration is vs. just a 'configuration' when creating a ModuleDescriptor.
DefaultModuleDescriptor md
= new DefaultModuleDescriptor(
ModuleRevisionId.parse("org#standalone;working"),
"integration",
new java.util.Date());
DefaultDependencyDescriptor mainDep
= new DefaultDependencyDescriptor(id, /* force = */ true);
mainDep.addDependencyConfiguration("compile", "compile");
mainDep.addDependencyConfiguration("compile", "sources");
md.addDependency(mainDep);
md.addConfiguration(new Configuration("compile"));
md.addConfiguration(new Configuration("sources"));
Nor do I really understand the above vs. RetrieveOptions vs. ResolveOptions.
I need a drink.
Ok, so it took a while, but I finally wrapped my head around some of this.
// define 'our' module
DefaultModuleDescriptor md
= new DefaultModuleDescriptor(ModuleRevisionId.parse("org#standalone;working"),
/* status = */ "integration",
new java.util.Date());
// add a configuration to our module definition
md.addConfiguration(new Configuration("compile"));
// define a dependency our module has on the (third party, typically) dependee module
DefaultDependencyDescriptor mainDep = new DefaultDependencyDescriptor(md, dependeeModuleId, /* force = */ true, false, true);
mainDep.addDependencyConfiguration("compile", "default");
mainDep.addDependencyConfiguration("compile", "sources");
// define which configurations we want to resolve (only have 1 in this case anyway)
ResolveOptions resolveOptions = new ResolveOptions();
String[] confs = new String[] {"compile"};
resolveOptions.setConfs(confs);
resolveOptions.setTransitive(true); // default anyway
resolveOptions.setDownload(true); // default anyway
ResolveReport report = ivy.resolve(md, resolveOptions);
This pulls down both the default jar as well as the sources target. Note that ivy has an issue where it won't transitively pull sources, though it will transitively pull 'main' jars. So you only get the sources for immediate dependency defined here, not the sub dependencies.
One other weakness I'm trying to figure out is this assumes the target dependency has a 'sources' configuration. I'd rather tell it to get any artifacts of type sources/source/src. Haven't figured that one out yet.

How to recursively parse xsd files to generate a list of included schemas for incremental build in Maven?

I have a Maven project that uses the jaxb2-maven-plugin to compile some xsd files. It uses the staleFile to determine whether or not any of the referenced schemaFiles have been changed. Unfortunately, the xsd files in question use <xs:include schemaLocation="../relative/path.xsd"/> tags to include other schema files that are not listed in the schemaFile argument so the staleFile calculation in the plugin doesn't accurately detect when things need to be actually recompiled. This winds up breaking incremental builds as the included schemas evolve.
Obviously, one solution would be to list all the recursively referenced files in the execution's schemaFile. However, there are going to be cases where developers don't do this and break the build. I'd like instead to automate the generation of this list in some way.
One approach that comes to mind would be to somehow parse the top-level XSD files and then either sets a property or outputs a file that I can then pass into the schemaFile parameter or schemaFiles parameter. The Groovy gmaven plugin seems like it might be a natural way to embed that functionality right into the POM. But I'm not familiar enough with Groovy to get started.
Can anyone provide some sample code? Or offer an alternative implementation/solution?
Thanks!
Not sure how you'd integrate it into your Maven build -- Maven isn't really my thing :-(
However, if you have the path to an xsd file, you should be able to get the files it references by doing something like:
def rootXsd = new File( 'path/to/xsd' )
def refs = new XmlSlurper().parse( rootXsd ).depthFirst().findAll { it.name()=='include' }.#schemaLocation*.text()
println "$rootXsd references $refs"
So refs is a list of Strings which should be the paths to the included xsds
Based on tim_yates's answer, the following is a workable solution, which you may have to customize based on how you are configuring the jaxb2 plugin.
Configure a gmaven-plugin execution early in the lifecycle (e.g., in the initialize phase) that runs with the following configuration...
Start with a function to collect File objects of referenced schemas (this is a refinement of Tim's answer):
def findRefs { f ->
def relPaths = new XmlSlurper().parse(f).depthFirst().findAll {
it.name()=='include'
}*.#schemaLocation*.text()
relPaths.collect { new File(f.absoluteFile.parent + "/" + it).canonicalFile }
}
Wrap that in a function that iterates on the results until all children are found:
def recursiveFindRefs = { schemaFiles ->
def outputs = [] as Set
def inputs = schemaFiles as Queue
// Breadth-first examine all refs in all schema files
while (xsd = inputs.poll()) {
outputs << xsd
findRefs(xsd).each {
if (!outputs.contains(it)) inputs.add(it)
}
}
outputs
}
The real magic then comes in when you parse the Maven project to determine what to do.
First, find the JAXB plugin:
jaxb = project.build.plugins.find { it.artifactId == 'jaxb2-maven-plugin' }
Then, parse each execution of that plugin (if you have multiple). The code assumes that each execution sets schemaDirectory, schemaFiles and staleFile (i.e., does not use the defaults!) and that you are not using schemaListFileName:
jaxb.executions.each { ex ->
log.info("Processing jaxb execution $ex")
// Extract the schema locations; the configuration is an Xpp3Dom
ex.configuration.children.each { conf ->
switch (conf.name) {
case "schemaDirectory":
schemaDirectory = conf.value
break
case "schemaFiles":
schemaFiles = conf.value.split(/,\s*/)
break
case "staleFile":
staleFile = conf.value
break
}
}
Finally, we can open the schemaFiles, parse them using the functions we've defined earlier:
def schemaHandles = schemaFiles.collect { new File("${project.basedir}/${schemaDirectory}", it) }
def allSchemaHandles = recursiveFindRefs(schemaHandles)
...and compare their last modified times against the stale file's modification time,
unlinking the stale file if necessary.
def maxLastModified = allSchemaHandles.collect {
it.lastModified()
}.max()
def staleHandle = new File(staleFile)
if (staleHandle.lastModified() < maxLastModified) {
log.info(" New schemas detected; unlinking $staleFile.")
staleHandle.delete()
}
}