org.apache.avro.SchemaParseException: Can't redefine: org.apache.avro.reflect.Pair620b9c15f622a7 - serialization

I have a class as given below
class A
{
private Map<Long ,Set<Long>> x;
private Map<Long ,Set<Long>> y;}
When avro tries to create schema using reflection it creates the schema having the name of the map as Pair620b9c15f622a7 for both the fields. ANd hence I get the exception
Schema s = ReflectData.get().getSchema(A.class);
I am not sure as of why I am getting this error , though the field names are completely different.
1 solution for this might be to explicitly define the name of x and y in the schema using #AvroSchema annotation, but that is very manual task and I have to do it for all such mappings in my code

pretty similar to what I just answered here Generate Avro file based on java file. TRy using kite sdk util class

Related

Sulu: Entity has no field or association error

I'm following Sulu example here: https://github.com/sulu/sulu-workshop/
trying to set translations for custom entity type.
My entity file has getter for field "home_team" defined like:
/**
* #Serializer\VirtualProperty(name="home_team")
*/
public function getHomeTeam(): ?string
{
$translation = $this->getTranslation($this->locale);
if (!$translation) {
return null;
}
return $translation->getHomeTeam();
}
So field is not actually part of that entity, but of it's translation entity since it suppose to be translatable.
When I try to create new object of that entity type it works well. I can see in database that field values are stored well and I don't get any error.
But on overview page instead of list of all objects I get error:
[Semantical Error] line 0, col 73 near 'home_team AS': Error: Class App\Entity\MatchEvent has no field or association named home_team
Any idea what could be wrong here?
If you wanna see the translation in the listView you have to create a real translationEntity, like in the workshop project. In this post it is already explained, how to translate a custom entity correctly.
If you have already created your translationEntity you have to configure the relation of the translation to your main entity via a join. Here is an example in the workshop for this configuration.
Sulu uses optimised queries to create the list-object directly from the database. So the entity itself does not get hydrated or serialised for performance reasons. Thus your virtualProperty is never executed.

RepoDb cannot find mapping configuration

I'm trying to use RepoDb to query the contents of a table (in an existing Sql Server database), but all my attempts result in an InvalidOperationException (There are no 'contructor parameter' and/or 'property member' bindings found between the resultset of the data reader and the type 'MyType').
The query I'm using looks like the following:
public Task<ICollection<MyType>> GetAllAsync()
{
var result = new List<MyType>();
using (var db = new SqlConnection(connectionString).EnsureOpen())
{
result = (await db.ExecuteQueryAsync<MyType>("select * from mytype")).ToList();
}
return result;
}
I'm trying to run this via a unit test, similar to the following:
[Test]
public async Task MyTypeFetcher_returns_all()
{
SqlServerBootstrap.Initialize();
var sut = new MyTypeFetcher("connection string");
var actual = await sut.GetAllAsync();
Assert.IsNotNull(actual);
}
The Entity I'm trying to map to matches the database table (i.e. class name and table name are the same, property names and table column names also match).
I've also tried:
putting annotations on the class I am trying to map to (both at the class level and the property level)
using the ClassMapper to map the class to the db table
using the FluentMapper to map the entire class (i.e. entity-table, all columns, identity, primary)
putting all mappings into a static class which holds all mapping and configuration and calling that in the test
providing mapping information directly in the test via both ClassMapper and FluentMapper
From the error message it seems like RepoDb cannot find the mappings I'm providing. Unfortunately I have no idea how to go about fixing this. I've been through the documentation and the sample tutorials, but I haven't been able to find anything of use. Most of them don't seem to need any mapping configuration (similar to what you would expect when using Dapper). What am I missing, and how can I fix this?

Using Orika in place of Spring Data Commons

There are several Spring Data projects like Neo4j that use the Spring Data Commons to build up a PersistentEntity/PeristentProperty (basically type info plus property geters and setters) and EntityConverter to roll from a native store to Java. This is what the SDN (Spring Data Neo4j) does plus it bundling BeanWrapper converters to make sure that certain property types are allowed for the Neo4j data structure.
Basically Java beans are stamped with a #NodeEntity annotation and the beans is decomposed on writes into nodes (think a bean with only simple properties) interlinked by relationship objects.
Wondering if I can do the same with Orika? Means identifying classes via an annotation and processing each property when complex recursively. For example:
#NodeEntity
class Software {
String name;
....
Organisation organisation;
....
}
#NodeEntity
class Organisation {
String name;
}
Should be rolled into 2 nodes each containing the property name and a relationship object (denotes Organisation as a member of Software).
Here is an example of an Orika ClassMapBuilder supporting custom annotations, I think you can adapt it to meet your needs.
Gist : AnnotationClassMapBuilder
For Node (or DBObject of MongoDB) you can use a custom property resolver, take a look at:
http://orika-mapper.github.com/orika-docs/advanced-mappings.html (ElementPropertyResolver)
Edit
Orika build mappers by class-map which are actually, just a collection of property-pair, property can be any thing which has name, type and setter or/and getter.
You can automatically create for each attribute in your beans an equivalent in Neo4J side, and let Orika build the mapper.
For example you can create a Person(name)->PrintStream mapper,
in which you create for each person's property (name) an equivalent that print data (System.out)
Example
final Builder name = new Property.Builder()
.name("name")
.type(String.class.getName())
.setter("append(\"My name : \").append(%s).append('\\n')");
factory.classMap(Person.class, PrintStream.class).fieldMap("name", name, false).add().register();
factory.getMapperFacade().map(person, System.out); // This print to default output stream, My name : xxxx

NHibernate: How to get mapped values?

Suppose I have a class Customer that is mapped to the database and everything is a-ok.
Now suppose that I want to retrieve - in my application - the column name that NH knows Customer.FirstName maps to.
How would I do this?
You can access the database field name through NHibernate.Cfg.Configuration:
// cfg is NHibernate.Cfg.Configuration
// You will have to provide the complete namespace for Customer
var persistentClass = cfg.GetClassMapping(typeof(Customer));
var property = persistentClass.GetProperty("FirstName");
var columnIterator = property.ColumnIterator;
The ColumnIterator property returns IEnumerable<NHibernate.Mapping.ISelectable>. In almost all cases properties are mapped to a single column so the column name can be found using property.ColumnInterator.ElementAt(0).Text.
I'm not aware that that's doable.
I believe your best bet would be to use .xml files to do the mapping, package them together with the application and read the contents at runtime. I am not aware of an API which allows you to query hibernate annotations (pardon the Java lingo) at runtime, and that's what you would need.
Update:
Judging by Jamie's solution, NHibernate and Hibernate have different APIs, because the Hibernate org.hibernate.Hibernate class provides no way to access a "configuration" property.

NHibernate - How do I change schemas during run time?

I'm using NHibernate to connect to an ERP database on our DB2 server. We have a test schema and a production schema. Both schemas have the same table structure underneath. For testing, I would like to use the same mapping classes but point NHibernate to the test environment when needed and then back when in production. Please keep in mind that we have many production schemas and each production schema has an equivalent test schema.
I know that my XML mapping file has a schema property inside it, but since it's in XML, it's not like I can change it via a compiler directive or change the schema property based on a config file.
Any ideas?
Thank You.
No need to specify schema in the mappings: there's a SessionFactory-level setting called default_schema. However, you can't change it at runtime, as NHibernate pregenerates and/or caches SQL queries, including the schema part.
To get what I wanted, I had to use NHibernate.Mapping.Attributes.
[NHibernate.Mapping.Attributes.Class(0, Table = “MyTable”, Schema = MySchemaConfiguration.MySchema)]
In this way, I can create a class like MySchemaConfiguration and have a property inside of it like MySchema. I can either set the property's value via a compiler directive or get it through a configuration file. This way I only have to change the schema in one place and it will be reflected throughout all of the other mappings.
I have found following link that actually fixes the problem.
How to set database schema for namespace in nhibernate
The sample code could be
cfg.ClassMappings.Where(cm => cm.Table.Schema == "SchemaName")
.ForEach(cm => cm.Table.Schema = "AnotherSchemaName");
This should happen before you initialize your own data service class.
#Brian, I tried NHibernate.Mapping.Attributes, the attribute value you put inside should be a constant. So it could not be updated during run time. How could you have set the property's value using a parameter value in configuration file?
The code to fix HBM XML resources.
// This is how you get all the hbm resource names.
private static IList<string> GetAllHbmXmlResourceNames(Assembly assembly)
{
var result = new List<string>();
foreach (var resource in assembly.GetManifestResourceNames())
{
if (resource.EndsWith(".hbm.xml"))
{
result.Add(resource);
}
}
return result;
}
// This is how you get the stream for each resource.
Assembly.Load(assembly).GetManifestResourceStream(name)
// What you need to do next is to fix schema name in this stream
// Replacing schema name.
private Stream FixSchemaNameInStream(Stream stream)
{
StreamReader strStream = new StreamReader(stream);
string strCfg = strStream.ReadToEnd();
strCfg = strCfg.Replace(string.Format("schema=\"{0}\"" , originalSchemaName), string.Format("schema=\"{0}\"" , newSchemaName));
return new MemoryStream(Encoding.ASCII.GetBytes(strCfg));
}
Take a look at SchemaUpdate.
http://blogs.hibernatingrhinos.com/nhibernate/archive/2008/04/28/create-and-update-database-schema.aspx