We work with legacy database which are saving integer values to the varchar column. We need to map this column to Int32 property and it works well if data in database column are numeric or NULL.
But we have problems if column contains empty string instead of null - nhibernate throws error that it cannot convert it to integer.
Is it possible to configure class mapping to automatically convert all values where the exception raises to some default value (zero in this case)?
It is probably possible to convert the values using an interceptor. You may get some advice by searching for "null value substitution".
But do you really have this need? You could map a private or read-only member for the varchar column and use a different property to control it.
private string _varcharField;
public string VarcharField
{
get { return _varcharField; }
}
public int IntProperty
{
get { [parse and return _varcharField]; }
set { _varcharField = value.ToString(); }
}
In this example, _varcharField would be mapped in NHibernate using an access strategy.
Related
Casting in Apex seems like Black Magic to me. I don't get when should we make an explicit cast, and when it can be implicit. Like:
Recipe.apxc
public virtual class Recipe{
public string nome;
protected string instructions;
private String source = 'Granny';
public Recipe() {}
public Recipe(String inpNome, String inpInstrucoes) {
nome = inpNome;
instructions = inpInstrucoes;
}
public void printDescription(){
system.debug('Name: ' + nome + ', Instructions: ' + instructions);
return;
}
}
DrinkRecipe.apxc
public class DrinkRecipe extends Recipe{
public String nome = 'Luso';
private String glassType;
public DrinkRecipe(String inpNome, String inpInstrucoes){
super(inpNome, inpInstrucoes);
}
}
in the annonymous window
DrinkRecipe dr = new DrinkRecipe('Whater', 'glu, glu', 'normal');
// why does this work? Shouldn't i always need to cast an object to make it use another constructor, from another class?
Recipe r1 = dr;
system.debug(r1.nome);
// I thought explicit casting like this would be the only way
Recipe r2 = (Recipe) dr;
system.debug(r2.nome);
Thanks
In general, Apex requires you to explicitly convert one data type to another. For example, a variable of the Integer data type cannot be implicitly converted to a String. You must use the string.format method. However, a few data types can be implicitly converted, without using a method.
Numbers form a hierarchy of types. Variables of lower numeric types can always be assigned to higher types without explicit conversion. The following is the hierarchy for numbers, from lowest to highest:
Integer
Long
Double
Decimal
Note
Once a value has been passed from a number of a lower type to a number of a higher type, the value is converted to the higher type of number.
In addition to numbers, other data types can be implicitly converted. The following rules apply:
IDs can always be assigned to Strings.
Strings can be assigned to IDs. However, at runtime, the value is checked to ensure that it is a legitimate ID. If it is not, a runtime exception is thrown.
The instanceOf keyword can always be used to test whether a string is an ID.
In my project I'm using Jersey 2.23.1 with Jackson for JSON support.
When I'm getting a request with something like { "foo":null, "bar":"123" } as JSON, matching with class A{String foo; String bar;} Jersey first creates and instance of A (with default values if specified in constructor), then deserialize JSON to a temporary object A', then copies all JSON fields that were specified in JSON from A' to A. If I have default values in A-class constructor, and have fields equal to null in JSON, all my default values are erased and replaced by null. So in the example above, if I have a default value for the foo field, it will be replaced by null in the object Jersey will return as param for my #Path annotated method.
I'm using #JsonInclude(Include.NON_NULL) on A class to avoid the transfer of null fields during Response. But it only works for serialization, what about deserialization? I mean, when having { "foo":null } as JSON results in field "foo" = null in new object instance after deserialization.
Here is some code to sum all of this :
#JsonIgnoreProperties(ignoreUnknown = true)
#JsonInclude(value = Include.NON_NULL)
public class User {
public enum EUserRole {
PARENT, STUDENT, PROF, ADMIN
}
#Id
public String id;
public String firstName;
public String lastName;
public EUserRole role;
public User() {
id = ObjectId.get().toString();
role = EUserRole.STUDENT;
lastName = "RandomLastName";
}
}
if I'm passing this kind of JSON
{
"id":null,
"lastName":null,
"firstName":"Random First Name",
"role":"STUDENT"
}
to my method (in controller)
#POST
public Response createUser(final User entity) {
}
it results that all null fields in JSON are set to null in my entity and not set to the constructor default values.
Do you know if there is a way to specify Jackson to ignore null fields during deserialization? Or is this a Jersey-related behavior?
There is no way to ignore data from JSON payload in that sense, based on value contained (you can use ignoral to just ignore all values for given property).
So if you want to avoid null assignment, you need define a setter that will just swallow null value (that is, only assign non-null).
Ability to prevent null assignment might a useful feature to add via #JsonFormat.Feature, something like:
// hypothetical no such feature exists yes
#JsonFormat(without = JsonFormat.Feature.ALLOW_NULL_ASSIGNMENT)
so perhaps this could be a feature request.
And the reason I think this belongs to per-property handling is that this seems like a specific rule for some of the properties. Although perhaps there could also be a matching global setting if it seems users really like such null-ignoral.
I have this code:
#Column(name = "foo")
#ReadTransformer(transformerClass=transformer.class)
private Date foo;
public static class transformer implements AttributeTransformer {
#Override
public void initialize(AbstractTransformationMapping atm) {
}
#Override
public Object buildAttributeValue(Record record, Object o, Session sn) {
}
}
My question is, how do I get the value to transform (from column foo) inside of buildAttributeVaule? It is not inside the record array.
You need one or more #WriteTransformer to write the fields you want selected (and thus get them selected), #Column is not used with a transformation mapping.
However, if you just have a single column, then just use a converter instead, #Convert,
http://wiki.eclipse.org/EclipseLink/UserGuide/JPA/Basic_JPA_Development/Mapping/Basic_Mappings/Default_Conversions_and_Converters
First check that the SQL generated is reading in the "foo" column by turning on logging. If it is, then check that the database is returning "foo" and not "FOO" - java is case sensitive on string looksups. It could be that "FOO" is in the record instead of "foo".
I have a mapping that I'm not quite sure how to do. I have a table that has a Numeric(9,0) column with possible values of 0-3. I'd like to represent this as an enumeration within my entity but there isn't a direct mapping of a Numeric(9, 0) to an integer. The reason it's numeric is for cross-database support (MSSQL and Oracle).
Numeric(9,0) maps directly to a C# decimal which cannot be an enumeration. This is not one of the allowed types for an enumeration.
Would I need to leverage an IUserType in conjuction with an ITypeConvention here or is there another way? Also given the following mapping (LoginType is the type in question) how would I implement this IUserType?
public enum LoginType : int
{
UNKNOWN = 0,
COMPANY_LOGIN = 1,
WINDOWS_LOGIN = 2,
LDAP_LOGIN = 3
}
public class UserHeader
{
public virtual Guid UserId { get; set; }
public virtual LoginType LoginType { get; set; }
}
Try using the CustomType and CustomSqlType specifiers:
public class UserHeaderMap:ClassMap<UserHeader>
{
public UserHeaderMap()
{
...
Map(x=>x.LoginType).CustomType(typeof(LoginType)).CustomSqlType("Numeric(9,0)");
}
}
Specifying the custom type of the enumeration tells FNH to persist the numeric value, instead of its default behavior of persisting the ToString() value (and/or expecting it in the table). The CustomSqlType is more for schema generation purposes, overriding the default int type for the schema column.
If that doesn't work, try adding a Formula() instead of a CustomSqlType, to cast or convert the numeric to an int, which will be picked up by NH and cast to the enum value.
I have created a User Defined Type in .Net 3.5 as per my blog entry at :
http://jwsadlerdesign.blogspot.com/2009/04/this-is-how-you-register.html
This works fine when using SQL with technologies like nHibernate.
However, when I try to map my LinQ to SQL class to use this UDT (with attribute defintions not XML), and I setup the property as the enumeration. I cannot get LinQ to map to this type. I have tried Image, Binary, varchar and integer all of which seem to issue Invalid Cast errors.
In particular I get the error 'Unable to cast object of type 'ISTD.InstallManager.Common.Classes.SQLUDTTargetType' to type 'System.Byte[]' any ideas or help would be much appreciated.
James.
UPDATE: I ran into this myself recently and found that the previous solution wasn't quite complete. Despite what all of the documentation says, it is possible to do this, but somewhat painful.
The first step, for your own convenience, is to implement some conversion operators:
public class MyUDT : INullable, IBinarySerialize
{
// Class implementation would go here
// ...
public static explicit operator MyUDT(byte[] data)
{
using (MemoryStream stream = new MemoryStream(data))
{
using (BinaryReader reader = new BinaryReader(stream))
{
MyUDT result = new MyUDT();
result.Read(reader);
return result;
}
}
}
public static explicit operator byte[](MyUDT x)
{
using (MemoryStream ms = new MemoryStream())
{
using (BinaryWriter writer = new BinaryWriter(ms))
{
x.Write(writer);
}
return ms.ToArray();
}
}
}
Linq to SQL will still flat-out refuse to give you the UDT field, no matter how you declare the property. So you have to give it a binary field instead. You don't need a stored procedure or any custom SQL for this, just add a computed column to your table:
ALTER TABLE MyTable
ADD UDTField_Data AS CAST(UDTField AS varbinary(len))
Where len is whatever your UDT defines in the MaxByteSize attribute.
Now you can finally get access to the column data. You might be tempted to use your UDT as the return type of the new property, thinking that Linq to SQL will find your conversion operator and automatically convert from the byte array; don't bother. Linq to SQL will decide that it's actually a serialized .NET object and spit out a message to the effect of "input stream is not a valid binary format." Instead, you need another layer of indirection:
private MyUDT udtField;
[Column(Name = "UDTField_Data", DbType = "varbinary(len)")]
private byte[] UdtFieldData
{
get { return (byte[])udtField; }
set { udtField = (MyUDT)value; }
}
public MyUDT UdtProperty
{
get { return udtField; }
set { udtField = value; }
}
A few notes to make it clear what's going on here:
The actual field data (udtField) is declared as the UDT itself, not a byte array. The reason for this is that we only want the conversion to happen when loading from or saving to the database. If you had to convert the byte array to the UDT every time you accessed it, it would not only hurt performance, but it would cause inconsistencies if the UDT declares any mutable fields.
The raw byte[] property (UdtFieldData) is declared private, so consumers only see the UDT itself. Linq to SQL will still read it as long as it has the [Column] attribute.
The UdtFieldData property does not declare a storage property. This is critical; if you try to use the UDT field as the storage property, you'll just get the same type conversion error.
Finally, the UdtProperty property is how consumers actually get to access the data. To them it looks like any other property.
It's unfortunate that you have to jump through so many hoops to get this to work, but it does work. You'll probably have difficulties doing this kind of massaging through the Linq surface designer, which is just one of several reasons why I don't use it; better to write the classes yourself and use SqlMetal to help you along if necessary.