Configure Grails 3.0.9 for Oracle 12c with identity PK? - grails-orm

I am trying to map my domain objects to use the new Oracle 12c identity type primary keys, AKA auto-increment in some other systems.
Hibernate 4 does not have Oracle12cDialect, it only has Oracle10gDialect.
Oracle10gDialect has a method called supportsIdentityColumns() whitch is hard coded to return false, so mapping my GORM domain object with generator:"identity" results in an error saying that the Oracle10gDialect does not support identity generator.
I cannot use the GORM select generator because I do not have a secondary unique key and I cannot use a Hibernate generated key because then Hibernate and other (external) inserts into the tables would generate overlapping keys.
Example of Existing Oracle 12c DDL:
create table person (
id number(10,0) generated by default as identity,
version number(10,0) not null,
home_address_id number(10,0),
name varchar(255) not null,
primary key (id)
);
GORM Object:
class Person {
String name
Address homeAddress
static mapping = {
id column: 'person_key', generator: 'identity'
}
static constraints = {
homeAddress nullable: true
}
}
In Memory DB Result (Works Perfect):
Hibernate: create table person (person_key bigint generated by default as identity, version bigint not null, home_address_id bigint, name varchar(255) not null, primary key (person_key))
Hibernate: alter table person add constraint FK_bemy93e8a8i6nknj4n21m6fub foreign key (home_address_id) references address
Hibernate: insert into person (person_key, version, home_address_id, name) values (null, ?, ?, ?)
Oracle DB Result (Broken):
org.hibernate.MappingException: org.hibernate.dialect.Oracle10gDialect does not support identity key generation
How do I get Grails 3.0.9 to work with the above Oracle table definition?

Hibernate 4 can not be configured to use Oracle 12c identity key generation.
Creating a custom Oracle12cDialect did not allow us to use identity key generation. It requires additional support in Hibernate 4 that is not there.
What does work is sticking with the Oracle10gDialect and using generator: 'sequence-identity' and then naming the sequence like this:
static mapping = {
id column: 'person_key', generator: 'sequence-identity', params:[sequence:'person_seq']
}
This virtually achieves the same result other than creating the tables with the identity key word in the DDL. Even if we had been able to get the identity keyword in the table definition, Oracle would simply have created its own sequence in the background to use every time a record was inserted. Using sequence-identity rather than sequence, also avoids the double DB call to insert a new row. With identity-sequence the insert DML is a single call like this:
insert into person (person_key, version, home_address_id, name) values (person_seq.nextval, ?, ?, ?)
With generator: 'sequence' new record inserts become two DB calls like this:
select person_seq.nextval from dual;
insert into person (person_key, version, home_address_id, name) values (?, ?, ?, ?)
So the only downside that I see for 'identity-sequence' over 'identity' is simply that Oracle will not automatically keep track of which sequence to use for which table and automatically use it when no key value is provided in the insert statement. But even that could probably be handled by a before insert trigger, at which point you might be almost exactly where you would be anyway if Hibernate 4 had supported generator: identity.

Hibernate 5 does have an Oracle 12c Dialect, specifically adding "identity" support: org.hibernate.dialect.Oracle12cDialect. So either use Hibernate 5, or write a custom 12c-based Dialect for Hibernate 4.

Related

H2 refuses to create auto_increment for Postgres emulated database

I created an in memory H2 database with JDBC URL
jdbc:h2:~/test;MODE=PostgreSQL;DATABASE_TO_LOWER=TRUE;DEFAULT_NULL_ORDERING=HIGH
The H2 web console refuses to let me do an auto_increment. I've seen serial for Postgres, but that doesn't work either.
At it's simplest, it hates:
create table test(id bigint auto_increment);
Syntax error in SQL statement "create table test(id bigint [*]auto_increment)"; expected "ARRAY, INVISIBLE, VISIBLE, NOT NULL, NULL, AS, DEFAULT, GENERATED, ON UPDATE, NOT NULL, NULL, DEFAULT ON NULL, NULL_TO_DEFAULT, SEQUENCE, SELECTIVITY, COMMENT, CONSTRAINT, COMMENT, PRIMARY KEY, UNIQUE, NOT NULL, NULL, CHECK, REFERENCES, ,, )"; SQL statement:
create table test(id bigint auto_increment) [42001-214] 42001/42001 (Help)
Why do I care:
My code base was failing with NULL not allowed for column "REV". I'm using JPA/Hibernate + Liquibase. In order to try the suggestions at
Hibernate Envers + Liquibase: NULL not allowed for column "REV"
I'm trying to add an auto_increment to my Liquibase changelog file.
You can use the SQL Standard's generation clause GENERATED ALWAYS AS IDENTITY. For example:
create table test (
id bigint generated always as identity,
name varchar(10)
);
See PostgreSQL Example.
It works the same way in H2. For example:
create table test(id bigint generated always as identity, name varchar(10));
insert into test (name) values ('Chicago') ;
select * from test;
Result:
ID NAME
-- -------
1 Chicago

Postgres GENERATED AS IDENTITY column nullability

I want to create a table, which contains a nullable column having GENERATED BY DEFAULT AS IDENTITY option, therefore I run the following query:
CREATE TABLE my_table (
generated INTEGER NULL GENERATED BY DEFAULT AS IDENTITY,
data TEXT NOT NULL
);
But once I try to insert a row in the table, which generated field is null like this:
INSERT INTO my_table(generated, data) VALUES(NULL, "some data");
I get a null-constraint violation error.
However if I change the order of my_table.generated column properties:
CREATE TABLE my_table (
generated INTEGER GENERATED BY DEFAULT AS IDENTITY NULL,
data TEXT NOT NULL
);
It inserts rows, which generated field is NULL, without any issues.
Is this the expected behavior for the case?
Postgres developers told me this is a bug since identity columns weren't supposed to be nullable (see the patch file under the response).

Is there a way to prevent a query from setting the serial primary key?

I've got a bunch of tables with the 'serial' keyword on a primary key so that auto-increment will work. The problem is that I can make a query to insert a row using any id number which overrides the auto-increment. Is there a reason for this? Or, is there a way to prevent a user from adding/changing this value?
Here's an example of my table config:
create table if not exists departments (
department_id serial primary key,
name varchar(64) not null unique
);
if I run the following query, I can add any number to primary key:
insert into departments (department_id, name) values (9001, 'FooBar')
I think I want to prevent this from happening. I'd like to get some opinions.
Use an identity column:
create table if not exists departments (
department_id integer primary key generated always as identity,
name varchar(64) not null unique
);
This will prevent an insert to override the generated value. You can still circumvent that by specifying OVERRIDING SYSTEM VALUE as part of your INSERT statement. But unless you specify that option, providing a value for the column will result in an error.
Related: PostgreSQL: serial vs identity
Unless '9001' isn't already in the registers, it shouldn't cause any trouble.
If the filed 'department_id' is already auto increment you can just run your insert statement like
INSERT INTO departments (name) VALUES ('FooBar')
I'm using Microsoft sql server and you can control the insert of value in identity column by below sql command
SET IDENTITY_INSERT [ [ database_name . ] schema_name . ] table_name { ON | OFF }
By setting it to 'OFF' you cannot insert a value in identity column.
For more info refer: https://learn.microsoft.com/en-us/sql/t-sql/statements/set-identity-insert-transact-sql?view=sql-server-ver15

Migrating Bugzilla to non fresh JIRA duplicate key

I have a JIRA environment which already has some information and i'm trying to merge all the bugzilla bugs into JIRA.
I'm trying to use the importer form JIRA "BugzillaImportBean.java‎"
But it's failing when it tries to insert into the OS_CURRENTSTEP table because of a unique Key violation, essentially the ID already exists in JIRA in that table.
So it crashes at
final GenericValue issue = createIssue(resultSet, getProductName(resultSet, true), componentName);
Error importing data from Bugzilla: com.atlassian.jira.exception.CreateException: Could not create new current step for #259350: root cause: while inserting: [GenericEntity:OSCurrentStep][id,357430][startDate,2010-07-23 05:32:14.414][status,Open][owner,][finishDate,null][actionId,0][stepId,1][dueDate,null][entryId,259350] (SQL Exception while executing the following:INSERT INTO OS_CURRENTSTEP (ID, ENTRY_ID, STEP_ID, ACTION_ID, OWNER, START_DATE, DUE_DATE, FINISH_DATE, STATUS, CALLER) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?) (Duplicate entry '357430' for key 1))
What is the best way of fixing this?
Bugzilla Database Schema: http://tldp.org/LDP/bugzilla/Bugzilla-Guide/dbschema.html
Jira Database Schema: http://confluence.atlassian.com/display/JIRA/Database+Schema
http://confluence.atlassian.com/display/JIRA/Modifying+the+Bugzilla+Importer
CREATE TABLE `OS_CURRENTSTEP` (
`ID` decimal(18,0) NOT NULL,
`ENTRY_ID` decimal(18,0) default NULL,
`STEP_ID` decimal(9,0) default NULL,
`ACTION_ID` decimal(9,0) default NULL,
`OWNER` varchar(60) default NULL,
`START_DATE` datetime default NULL,
`DUE_DATE` datetime default NULL,
`FINISH_DATE` datetime default NULL,
`STATUS` varchar(60) default NULL,
`CALLER` varchar(60) default NULL,
PRIMARY KEY (`ID`),
KEY `wf_entryid` (`ENTRY_ID`)
) ENGINE=MyISAM DEFAULT CHARSET=utf8;
The problem could be a duplicate sequence value. Check the SEQUENCE_VALUE_ITEM table, look for a row such as "OSCurrentStep" (if this is not the name, the mapping of tables to entity names is in WEB-INF/classes/entitydefs/entitymodel.xml)
select * from SEQUENCE_VALUE_ITEM where SEQ_NAME='OSCurrentStep'
Check what is the maximal used value:
select MAX(ID) from OS_CURRENTSTEP
Set SEQ_ID bigger than the maximal used value, rounding up to a multiple of 10.
(Described in http://confluence.atlassian.com/display/JIRA/Database+Schema # SEQUENCE_VALUE_ITEM)
The failed duplicate key '357430' is a multiple of 10, which suggests this is the reason
An easier but less likely solution: are you trying to import the same issue a second time?
If so, then "click the 'Import only new issues' checkbox in the importer" as described here: http://confluence.atlassian.com/display/JIRA/Importing+Data+from+Bugzilla
(You will notice that the failed statement is inside this condition: if (!onlyNewIssues || !previouslyImportedKeys.containsKey...)
Looks like the Bugzilla Importer has got confused about Status and Workflow steps. I can't remember if it tries to create new workflow steps on the fly? That importer is a right dog's breakfast, which is why I wrote my own product to do imports into JIRA. I'm doing another one tomorrow in fact.
Anyway, one way to narrow down the problem is to import a subset of issues. Perhaps you don't have the mapping from Bugzilla states (customized?) to JIRA statuses complete?
There's more info about the guts of this at http://confluence.atlassian.com/display/JIRA/Issue+status+and+workflow
~Matt

Nhibernate id’s with sequential one step incremented id’s (alternatives to HiLo)

How do I instruct Nhibernate to generate sequential one step primary keys, like the sql generated ones?
The current HiLo algorithm generates keys like 4001 then 5010, 6089 etc. I understand that this is to manage multiple app servers etc. But I don’t have that problem.
I need nhibernate to pick up the highest record set value during startup (say 15) and then generate the next record with primary key 16(very much like generated id’s from sql’s side).
Why do you need/expect NHibernate to do this for you?
It's hard for NHibernate to provide a generic solution for scenarios like this as the requirements can vary ever so slightly, but since you exactly know your particular constraints, it should be relatively straight-forward for you to provide your own solution (using manually assigned ids).
On application startup, query the database and get the current max id value. Increment that value every time you do an insert.
Create table:
CREATE TABLE `seq` (
`ID` varchar(10) COLLATE utf8_unicode_ci NOT NULL,
`HI` bigint(20) unsigned NOT NULL,
PRIMARY KEY (`ID`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci;
INSERT INTO `seq` VALUES ('COMMENT', '0');
INSERT INTO `seq` VALUES ('POST', '0');
INSERT INTO `seq` VALUES ('USER', '0');
Add mappings like this with FluentNHbiernate:
public class Comment_Map : ClassMap<Comment>
{
public Comment_Map()
{
Table("COMMENT");
Id(x => x.ID, "ID").GeneratedBy.HiLo("SEQ", "HI", "0", o => o.AddParam("where", "ID = 'COMMENT'"));
}
}