Calculate volumes based on date - sql

I have this MariaDB table which I would like to use for bar chart:
CREATE TABLE `payment_transaction_daily_facts` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`date` date DEFAULT NULL,
`year` int(11) DEFAULT NULL,
`month` int(11) DEFAULT NULL,
`week` int(11) DEFAULT NULL,
`day` int(11) DEFAULT NULL,
`volume` int(11) DEFAULT NULL,
`count` int(11) DEFAULT NULL,
'created_at' date DEFAULT NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB;
In my example SQL query I have single column for Date. How I can calculate the volumes per day for last 10 days when I have split date, year, month, week and day into different columns?
The final result should be for example:
Date | Amount| Number of transactions per day |
11-11-2018 | 30 | 3 |
11-12-2018 | 230 | 13 |
I tried this:
SELECT SUM(amount) AS sum_volume, COUNT(*) AS sum_Transactions
WHERE (created_at BETWEEN '2018-11-07' AND '2018-11-08')
GROUP BY DATE(created_at)
I want to return the generated data using DTO:
public class DashboardDTO {
private Date date;
private int sum_volume;
private int sum_Transactions;
... getters and setters
}
Rest controller:
#RestController
#RequestMapping("/dashboard")
public class DashboardController {
private static final Logger LOG = LoggerFactory.getLogger(DashboardController.class);
#Autowired
private DashboardRepository dashboardRepository;
#Autowired
private PaymentTransactionsDailyFactsMapper mapper;
#GetMapping("/volumes")
public ResponseEntity<List<DashboardDTO>> getProcessingVolumes(#PathVariable String start_date, #PathVariable String end_date) {
List<DashboardDTO> list = StreamSupport.stream(dashboardRepository.findPaymentTransactionsDailyFacts(start_date, end_date).spliterator(), false)
.map(mapper::toDTO)
.collect(Collectors.toList());
return ResponseEntity.ok(list);
}
}
JPA query:
public List<PaymentTransactionsDailyFacts> findPaymentTransactionsDailyFacts(LocalDateTime start_date, LocalDateTime end_date) {
String hql = "SELECT SUM(amount) AS sum_volume, COUNT(*) AS sum_Transactions " +
" WHERE (created_at BETWEEN :start_date AND :end_date )" +
" GROUP BY DATE(created_at)";
TypedQuery<PaymentTransactionsDailyFacts> query = entityManager.createQuery(hql,
PaymentTransactionsDailyFacts.class).setParameter("start_date", start_date).setParameter("end_date", end_date);
List<PaymentTransactionsDailyFacts> data = query.getResultList();
return data;
}
How should I implement the query properly?
When I receive start_date and end_date as String from Angular how should I convert it into LocaDateTime?

Well, as I commented, time is a dimension in a data warehouse star schema, and I guess period is as well. So you should have two dimension tables, a TimeDim for LocalDate, and a PeriodDim for Period. Then you should have a Fact with the an embeddedId made up of the various dimensions in your schema. Then you would have facts for 1 day periods and facts for 10 day periods. If you insisted on summing facts you have the issue that JPA cannot do a <= or >= comparison against composite keys. Since you are only summing 10 days you could use a in clause to select 10 keys, but again, you should have facts for the periods you need.
#Entity
public class TimeDim {
#Id
private LocalDate localDate;
#Entity
public class PeriodDim {
#Id
private Period period;
// need this too
#Converter(autoApply = true)
public class LocalDateAttributeConverter implements AttributeConverter<LocalDate, Date> {
#Override
public Date convertToDatabaseColumn(LocalDate locDate) {
return (locDate == null ? null : Date.valueOf(locDate));
}
#Override
public LocalDate convertToEntityAttribute(Date sqlDate) {
return (sqlDate == null ? null : sqlDate.toLocalDate());
}
}
#SuppressWarnings("serial")
#Embeddable
public class DimKey implements Serializable {
private LocalDate localDate;
private Period period;
#Entity
public class Fact {
#EmbeddedId
private DimKey dimKey = new DimKey();
private long amount;
And for example:
tx.begin();
TimeDim td10 = new TimeDim();
td10.setLocalDate(LocalDate.now().minusDays(5));
em.persist(td10);
TimeDim td5 = new TimeDim();
td5.setLocalDate(LocalDate.now().minusDays(10));
em.persist(td5);
PeriodDim pd5 = new PeriodDim();
pd5.setPeriod(Period.ofDays(5));
em.persist(pd5);
PeriodDim pd10 = new PeriodDim();
pd10.setPeriod(Period.ofDays(10));
em.persist(pd10);
Fact f10 = new Fact();
f10.getDimKey().setLocalDate(td10.getLocalDate());
f10.getDimKey().setPeriod(pd10.getPeriod());
f10.setAmount(100);
em.persist(f10);
Fact f51 = new Fact();
f51.getDimKey().setLocalDate(td10.getLocalDate());
f51.getDimKey().setPeriod(pd5.getPeriod());
f51.setAmount(50);
em.persist(f51);
Fact f52 = new Fact();
f52.getDimKey().setLocalDate(td5.getLocalDate());
f52.getDimKey().setPeriod(pd5.getPeriod());
f52.setAmount(50);
em.persist(f52);
tx.commit();
em.clear();
DimKey dk = new DimKey();
dk.setLocalDate(td10.getLocalDate());
dk.setPeriod(pd10.getPeriod());
Fact f = em.createQuery("select f from Fact f where f.dimKey = :dimKey", Fact.class)
.setParameter("dimKey", dk)
.getSingleResult();
System.out.println("From 10 day period: " + f.getAmount());
DimKey dk1 = new DimKey();
dk1.setLocalDate(td10.getLocalDate());
dk1.setPeriod(pd5.getPeriod());
DimKey dk2 = new DimKey();
dk2.setLocalDate(td5.getLocalDate());
dk2.setPeriod(pd5.getPeriod());
Long sum = em.createQuery("select sum(f.amount) from Fact f where f.dimKey in (:dimKey1 , :dimKey2)", Long.class)
.setParameter("dimKey1", dk1)
.setParameter("dimKey2", dk2)
.getSingleResult();
System.out.println("From 2*5 day period: " + sum);

Related

Dart DateTime.toIso8601String() throwing exception when inserting into SQFlite database

I am having trouble inserting a DateTime.toIso8601String() into a SQLite (SQFlite) database in Dart. The issue I am having is with a property_model class, who's only job is to interact with the database and hold data. I have an almost identical address_model class that works how I expect it, but I am having trouble with the property_model. Whenever I try to call the Property.insert() method, I get this error:
E/flutter: [ERROR:flutter/lib/ui/ui_dart_state.cc(209)] Unhandled Exception: Invalid argument: Instance of 'DateTime'
I don't have this issue with the very similar Address class though, and I am pretty stumped. Here are the property_model, address_model, and database files I am using (the database.dart file is a singleton that I use throughout the application).
property_model.dart
import 'package:flutter/material.dart';
import 'package:sqflite/sqflite.dart';
import 'package:villadex/model/database.dart' as db;
import 'package:villadex/model/address_model.dart';
class Property {
/// Constructors
Property({
required this.name,
required Address address,
required this.owner,
}) : _address = address,
_primaryKey = null,
_dateCreated = DateTime.now();
Property.existing(
{required this.name,
required Address address,
required this.owner,
required int? primaryKey,
required DateTime dateCreated})
: _address = address,
_primaryKey = primaryKey,
_dateCreated = dateCreated;
Property.fromJSON({required Map<String, dynamic> json})
: name = json['name'],
owner = json['owner'],
_address = Address.fromJson(json: json['location']),
_primaryKey = json['property_id'],
_dateCreated = DateTime.fromMillisecondsSinceEpoch(json['dateCreated']);
/// Data
String name;
String? owner;
final Address _address;
/*final List<Event> calendar;
final List<Expenditure> expenditures;
final List<Associate> associates;
final List<Earning> earnings;*/
final int? _primaryKey;
final DateTime _dateCreated;
///Methods
Future<void> insert() async {
String dateCreated = _dateCreated.toIso8601String().trim();
Map<String, dynamic> data = {
// SQFlite sets the primary key
'name': name,
'owner': owner,
'location': address.toJson(),
'dateCreated': dateCreated,
};
await db.DatabaseConnection.database.then((databaseConnection) => {
databaseConnection?.insert('properties', data,
conflictAlgorithm: ConflictAlgorithm.replace)
});
}
static Future<Property?> fetchById(int id) async {
String sql = "SELECT * FROM properties WHERE property_id = $id";
Future<List<Map<String, dynamic>>>? rawData;
await db.DatabaseConnection.database.then(
(databaseConnection) => {rawData = databaseConnection?.rawQuery(sql)});
return rawData?.then((data) {
return Property.fromJSON(json: data[0]);
});
}
/// Getters
Address get address => _address;
}
address_model.dart
import 'package:flutter/material.dart';
import 'package:villadex/model/database.dart' as db;
class Address {
/// Constructors
Address(
{required this.street1,
this.street2 = '',
required this.city,
this.state = '',
this.zip = '',
required this.country})
: _dateCreated = DateTime.now(),
_primaryKey = null,
_propertyId = null,
_associateId = null;
Address.existing({
required this.street1,
this.street2 = '',
required this.city,
this.state = '',
this.zip = '',
required this.country,
required DateTime dateCreated,
required int primaryKey,
int? propertyKey,
int? associateKey,
}) : _dateCreated = dateCreated,
_primaryKey = primaryKey,
_propertyId = propertyKey,
_associateId = associateKey;
Address.fromJson({required Map<String, dynamic> json})
: street1 = json['street1'],
street2 = json['street2'],
city = json['city'],
state = json['state'],
zip = json['zip'],
country = json['country'],
_primaryKey = json['address_id'],
_propertyId = json['property_id'],
_associateId = json['associate_id'],
_dateCreated = DateTime.parse(json['_dateCreated']);
/// Data
final String street1;
final String street2;
final String city;
final String state;
final String zip;
final String country;
final int? _primaryKey;
final int? _propertyId;
final int? _associateId;
final DateTime _dateCreated;
/// Methods
Future<void> insert() async {
Map<String, dynamic> data = {
// SQFlite sets the primaryKey
'property_id': _propertyId,
'associate_id': _associateId,
'dateCreated': _dateCreated.toIso8601String().trim(),
'street1': street1,
'street2': street2,
'city': city,
'zip': zip,
'country': country
};
await db.DatabaseConnection.database.then((databaseConnection) =>
{databaseConnection?.insert('addresses', data)});
}
// Returns an address by ID
static Future<Address?> fetchById(int id) async {
String sql = "SELECT * FROM addresses WHERE address_id = $id";
Future<List<Map<String, dynamic>>>? rawData;
await db.DatabaseConnection.database.then(
(databaseConnection) => {rawData = databaseConnection?.rawQuery(sql)});
return rawData?.then((data) {
return Address.fromJson(json: data[0]);
});
}
Map<String, dynamic> toJson() {
return {
'street1': street1,
'street2': street2,
'city': city,
'state': state,
'zip': zip,
'country': country,
'address_id': _primaryKey,
'property_id': _propertyId,
'associate_id': _associateId,
'dateCreated': _dateCreated
};
}
/// Getters
String get fullAddress =>
street1 +
" " +
street2 +
", " +
city +
" " +
state +
" " +
zip +
", " +
country;
DateTime get dateCreated => _dateCreated;
int get key => _primaryKey ?? 0;
/// Setters
}
database.dart
import 'package:path/path.dart';
import 'package:sqflite/sqflite.dart';
import 'package:path_provider/path_provider.dart';
import 'dart:async';
import 'package:villadex/model/property_model.dart';
class DatabaseConnection {
//static final DatabaseConnection instance = DatabaseConnection.init();
//DatabaseConnection._init();
/// Database variable
static Database? _database;
/// Getter for the database
static Future<Database?> get database async {
// If _database is null, set it equal to the return value of _initDB
_database ??= await _initDB('database3');
return _database;
}
/// Initialize database
static Future<Database?> _initDB(String dbname) async {
final dbPath = await getApplicationDocumentsDirectory();
final path = join(dbPath.toString(), dbname);
var dbInstance = await openDatabase(path, version: 1, onCreate: _createDatabase);
return dbInstance;
}
/// Create the database
static Future _createDatabase(Database database, int version) async {
Batch batch = database.batch();
/// CREATE PROPERTIES TABLE
batch.execute('''CREATE TABLE properties(
property_id INTEGER PRIMARY KEY,
dateCreated TEXT NOT NULL,
name TEXT NOT NULL,
location TEXT NOT NULL,
owner TEXT NOT NULL,
calendar TEXT,
expenditures TEXT,
associates TEXT,
earnings TEXT
);''');
/// CREATE EXPENDITURES TABLE
batch.execute('''CREATE TABLE expenditures(
expenditure_id INTEGER PRIMARY KEY,
property_id INTEGER NOT NULL,
dateCreated TEXT NOT NULL,
name TEXT NOT NULL,
amount REAL NOT NULL,
numberUnits INTEGER NOT NULL,
isPaid INTEGER NOT NULL,
description TEXT,
category TEXT,
date TEXT,
associates TEXT,
FOREIGN KEY (property_id)
REFERENCES properties(property_id)
);''');
/// CREATE EARNINGS TABLE
batch.execute(''' CREATE TABLE earnings(
earning_id INTEGER PRIMARY KEY,
property_id INTEGER NOT NULL,
dateCreated TEXT NOT NULL,
name TEXT NOT NULL,
amount REAL NOT NULL,
description TEXT,
category TEXT,
date TEXT,
associates TEXT,
FOREIGN KEY (property_id)
REFERENCES properties(property_id)
);''');
/// CREATE CATEGORIES TABLE
batch.execute(''' CREATE TABLE categories(
category_id INTEGER NOT NULL,
dateCreated TEXT NOT NULL,
name TEXT NOT NULL
);''');
/// CREATE ASSOCIATES TABLE
batch.execute(''' CREATE TABLE associates(
associate_id INTEGER PRIMARY KEY,
property_id INTEGER NOT NULL,
dateCreated TEXT NOT NULL,
name TEXT NOT NULL,
contact TEXT,
role TEXT,
payments TEXT,
FOREIGN KEY (property_id)
REFERENCES properties (property_id)
);''');
/// CREATE CONTACTS TABLE
batch.execute(''' CREATE TABLE contact (
associate_id INTEGER NOT NULL,
phoneNumber TEXT,
email TEXT,
FOREIGN KEY (associate_id)
REFERENCES associates (associate_id)
);''');
/// CREATE ADDRESSES TABLE
batch.execute(''' CREATE TABLE addresses (
address_id INTEGER PRIMARY KEY,
property_id INTEGER,
associate_id INTEGER,
dateCreated TEXT NOT NULL,
street1 TEXT NOT NULL,
street2 TEXT,
city TEXT NOT NULL,
zip TEXT,
state TEXT,
country TEXT,
FOREIGN KEY (property_id)
REFERENCES properties (property_id),
FOREIGN KEY (associate_id)
REFERENCES associates (associate_id)
);''');
/// CREATE EVENT TABLE
batch.execute(''' CREATE TABLE event (
event_id INTEGER PRIMARY KEY,
property_id INTEGER NOT NULL,
dateCreated TEXT NOT NULL,
name TEXT NOT NULL,
description TEXT,
address TEXT,
associates TEXT,
expenditures TEXT,
earnings TEXT,
FOREIGN KEY (property_id)
REFERENCES properties (property_id)
);''');
batch.commit();
}
Future close() async {
_database?.close;
}
}
Thank you for any help!
I figured it out. The address object within my property object did not turn its DateTime object into a string during the address.toJson() method, which is why it was giving me that error.

how to use spark sql udaf to implement window counting with condition?

I have a table with columns: timestamp and id and condition, and I want to count the number of each id per interval such as 10 seconds.
If condition is true, the count++, otherwise return the previous value.
the udaf code like:
public class MyCount extends UserDefinedAggregateFunction {
#Override
public StructType inputSchema() {
return DataTypes.createStructType(
Arrays.asList(
DataTypes.createStructField("condition", DataTypes.BooleanType, true),
DataTypes.createStructField("timestamp", DataTypes.LongType, true),
DataTypes.createStructField("interval", DataTypes.IntegerType, true)
)
);
}
#Override
public StructType bufferSchema() {
return DataTypes.createStructType(
Arrays.asList(
DataTypes.createStructField("timestamp", DataTypes.LongType, true),
DataTypes.createStructField("count", DataTypes.LongType, true)
)
);
}
#Override
public DataType dataType() {
return DataTypes.LongType;
}
#Override
public boolean deterministic() {
return true;
}
#Override
public void initialize(MutableAggregationBuffer mutableAggregationBuffer) {
mutableAggregationBuffer.update(0, 0L);
mutableAggregationBuffer.update(1, 0L);
}
public void update(MutableAggregationBuffer mutableAggregationBuffer, Row row) {
long timestamp = mutableAggregationBuffer.getLong(0);
long count = mutableAggregationBuffer.getLong(1);
long event_time = row.getLong(1);
int interval = row.getInt(2);
if (event_time > timestamp + interval) {
timestamp = event_time - event_time % interval;
count = 0;
}
if (row.getBoolean(0)) {
count++;
}
mutableAggregationBuffer.update(0, timestamp);
mutableAggregationBuffer.update(1, count);
}
#Override
public void merge(MutableAggregationBuffer mutableAggregationBuffer, Row row) {
}
#Override
public Object evaluate(Row row) {
return row.getLong(1);
}
}
Then I sumbit a sql like:
select timestamp, id, MyCount(true, timestamp, 10) over(PARTITION BY id ORDER BY timestamp) as count from xxx.xxx
the result is:
timestamp id count
1642760594 0 1
1642760596 0 2
1642760599 0 3
1642760610 0 2 --duplicate
1642760610 0 2
1642760613 0 3
1642760594 1 1
1642760597 1 2
1642760600 1 1
1642760603 1 2
1642760606 1 4 --duplicate
1642760606 1 4
1642760608 1 5
When the timestamp is repeated, I get 1,2,4,4,5 instead of 1,2,3,4,5
How to fix it?
And another requestion is that when to execute the merge method of udaf? I empty implement it but it runs normally. I try to add the log in the method but I haven't seen this log. Is it really necessary?
There is a similar question: Apache Spark SQL UDAF over window showing odd behaviour with duplicate input
However, row_number() does not have such a problem. row_number() is a hive udaf, then I try to create a hive udaf. But I also have the problem...Why hive udaf row_number() terminate() returns 'ArrayList'? I create my udaf row_number2() by copying its code then I got list return?
Finally I solved it by spark aggregateWindowFunction:
case class Count(condition: Expression) extends AggregateWindowFunction with Logging {
override def prettyName: String = "myCount"
override def dataType: DataType = LongType
override def children: Seq[Expression] = Seq(condition)
private val zero = Literal(0L)
private val one = Literal(1L)
private val count = AttributeReference("count", LongType, nullable = false)()
private val increaseCount = If(condition, Add(count, one), count)
override val initialValues: Seq[Expression] = zero :: Nil
override val updateExpressions: Seq[Expression] = increaseCount :: Nil
override val evaluateExpression: Expression = count
override val aggBufferAttributes: Seq[AttributeReference] = count :: Nil
Then use spark_session.functionRegistry.registerFunction to register it.
"select myCount(true) over(partition by window(timestamp, '10 seconds'), id order by timestamp) as count from xxx"

JPA - I get multiple rows instead of 1

I spent many hours to solve my problem but without success. I'd like to achieve something like this (but with ONE row instead of TWO):
My database:
CREATE TABLE odo.d_kryterium_naruszen (
id bigserial primary key,
kryterium text not null,
data_wpr TIMESTAMP not null DEFAULT clock_timestamp(),
opr bigint not null
);
CREATE TABLE odo.d_czynnik_naruszen (
id bigserial primary key,
czynnik text not null,
id_kryterium_naruszen bigint not null references odo.d_kryterium_naruszen(id),
stopien NUMERIC(10,2) not null,
data_wpr TIMESTAMP not null DEFAULT clock_timestamp(),
opr bigint not null
);
CREATE TABLE odo.d_dotkliwosc_naruszenia (
id bigserial primary key,
zakres numrange not null,
ocena text not null,
opis text not null,
wymagane_dzialanie text not null,
data_wpr TIMESTAMP not null DEFAULT clock_timestamp(),
opr bigint not null
);
CREATE TABLE odo.ocena_naruszenia_wynik (
id bigserial primary key,
wartosc_dotkliwosci_naruszenia NUMERIC(10,2) not null,
status_id bigint not null references odo.d_status_oceny_naruszenia(id),
ocena_naruszenia_id bigint not null references odo.ocena_naruszenia(id),
data_wpr TIMESTAMP not null DEFAULT clock_timestamp(),
opr bigint not null
);
create table odo.czynnik_naruszen_wynik(
id bigserial primary key,
ocena_naruszenia_wynik_id bigint not null references odo.ocena_naruszenia_wynik(id),
czynnik_naruszen_id bigint not null references odo.d_czynnik_naruszen(id),
komentarz text,
czynnik_wybrany boolean not null default false
wartosc_wybrana NUMERIC(10,2) not null,
data_wpr TIMESTAMP not null DEFAULT clock_timestamp(),
opr bigint not null
);
And here my entities:
#Data
#Entity
#Table(schema = "odo", name = "d_kryterium_naruszen")
public class ViolationCriterion extends BaseEntity {
#Column(name = "kryterium")
private String criterion;
#OneToMany(fetch = FetchType.LAZY, cascade = CascadeType.ALL, orphanRemoval = true)
#JoinColumn(name = "id_kryterium_naruszen")
private List<ViolationFactor> violationFactors;
}
#Data
#Entity
#Table(schema = "odo", name = "d_czynnik_naruszen")
public class ViolationFactor extends BaseEntity {
#Column(name = "czynnik")
private String factor;
#Column(name = "stopien")
private float degree;
#OneToMany
#JoinColumn(name = "czynnik_naruszen_id")
private List<IncidentAssessmentFactor> incidentAssessmentFactor;
}
#Data
#Entity
#Table(schema = "odo", name = "czynnik_naruszen_wynik")
public class IncidentAssessmentFactor extends BaseEntity {
#Column(name="komentarz")
private String comment;
#Column(name="czynnik_wybrany")
private Boolean factorIsSelected;
#Column(name = "wartosc_wybrana")
private Float value;
#OneToOne(fetch = FetchType.LAZY)
#JoinColumn(name="ocena_naruszenia_wynik_id", updatable=false, insertable=false)
private IncidentAssessment incidentAssessment;
}
#Data
#Entity
#Table(schema = "odo", name = "ocena_naruszenia_wynik")
public class IncidentAssessment extends BaseEntity {
#Column(name="ocena_naruszenia_id")
private Long incidentAssessmentId;
#Column(name = "wartosc_dotkliwosci_naruszenia")
private Float severityDegreeValue;
My repository:
#Repository
public interface ViolationCriterionRepository extends JpaRepository<ViolationCriterion, Long> {
// #Query("select vc from ViolationCriterion vc inner join vc.violationFactors vf inner join vf.incidentAssessmentFactor iaf inner join iaf.incidentAssessment ia where ia.incidentAssessmentId = ?1 group by vc ")
#Query("select vc from ViolationCriterion vc inner join vc.violationFactors vf inner join vf.incidentAssessmentFactor iaf inner join iaf.incidentAssessment ia where ia.incidentAssessmentId = ?1 group by vc ")
// #Query(value = "select kn.kryterium from odo.d_kryterium_naruszen kn join odo.d_czynnik_naruszen cn on kn.id = cn.id_kryterium_naruszen join odo.czynnik_naruszen_wynik cnw on cnw.czynnik_naruszen_id = cn.id join odo.ocena_naruszenia_wynik onw on cnw.ocena_naruszenia_wynik_id = onw.id where onw.ocena_naruszenia_id = ?1 group by kn.id, cn.id, cnw.id, onw.id", nativeQuery = true)
// #Query(value = "select kn.id, kn.kryterium, kn.data_wpr, kn.opr, cn.id, cn.czynnik, cn.stopien, cn.opr, cn.data_wpr, cnw.id, cnw.data_wpr, cnw.opr, cnw.komentarz, cnw.czynnik_wybrany, cnw.wartosc_wybrana, onw.id, onw.data_wpr, onw.opr, onw.ocena_naruszenia_id, onw.wartosc_dotkliwosci_naruszenia from odo.d_kryterium_naruszen kn join odo.d_czynnik_naruszen cn on kn.id = cn.id_kryterium_naruszen join odo.czynnik_naruszen_wynik cnw on cnw.czynnik_naruszen_id = cn.id join odo.ocena_naruszenia_wynik onw on cnw.ocena_naruszenia_wynik_id = onw.id where onw.ocena_naruszenia_id = ?1 group by kn.id, cn.id, cnw.id, onw.id", nativeQuery = true)
List<ViolationCriterion> findIncidentAssessmentByIncidentAssessmentId(Long incidentId);
// List<ViolationCriterion> findByViolationFactorsIncidentAssessmentFactorIncidentAssessmentIncidentAssessmentIdGroupByViolationCriterionCriterion(Long id);
}
And here I call my repository:
List<ViolationCriterion> violationCriteria = violationCriterionRepository.findIncidentAssessmentByIncidentAssessmentId(id);//vi
In a table czynnik_naruszen_wynik I have 2 different rows because I have 2 rows in table ocena_naruszenia_wynik. The problem is that I have multiple values of entity IncidentAssessmentFactor instead of 1

how to change this sql query to hibernate query?

I want to change this sql query to hibernate query.
SQL query:
SELECT *
FROM User
WHERE (DATE_SUB(NOW(), INTERVAL 1 DAY) >= AccountStartDate)
AND PINStatus = 'N'
AND PetName IS NULL
AND SchoolName IS NULL
AND AccountType = 'Supervisor'
This is my User class
#Entity
#Table(name = "User")
public class User{
#Id
#GeneratedValue
#Column(name="UserID")
private long userID;
#Column(name="AccountStartDate")
private Date accountStartDate;
#Column(name="PINStatus")
private String pinStatus;
#Column(name="PetName")
private String petName;;
#Column(name="SchoolName")
private String schoolName;
#Column(name="AccountType")
private String accountType;
}
Here is part of the answer... Try this
User user = _session.CreateCriteria<User>()
.Add(Restrictions.Eq("PINStatus", "[VARIABLE PARAMETER]"))
.AddOrder(new Order("_id", false))
.List<User>()
.FirstOrDefault();
That will result in something like:
select * from User where PINStatus = ??? Order By Id desc
Query query=sessionFactory.getCurrentSession().createQuery("FROM User WHERE " +
"pinStatus=:pinStatus AND petName IS NULL AND schoolName IS NULL AND accountType=:accountType")
.setString("accountType", "User").setString("pinStatus", "N");
List<User> userList=query.list();
Iterator<User> itr = userList.iterator();
while(itr.hasNext()) {
User user = (User) itr.next();
long diff=currentDate.getTime() - user.getAccountStartDate().getTime();
long diffHours = diff / (60 * 60 * 1000);
if(diffHours>=24)
{
System.out.println("User added "+user.getFirstName());
userList1.add(user);
}
}

NHibernate: how to select a sorted parent/child and retrieve only specific row_numbers

I have 2 tables: Parent and Child which have the following relation: Parent has many Childs.
public class Parent
{
public DateTime Timestamp;
public IList<Child> Child;
}
public Child
{
public string Name;
}
I want to select both Parent and Child, Sorted by Timestamp and get only rows between index x to y.
public IList<Parent> Get(DateTime from, DateTime to, int startRow, int count)
{
QueryOver<Parent>().Where(row => row.Timestamp >= from)
.And(row => row.Timestamp <= to).OrderBy(row => row.Timestamp).Asc.List();
}
I don't know how to get only the required rows.
Should I do it with QueryOver? or its better doing it in HQL?
Thanks
I changed the relation and instead of having Parent and Child I use only one table:
public class Info
{
public DateTime Timestamp;
public string Name;
}
In order to get all records between dates, sorted and get them from index startRow to startRow + count I used the following:
public IList<Info> GetInfo (DateTime fromDate, DateTime toDate, int startRow, int count)
{
IList<Info> result =
QueryOver<Info>()
.Where(row => row.Timestamp >= fromDate)
.And(row => row.Timestamp <= toDate)
.OrderBy(row => row.Timestamp).Asc
.Skip(startRow).Take(count).List();
return result;
}
The resulted SQL is:
SELECT * FROM Info WHERE timestamp >= :fromDate AND timestamp <= :toDate
ORDER BY timestamp ASC OFFSET :startRow ROWS FETCH NEXT :count ROWS ONLY