How to filter a date-field with a swift vapor-fluent query - sql

To avoid multiple inserts of the same person in a database, I wrote the following function:
func anzahlDoubletten(_ req: Request, nname: String, vname: String, gebTag: Date)
async throws -> Int {
try await
Teilnehmer.query(on: req.db)
.filter(\.$nname == nname)
.filter(\.$vname == vname)
.filter(\.$gebTag == gebTag)
.count()
}
The function always returns 0, even if there are multiple records with the same surname, prename and birthday in the database.
Here is the resulting sql-query:
[ DEBUG ] SELECT COUNT("teilnehmer"."id") AS "aggregate" FROM "teilnehmer" WHERE "teilnehmer"."nname" = $1 AND "teilnehmer"."vname" = $2 AND "teilnehmer"."geburtstag" = $3 ["neumann", "alfred e.", 1999-09-09 00:00:00 +0000] [database-id: psql, request-id: 1AC70C41-EADE-43C2-A12A-99C19462EDE3] (FluentPostgresDriver/FluentPostgresDatabase.swift:29)
[ INFO ] anzahlDoubletten=0 [request-id: 1AC70C41-EADE-43C2-A12A-99C19462EDE3] (App/Controllers/TeilnehmerController.swift:49)
if I query directly I obtain:
lwm=# select nname, vname, geburtstag from teilnehmer;
nname | vname | geburtstag
---------+-----------+------------
neumann | alfred e. | 1999-09-09
neumann | alfred e. | 1999-09-09
neumann | alfred e. | 1999-09-09
neumann | alfred e. | 1999-09-09
so count() should return 4 not 0:
lwm=# select count(*) from teilnehmer where nname = 'neumann' and vname = 'alfred e.' and geburtstag = '1999-09-09';
count
-------
4
My DateFormatter is defined like so:
let dateFormatter = ISO8601DateFormatter()
dateFormatter.formatOptions = [.withFullDate, .withDashSeparatorInDate]
And finally the attribute "birthday" in my model:
...
#Field(key: "geburtstag")
var gebTag: Date
...
I inserted the 4 alfreds in my database using the model and fluent, passing the birthday "1999-09-09" as a String and fluent inserted all records correctly.
But .filter(\.$gebTag == gebTag) seems to return constantly 'false'.
Is it at all possible to use .filter() with data types other than String?
And if so, what am I doing wrong?
Many thanks for your help
Michael

The problem you've hit is that you're storing only dates whereas you're filtering on dates with times. Unfortunately there's no native way to store just a date. However there are a few options.
The easiest way is to change the date field to a String and then use your date formatter (make sure you remove the time part) to convert the query option to a String.

I am guessing slightly here, but I suspect that your table was not created by a Migration? If it had been, your geburtstag field would include a time component as this is the default and you would have spotted the problem quickly.
In any event, the filter is actually filtering on the time component of gebTag as well as the date. This is why it is returning zero.
I suggest converting the geburtstag to a type that includes the time and ensuring that the time component is set to 0:00:00 when you store it. You can reset the time component to 'midnight' using something like this:
extension Date {
var midnight: Date { return Calendar.current.date(bySettingHour: 0, minute: 0, second: 0, of: self)! }
}
Then change your filter to:
.filter(\.$gebTag == gebTag.midnight)
Alternatively, just use the static method in Calendar:
.filter(\.$gebTag == Calendar.startOfDay(for:gebTag))
I think this is the most straightforward way of doing it.

Related

Process fields with nested arrays into strings with strcat_array for output in Kusto

I would like to process Azure AD audit Logs into HTML tables/csv files. The data contains nested sets of arrays that I would like to summarise into a comma separated string.
eg data that looks like this
{
"TargetResources": [{"displayName": "Policy",
"modifiedProperties": [{"displayname": "PolicySetting1"},
{"displayname": "PolicySetting2"}]
}]
}
Would be processed into
TargetResource | Policy
modifedProps | PolicySetting1, PolicySetting2
mv-expand doesn't seem to work because some rows do not have modifiedProperties so those rows get eliminated
The only solution I have been able to find that gets close to what I am trying to do looks like this:
AuditLogs
| extend TargetResource = tostring(TargetResources[0].displayName)
| extend ModifiedProperty0 = tostring(parse_json(tostring(TargetResources[0].modifiedProperties))[0].displayName)
| extend ModifiedProperty1 = tostring(parse_json(tostring(TargetResources[0].modifiedProperties))[1].displayName)
| extend ModifiedProperty2 = tostring(parse_json(tostring(TargetResources[0].modifiedProperties))[2].displayName)
| extend ModifiedProperties = strcat(ModifiedProperty0,", ",ModifiedProperty1,", ",ModifiedProperty2)
This solution is limited in that it cannot work for arbitrary numbers of modifiedProperty values (it only works properly for exactly 3) which is a requirement for my purposes, I would like the solution to work if modifiedProperties does not exist and if there are 0-15 values.
Thank you for any help you can provide
if I understood your description correctly, you could use mv-apply (twice) to achieve that:
datatable(d: dynamic)
[
dynamic({"TargetResources":[{"displayName": "Policy0","someOtherProperty":"hello world"}]}),
dynamic({"TargetResources":[{"displayName": "Policy1","modifiedProperties":[{"displayname":"PolicySetting1"},{"displayname":"PolicySetting2"}]}]}),
dynamic({"TargetResources":[{"displayName": "Policy2","modifiedProperties":[{"displayname":"PolicySetting3"},{"displayname":"PolicySetting4"}]}, {"displayName":"Policy3","modifiedProperties":[{"displayname":"PolicySetting5"},{"displayname":"PolicySetting6"}]}]}),
]
| mv-apply tr = d.TargetResources on (
extend TargetResource = tr.displayName
| mv-apply mp = tr.modifiedProperties on (
extend propertyName = mp.displayname
| summarize modifiedProps = strcat_array(make_set(propertyName), ", ")
)
)
| project TargetResource, modifiedProps
TargetResource
modifiedProps
Policy0
Policy1
PolicySetting1, PolicySetting2
Policy2
PolicySetting3, PolicySetting4
Policy3
PolicySetting5, PolicySetting6

Eloquent: get AVG from all rows that have minimum timestamp

I want to get the User ID and it's average score from every minimum timestamp for each category. Here's the table structure
Skill Table
id | user_id | category | score | timestamp**
0....10............a................11........12
1....10............a................10........9
2....10............b................12........10
3....10............c................11........8
4....11............a................8........9
5....11............b................9........10
6....11............c................10........8
7....11............c................15........14
I want to get the result like this:
user_id | AVG(score)
10........11 (average id: 1,2,3)
11........9 (average id: 4,5,6)
For now I use the looping query for every user
foreach ($userIds as $id){
// in some case I need to get from only specified Ids not all of them
foreach ($category as $cat) {
// get the minimum timestamp's score for each category
$rawCategory = Skill::where("user_id", $id)->where("timestamp", "<=", $start)->where("category",$cat->id)->orderBy("timestamp", "desc")->first();
if($rawCategory){
$skillCategory[$cat->cat_name] = $rawCategory->score;
}
}
//get the average score
$average = array_sum($skillCategory) / count($skillCategory);
}
I want to create better Eloquent query to get the data like this with good performance (< 60 sec). Have anyone faced a similar problem and solved it? If so, can you please give me the link. Thanks

DDD - Update Value Object in Database with FK dependencie

I am reading about DDD and I have learned that Value Object is immutable, if you want to change it, you will have to create a new one.
I have just read the information on How are Value Objects stored in the database? , it works well for Address class and I also read https://cargotracker.java.net/ and https://gojko.net/2009/09/30/ddd-and-relational-databases-the-value-object-dilemma/. But I want to do something different .
I am working on a billing system , it has 4 tables/classes
TPerson - fields: id_person, name -> <<Entity>>
TMobile - fields: id_mobile, number -> <<Entity>>
TPeriod - fields: id_period, id_person, id_mobile, begin_date, end_date -> <<Value Object>> (I think, because the dates can be change)
TCall - field: id_call, id_period, etc... -> <<Value Object>>
The table TCall has many records, if I change the period record dates (Value Object, table TPeriod) it will create another Object Period then id_period will change(delete, insert a record) , but the foreign key in table TCall will be violated. How Could I implement the period class ? if i implement as a value object , it will be immutable and turns out I will not be able to change anything whatsoever.
Thanks,
Fernando
if it's a value object you don't have a period table/id.
A value object is just a grouping of certain fields. For example a call might have a start time, an end time, and then you could create a Duration Value object with starttime and end time from the call table. In your java code it would be then more convenient to talk about the call duration instead of the start/end time separately.
However, it certainly could make sense to make period an entity, but then period 201601 probally always have the same start/end time and you wouldn't need to make changes to it. And if you did you make changes to the entity directly and keeping the ids in tact.
Thank for your help,
I have this situation:
TPerson - fields: id_person = 1 , name = "John"
TMobile - fields: id_mobile = 100, number "555-0123"
TPeriod - fields: id_period = 1000, id_person = 1 , id_mobile = 1, begin_date = "2016-01-01", end_date = "2049-12-31"
TCall - field: id_call = 1, id_period = 1000
The period is a relation between TPerson and TPeriod, in this example John has a mobile between "2016-01-01" and "2049-12-31". On the table TCall there are John's calls record, but if i replace the period (TPeriod table) end_date to "2016-02-01", from my understanding the end_date will be inconsistent, it turns out i cann't replace because it's a value object, not a entity. I considered to implement like this.
// Create a class DatePeriod
public class DatePeriod {
private final begin_date;
private final end_date;
DatePeriod() {}
public static DatePeriod of(Date begin_date, Date end_date) {
this.begin_date = begin_date;
this.end_date = end_date;
}
// implement equals / hashcode...
}
// Period class
public class Period {
int id;
// others mappings id_person / id_mobile
DatePeriod datePeriod;
}
Still, i will have to update datePeriod attribute
Thank you for your attention to this matter

Generating seed code from existing database in ASP.NET MVC

I wondered if anyone has encountered a similar challenge:
I have a database with some data that was ETL'ed (imported and transformed) in there from an Excel file. In my ASP.NET MVC web application I'm using Code First approach and dropping/creating every time database changes:
#if DEBUG
Database.SetInitializer(new DropCreateDatabaseIfModelChanges<MyDataContext>());
#endif
However, since the data in the Database is lost, I have to ETL it again, which is annoying.
Since, the DB will be dropped only on model change, I will have to tweak my ETL anyway, I know that. But I'd rather change my DB seed code.
Does anyone know how to take the contents of the database and generate seed code, assuming that both Models and SQL Tables are up to date?
EDIT 1:
I'm planning to use the auto-generated Configuration.cs, and its Seed method, and then use AddOrUpdate() method to add data into the database: Here is Microsoft's Tutorial on migrations (specifically the "Set up the Seed method" section).
Lets say we have a simple database table with 3750 records in it;
| Id | Age | FullName |
|------|-----|-----------------|
| 1 | 50 | Michael Jackson |
| 2 | 42 | Elvis Presley |
| 3 | 48 | Whitney Houston |
| ... | ... | ... |
| 3750 | 57 | Prince |
We want to create this table in our database with using auto-generated Configuration.cs file and its Seed() method.
protected override void Seed(OurDbContainer context)
{
context.GreatestSingers.AddOrUpdate(
p => p.Id,
new GreatestSinger { Id = 1, Age = 50, FullName = "Michael Jackson" },
new GreatestSinger { Id = 2, Age = 42, FullName = "Elvis Presley" },
new GreatestSinger { Id = 3, Age = 48, FullName = "Whitney Houston" }
);
}
This is what you should do. 3750 times!
But you already have this data in your existing database table. So we can use this existing data to create Seed() codes.
With the help of SQL String Concatenation;
SELECT
CONCAT('new GreatestSinger { Id = ', Id ,', Age = ', Age ,', FullName = "', FullName ,'" },')
FROM GreatestSinger
will give us all the code needed to create 3750 rows of data.
Just copy/paste it into Seed() method. And from Package Manager Console;
Add-Migration SeedDBwithSingersData
Update-Database
Another way of seeding data is to run it as sql in an Up migration.
I have code that will read a sql file and run it
using System;
using System.Data.Entity.Migrations;
using System.IO;
public partial class InsertStandingData : DbMigration
{
public override void Up()
{
var baseDir = AppDomain.CurrentDomain
.BaseDirectory
.Replace("\\bin", string.Empty) + "\\Data\\Sql Scripts";
Sql(File.ReadAllText(baseDir + "\\StandingData.sql"));
}
public override void Down()
{
//Add delete sql here
}
}
So if your ETL generates sql for you then you could use that technique.
The advantages of doing it in the Up method are
It will be quicker than doing it using AddOrUpdate because
AddOrUpdate queries the database each time it is called to get any
already existing entity.
You are normally going from a known state (e.g. empty tables) so you probably
don't need to check whether data exists already. NB to ensure this
then you should delete the data in the Down method so that you can
tear all the way down and back up again.
The Up method does not run every time the application starts.
The Seed method provides convenience - and it has the advantage (!?) that it runs every time the application starts
But if you prefer to run the sql from there use ExecuteSqlCommand instead of Sql:
string baseDir = AppDomain.CurrentDomain.BaseDirectory.Replace("\\bin", string.Empty)
+ "\\Data\\Sql Scripts";
string path = Path.Combine(baseDir, "StandingData");
foreach (string file in Directory.GetFiles(path, "*.sql"))
{
context.Database.ExecuteSqlCommand(File.ReadAllText(file));
}
References:
Best way to incrementally seed data
Preparing for database deployment
Database Initializer and Migrations Seed Methods

Using LINQ to pull collection until aggregate condition met

At a high level, I need a query that can pull a subset of records based on the sum of a column, just like Linq: How to query items from a collection until the sum reaches a certain value.
However, the key difference is that he's already got his records in an object, and I don't and can't. My table can have millions of records. If I build my query the way he did, I get this error:
"A lambda expression with a statement body
cannot be converted to an expression tree"
Which makes sense after researching it, LINQ can't turn the answer in the above referenced question into valid SQL.
I'm going to make a hypothetical table that represents my situation.
Order Id | Cookie Name | Qty
1 Sugar 5
2 Snickerdoodle 4
3 Chocolate chip 8
4 Snickerdoodle 10
5 Snickerdoodle 5
Given this sample, I need to write a query that grabs the first X orders of Snickerdoodle until the summed Qty exceedes an input from the parameter (i.e. If the user chooses 13, it would return records 2 & 4 ).
I'm using Nhibernate.Linq, because I'm more comfortable in LINQ. I'm completely open to ICreate if the need arises.
As a side note, I'm interested in this as a concept as well as a direct problem. Even though I need a Sum, there has to be a way to do something akin to a takewhile that executes until a condition is met.
pragmatic approach
int needed = ...;
int actual = 0;
int page = 0;
const int pagesize = 20; // set to some sensible value, eg. the pagesize of the grid shown to the user
var results = new List<CookieOrder>();
while (actual < needed)
{
var partialResults = session.Query<CookieOrder>()
.Where(c => c.Name == "Snickerdoodle")
.OrderBy(c => c.Id)
.Skip(page * pagesize)
.Take(pagesize)
.ToList();
for(int i = 0; i < partialResults.Length && actual < needed; i++)
{
results.Add(partialResults[i]);
actual = partialResults[i].Quantity;
}
page++;
}
return results;