NET 6 WebAPI - column mapping. How to do it fast and automaticaly? - orm

I'm looking for a solution to do a column mapping. The database is not mine, and I am just making some kind of additional software for an existing database where column naming is terrible. I want to handle full CRUD, but now I'm using DapperExtensions.Mapper, which is mapping only for update/insert, but data selection with QueryAsync is not mapped. I am getting zeros and nulls. I need to add [MyPropertyName] in the select query.
There are a lot of tables that contain a lot of columns:
Documents table has 262 columns
Payments have two tables, first with 90 columns, second with 87
Customers - 171 columns
Is there any way to map those faster? A solution that works both ways add/edit and select with one configuration.

I think you can try to use EF Core Db First to map column and handle CRUD.
Reverse Engineering:
Reverse engineering is the process of scaffolding entity type classes
and a DbContext class based on a database schema. It can be performed
using the Scaffold-DbContext command of the EF Core Package Manager
Console (PMC) tools or the dotnet ef dbcontext scaffold command of
the .NET Command-line Interface (CLI) tools.
More details your can refer to link1,link2

Related

Automatically connect SQL tables based on keys

Is there a method to automatically join tables that have primary to foreign relationship rather then designate joining on those values?
The out and out answer is "no" - no RDBMS I know of will allow you to get away with not specifying columns in an ON clause intended to join two tables in a non-cartesian fashion, but it might not matter...
...because typically multi tier applications these days are built with data access libraries that DO take into account the relationships defined in a database. Picking on something like entity framework, if your database exists already, then you can scaffold a context in EF from it, and it will make a set of objects that obey the relationships in the frontend code side of things
Technically, you'll never write an ON clause yourself, because if you say something to EF like:
context.Customers.Find(c => c.id = 1) //this finds a customer
.Orders //this gets all the customer's orders
.Where(o => o.date> DateTIme.UtcNow.AddMonths(-1)); //this filters the orders
You've got all the orders raised by customer id 1 in the last month, without writing a single ON clause yourself... EF has, behind the scenes, written it but in the spirit of your question where there are tables related by relation, we've used a framework that uses that relation to relate the data for the purposes thtat the frontend put it to.. All you have to do is use the data access library that does this, if you have an aversion to writing ON clauses yourself :)
It's a virtual certaintythat there will be some similar ORM/mapping/data access library for your front end language of choice - I just picked on EF in C# because it's what I know. If you're after scouting out what's out there, google for {language of choice} ORM (if you're using an OO language) - you mentioned python,. seems SQLAlchemy is a popular one (but note, SO answers are not for recommending particular softwares)
If you mean can you write a JOIN at query time that doesn't need an ON clause, then no.
There is no way to do this in SQL Server.
I am not sure if you are aware of dbForge; it may help. It recognises joinable tables automatically in following cases:
The database contains information that specifies that the tables are related.
If two columns, one in each table, have the same name and data type.
Forge Studio detects that a search condition (e.g. the WHERE clause) is actually a join condition.

Have a table for every basic types or merge them into one table

I have over 100 basic types in my project. Assume that all of them just contains an Id and a Title , so which of these approach is better to use :
Approach 1 : create a separate table for each of them
Approach 2 : Create a table for all of them and use another field as discriminator
I am using Using MSSQL server with Entity Framework Code-First Approach . Actually I can not decide which approach I should choose to use.
I think the question is self-briefed , but Let me know if you need more details.
UPDATE1 : Please do not refer me to this question . I have checked this one , wasnt that much helpful
UPDATE2 : Many of these tables have many relations to the other tables. but some of them wont use that much
100 types that inherits from Id/Title base type and EF TPH (so the DB will have 1 table with discriminator and programmers will have 100 types).
Approach1 will keep relation integrity and clean navigation properties form models.
Also your IDE will helps you completing rigth model names.
The tip: create and interface for all tables in order to reuse UI controls.
Edited
If you found a business name for this table, like customer_data then do a single table. If name is related with technology master_tables split into full semantical classes.

Search database using sqlite versus NSPredicate

I'm building a database in sqlite with multiple tables. It will work like a tag based search where CARS will be compared based on how many TAGS match between them. There will also be one layer used to categorize items called MANUFACTURER. So a typical use case would be user selects MANUFACTURER1 (lets say Ford) as an input and MANUFACTURER2 (lets say Toyota) as an output, enters a CAR [database compares TAGS to CARS between the two MANUFACTURERS] and fectches a CAR recommendation of MANUFACTURER2. I am using Core Data with entities of each, but this does not involve newly created objects, just what's in the original sql database.
My question is - is it better to generate the search with SQLite code, or NSPredicate/NSCompoundPredicate? Are there performance differences?
If you use Core Data with a SQlite store, the NSFetchRequest with a specific predicate will be resolved at the sql level, so you don't need to add nothing to it.
Core Data will abstract this for you. If you use Core Data you cannot use your own query. Just stick with NSFetchRequests and NSPredicates.
Maybe what you need it's to import the db you have in the actual Core Data store.
Maybe I cannot understand your question but what's your goal?

Splitting the Db model in EF 4.1

Anyone suggest me how to handle the following scenario.
I have some Db tables that are using in all of my projects so I am creating all these tables in every database (common tables + project tables). Now I have a common data and business libraries that depend on the common tables, I need to split these table entities into two different libs with a single DbContex.
I am using the dependency injection to pass the db context.
I am using the following Vs tools.
EF 4.1
VS 2010.
Regards,
Hareen.
You can either use inheritance like below or
public class CommonContext:DbContext
{
}
public class ProjectContext:CommonContext
{
}
Composition like this. (The answer I have given for your earlier question

NHibernate: Dynamic Table Mapping

I have a scenario where I want to persist document info record to a table specific to the typo of document, rather than a generic table for all records.
For example, records for Invoices will be stored in dbo.Doc_1000 and records for Receipts will be stored in dbo.Doc_2000 where 1000 and 2000 are id autogenerate and store in well-known table (dbo.TypeOfDoc.
Furthermore each dbo.Doc.xxx table have a group of system column (always the same) and could have a group of dynamic column (metadata).
Tables dbo.Doc.xxx and eventually dynamic column are clearly created at runtime.
If this is possible with NHibernate???
Thanks.
hope that I got your point. I am currently looking for a solution for a problem that looks similar. I want to integrate a feature in my application where the admin user can design an entity at runtime.
As far as I know, once the SessionFactory is configured and ready to use, there is no way to modify the mapping used by nhibernate. If you want to use a customized table structure that is configured, created and modified at runtime, you should have a place where a corresponding mapping lives, e.g. as a nhibernate mapping xml file and you have to set up a new SessionFactory each time you change the database model to reflect these changes.