If I use SQLAlchemy ORM, does that mean I stop querying the backend DB directly? - sql

If I use SQLAlchemy's ORM to create objects and store them, does that mean I pretty much also only retrieve the data from the DB via SQLAlchemy? Will the underlying tables created by SQLAlchemy ORM still be sane? Can I still query the DB directly and have useful findings?

The ORM will only create and modify the database records as they're defined. You'll be able to query them just as you normally would. Using SqlAlchemy does not limit your normal database access in any way. SqlAlchemy can output the queries used into log files for seeing what exactly they're doing. It's nothing like html generation where you then don't want to look at the html it created.

Related

Query Builder vs. ORM?

What is the difference between a query builder and an ORM that developers use in their backend server-side code/logic for interacting with their database? It seems like the query builder can fulfill the goal of coding in the same language you are choosing to write the backend server-side code/logic to work with the database. However, I don't understand why an ORM comes in the picture then.
SQL query returns records (scalars) that do not automatically load more data if needed.
ORM can return you a Java object (say Person) with actual working methods that can load more data from database (without writing more queries explicitly).
For example if you call person.getAddress() it can return Address object that's just loads from database on the fly. Without writing new "select address" query.
Whatever is returned form SQL query (builder) does not work like that.

how to find a way for mongodb schema?

I'm new to MongoDB. before I start MongoDB i was working with SQL databases. I know that MongoDB is different from SQL and in Mongodb you don't have to define a schema. but if you give your database to other teammates and ask him to work with your database, how he can figure out how to work with a Collection? in this situation in SQL, your teammate will open the database and look at the Table and he will understand how to work with it. Suppose according to your analyze, the User Collection should hold the following data model:
But when you have not defined it anywhere, How can you explain it to others?
I hope that I have been able to express my meaning correctly.

Django ORM vs PostgreSQL raw sql

I have my django app running on server 1 which is connected to Amazon RDS (PostgreSQL) present on server2. On a day-to-day basis, I have to interact with database often to write/fetch/query data.
Currently, I am following the django ORM way across the app. I just wanted to know if there would be any improvement in the performance if I use PostgreSQL raw sql quereis using django.db connection cursor over ORM?
Following are the proposed ways for one sample query.
Django ORM
table_name.objects().all()
PostgreSQL raw sql quereis using django.db connection cursor
Can someone please suggest which one of the above will lead to a faster increased in performance provided I have my database running on RDS in another server
Using a raw query will result in close to no performance difference. The ORM indeed uses a few cycles to construct the query, but that is insignificant compared to the work of sending the query, the database receiving, interpreting and running the query, and sending the results back.
In some cases the ORM might indeed make a (over)complicated query, but that is not the case if you fetch all records. It sometimes requires ORM expertise (just like it requires expertise in SQL) to find a query that performs better.
You can furthermore already let the Django ORM do the work for you by using .values() to make it more convenient to construct a DataFrame:
res = pd.DataFrame(table_name.objects.values())

How should I store SQL queries in my API server?

I'm making an API server that require SQL queries that has something like 20-40 lines.
I want it to be a simple server so I'm using NodeJS, express, body-parser and a database connector.
In other environments, I understand that you either use final constants or import SQL files but I don't know what's the best method in Node.
I think reading in individual SQL file for every query will be slow and since JavaScript does not support multiline string, saving them in a json object would seem tedious.
(I know ES6/Babel exists but it seems like an overkill for just one functionality).
So the question is what's the best and the most common way to store SQL queries in the context of node/express?
20-40 lines seems like a lot... Have you considered making those queries views or stored procedures?
The other thing is that if you are using at least NodeJS 4.6.1 you can use template literals out of the box

Multi database schema export

I'm writing a project that will make use of mySQL or Derby (it must work in both situations). To solve this I'm using Hibernate and it works nicely, but for a situation: I've a couple of tables that contain cities and towns, with data related so that if i know the town I can join and get the county, state and zip code. These tables are full of thousands of rows. I'm not using Hibernate to handle them, but plain JDBC. These table are not going to change in time, they're just for reference and for autocompletion needs. So what is the best way to reproduce this tables in both mySQL and JavaDB? Specifically they must be generated on the first start of the app. I thought of creating a special format and save everything to text files, then on first run they will be inserted in the db... but is there a way to save some coding and use a tool that is already there?
I found many saying to use CSV, but it is not the case as it doesn't keep information like the type of column or length. Same for the XML that my mySQL tool (sqlYog) produces. Do you have other suggestions or tools?
I would use Hibernate, and map the entities as mutable=false. Hibernate is then not permitted to change anything. Create the schema using the standard Hibernate means, then use IStatelessSession to insert the records, ensuring you've enabled batching.