I would really love to simplify a lot of my logic by using a foreign key constraint with ON DELETE CASCADE in one of my table definitions using Android native SQLite and the expo-sqlite module.
Through researching I've found that you have to set PRAGMA FOREIGN_KEYS = ON before every DB transaction. The tools that are available in many other SQLite libraries are not available for expo-sqlite. As far as I can tell, I can only execute one query per transaction using this library (though I'd love to be proven wrong).
So currently a simple query might look like this:
export function insertEntity(entity) {
return new Promise((resolve, reject) => {
database.transaction(tx => {
tx.executeSql(
`INSERT INTO ${tableName}
(firstName, lastName)
VALUES (?, ?)`,
[entity.fn, entity.ln],
(_, result) => { resolve(result)},
(_, err) => { reject(err); return true; }
)
})
});
};
But when I try to set the PRAGMA FOREIGN_KEYS = ON; statement as part of that transaction like this:
export function insertEntity(entity) {
return new Promise((resolve, reject) => {
database.transaction(tx => {
tx.executeSql(
`PRAGMA FOREIGN_KEYS = ON;
INSERT INTO ${tableName}
(firstName, lastName)
VALUES (?, ?)`,
[entity.fn, entity.ln],
(_, result) => { resolve(result)},
(_, err) => { reject(err); return true; }
)
})
});
};
I start to get errors about my parameters not being able to map...
I also don't see a way to configure any settings on opening the DB either, which would be ideally what I want. Because I can't imagine why I would want to create a foreign key constraint and then not use it by default, but that is how this API is designed apparently...
There is this, but without context around where and when to use it I can't figure it out. (Do I just run it once when opening the DB? Do I run it before every transaction? Does my transaction then have to be a callback passed to that method?)
https://docs.expo.dev/versions/latest/sdk/sqlite/#executing-statements-outside-of-a-transaction
From SQLite documentation https://www.sqlite.org/foreignkeys.html
Foreign key constraints are disabled by default (for backwards compatibility), so must be enabled separately for each database connection. (Note, however, that future releases of SQLite might change so that foreign key constraints enabled by default. Careful developers will not make any assumptions about whether or not foreign keys are enabled by default but will instead enable or disable them as necessary.) The application can also use a PRAGMA foreign_keys statement to determine if foreign keys are currently enabled. The following command-line session demonstrates this
So you can do check any time you create a new connection to DB.
const db = SQLite.openDatabase('dbName', version);
// once per connection
db.exec(
[{ sql: 'PRAGMA foreign_keys = ON;', args: [] }],
false,
() => console.log('Foreign keys turned on')
);
Related
I'm building a nestjs application that uses typeorm to communicate with postgres.
My tables are created dynamically, and data is inserted also dynamically. That's why I use raw query instead of entities.
The problem is that some data in tables are related and I can't insert new data unless previous insert query has finished.
How do I check if query execution finished?
Here is an example of workflow that I use. It works with small data but fails with big data (10 000 000 entries and more)
export class Test {
constructor(
private readonly connection: Connection;
) {}
public async insertData(table1, table2, arr1, arr2) {
await insertInto(table1, arr1);
//I want second insertInto() to be executed after I get confirmation from database that insertInto() from above is finished
await insertInto(table2, arr2);
}
private async insertInto(table, data) {
const queryRunner = this.connection.createQueryRunner();
await queryRunner.connect();
await queryRunner.startTransaction();
const preparedData = [];
//prepare data to be inserted as raw query
//...
try {
await queryRunner.query(`INSERT INTO "${table}" VALUES ${preparedData}`);
await queryRunner.commitTransaction();
} catch (e) {
await queryRunner.rollbackTransaction();
throw new InternalServerErrorException(e, Error while executing custom query. Rollback transaction.)
} finally {
await queryRunner.release();
}
}
}
Desired result is to have some callback for queryRunner.query like this queryRunner.query('raw_sql', (err, res) => {})
Is it possible with typeorm?
Thanks
The way your code is written, the transaction commit will only happen after the insert finishes. Which means that, at that point you can also execute your new query. You don't necessarily need a callback because you're using the async/await syntax.
However, it seems that with very large inserts, something wrong is happening (some sort of query/connection timeout, or server resource fail). Try debugging/printing the error to see what really happened.
I suggest you try to split the insert into multiple batches (of something like 1k records, for example).
I have been trying to use the value from the JSON that I have got added successfully using fs.write() function,
There are two test cases in the same fixture, one to create an ID and 2nd to use that id. I can wrote the id successfully in the json file using fs.write() function and trying to use that id using importing json file like var myid=require('../../resources/id.json')
The json file storing correct id of the current execution but I get the id of first test execution in 2nd execution.
For example, id:1234 is stored during first test execution and id:4567 is stored in 2nd test execution. During 2nd test execution I need the id:4567 but I get 1234 this is weird, isn't it?
I use it like
t.typeText(ele, myid.orid)
my json file contains only id like {"orid":"4567"}
I am new to Javascript and Testcafe any help would really be appreciated
Write File class
const fs = require('fs')
const baseClass =require('../component/base')
class WriteIntoFile{
constructor(orderID){
const OID = {
orderid: orderID
}
const jsonString = JSON.stringify(OID)
fs.writeFile(`resources\id.json`, jsonString, err => {
if (err) {
console.log('Error writing file', err)
} else {
console.log('Successfully wrote file')
}
})
}
}
export default WriteIntoFile
I created 2 different classes in order to separate create & update operations and call the functions of create & update order in single fixture in test file
Create Order class
class CreateOrder{
----
----
----
async createNewOrder(){
//get text of created ordder and saved order id in to the json file
-----
-----
-----
const orId= await baseclass.getOrderId();
new WriteIntoFile(orId)
console.log(orId)
-----
-----
-----
}
}export default CreateOrder
Update Order class
var id=require('../../resources/id.json')
class UpdateOrder{
async searchOrderToUpdate(){
await t
***//Here, I get old order id that was saved during previous execution***
.typeText(baseClass.searchBox, id.orderid)
.wait(2500)
.click(baseClass.searchIcon)
.doubleClick(baseClass.orderAGgrid)
console.log(id.ordderid)
----
----
async updateOrder(){
this.searchOrderToUpdate()
.typeText(baseClass.phNo, '1234567890')
.click(baseClass.saveBtn)
}
}export default UpdateOrder
Test file
const newOrder = new CreateOrder();
const update = new UpdateOrder();
const role = Role(`siteurl`, async t => {
await t
login('id')
await t
.wait(1500)
},{preserveUrl:true})
test('Should be able to create an Order', async t=>{
await newOrder.createNewOrder();
});
test('Should be able to update an order', async t=>{
await update.updateOrder();
});
I'll reply to this, but you probably won't be happy with my answer, because I wouldn't go down this same path as you proposed in your code.
I can see a couple of problems. Some of them might not be problems right now, but in a month, you could struggle with this.
1/ You are creating separate test cases that are dependent on each other.
This is a problem because of these reasons:
what if Should be able to create an Order doesn't run? or what if it fails? then Should be able to update an order fails as well, and this information is useless, because it wasn't the update operation that failed, but the fact that you didn't meet all preconditions for the test case
how do you make sure Should be able to create an Order always runs before hould be able to update an order? There's no way! You can do it like this when one comes before the other and I think it will work, but in some time you decide to move one test somewhere else and you are in trouble and you'll spend hours debugging it. You have prepared a trap for yourself. I wrote this answer on this very topic, you can read it.
you can't run the tests in parallel
when I read your test file, there's no visible hint that the tests are dependent on each other. Therefore as a stranger to your code, I could easily mess things up because I have no way of knowing about it without going deeper in the code. This is a big trap for anyone who might come to your code after you. Don't do this to your colleagues.
2/ Working with files when all you need to do is pass a value around is too cumbersome.
I really don't see a reason why you need to same the id into a file. A slightly better approach (still violating 1/) could be:
const newOrder = new CreateOrder();
const update = new UpdateOrder();
// use a variable to pass the orderId around
// it's also visible that the tests are dependent on each other
let orderId = undefined;
const role = Role(`siteurl`, async t => {
// some steps, I omit this for better readability
}, {preserveUrl: true})
test('Should be able to create an Order', async t=>{
orderId = await newOrder.createNewOrder();
});
test('Should be able to update an order', async t=>{
await update.updateOrder(orderId);
});
Doing it like this also slightly remedies what I wrote in 1/, that is that it's not visible at first sight that the tests are dependent on each other. Now, this is a bit improved.
Some other approaches how you can pass data around are mentioned here and here.
Perhaps even a better approach is to use t.fixtureCtx object:
const newOrder = new CreateOrder();
const update = new UpdateOrder();
const role = Role(`siteurl`, async t => {
// some steps, I omit this for better readability
}, {preserveUrl:true})
test('Should be able to create an Order', async t=>{
t.fixtureCtx.orderId = await newOrder.createNewOrder();
});
test('Should be able to update an order', async t=>{
await update.updateOrder(t.fixtureCtx.orderId);
});
Again, I can at least see the tests are dependent on each other. That's already a big victory.
Now back to your question:
During 2nd test execution I need the id:4567 but I get 1234 this is weird, isn't it?
No, it's not weird. You required the file:
var id = require('../../resources/id.json')
and so it's loaded once and if you write into the file later, you won't read the new content unless you read the file again. require() is a function in Node to load modules, and it makes sense to load them once.
This demonstrates the problem:
const idFile = require('./id.json');
const fs = require('fs');
console.log(idFile); // { id: 5 }
const newId = {
'id': 7
};
fs.writeFileSync('id.json', JSON.stringify(newId));
// it's been loaded once, you won't get any other value here
console.log(idFile); // { id: 5 }
What you can do to solve the problem?
You can use fs.readFileSync():
const idFile = require('./id.json');
const fs = require('fs');
console.log(idFile); // { id: 5 }
const newId = {
'id': 7
};
fs.writeFileSync('id.json', JSON.stringify(newId));
// you need to read the file again and parse its content
const newContent = JSON.parse(fs.readFileSync('id.json'));
console.log(newContent); // { id: 7 }
And this is what I warned you against in the comment section. That this is too cumbersome, inefficient, because you write to a file and then read from the file just to get one value.
What you created is not very readable either:
const fs = require('fs')
const baseClass =require('../component/base')
class WriteIntoFile{
constructor(orderID){
const OID = {
orderid: orderID
}
const jsonString = JSON.stringify(OID)
fs.writeFile(`resources\id.json`, jsonString, err => {
if (err) {
console.log('Error writing file', err)
} else {
console.log('Successfully wrote file')
}
})
}
}
export default WriteIntoFile
All these operations for writing into a file are in a constructor, but a constructor is not the best place for all this. Ideally you have only variable assignments in it. I also don't see much reason for why you need to create a new class when you are doing only two operations that can easily fit on one line of code:
fs.writeFileSync('orderId.json', JSON.stringify({ orderid: orderId }));
Keep it as simple as possible. it's more readable like so than having to go to a separate file with the class and decypher what it does there.
I'm trying to insert data to my Wix collection using the API. I'm using a POST function and am posting a JSON document. It's supposed to simply add a new row to a database containing 1 value.
Here is the http-functions.js which I can trigger without issues (it's more or less a copy of the example from the documentation):
import {created, serverError} from 'wix-http-functions';
import wixData from 'wix-data';
export function post_peopleCount(request) {
let options = {
"headers": {
"Content-Type": "application/json"
}
};
// get the request body
return request.body.text()
.then( (body) => {
// insert the item in a collection
return wixData.insert("NumberOfPeopleDB", JSON.parse(body));
} )
.then( (results) => {
options.body = {
"inserted": results
};
return created(options);
} )
// something went wrong
.catch( (error) => {
options.body = {
"error": error
};
return serverError(options);
} );
}
The database looks like this:
and the JSON I am posting looks like this:
But the Error I am getting is:
But the permissions I have set for the collection is:
Do you know why I might be getting that "WD_PERMISSION_DENIED" and 500 Server Error? (The data does not get entered.)
Thanks!
My friend, its not related to creating a collection from scratch it is because of the permissions set to this collection once created. You fixed that by not noticing :).
Permission need to be given in order to perform such queries.
It turns out, if I create a new collection (= table) from scratch, it works. I also changed the field value in the collection to people, maybe value is a reserved term. Nevertheless, now it seems to work:
So if you run into the same problem: Try recreating the collections from scratch.
The critical thing for me which has not been mentioned yet is that you need to set the collection to have form-like permissions so that anyone has permission to submit data to the collection.
I'm wanting to fetch all items from a table into a collection but am getting an error that the table name is undefined. Here is my store:
db.version(1).stores({
users: '++id,',
orgs: '++id,',
applications: '++id'
})
Then later here is my call:
db.orgs.toCollection().count(function (count) {
console.log(count)
})
It gives the following error:
TypeError: Cannot read property 'toCollection' of undefined
But when I stop the debugger at the call and type in db.tables sure enough:
1:Table {name: "orgs", schema: TableSchema, _tx: undefined, …}
_tx:undefined
hook:function rv(eventName, subscriber) { … }
name:"orgs"
Any help is appreciated - thanks.
UPDATE
I noticed that when I seeded the database on initial creation I could fetch the data out. So I copied in that code into my template. It still fails however, so there must be something simple I'm missing, here is that code:
import Dexie from '#/dexie.es.js'
export default {
name: 'ListOrgs',
data: () => {
return {
orgs: []
}
},
methods: {
populateOrgs: async function () {
let db = await new Dexie('myDatabase').open()
db.orgs.toCollection().count(function (count) {
console.log(count)
})
}
},
mounted () {
this.populateOrgs()
}
}
Dexie has two modes
Static - the most common one used in most samples.
Dynamic - Schema is not specified in code.
Static Mode
//
// Static Mode
//
const db = new Dexie('myDatabase');
db.version(1).stores({myTable1: '++'});
db.version(2).stores({myTable1: '++, foo'});
db.myTable1.add({foo: 'bar'}); // OK - dexie knows about myTable1!
Dynamic Mode
//
// Dynamic Mode
//
const db = new Dexie('myDatabase');
// FAIL: db.myTable1.add({foo: 'bar'}); // myTable1 is unknown to the API.
// Here, you must wait for db to open, and then access tables using db.table() method:
db.open().then(db => {
const myTable = db.table('myTable');
if (myTable) {
myTable.add({foo: 'bar'});
}
}).catch(error => {
console.error(error);
});
If omitting any version() specification, Dexie will just try to open any existing database with the same name, no matter version or schema. But it won't create the implicit table properties onto the db instance.
When Dynamic Mode is Useful
Dynamic mode can be useful when building an arbritary database utility that should adapt to any indexedDB database - such as a DB explorer. Dynamic mode can also be useful when the javascript code is by design not aware of the schema (what tables are expected to be queried and what indexes there are).
Benefits with Static Mode
No need to wait for db.open() to complete.
Automatic DB creation when neeeded. No complex app code to deal with database versioning.
Automatic DB population when needed.
Design Patterns in Static Mode
db.js
import Dexie from 'dexie';
//
// Let this module do several things:
//
// * Create the singleton Dexie instance for your application.
// * Declare it's schema (and version history / migrations)
// * (Populate default data http://dexie.org/docs/Dexie/Dexie.on.populate)
//
export const db = new Dexie('myDatabase');
db.version(1).stores({
users: '++id,',
orgs: '++id,',
applications: '++id'
});
db.on('populate', () => {
return db.orgs.bulkAdd([
{'foo': 'bar'},
]);
});
app.js
import {db} from './db';
// Wherever you use the database, include your own db module
// instead of creating a new Dexie(). This way your code will
// always make sure to create or upgrade your database whichever
// of your modules that comes first in accessing the database.
//
// You will not have to take care of creation or upgrading scenarios.
//
// Let Dexie do that for you instead.
//
async function countOrgs() {
return await db.orgs.count();
}
I'm having difficult coming up with the best possible way of storing todo list items in the backend. I was told that storing array and object in the backend was not a good idea. I'm trying to clone a google keep inspired web app.
Some context: as soon as the user submits their todo list, it will make an axios call to the backend that will iterate through an array of todo list items and save them individually to the backend.
Which inspired me with this current set up.
CREATE TABLE TODO (
ID SERIAL PRIMARY KEY,
title VARCHAR,
user_id INTEGER REFERENCES users(ID));
CREATE TABLE TODO_ITEM (
ID SERIAL PRIMARY KEY,
item VARCHAR,
complete BOOLEAN,
todo_list_id INTEGER REFERENCES TODO(id));
My frontend call to the backend looks like this
toDoArray.map(ele => {
axios.post('users/postToDoListItems', {
item: ele,
complete: false,
todo_list_id: ?
})
})
axios.post('users/postToDoList', {
title: title,
toDoList: toDoList
})
}
The TODO_ITEM table I would like to to reference my TODO table so that when it's called to the frontend and grouped with the correct table.
With my current setup, is it possible to pass the reference (TODO)ID to TODO_ITEM table?
so aaaww i think you made some little mistakes i dont know how you query on your back-end but you must notice that before making your tables you must make a connection to your db soo i think it's not back to check this
or if you did it before plz complete your info about your problem , but the right thing for making queries is this :
var mysql = require('mysql');
var connection = mysql.createConnection({
host : 'localhost',
user : 'me',
password : 'secret',
database : 'my_db'
});
connection.connect();
connection.query('SELECT 1 + 1 AS solution', function (error, results, fields) {
if (error) throw error;
console.log('The solution is: ', results[0].solution);
});
connection.end();
and for query from front-end you should do this :
<< fron-end >>
axios.post('users/postToDoListItems', {
item: ele,
complete: false,
todo_list_id: 1
})
<< back-end >>
route : postToDoListItems
connection.query(`SET complete = ${req.body.complete} FROM todo WHERE id =${req.body.id}`, function (error, results, fields) {
if (error) throw error;
res.json({results,fields})
});