Testing flow using prisma and graphql - testing

I'm using prisma-graphql for backend server, and jest & supertest for testing graphql queries.
For testing environment, I'd like to establish an in-memory database to create and retrieve data during test-phase. My questions are,
What is a good way to configure a test DB for prisma client? Do I need to configure a memory DB at before all phase? like below?
//setup.js for test
beforeAll(async() => {
db = new sqlite3.Database(':memory:', (err) => {
if (err) {
return console.error(err.message);
}
console.log('Connected to the in-memory SQlite database.');
});
})
How & when can I run the commands for prisma? For instance, I need to run the below commands for in memory db, to create tables. https://github.com/prisma/prisma/issues/732 seems to suggest that in memory migration doesn't make sense - in other words, it does not support such thing?
npx prisma migrate --experimental save/up
npx prisma generate

I would suggest going for integration tests against an actual DB as they will be a good way to test your code against the Queries/Mutations that you perform.
I have created a repo here that performs integration tests against an actual DB and this one is for GraphQL. These contain the entire Jest setup from start to finish.

Related

Django - creating automatically database in postgresql for tests, and deleting it after it

I wonder if there is a possibility to trigger somehow automatic creation of database directly from application's code, only for tests purpose. In the perfect scenario, it should look like:
Run the tests in Django
Database being automatically created and fulfilled (postgresql)
After tests pass, whole database should be deleted.
For now what I have, is only fulfilling database with tables, and dropping it after tests, but I can't figure out if there is a possibility to trigger creation of database, so user won't be force to do it manually. I was looking for solution, but couldn't find any proper answer for it.
I was trying to find a solution for it in documentation in Django, or postgresql but couldn't find any similar problem.
Tests that require a database (namely, model tests) will not use your “real” (production) database. Separate, blank databases are created for the tests.
Regardless of whether the tests pass or fail, the test databases are destroyed when all the tests have been executed.
You can specify test database in settings.py. See Test Database Docs.
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'USER': 'mydatabaseuser',
'NAME': 'mydatabase',
'TEST': {
'NAME': 'mytestdatabase',
},
},
}

How to achieve data driven testing using postman and newman?

I have a requirement to automate API`s and also integrate the same with Azure devOps pipeline. I am currently using Cypress and i am successful in doing so.
My clients wants to use postman for automation.
I have to run a single API with multiple combinations like different set of query parameters with different set of Request Body.
I know by using Cypress fixture we can achieve data driven testing , but can we do the same with Postman, if yes ? how can we integrate the same to Azure Pipeline for running different combinations of data ?
data driven testing in postman is straight forward, csv header is the variable name :
create a csv file called data.csv :
age
1
2
3
now call the varaible in any part like {{name}}
eg let request be :
{
"user": "{{name}}"
}
Now run the script using newman or collection runner. For cicd integration we should use newman. Export the collection as json and run
newman run collection.json -d data.csv
thats it , at a single time you can use only one csv file , but can run the configuration using different csv file by rerunning command with different datafiles specified using -d

nuxt fully static - nuxt generate not building new data

I have this scenario since I'm looking at the new fully static release recently. (https://nuxtjs.org/blog/going-full-static/). I have some issues when upgrading to fully static due to my current workflow as follows:
Currently, I am calling an API before build to populate my data -> npx build -> npx export -> deleting data stored. That way, from my understanding, asyncData caches that data on the server side and it works perfectly fine on the client side. This in turn "builds" my new pages if there is new data received from my API during the npx export command.
However, with the new nuxt generate, it only builds when a change is detected in my file. The thing is that my data is populated and deleted, hence nuxt generate will always skip the building phase since no changes are detected -> no new pages are generated from my new data.
I am thinking of the following, but it doesn't sound ideal:
Run a separate js file to populate my API data -> then call npx generate -> then run another separate js file to delete the API data, so that whenever npx generate runs, it detects the data from API. But this will cause the npx generate to always run the build phase which isn't the intended purpose of this (?)
MILLION DOLLAR QUESTION
I am aware the npx generate is supposed to skip build for quicker "exports" and generating of pages. I am wondering if there is a better/correct way of avoiding the build (and saving time, as intended), while being able to generate my pages as new data comes in from my API.

How do I create a Realm DB for a React Native app?

Maybe I'm thinking of this the wrong way, but how do I create a pre-populated Realm DB? For example, let's say I want to create a dictionary with 1000 words and definitions in it. The user can change the definitions from within the app, but initially the DB will have default definitions.
Can I create a .realm file with the 1000 words and definitions and include it in my app?
FYI: I am using Realm with React Native and I am currently testing using emulator -avd CordovaAVD to launch my Android emulator.
I think I've figured this out so I'll post my solution in case anyone else might find it useful.
I have a function that will populate a Realm DB. Once I run that function, however, I want to grab that static DB and use it instead of generating the DB every time the app starts up. That's what prompted my effort. However, these steps will also be useful if you just want to back up a DB.
I am primarily testing using an emulator, but I think these steps will work if you test on an actual device.
To grab the Realm DB from the emulator:
Find the path to your realm file within the phone by adding this somewhere in your code:
let YourRealmDB = new Realm({schema: [YourSchema]});
console.log('YourRealmDB path =', YourRealmDB.path);
The path will be something like
/data/data/(package.name)/files/(filename).realm
"filename" will probably be "default"
From the command line, run
adb exec-out run-as (package.name) cat files/(filename).realm > (filename).realm
This will copy your Realm db to your current directory
Now, to use that DB in your app:
Create a realm object:
let YourRealmDB = new Realm({schema: [YourSchema]});
Set the objects path to your DB:
YourRealmDB.defaultPath = 'path/to/your/db/(filename).realm';
NOTE: If you use a 'local' DB like this and your app performs a write operation, it will not write
to your local DB. It will write to the db at YourRealmDB.path. So if you want to copy or
view the updated DB, you will need to run adb exec-out run-as (package.name) cat files/(filename).realm > (filename).realm
again to get the most current version of your DB.
I hope that helps. It took me quite a while to piece that all together.

What is the fastest way to clear out a database using NHibernate?

I intend to perform some automated integration tests. This requires the db to be put back into a 'clean state'. Is this the fastest/best way to do this?
var cfg = new Configuration();
cfg.Configure();
cfg.AddAssembly("Bla");
new SchemaExport(cfg).Execute(false, true, false);
var se = new SchemaExport(conf);
se.Drop(false, true);
se.Create(false, true);
Yes it almost is. You don't have to create a new configuration object before each test, you can reuse it when created once. You can make it faster by using an inmemory database like sql-lite for testing.
My integration tests do SessionFactory creation in a base class constructor, and SchemaExport in test fixture setup. I also test against SQLite running as an in-memory database for extra speed.
Ayende gave a good example of this approach in this blog post. Tobin Harris' article includes timing data of drop/create vs delete.
Me I use Proteus, it's an open source library. Before each test, there is an auto save of your set of data , load the set you want to test (an empty DB for exemple). After each test, the set of data is reloaded after the last test, the data present in the database before the tests are restored.
Since NHibernate is database independent another interesting option if you are having it generate your database is to run your tests against something like SQLite which is in memory. Things will run MUCH faster that way.
Here is an article on showing how to do this with ActiveRecord but you can take the concept and use it without ActiveRecord.
And here is a discussion if you're having trouble getting it working (I did at first).