TypeORM Migration: File must contain a TypeScript / JavaScript code and export a DataSource instance - migration

When trying to autogenerate migrations I get the following error.
File must contain a TypeScript / JavaScript code and export a DataSource instance
This is the command that I am running:
typeorm migration:generate projects/core/migrations/user -d db_config.ts -o
And my db_config.ts file looks like this:
import { DataSource } from "typeorm";
const AppDataSource = new DataSource({
type: "postgres",
host: process.env.PGHOST,
port: 5432,
username: process.env.PGUSER,
password: process.env.PGPASSWORD,
database: process.env.PGDATABASE,
entities: ["./projects/**/entities/*.ts"],
migrations: ["./projects/**/migrations/**.js"],
synchronize: true,
logging: false,
});
export default AppDataSource
My current file structure looks like this:
back_end
-- projects
--- index.ts
--- db_config.ts
And my index.ts file looks like this:
import express from "express";
import { AppDataSource } from "./data-source";
import budget_app from "./projects/budget_app/routes";
export const app = express();
const port = 3000;
AppDataSource.initialize()
.then(() => {
console.log("Data Source has been initialized!");
})
.catch((err) => {
console.error("Error during Data Source initialization", err);
});
// export default AppDataSource;
app.get("/", (req, res) => {
res.send("Hello World!!!!");
});
app.use("/budget_app", budget_app);
app.listen(port, () => {
console.log(`Example app listening on port ${port}`);
});
I am also running this in a docker container along with my postgres database. I have confirmed that the connection works because if I do synchronize=true it will create the table just fine. I just can't create the migration.
So I'm confused and don't know where to go from here to fix the issue. Thanks for your help in advance!

I had trouble with migrations in typeorm, and finally found a solution that will work consistently.
For me, build and then using js datasource didn't work, So i provide my solution for those who steel have struggle with typeorm-migrations.
Here is my step by step solution:
create your datasource config in some file like datasource.config.ts,
mine is like this:
import * as mysqlDriver from 'mysql2';
import {DataSourceOptions} from 'typeorm';
import dotenv from 'dotenv';
dotenv.config();
export function getConfig() {
return {
driver: mysqlDriver,
type: 'mysql',
host: process.env.MYSQL_HOST,
port: parseInt(process.env.MYSQL_PORT, 10),
username: process.env.MYSQL_USER,
password: process.env.MYSQL_PASSWORD,
database: process.env.MYSQL_DB,
synchronize: false,
migrations: [__dirname + '/../../typeorm-migrations/*.{ts,js}'],
entities: [__dirname + '/../**/entity/*.{ts,js}'],
} as DataSourceOptions;
}
create a file with name like migration.config.ts
the implementation is like this:
const datasource = new DataSource(getConfig()); // config is one that is defined in datasource.config.ts file
datasource.initialize();
export default datasource;
now you can define your migration commands in package.json file
"migration:up": "./node_modules/.bin/ts-node ./node_modules/.bin/typeorm migration:run -d config/migration.config.ts",
"migration:down": "./node_modules/.bin/ts-node ./node_modules/.bin/typeorm migration:revert -d config/migration.config.ts"
with running yarn run migration:up you will be able to run your defined migrations in typeorm-migrations folder

I was running into the same issues (typeorm 0.3.4). I was just using npx typeorm migration:show -d ./src/data-source.ts and getting the same error as above (File must contain a TypeScript / JavaScript code and export a DataSource instance), while generating the migration file itself worked somehow, but not running/showing the migrations themselves.
My datasource looks like this
export const AppDataSource = new DataSource({
type: 'postgres',
url: process.env.DATABASE_URL,
logging: true,
entities: ['dist/entities/*.js'],
migrations: ['dist/migrations/*.js'],
});
because my tsc output lives in /dist. So based on the comments above I started using the datasource file that was generated from TypeScript and the error message changed:
npx typeorm migration:run -d ./dist/appDataSource.js
CannotExecuteNotConnectedError: Cannot execute operation on "default" connection because connection is not yet established.
So I looked into the database logs and realized it wanted to connect to postgres with the standard unix user, it wasn't honoring the connection strings in the datasource code. I had to supply all environment variables to the command as well and it worked:
DATABASE_URL=postgresql://postgres:postgres#localhost:5432/tsgraphqlserver npx typeorm migration:run -d ./dist/appDataSource.js

I had the same issue when using a .env file (if you don't have a .env file, this answer probably is irrelevant for you).
It seems, that the CLI does not pick environment variables from dotenv, so you have to load them yourself. E.g., using dotenv library, put this on top of your data-source file:
import * as dotenv from 'dotenv';
dotenv.config();
// export const AppDataSource = new DataSource()...
Alternatively, provide real environment variables when running the script:
PGHOST=... PGUSER=... PGDATABASE=... PGPASSWORD=... typeorm migration:generate ...

I am actually running into the same issue.
I was able to resolve it by using *.js instead of *.ts
Please try something like this:
tsc && typeorm migration:generate -d db_config.ts projects/core/migrations/user
My tsconfig.json looks like this.
{
"compilerOptions": {
"target": "esnext",
"module": "CommonJS",
"moduleResolution": "node",
"emitDecoratorMetadata": true,
"experimentalDecorators": true,
"outDir": "./build",
"removeComments": false,
"resolveJsonModule": true,
"esModuleInterop": true,
}
}
I recommend you open an issue on the typeorm github repo, I think it might be a bug.

Add the following to package.json scripts section:
"typeorm": "typeorm-ts-node-commonjs",
"migration:run": "ts-node ./node_modules/typeorm/cli.js migration:run -d ./src/data-source.ts",
"schema:sync": "npm run typeorm schema:sync -- -d src/data-source.ts",
"migration:show": "npm run typeorm migration:show -- -d src/data-source.ts",
"migration:generate": "npm run typeorm migration:generate -- -d src/data-source.ts",
"migration:create": "npm run typeorm migration:create"
You can then use npm run migration:create -- src/migration for example

Related

typeORM v0.3.6 nest,js migration

I have Typeorm loaded asynchronously
app.module.ts:
TypeOrmModule.forRootAsync({ inject: [ConfigService], useFactory: getTypeormConfig}), typeorm.config.ts: export const getTypeormConfig = async (config: ConfigService) => ({ type: 'postgres' as const, host: config.get<string>('TYPEORM_HOST'), port: config.get<number>('TYPEORM_PORT'), password: config.get<string>('TYPEORM_PASSWORD'), ....
It seems that the typeORM documentation is outdated, as they say:: https://wanago.io/2022/07/25/api-nestjs-database-migrations-typeorm/
I'm trying to follow this example. Created a separate second configuration for CLI migration typeorm-migration.config.ts at the root of the project:
import { BusinessLog } from 'src/yandex-ndd-api-client/entity/business-log.entity';
import { ClientStatus } from 'src/yandex-ndd-api-client/entity/client-status.entity';
import { Item } from 'src/yandex-ndd-api-client/entity/item.entity'; ...
export default new DataSource({
type: 'postgres' as const,
host: 'postgres', port: 5432, username: 'postgres', database: 'childrensworld',
subscribers: [TruckSubscriber, RequestSubscriber],
entities: [ Request, Item, BusinessLog, Place, ClientStatus, ....
I also wrote in package.json:
"typeorm": "ts-node ./node_modules/typeorm/cli", "typeorm:run-migrations": "npm run typeorm migration:run -- -d ./typeorm-migration.config.ts", "typeorm:generate-migration": "npm run typeorm -- -d ./typeorm-migration.config.ts migration:generate ./migrations/test_migration_name", "typeorm:create-migration": "npm run typeorm -- migration:create ./migrations/test_migration_name", "typeorm:revert-migration": "npm run typeorm -- -d ./typeorm-migration.config.ts migration:revert"
launching npm run typeorm:generate-migration --name=CreatePost as in the example and get:
Error during migration run: Error: Unable to open file: "E:\Programming\Nodejs\..........\typeorm-migration.config.ts". Cannot find module 'src/yandex-ndd-api-client/entity/business-log.entity' Require stack: - E:\Programming\Nodejs\LandPro\сhildsworld\Projects\tests\test_migrationTypeOrm\typeorm-migration.config.ts
As if it cannot read entities from typeorm-migration.config.ts The example says nothing about this. Maybe this config is for CLI migration(typeorm-migration.config.ts)do you need to connect somewhere else?
This is our typeorm cmd. You might need the -r tsconfig-paths/register.
typeorm": "ts-node --transpile-only -r tsconfig-paths/register ./node_modules/typeorm/cli.js --dataSource src/typeorm/typeorm.config.ts",

How to use dotenv in SvelteKit project?

I'm trying to use dotenv.config() in a SvelteKit project.
I can run npm run build successfully. But when I try to start the server (using node build), it throws Error: Dynamic require of "fs" is not supported.
I tried to comment out the dotenv part in src/routes/test.js and build again, and this time the server started without any errors. (I created the project with npm init svelte#next without typescript, and except for the codes here, nothing else is changed)
How should I use dotenv here to load environment variables at runtime?
svelte.config.js
import node from '#sveltejs/adapter-node';
const config = {
kit: {
adapter: node(),
target: '#svelte'
}
};
export default config;
/src/routes/test.js
import dotenv from 'dotenv';
dotenv.config();
export function get() {
return {
body: {
test: process.env.TEST
}
}
}
.env
TEST=123
No need to explicitly load dotenv.
Vite uses dotenv
https://vitejs.dev/guide/env-and-mode.html#env-files
You can access your variable via import.meta.env.VITE_MY_VAR
Important is that your env variables must be prefixed with VITE_ in order to get them exposed. And if you are already running npm run dev, quit it and start again.
That worked for me.
Since a few weeks SvelteKit has a built-in way to handle environment variables:
https://kit.svelte.dev/docs/modules#$env-dynamic-private
I solved the problem with env-cmd (https://www.npmjs.com/package/env-cmd) by adding env-cmd to the beginning of svelte-kit dev, svelte-kit preview and node build.
Also, use process.env['TEST'] instead of process.env.TEST since process.env.TEST is replaced with ({}) by vite. (https://github.com/vitejs/vite/issues/3176)
This is what I did:
vite has a special config option for server port.
// import adapter from '#sveltejs/adapter-static';
import adapter from '#sveltejs/adapter-node';
import preprocess from 'svelte-preprocess';
import path from 'path';
import dotenv from 'dotenv-flow';
dotenv.config();
/** #type {import('#sveltejs/kit').Config} */
const config = {
// Consult https://github.com/sveltejs/svelte-preprocess
// for more information about preprocessors
preprocess: preprocess(),
kit: {
// hydrate the <div id="svelte"> element in src/app.html
// target: '#svelte',
/*
adapter: adapter({
// default options are shown
pages: 'build',
assets: 'build',
fallback: 'index.html'
}),
*/
adapter: adapter({
out: './build',
precompress: true
}),
vite: {
resolve: {
alias: {
$components: path.resolve('./src/components'),
$stores: path.resolve('./src/stores'),
$api: path.resolve('./src/api')
}
},
build: {
minify: true
},
server: {
port: process.env.PORT || 3000
}
}
}
};
export default config;
I have .env for defaults (dev etc) and .env.local that is ignored in .gitignore for production (keys, etc).
When .env.local is present it uses that port.
edit: this does not work with node-adapter in production. I think we need to declare PORT some other way. it only works with npm run dev

pass environment variables during babel build phase for importing different files

I'm building a web (react with webpack & babel) and mobile apps (react-native with expo) for a project. I therefore created a common library for business logic and redux/api library.
Some code will be slightly different between web and mobile. In my case it's localStorage vs AsyncStorage, which I use for authentication among other things...
I'm trying to pass an environment variable for the build stage to switch import of certain files so that the correct file is loaded for each build which are simply path linked (ie no pre-build of my library, I just do import '../mylib') ex:
if(PLATFORM === 'mobile'){
import StorageModule from './mobile-storage-module`
} else {
import StorageModule from './mobile-storage-module`
}
export default StorageModule
Try 1
#babel/preset-env to say if it's mobile or web so that it imports different libraries depending on build like so:
My .babelrc has this:
{
"presets": [
[
"#babel/preset-env",
{
"platform": "mobile"
}
]
]
}
And then in local storage file I do this:
export default () => {
const platform = process.env.platform
if (platform === 'mobile') {
return import './storage-modules/storage-mobile'
}
return import './storage-modules/storage-web'
}
That didn't work, and this also didn't work for me.
Try 2
I installed react-native-dotenv and created a .env file with:
PLATFORM=mobile
And set the plugin in my .babelrc:
{
"presets": [
"babel-preset-expo",
"react-native-dotenv"
]
}
And in my example file, I tried this:
import { PLATFORM } from 'react-native-dotenv'
export default PLATFORM === 'mobile' ? import './storage-modules/storage-mobile' : import './storage-modules/storage-web'
But now my build doesn't work. Any idea how I do dynamic imports during the build process that works for babel in react-native app and webpack build (also uses babel)?
First, #babel/preset-env does not do what you think it does. This is not for specifying your own variables, it is a plugin to automatically use the right target and pollyfills for the browsers you want to support.
The easiest way to get environment variables is with the webpack define plugin (which is part of webpack, so no need to install anything extra)
Just add this to your webpack config.
plugins: [
new webpack.DefinePlugin({
'process.env': {
platform: 'mobile',
},
}),
],
Next, you can't use normal import statements inside of ifs.
import gets resolved before any code runs, either on build by webpack, or in supported environments on script load.
To import something on runtime, you need to use dynamic imports.
Here is an example of how this could look like.
export default new Promise(async resolve => {
resolve(
process.env.platform === 'mobile'
? (await import('./mobile.js')).default
: (await import('./desktop.js')).default
);
});
You can now import from this file like you normally would, but be aware that the default export is a promise.
As your question's title says "during babel build phase", I assume you would like to make different builds for desktop and mobile (not one build for both and load the needed modules dynamically run-time). So I would go like this:
Define the run scripts in package.json for desktop and mobile:
"scripts": {
"devmobile": "cross-env NODE_ENV=development PLATFORM=mobile webpack --progress",
"dev": "cross-env NODE_ENV=development webpack --progress",
}
... or you can create two different webpack.config.js files for desktop and mobile builds but I think the above is easier...
Then npm run devmobile to build for mobile and npm run dev for desktop.
Since I'm on Windows I use the cross-env package but this is the recommended way to be OS independent.
Then I would use Webpack's NormalModuleReplacementPlugin:
(based on this exmaple)
In your webpack.config.js:
// defining the wanted platform for the build (comfing form the npm run script)
const targetPlatform = process.env.PLATFORM || 'desktop';
// then use the plugin like this
plugins: [
new webpack.NormalModuleReplacementPlugin(/(.*)-PLATFORM(\.*)/, function(resource) {
resource.request = resource.request.replace(/-PLATFORM/, `-${targetPlatform}`);
}),
]
...then if you have these two files:
./storage-modules/storage-mobile.js
./storage-modules/storage-desktop.js
import the needed one in your script like this:
import './storage-modules/storage-PLATFORM';
This way the generated build will only contain the needed file for the current PLATFORM used for the build process.
Another possible solution could be the ifdef-loader but I haven't tested it. Maybe worth to try, seems easy.
If you want one build though and import the needed module dynamically, you could do something like this in your app.js (or whatever):
// this needs to have defined when the app is running
const targetPlatform = process.env.PLATFORM || 'desktop';
import(
/* webpackChunkName: "[request]" */
`./storage-modules/storage-${targetPlatform}`
).then(storageModule => {
// use the loaded module
});
or:
(async () => {
const storageModule = await import(
/* webpackChunkName: "[request]" */
`./storage-modules/storage-${targetPlatform}`
);
// use the loaded module
})();
For this to work Babel has to be configured.
More on Webpack with dynamic imports here.
You can use transform-inline-environment-variablesto pass platform to babel
"build-mobile": "PLATFORM=mobile ...",
"build-app": "PLATFORM=app ...",

TypeORM with React Native

I created a new dummy app with react-native init test, and then followed the instructions to add typeorm. In my App.js I included import {getManager} from 'typeorm', and then ran react-native run-ios.
I see the following error in metro-bundler:
Error: Unable to resolve module path from /Users/amit/Code/test/node_modules/typeorm/platform/PlatformTools.js: Module path does not exist in the Haste module map
Here's a sample repository to show the problem: enter link description here
Not sure if I missed something in the setup! Any help is really welcome!
Unfortunately import from 'typeorm' module does not work because react-native projects does not use node platform excatly. Imports from 'typeorm/browser' will work. Here is a sample project.: https://github.com/typeorm/react-native-example
Make sure you create a connection object that does not use any references to project file system. Avoid using something like:
import { CountSession } from '../biopro-mobile-database/entities/count_session';
const connection = await createConnection({
name: 'liteDb_3',
type: 'react-native',
database: 'biopro_mobile.sqlite',
location: 'default',
synchronize: false,
logging: true,
entities: ["../biopro-mobile-database/entities/**/*.ts"],
})
Avoid entities: ["../biopro-mobile-database/entities//*.ts"],**
Instead use something like:
import { EquipmentCounted } from '../biopro-mobile-database/entities/equipment_counted';
import { CountSession } from '../biopro-mobile-database/entities/count_session';
const connection = await createConnection({
name: 'liteDb_3',
type: 'react-native',
database: 'biopro_mobile.sqlite',
location: 'default',
synchronize: false,
logging: true,
entities: [
CountSession,
EquipmentCounted,
],
})

Grunt watch: only upload files that have changed

related
I was able to set up a Grunt task to SFTP files up to my dev server using grunt-ssh:
sftp: {
dev: {
files: {
'./': ['**','!{node_modules,artifacts,sql,logs}/**'],
},
options: {
path: '/path/to/project',
privateKey: grunt.file.read(process.env.HOME+'/.ssh/id_rsa'),
host: '111.111.111.111',
port: 22,
username: 'marksthebest',
}
}
},
But this uploads everything when I run it. There are thousands of files. I don't have time to wait for them to upload one-by-one every time I modify a file.
How can I set up a watch to upload only the files I've changed, as soon as I've changed them?
(For the curious, the server is a VM on the local network. It runs on a different OS and the setup is more similar to production than my local machine. Uploads should be lightning quick if I can get this working correctly)
What you need is grunt-newer, a task designed especially to update the configuration of any task depending on what file just changed, then run it. An example configuration could look like the following:
watch: {
all: {
files: ['**','!{node_modules,artifacts,sql,logs}/**'],
tasks: ['newer:sftp:dev']
}
}
You can do that using the watch event of grunt-contrib-watch.
You basically need to handle the watch event, modify the sftp files config to only include the changed files, and then let grunt run the sftp task.
Something like this:
module.exports = function(grunt) {
grunt.initConfig({
pkg: grunt.file.readJSON('package.json'),
secret: grunt.file.readJSON('secret.json'),
watch: {
test: {
files: 'files/**/*',
tasks: 'sftp',
options: {
spawn: false
}
}
},
sftp: {
test: {
files: {
"./": "files/**/*"
},
options: {
path: '/path/on/the/server/',
srcBasePath: 'files/',
host: 'hostname.com',
username: '<%= secret.username %>',
password: '<%= secret.password %>',
showProgress: true
}
}
}
}); // end grunt.initConfig
// on watch events configure sftp.test.files to only run on changed file
grunt.event.on('watch', function(action, filepath) {
grunt.config('sftp.test.files', {"./": filepath});
});
grunt.loadNpmTasks('grunt-contrib-watch');
grunt.loadNpmTasks('grunt-ssh');
};
Note the "spawn: false" option, and the way you need to set the config inside the event handler.
Note2: this code will upload one file at a time, there's a more robust method in the same link.
You can achieve that with Grunt:
grunt-contrib-watch
grunt-rsync
First things first: I am using a Docker Container. I also added a public SSH key into my Docker Container. So I am uploading into my "remote" container only the files that have changed in my local environment with this Grunt Task:
'use strict';
module.exports = function(grunt) {
grunt.initConfig({
rsync: {
options: {
args: ['-avz', '--verbose', '--delete'],
exclude: ['.git*', 'cache', 'log'],
recursive: true
},
development: {
options: {
src: './',
dest: '/var/www/development',
host: 'root#www.localhost.com',
port: 2222
}
}
},
sshexec: {
development: {
command: 'chown -R www-data:www-data /var/www/development',
options: {
host: 'www.localhost.com',
username: 'root',
port: 2222,
privateKey: grunt.file.read("/Users/YOUR_USER/.ssh/id_containers_rsa")
}
}
},
watch: {
development: {
files: [
'node_modules',
'package.json',
'Gruntfile.js',
'.gitignore',
'.htaccess',
'README.md',
'config/*',
'modules/*',
'themes/*',
'!cache/*',
'!log/*'
],
tasks: ['rsync:development', 'sshexec:development']
}
},
});
grunt.loadNpmTasks('grunt-contrib-watch');
grunt.loadNpmTasks('grunt-rsync');
grunt.loadNpmTasks('grunt-ssh');
grunt.registerTask('default', ['watch:development']);
};
Good Luck and Happy Hacking!
I have recently ran into a similar issue where I wanted to only upload files that have changed. I'm only using grunt-exec. Providing you have ssh access to your server, you can do this task with much greater efficiency. I also created an rsync.json that is ignored by git, so collaborators can have their own rsync data.
The benefit is that if anyone makes a change it automatically uploads to their stage.
// Watch - runs tasks when any changes are detected.
watch: {
scripts: {
files: '**/*',
tasks: ['deploy'],
options: {
spawn: false
}
}
}
My deploy task is a registered task that compiles scripts then runs exec:deploy
// Showing exec:deploy task
// Using rsync with ssh keys instead of login/pass
exec: {
deploy: {
cmd: 'rsync public_html/* <%= rsync.options %> <%= rsync.user %>#<%= rsync.host %>:<%=rsync.path %>'
}
You see a lot of the <%= rsync %> stuff? I use that to grab info from rysnc.json which is ingored by git. I only have this because this is a team workflow.
// rsync.json
{
"options": "-rvp --progress -a --delete -e 'ssh -q'",
"user": "mmcfarland",
"host": "example.com",
"path": "~/stage/public_html"
}
Make sure you rsync.json is defined in grunt:
module.exports = function(grunt) {
var rsync = grunt.file.readJSON('path/to/rsync.json');
var pkg = grunt.file.readJSON('path/to/package.json');
grunt.initConfig({
pkg: pkg,
rsync: rsync,
I think it's not good idea to upload everything that changed at once to staging server. And working on the staging server is not a good idea too. You have to configure your local machine server, to be the same as staging/production
It's better to upload 1 time, when you do deployment.
You can archive all the files using grunt-contrib-compress. And push them using grunt-ssh as 1 file, then extract it on the server, that will be much faster.
that's example of compress task:
compress: {
main: {
options:{
archive:'build/build.tar.gz',
mode: 'tgz'
},
files: [
{cwd: 'build/', src: ['sites/all/modules/**'], dest:'./'},
{cwd: 'build/', src: ['sites/all/themes/**'], dest:'./'},
{cwd: 'build/', src: ['sites/default/files/**'], dest:'./'}
]
}
}
PS: Didn't ever look to rsync grunt modules.
I understand that it's might not what you are looking for. But i decided to create my answer as standalone answer.