SQLite3 Database is not a constructor - sql

So in my project I am trying to gather simple Discord username and unique identifier from discord and store it in SQLite database file. I get the error:
` let userDB = new sqlite.Database('./disco.db', sqlite.OPEN_READWRITE);
^
TypeError: sqlite.Database is not a constructor`
Here is my code in my index.js
// Requirements
const Discord = require('discord.js');
const client = new Discord.Client();
const fs = require('fs');
const ServList = client.guilds.cache.size;
const sqlite = require('sqlite3').verbose();
require('dotenv').config()
//client login function
client.login(process.env.TOKEN);
// Start up Check list
client.once('ready', () => {
//Log and Set Status
console.log('Bot Online');
client.user.setActivity(`Proudly in ${client.guilds.cache.size} servers`, {
type: "WATCHING",
}, 60000);
//Database Initialization
let userDB = new sqlite.Database('./disco.db', sqlite.OPEN_READWRITE | sqlite.OPEN_CREATE);
});
Here is my code for the command that is creating the error:
const Discord = require('discord.js');
const sqlite = require('sqlite3').verbose();
module.exports = {
name: 'create',
description: "Create your account!",
use(message, args, client, sqlite){
// Data to Add
let userDB = new sqlite.Database('./disco.db', sqlite.OPEN_READWRITE);
userDB.run(`CREATE TABLE IF NOT EXIST usersInfo(userID INTEGER NOT NULL, uNameR TEXT NOT NULL)`);
let userID = message.author.id;
let uName = message.author.tag;
let uQuery = `SELECT * FROM usersInfo WHERE userID = ?`;
userDB.get(uQuery, [userID], (err, row) => {
if (err) {
console.log(err);
return;
}
if (row === undefined){
userDB.prepare(`INSERT INTO usersInfo VALUES(?,?)`);
insertdata.run('userID, uName');
insertdata.finalize();
userDB.close();
} else {
let userID2 = row.userID;
let yName = row.uNameR;
console.log(yName, userID);
}
});
message.channel.send('success');
}
}
Edit: Your question has been identified as a possible duplicate of another question. If the answers there do not address your problem, please edit to explain in detail the parts of your question that are unique.
The suggestion solution does not work for me as the suggested answer utilizes mySQL while I use SQLite3, Not only that but the suggested answer attempts to connect to a hosted database while mine is local.

Related

Get all transactions for an NFT on Solana

I want to collect all transactions for an NFT.
For example, you can display all transactions here:
https://explorer.solana.com/address/2Nzt8TYeAfgJDftKzkb7rgYShVvyXTR7cPVvpqaZ2a4V
or here:
https://solscan.io/token/2Nzt8TYeAfgJDftKzkb7rgYShVvyXTR7cPVvpqaZ2a4V#txs
But is there any way to do this with the API?
I checked
solana-py: https://michaelhly.github.io/solana-py/
and solscan api: https://public-api.solscan.io/docs/
But I could not find a way to do it.
You can use the getSignaturesForAddress RPC method on the mint address and walk backward to get all the transactions.
Here is an example in JS:
import {
Connection,
clusterApiUrl,
ConfirmedSignatureInfo,
PublicKey,
} from "#solana/web3.js";
const connection = new Connection(clusterApiUrl("mainnet-beta"));
export const getTxs = async (connection: Connection, pubkey: PublicKey) => {
const txs: ConfirmedSignatureInfo[] = [];
// Walk backward
let lastTransactions = await connection.getConfirmedSignaturesForAddress2(
pubkey
);
let before = lastTransactions[lastTransactions.length - 1].signature;
txs.push(...lastTransactions);
while (true) {
const newTransactions = await connection.getConfirmedSignaturesForAddress2(
pubkey,
{
before,
}
);
if (newTransactions.length === 0) break;
txs.push(...newTransactions);
before = newTransactions[newTransactions.length - 1].signature;
}
return txs;
};
getTxs(
connection,
new PublicKey("2Nzt8TYeAfgJDftKzkb7rgYShVvyXTR7cPVvpqaZ2a4V")
);
The equivalent method in Solana.py is this one https://michaelhly.github.io/solana-py/rpc/api/#solana.rpc.api.Client.get_signatures_for_address

Write rows to BigQuery via nodejs BigQuery Storage Write API

It seems quite new, but just hoping someone here has been able to use nodejs to write directly to BigQuery storage using #google-cloud/bigquery-storage.
There is an explanation of how the overall backend API works and how to write a collection of rows atomically using BigQuery Write API but no such documentation for nodejs yet. A recent release 2.7.0 documents the addition of said feature but there is no documentation, and the code is not easily understood.
There is an open issue requesting an example but thought I'd try my luck to see if anyone has been able to use this API yet.
Suppose you have a BigQuery table called student with three columns id,name and age. Following steps will get you to load data into the table with nodejs storage write api.
Define student.proto file as follows
syntax = "proto2";
message Student {
required int64 id = 1;
optional string name = 2;
optional int64 age = 3;
}
Run the following at the command prompt
protoc --js_out=import_style=commonjs,binary:. student.proto
It should generate student_pb.js file in the current directory.
Write the following js code in the current directory and run it
const {BigQueryWriteClient} = require('#google-cloud/bigquery-storage').v1;
const st = require('./student_pb.js')
const type = require('#google-cloud/bigquery-storage').protos.google.protobuf.FieldDescriptorProto.Type
const mode = require('#google-cloud/bigquery-storage').protos.google.cloud.bigquery.storage.v1.WriteStream.Type
const storageClient = new BigQueryWriteClient();
const parent = `projects/${project}/datasets/${dataset}/tables/student`
var writeStream = {type: mode.PENDING}
var student = new st.Student()
var protoDescriptor = {}
protoDescriptor.name = 'student'
protoDescriptor.field = [{'name':'id','number':1,'type':type.TYPE_INT64},{'name':'name','number':2,'type':type.TYPE_STRING},{'name':'age','number':3,'type':type.TYPE_INT64}]
async function run() {
try {
var request = {
parent,
writeStream
}
var response = await storageClient.createWriteStream(request);
writeStream = response[0].name
var serializedRows = []
//Row 1
student.setId(1)
student.setName('st1')
student.setAge(15)
serializedRows.push(student.serializeBinary())
//Row 2
student.setId(2)
student.setName('st2')
student.setAge(15)
serializedRows.push(student.serializeBinary())
var protoRows = {
serializedRows
}
var proto_data = {
writerSchema: {protoDescriptor},
rows: protoRows
}
// Construct request
request = {
writeStream,
protoRows: proto_data
};
// Insert rows
const stream = await storageClient.appendRows();
stream.on('data', response => {
console.log(response);
});
stream.on('error', err => {
throw err;
});
stream.on('end', async () => {
/* API call completed */
try {
var response = await storageClient.finalizeWriteStream({name: writeStream})
response = await storageClient.batchCommitWriteStreams({parent,writeStreams: [writeStream]})
}
catch(err) {
console.log(err)
}
});
stream.write(request);
stream.end();
}
catch(err) {
console.log(err)
}
}
run();
Make sure your environment variables are set correctly to point to the file containing google cloud credentials.
Change project and dataset values accordingly.

Consulting an array into a DB and comparing with a ID SQL Typescript

Hello I'm trying to take data from a sql table but the data that I want to check is into an array, so I need compare the data to check if an user is into the group, the array only have the IDs from users and the specific ID that I want is being bringing to me through the login.
This code is in Typescript.
If you need more information let me know please.
class CompanyController {
async consultCompanys(req: Request, res: Response) {
let response: ResponseModel = new ResponseModel(ECodeResponse.Ok, "", []);
const { UserId } = req.body;
try {
const Companies: any = await pool.query(
`SELECT (CompanyId) From Companies Where Members = '${UserId}'`
);
response.Code = ECodeResponse.Ok;
response.Message = EWarningMessage.Error;
return res.json(response);
} catch (error) {
response.Code = ECodeResponse.Warning;
response.Message = EWarningMessage.Error;
return res.json(response);
}
}
}
I'm a litle oxidated in this kind of consults

Does TypeORM supports raw SQL queries for input and output?

I would like to know if there is a feature of TypeORM that supports raw sql queries for Insert Update Delete Select etc..
According to this issue comment, TypeORM enables you to use any queries to your heart's content. using entityManager.query() Here is the documentation.
UPDATE
Link above is outdated, try this instead entity-manager-api.
const rawData = await manager.query(`SELECT * FROM USERS`);
2020 UPDATE, entityManager.query() basing the entityManager off the EntityManager class, was not working for me so had to do this:
import { getManager } from 'typeorm';
const entityManager = getManager();
const someQuery = await entityManager.query(`
SELECT
fw."X",
fw."Y",
ew.*
FROM "table1" as fw
JOIN "table2" as ew
ON fw."X" = $1 AND ew.id = fw."Y";
`, [param1]);
https://orkhan.gitbook.io/typeorm/docs/working-with-entity-manager
try out this (April, 2022, typeorm ^0.3.6)
import { DataSource } from "typeorm";
(async () => {
const AppDatasource = new DataSource({
type: "mysql",
host: "localhost",
port: 3306,
username: "root",
password: "root",
database: "your-database",
synchronize: false,
logging: false,
entities: ['src/entity/**/*.ts']
})
const appDataSource = await AppDataSource.initialize();
const queryRunner = await appDataSource.createQueryRunner();
var result = await queryRunner.manager.query(
`SELECT * FROM your-table LIMIT 100`
);
await console.log(result)
})()
You can use deprecated getConnection or repo instance:
const db = this.repo.manager; // or getConnection().manager
const users = await db.query(`SELECT * FROM "users";`);
const [{ total }] = await db.query(`SELECT COUNT(*) as total FROM "users";`);
// users.length === Number(total);
Metadata allows you to get table properties dynamically
// utilities...
const usersTableMeta = db.connection.getMetadata(UserEntity); // or getConnection().getMetadata(UserEntity);
const usersTable = `"${usersTableMeta.tableName}"`
// data...
const users = await db.query(`SELECT * FROM ${usersTable};`);
const admins = await db.query(`
SELECT id, name FROM ${usersTable}
WHERE ${usersTable}.role = 'admin';
`);

Import SQL dump within Node environment

I'd like a npm script to create/configure/etc. and finally import a SQL dump. The entire creation, configuring, etc. is all working, however, I cannot get the import to work. The data never is inserted. Here's what I have (nevermind the nested callback as they'll be turned into promises):
connection.query(`DROP DATABASE IF EXISTS ${config.database};`, err => {
connection.query(`CREATE DATABASE IF NOT EXISTS ${config.database};`, err => {
connection.query('use DATABASENAME', err => {
const sqlDumpPath = path.join(__dirname, 'sql-dump/sql-dump.sql');
connection.query(`SOURCE ${sqlDumpPath}`, err => {
connection.end(err => resolve());
});
})
});
});
I also tried the following with Sequelize (ORM):
return new Promise(resolve => {
const sqlDumpPath = path.join(__dirname, 'sql-dump/sql-dump.sql');
fs.readFile('./sql/dump.sql', 'utf-8', (err, data) => {
sequelize
.query(data)
.then(resolve)
.catch(console.error);
});
});
Here's how I set up my initial Sequelized import using the migrations framework. There is plenty of going on here but in short I:
find the latest sql-dump in the migrations folder
read the file using fs
split the text into queries
check if its a valid query and if so apply some cleaning that my data required (see related post)
push an array full of queries - I start with making sure that the database is clean by calling the this.down first
run everything as a promise (as suggested here) using the mapSeries (not the map)
Using sequelize-cli you can in your shell create a migration by writing:
sequelize migration:create
And you will automatically have the file where you enter the code below. In order to execute the migration you simply write:
sequelize db:migrate
"use strict";
const promise = require("bluebird");
const fs = require("fs");
const path = require("path");
const assert = require("assert");
const db = require("../api/models"); // To be able to run raw queries
const debug = require("debug")("my_new_api");
// I needed this in order to get some encoding issues straight
const Aring = new RegExp(String.fromCharCode(65533) +
"\\" + String.fromCharCode(46) + "{1,3}", "g");
const Auml = new RegExp(String.fromCharCode(65533) +
String.fromCharCode(44) + "{1,3}", "g");
const Ouml = new RegExp(String.fromCharCode(65533) +
String.fromCharCode(45) + "{1,3}", "g");
module.exports = {
up: function (queryInterface, Sequelize) {
// The following section allows me to have multiple sql-files and only use the last dump
var last_sql;
for (let fn of fs.readdirSync(__dirname)){
if (fn.match(/\.sql$/)){
fn = path.join(__dirname, fn);
var stats = fs.statSync(fn);
if (typeof last_sql === "undefined" ||
last_sql.stats.mtime < stats.mtime){
last_sql = {
filename: fn,
stats: stats
};
}
}
}
assert(typeof last_sql !== "undefined", "Could not find any valid sql files in " + __dirname);
// Split file into queries
var queries = fs.readFileSync(last_sql.filename).toString().split(/;\n/);
var actions = [{
query: "Running the down section",
exec: this.down
}]; // Clean database by calling the down first
for (let i in queries){
// Skip empty queries and the character set information in the 40101 section
// as this would most likely require a multi-query set-up
if (queries[i].trim().length == 0 ||
queries[i].match(new RegExp("/\\*!40101 .+ \\*/"))){
continue;
}
// The manual fixing of encoding
let clean_query = queries[i]
.replace(Aring, "Å")
.replace(Ouml, "Ö")
.replace(Auml, "Ä");
actions.push({
query: clean_query.substring(0, 200), // We save a short section of the query only for debugging purposes
exec: () => db.sequelize.query(clean_query)
});
}
// The Series is important as the order isn't retained with just map
return promise.mapSeries(actions, function(item) {
debug(item.query);
return item.exec();
}, { concurrency: 1 });
},
down: function (queryInterface, Sequelize) {
var tables_2_drop = [
"items",
"users",
"usertypes"
];
var actions = [];
for (let tbl of tables_2_drop){
actions.push({
// The created should be created_at
exec: () => db.sequelize.query("DROP TABLE IF EXISTS `" + tbl +"`")
});
}
return promise.map(actions, function(item) {
return item.exec();
}, { concurrency: 1 });/**/
}
};
Based loosely on Max Gordon's answer, here's my code to run a MySQL Dump file from NodeJs/Sequelize:
"use strict";
const fs = require("fs");
const path = require("path");
/**
* Start off with a MySQL Dump file, import that, and then migrate to the latest version.
*
* #param dbName {string} the name of the database
* #param mysqlDumpFile {string} The full path to the file to import as a starting point
*/
module.exports.migrateFromFile = function(dbName, mysqlDumpFile) {
let sequelize = createSequelize(dbName);
console.log("Importing from " + mysqlDumpFile + "...");
let queries = fs.readFileSync(mysqlDumpFile, {encoding: "UTF-8"}).split(";\n");
console.log("Importing dump file...");
// Setup the DB to import data in bulk.
let promise = sequelize.query("set FOREIGN_KEY_CHECKS=0"
).then(() => {
return sequelize.query("set UNIQUE_CHECKS=0");
}).then(() => {
return sequelize.query("set SQL_MODE='NO_AUTO_VALUE_ON_ZERO'");
}).then(() => {
return sequelize.query("set SQL_NOTES=0");
});
console.time("Importing mysql dump");
for (let query of queries) {
query = query.trim();
if (query.length !== 0 && !query.match(/\/\*/)) {
promise = promise.then(() => {
console.log("Executing: " + query.substring(0, 100));
return sequelize.query(query, {raw: true});
})
}
}
return promise.then(() => {
console.timeEnd("Importing mysql dump");
console.log("Migrating the rest of the way...");
console.time("Migrating after importing mysql dump");
return exports.migrateUp(dbName); // Run the rest of your migrations
}).then(() => {
console.timeEnd("Migrating after importing mysql dump");
});
};