Ask users to input value for npm script - npm

I have an npm script, which is run like that:
npm run start:local -- -target.location https://192.1.1.1:8052/
The URL param is the local IP of a user.
What I would love to have is to ask users to input this value, because it's different for everybody.
Is it possible? Would be great to do it with vanila npm scripts.

Simply speaking, an npm script will run the desired command in your shell environment.
In a shell script, the arguments passed can be accessed using $N where N = Position of the argument.
Talking about your case, the command you want to run is
npm run start:local -- -target.location USER_INPUT
USER_INPUT needs to replaced with the argument that the user has passed. Assuming that the user will pass location as the first argument to the script, it can be accessed using $1.
I have created this gist to demonstrate the same.
As you can clearly see, I have defined start:local to access the first argument and then, pass it to the start script which then echoes out the passed in argument.
UPDATE:
Here is the script for ASKING a value from a user in a prompt format.
Basically, first I am asking for user input then, storing the same in a variable and passing the variable as an argument to npm start
References
Accessing Positional Arguments
Asking User Input

Use readline to get the ip value then use exec to spawn this process. This is a pure JS solution, and OS agnostic.
Example:
package.json
"scripts": {
"start": "npm run start:local -- -target.location",
"prompt": "node prompt.js"
},
prompt.js
const { spawn, execSync } = require('child_process');
const exec = commands => {
execSync(commands, { stdio: 'inherit', shell: true });
};
const spawnProcess = commands => {
spawn(commands, { stdio: 'inherit', shell: true });
};
const readline = require('readline');
const rl = readline.createInterface({
input: process.stdin,
output: process.stdout
});
rl.question('What is your current ip? example: https://192.168.1.10:9009 ', (ip) => {
console.log(`Starting server on: ${ip}`);
exec(`npm run start -- ${ip}`);
rl.close();
});

Example: If we want to run below three commands in sequence with userinput
git add .
git commit -m "With git commit message at run time"
git push
Add below command in your package.json file under the scripts
"gitPush": "git add . && echo 'Enter Commit Message' && read message && git commit -m \"$message\" && git push"
And run commands via npm run gitPush
References: ask-users-to-input-value-for-npm-script

Use Node's readline? it has methods for interactive IO.

If you're trying to ask for users to input ther response you could do something like this.
const readline = require("readline");
const reader = readline.createInterface({
input: process.stdin,
output: process.stdout,
error: process.stderr
});
const ask = (message, default_value = null) => new Promise(resolve => {
reader.question(message, (response)=>{
return resolve(response.length >= 1 ? response : default_value);
});
});
(()=>{
let ip = await ask(`Please enter your public ip: `);
})();

Related

Using Deno to interact through SSH hangs on p.output()

I'm looking to create an SSH sub-process and then interact with the server. I'm hung up on a basic step which is to simply wait until the SSH process has connected. I know that this ssh command connects fine because when I run it with inherit instead of piped, the ssh shell shows up as expected.
If I understand correctly, p.output() listens for stdout until it reaches EOF. I'm assuming that when SSH has connected, it streams the stdout, but does not EOF, and so p.output() never gets called.
const encoder = new TextEncoder();
const decoder = new TextDecoder();
const p=Deno.run({
cmd: ["ssh", "root#mywebsite"],
stdout: "piped",
stderr: "piped",
stdin: "piped"
});
const command = (cmd : string) => p.stdin.write(encoder.encode(cmd))
const getOutput = async () => decoder.decode(await p.output())
await p.output() // <----- Hangs here
await command("cd /home/dev/www")
await command("ls -la")
console.log(await getOutput())
await p.status()
console.log("done")
It hangs because .output will resolve once the entire process output has been read, meaning that until ssh command has finished, it will not resolve.
Also have in mind that you need to add \n at the end of each command, otherwise it'll never be triggered.
await command("cd /home\n");
await command("ls -la\n");
// if you don't finish the ssh session, .output will never resolve
await command("exit\n");
// now it will work correctly
console.log(await getOutput());
In any case if you don't want to close the session to read the output of a given command, what you need to do is use p.stdout.readable or p.stdout.read(buf) instead.
for await const(const chunk of p.stdout.readable) {
// parse chunk and do something
}

Unable to run ethers.getSigner() from the (nodeJS?) console

I am following a Hardhat intro tutorial by Reanblock https://www.youtube.com/watch?v=osHk0eEsjDM and I am stuck in the last few minutes of the video.
I have a simple hardhat.config.js file as follows:
require("#nomiclabs/hardhat-waffle");
require("#nomiclabs/hardhat-web3");
require('solidity-coverage');
task("accounts", "Prints the list of accounts", async (taskArgs, hre) => {
const accounts = await hre.ethers.getSigners();
for (const account of accounts) {
console.log(account.address);
}
});
task("azbalance", "-prints a/c balances- ")
.addParam("azaccount", "the accounts address")
.setAction(async (taskArgs) => {
const account = web3.utils.toChecksumAddress(taskArgs.azaccount);
const balance = await web3.eth.getBalance(account);
console.log(web3.utils.fromWei(balance, "ether"), "ETH");
})
/**
* #type import('hardhat/config').HardhatUserConfig
*/
module.exports = {
solidity: "0.8.4",
};
I successfully created a node in another terminal using the command:
npx hardhat node
I am able to deploy the contract using the following command.
npx hardhat run scripts/sample-script.js --network localhost
on the console I type:
npx hardhat console --network localhost
so far it has been working as expected.
on the (NodeJS ? ) console when I type
acc = await ethers.getSigner()
I get the following error:
Uncaught TypeError: ethers.getSigner is not a function
at REPL1:1:47
my goal is to get the account address using the command:
acc.address
**** Apologies if used the terms Terminal and Console incorrectly. I'm yet to figure out its proper usage ***
Where do you run the command npx hardhat console --network localhost?
Make sure you run from the folder contained the hardhat.config.js file.
Please try with the simple config first:
require("#nomiclabs/hardhat-waffle");
module.exports = {
solidity: "0.8.4",
defaultNetwork: "localhost",
networks: {
localhost: {
url: "http://127.0.0.1:8545"
}
}
};
You should able to achieve

How to store Terraform provisioner "local-exec" output in local variable and use variable value in "remote-exec"

I am working with Terraform provisionar. and in one scenario I need to execute a 'local-exec' provisionar and use the output [This is array of IP addesses] of the command into next 'remote-exec' provisionar.
And i am not able to store the 'local-exec' provisionar output in local variable to use later. I can store it in local file but not in intermediate variable
count = "${length(data.local_file.instance_ips.content)}"
this is not working.
resource "null_resource" "get-instance-ip-41" {
provisioner "local-exec" {
command = "${path.module}\\scripts\\findprivateip.bat > ${data.template_file.PrivateIpAddress.rendered}"
}
}
data "template_file" "PrivateIpAddress" {
template = "/output.log"
}
data "local_file" "instance_ips" {
filename = "${data.template_file.PrivateIpAddress.rendered}"
depends_on = ["null_resource.get-instance-ip-41"]
}
output "IP-address" {
value = "${data.local_file.instance_ips.content}"
}
# ---------------------------------------------------------------------------------------------------------------------
# Update the instnaces by installing newrelic agent using remote-exec
# ---------------------------------------------------------------------------------------------------------------------
resource "null_resource" "copy_file_newrelic_v_29" {
depends_on = ["null_resource.get-instance-ip-41"]
count = "${length(data.local_file.instance_ips.content)}"
triggers = {
cluster_instance_id = "${element(values(data.local_file.instance_ips.content[count.index]), 0)}"
}
provisioner "remote-exec" {
connection {
agent = "true"
bastion_host = "${aws_instance.bastion.*.public_ip}"
bastion_user = "ec2-user"
bastion_port = "22"
bastion_private_key = "${file("C:/keys/nvirginia-key-pair-ajoy.pem")}"
user = "ec2-user"
private_key = "${file("C:/keys/nvirginia-key-pair-ajoy.pem")}"
host = "${self.triggers.cluster_instance_id}"
}
inline = [
"echo 'license_key: 34adab374af99b1eaa148eb2a2fc2791faf70661' | sudo tee -a /etc/newrelic-infra.yml",
"sudo curl -o /etc/yum.repos.d/newrelic-infra.repo https://download.newrelic.com/infrastructure_agent/linux/yum/el/6/x86_64/newrelic-infra.repo",
"sudo yum -q makecache -y --disablerepo='*' --enablerepo='newrelic-infra'",
"sudo yum install newrelic-infra -y"
]
}
}
Unfortunately you can't. The solution I have found is to instead use an external data source block. You can run a command from there and retrieve the output(s), the only catch is that the command needs to produce json to standard output (stdout). See documentation here. I hope this is some help to others trying to solve this problem.

Error parsing triggers: Cannot find module 'firebase/firestore'

I am trying to run a very basic code on google api using firebase.
'use strict';
const functions = require('firebase-functions');
const {WebhookClient} = require('dialogflow-fulfillment');
//const {Card, Suggestion} = require('dialogflow-fulfillment');
var admin = require('firebase-admin');
require("firebase/firestore");
admin.initializeApp(functions.config().firebase);
//var firestore = admin.firestore();
process.env.DEBUG = 'dialogflow:debug'; // enables lib debugging statements
//firestore arguments defined
/* var addRef = firestore.collection('Admissions');
var feeRef = firestore.collection('Fees');
*/
exports.dialogflowFirebaseFulfillment = functions.https.onRequest((request, response) => {
const agent = new WebhookClient({ request, response });
console.log('Dialogflow Request headers: ' + JSON.stringify(request.headers));
console.log('Dialogflow Request body: ' + JSON.stringify(request.body));
console.log("request.body.queryResult.parameters: ", request.body.queryResult.parameters);
// Run the proper function handler based on the matched Dialogflow intent name
var intentMap = new Map();
});
It gives me a error that says
'Error parsing triggers: Cannot find module 'firebase/firestore'.Try running "npm install" in your functions directory before deploying.
When I run npm install inside the funtions directory, I get:
audited 9161 packages in 25.878s found 292 vulnerabilities (21 low,
207 moderate, 64 high) run npm audit fix to fix them, or npm
audit for details
Its been a week, I am stuck with these errors, these errors keep fluctuating based on the solution i find. But i am not able to overcome this error. Can you please check if there is something wrong I am doing, or anything else I need to try?
Just delete the node_modules folder and run npm install again. I was also stuck on this for a week. It is a corrupt file issue.

Gulp task to SSH and then mysqldump

So I've got this scenario where I have separate Web server and MySQL server, and I can only connect to the MySQL server from the web server.
So basically everytime I have to go like:
step 1: 'ssh -i ~/somecert.pem ubuntu#1.2.3.4'
step 2: 'mysqldump -u root -p'password' -h 6.7.8.9 database_name > output.sql'
I'm new to gulp and my aim was to create a task that could automate all this, so running one gulp task would automatically deliver me the SQL file.
This would make the developer life a lot easier since it would just take a command to download the latest db dump.
This is where I got so far (gulpfile.js):
////////////////////////////////////////////////////////////////////
// Run: 'gulp download-db' to get latest SQL dump from production //
// File will be put under the 'dumps' folder //
////////////////////////////////////////////////////////////////////
// Load stuff
'use strict'
var gulp = require('gulp')
var GulpSSH = require('gulp-ssh')
var fs = require('fs');
// Function to get home path
function getUserHome() {
return process.env.HOME || process.env.USERPROFILE;
}
var homepath = getUserHome();
///////////////////////////////////////
// SETTINGS (change if needed) //
///////////////////////////////////////
var config = {
// SSH connection
host: '1.2.3.4',
port: 22,
username: 'ubuntu',
//password: '1337p4ssw0rd', // Uncomment if needed
privateKey: fs.readFileSync( homepath + '/certs/somecert.pem'), // Uncomment if needed
// MySQL connection
db_host: 'localhost',
db_name: 'clients_db',
db_username: 'root',
db_password: 'dbp4ssw0rd',
}
////////////////////////////////////////////////
// Core script, don't need to touch from here //
////////////////////////////////////////////////
// Set up SSH connector
var gulpSSH = new GulpSSH({
ignoreErrors: true,
sshConfig: config
})
// Run the mysqldump
gulp.task('download-db', function(){
return gulpSSH
// runs the mysql dump
.exec(['mysqldump -u '+config.db_username+' -p\''+config.db_password+'\' -h '+config.db_host+' '+config.db_name+''], {filePath: 'dump.sql'})
// pipes output into local folder
.pipe(gulp.dest('dumps'))
})
// Run search/replace "optional"
SSH into the web server runs fine, but I have an issue when trying to get the mysqldump, I'm getting this message:
events.js:85
throw er; // Unhandled 'error' event
^
Error: Warning:
If I try the same mysqldump command manually from the server SSH, I get:
Warning: mysqldump: unknown variable 'loose-local-infile=1'
Followed by the correct mylsql dump info.
So I think this warning message is messing up my script, I would like to ignore warnings in cases like this, but don't know how to do it or if it's possible.
Also I read that using the password directly in the command line is not really good practice.
Ideally, I would like to have all the config vars loaded from another file, but this is my first gulp task and not really familiar with how I would do that.
Can someone with experience in Gulp orient me towards a good way of getting this thing done? Or do you think I shouldn't be using Gulp for this at all?
Thanks!
As I suspected, that warning message was preventing the gulp task from finalizing, I got rid of it by commenting the: loose-local-infile=1 From /etc/mysql/my.cnf