Get Metamask current network name when dashboard is specified as a network option using Truffle - solidity

I'm using Truffle to develop DAPP. I would like to ask if it's possible to dynamically get the network name during deployment process with dashboard as a spcified network. What I mean by that is I have a deploy-config.js file which holds different configurations for different networks. I also have a 2_deploy_MyContract.js migration file. MyContract expects a struct in the constructor function as a parameter.
const MyContract = artifacts.require('MyContract');
const getConfig = require('../deploy-config');
module.exports = async function (deployer) {
const config = getConfig(currently_selected_network); <-- The Problem
await deployer.deploy(
MyContract,
{
...config.data
}
);
};
When I run truffle migrate --reset --network dashboard I can change the selected network using metamask any time. I would like to somehow fetch the network name it deploys to and pass it as currently_selected_network so my js function can provide proper config values. I think I can try specifying the network name by updating the truffle-config.js file and then only deploy to those predefined networks but using dashboard allows me to not keep the mnemonic inside the repo and sign every transaction by Metamask extension.
If you have any other ideas how to achive this goal I will be more than happy to hear it out!
This is how deploy-config.js looks like
const config = {
network1: {
paramA: "A"
paramB: "B"
},
network2: {
paramA: "C"
paramB: "D"
}
}
function getConfig(networkName) {
switch(networkName) {
case "network1":
return config.network1;
case "network2":
return config.network2;
default:
return null;
}
module.exports = getConfig

On your migration scripts add, network to your anonymous function
I believe order and arg position matter.
e.g.
module.exports = async (deployer, network, accounts) => {
console.log(network);
console.log(accounts);// also useful to have this at hand
}

Related

How to deploy multiple smart contracts using hardhat-deploy

I have two smart contracts that I want to deploy. I want to deploy the first one, then pass the address of the first into the constructor of the second one. I am new to hardhat-deploy and keep getting caught up with this.
Thanks!
First create file "scripts/deploy.js".
const { ethers } = require("hardhat");
async function main() {
const [deployer] = await ethers.getSigners();
console.log('Deploying contracts with the account: ' + deployer.address);
// Deploy First
const First = await ethers.getContractFactory('FirstContract');
const first = await First.deploy();
// Deploy Second
const Second = await ethers.getContractFactory('SecondContract');
const second = await Second.deploy(first.address);
console.log( "First: " + first.address );
console.log( "Second: " + second.address );
}
main()
.then(() => process.exit())
.catch(error => {
console.error(error);
process.exit(1);
})
Then run this command.
npx hardhat run scripts/deploy.js --network ropsten
The original question specifically refers to the hardhat-deploy NPM package (i.e. the community plugin for hardhat tooling). On that basis the answer provided is not directly correct.
hardhat-deploy's documentation is extensive and thorough. The portion relevant to the deployment of (multiple) contracts is here
The hardhat deployment documentation here may be a little bit cryptic for newcomers. But it is very simple to deploy multiple contracts using hardhat deploy.
First create the deployment scripts in the deploy directory which is in the same level as of contracts directory.
You can name the deployment scripts like 01-deploy-contract-1.js, 02-deploy-contract-2.js etc.
A sample deploy script is as shown below.
01-deploy-contract-1.js
const { ethers, hardhat, upgrades, network } = require("hardhat");
require("dotenv").config();
async function deployFunc(hre) {
if (process.env.DEPLOY_TOKEN_LIB.toLowerCase() === "true") {
console.log("Deploying Contract - 1...");
await deployContract1();
console.log("-------------------------");
}
}
async function deployContract1() {
const networkName = network.name;
let owner = ethers.getSigner();
console.log(`Deploying to '${networkName}' as ${(await owner).address}...`);
const contract1 = await ethers.getContractFactory("Contract1");
lib = await contract1.deploy((await owner).address);
await lib.deployed();
console.log(`Token library contract deployed to ${contract1.address}`);
}
module.exports.default = deployContract1;
module.exports.tags = ["all", "tokenlibrary"];
Now run npx hardhat deploy. It will run all the deployment scripts in the deploy folder.
An added advantage of hardhat deployment scripts is, when you run npx hardhat node, it will automatically deploy all the contracts, and your local node will be ready with all the contracts ready to test.

electron.js and sql - correct way to set it up?

I am new to electron.js - been reading the documentation and some similar post here:
How do I make a database call from an Electron front end?
Secure Database Connection in ElectronJS Production App?
Electron require() is not defined
How to use preload.js properly in Electron
But it's still not super clear how to properly implement a secure SQL integration. Basically, I want to create a desktop database client. The app will connect to the remote db and users can run all kind of predefined queries and the results will show up in the app.
The documentation says that if you are working with a remote connection you shouldn't run node in the renderer. Should I then require the SQL module in the main process and use IPC to send data back and forth and preload IPCremote?
Thanks for the help
Short answer: yes
Long answer:
Allowing node on your renderer poses a big security risk for your app. It is best practices in this case to create pass a function to your preloader. There are a few options you can use to do this:
Pass a ipcRenderer.invoke function wrapped in another function to your renderer in your preload. You can then invoke a call to your main process which can either send info back via the same function or via sending it via the window.webContents.send command and listening for it on the window api on your renderer. EG:
Preload.js:
const invoke = (channel, args, cb = () => {return}) => {
ipcRenderer.invoke(channel, args).then((res) => {
cb(res);
});
};
const handle = (channel, cb) => {
ipcRenderer.on(channel, function (Event, message) {
cb(Event, message);
});
};
contextBridge.exposeInMainWorld("GlobalApi", {
invoke: invoke,
handle:handle
});
Renderer:
let users
window.GlobalApi.handle("users", (data)=>{users=data})
window.GlobalApi.invoke("get", "users")
or:
let users;
window.GlobalApi.invoke("get", "users", (data)=>{users=data})
Main:
ipcMain.handle("get", async (path) => {
let data = dbFunctions.get(path)
window.webContents.send(
path,
data
);
}
Create a DB interface in your preload script that passes certain invocations to your renderer that when called will return the value that you need from your db. E.G.
Renderer:
let users = window.myCoolApi.get("users");
Preload.js:
let get = function(path){
let data = dbFuncions.readSomeDatafromDB("path");
return data; // Returning the function itself is a no-no shown below
// return dbFuncions.readSomeDatafromDB("path"); Don't do this
}
contextBridge.exposeInMainWorld("myCoolApi", {
get:get
});
There are more options, but these should generally ensure security as far as my knowledge goes.

using redis in getServerSideProps results in error net.isIP is not a function

Correct me if I am wrong but getServerSideProps is used to pre-render data on each render? If I use the standard redis npm module in getServerSideProps I get the error net.isIP is not a function. From what I have researched this is due to the client trying to use the redis functions.
I am trying to create an application to where session data is saved in a redis key based on a cookie token. Based on the user Id a database is called and renders data to the component. I can get the cookie token in getServerSideProps but I I run client.get(token) I get the error net.isIP is not a function at runtime.
Am I not using getServerSideProps correctly or should I be using a different method / function? I am new to the whole Next.js world. I appreciate the help.
If I use the same functionality in an /api route everything works correctly.
import util from 'util';
import client from '../libs/redis' // using 'redis' module
// ... my component here
export async function getServerSideProps(context) {
const get = util.promisify(client.get).bind(client);
const name = await get('mytoken') // error `net.isIP is not a function`
return {
props: {
name
},
}
}
// redis.js
const redis = require("redis");
const client = redis.createClient();
client.on("error", function(error) {
console.error(error);
});
module.exports = client
I upgraded to next version 10.0.6 from 9.3 and I do not receive the error anymore.

Dependency Injection (for HttpFetch) at setRoot in main.js Aurelia

I am having trouble getting dependency injection working for my AuthorizerService. Obviously, dep-inj is not ready until after Aurelia "starts", but I wasn't sure how to access it.
main.js:
aurelia.container.registerInstance(HttpClient, http.c());
// set your interceptors to take cookie data and put into header
return aurelia.start().then(() => {
let Authorizer = new AuthorizerService();
aurelia.container.registerInstance(AuthorizerService, Authorization);
console.log('Current State: %o', Authorizer.auth);
Authorizer.checkCookieAndPingServer().then(() => { console.log('Current State: %o', Authorizer.auth); aurelia.setRoot(PLATFORM.moduleName('app')); }, () => { aurelia.setRoot(PLATFORM.moduleName('login-redirect')); });
});
Now the problem is that if I do "new AuthorizerService()" then "this.http.fetch()" is not available in AuthorizerService.js.
Am I meant to pass "http.c()" (which delivers the HttpClient instance) as a parameter inside:
checkCookieAndPingServer(http.c())
or is there another way?
Can I delete "new AuthorizerService()" and just do (I made this up):
aurelia.container.getInstance(AuthorizerService);
Somehow FORCE it to do dependency-injection and retrieve the "registered Instance" of "http.c()"?
I can't just check cookie. I have to ping server for security and the server will set the cookie.
I think this is all sorts of wrong, because I need a global parameter that just is false by default, then it does the query to backend server and setsRoot accordingly. Perhaps only query backend in the "login page"? Okay but then I would need to do "setRoot(backtoApp); aurelia.AlsoSetLoggedIn(true);" inside login module. But when I setRoot(backtoApp) then it just starts all over again.
In other words, when setRoot(login); then setRoot(backToApp); <-- then AuthorizerService instance doesn't have its proper data set (such as loggedIn=true).
EDIT: Better Solution maybe:
main.js:
return aurelia.start().then(() => {
let Authorizer = aurelia.container.get(AuthorizerService);
let root = Authorizer.isAuthenticated() ? PLATFORM.moduleName('app') : PLATFORM.moduleName('login');
console.log('Current State: %o', Authorizer.auth);
aurelia.setRoot(root);
});
Authorizer.js
constructor(http) {
this.http = http;
this.auth = {
isAuthenticated: false,
user: {}
}
}
"this.auth" is no longer static. No longer "static auth = { isAuthenticated: false }" which was some example code I had found.
So now "auth" gets set inside "login" module. But this means the "login" module is displayed every single time the app loads briefly, before being redirected back to "setRoot(backToApp)"
If the class you want to get the instance is purely based on service classes and has no dependencies on some Aurelia plugins, it doesn't need to wait until Aurelia has started to safely invoke the container.
For your example:
aurelia.container.getInstance(AuthorizerService);
It can be
aurelia.container.get(AuthorizerService);
And you should not use new AuthorizerService(), as you have noticed in your question.

Multiple realms in React Native don't query realm object server correctly on first launch of app after install

I am having an issue dealing with multiple realms in React Native. I'm working on an app that allows users to use the app without having a subscription (using a local realm) and then at any point in their app journey they have the option of upgrading to a subscription with syncing (which uses sync to a realm object server).
When I start the app I check to see if they are using sync and if so I initialize a synced realm with their user and everything works great. I get all the data I expect.
However, when the app starts on first launch after install (that part about first launch after install is crucial) and I see that they don't use sync I initialize a local realm which I save data to until they decide to log in to their sync account (if they have one). At this point I attempt to pull information from the synced realm but it does not have the information that I saw when I only initialized the synced realm (in the case that on app startup I detect they use sync).
I am able to log in as the sync user but the data isn't there if I've previously initialized a local realm AND this logic gets run on the first launch of the app after install. The data only shows up from the realm object server when I initialize a local and synced realm on a secondary launch of the app (no reinstall before launching).
Here's a simple test script with dummy data in it with which I've been able to replicate the observed behavior:
const username = 'testuser2';
const password = 'supersecret';
const tld = 'REALM_OBJECT_SERVER_TLD';
class Test extends Realm.Object {}
Test.schema = {
name: 'Test',
properties: {
id: {
type: 'string',
},
}
};
function initLocalRealm() {
return new Realm({
path: 'local.realm',
schema: [Test],
});
}
function initSyncedRealmWithUser(user) {
return new Realm({
path: 'synced.realm',
sync: {
user,
url: `realm://${tld}:9080/~/data`,
},
schema: [Test],
});
}
function writeTestObjectWithId(realm, id) {
realm.write(() => {
realm.create('Test', {
id,
});
alert(`Test object with id: ${id}`);
});
}
initLocalRealm();
// setup
// uncomment this and comment out the login section to setup user on first run
// Realm.Sync.User.register(`http://${tld}:9080`, username, password, (error, user) => {
// if (error) {
// return;
// }
// const syncedRealm = initSyncedRealmWithUser(user);
// writeTestObjectWithId(syncedRealm, '1');
// });
// login
Realm.Sync.User.login(`http://${tld}:9080`, username, password, (error, user) => {
if (error) {
return;
}
const syncedRealm = initSyncedRealmWithUser(user);
alert(`Synced realm test objects: ${syncedRealm.objects('Test').length}`);
});
If you create a react native app and then add this code to the main components componentDidMount function you should see that on the first run of the app (after you've uncommented the register code once) you will see the Test collection length at 0, but then when you refresh you will see the Test collection length at 1.
Any help on this would be awesome.
Thanks!
running your code snippet, I get a length of 1 immediately as soon as I uncomment the login section. Could you try observing your synchronized realm with the Realm Browser and see if it seems to have the data you are expecting after registering the user?