Etherscan has no support for network sepolia with chain id 11155111 - solidity

I am trying to verify my deployed contract from truffle and getting "Etherscan has no support for network sepolia with chain id 11155111" error. So I am working with Etherscan and I deployed my contract on sepolia testnet.
How can I solve this problem?
My truffle-config.js
const apikeys = require("./chains/apikeys");
const keys = require("./keys.json");
module.exports = {
plugins: ["truffle-plugin-verify"],
api_keys:{
etherscan: "myApiEtherScan"
},
contracts_build_directory: "./public/contracts",
networks: {
development: {
host: "127.0.0.1",
port: 7545,
network_id: "*",
},
sepolia: {
provider: () =>
new HDWalletProvider(
keys.PRIVATE_KEY,
keys.INFURA_SEPOLIA_URL,
),
network_id: 11155111,
gas:5221975,
gasPrice:20000000000,
confirmations: 3,
timeoutBlocks:200,
skipDryRun: true
}
},
compilers: {
solc: {
version: "0.8.16",
settings: {
optimizer: {
enabled: true, // Default: false
runs: 1000 // Default: 200
}
}
}
},
};

Plugin truffle-plugin-verify doesn't have Sepolia chain support
I added the sepolia api url and etherscan url to constants and it works
const API_URLS = {
...
11155111: 'https://api-sepolia.etherscan.io/api',
...
}
const EXPLORER_URLS = {
...
11155111: 'https://sepolia.etherscan.io/address',
...
}
https://github.com/tafonina/truffle-plugin-verify

Related

Strapi media library image url broken by Digital Ocean

I am trying to upload an image to Digital Ocean Storage. it's upload in DO but after callback strapi generate wrong URL
for Example: https://https//jobsflow/d0e989a489bdc380c55e5846076d07f8.png?updated_at=2022-06-08T17:00:32.934Z thats mean https://https//.
//jobsflow is my location of storage.
here is my config/plugins.js code
module.exports = {
upload: {
config: {
provider: "strapi-provider-upload-dos",
providerOptions: {
key: process.env.DO_SPACE_ACCESS_KEY,
secret: process.env.DO_SPACE_SECRET_KEY,
endpoint: process.env.DO_SPACE_ENDPOINT,
space: process.env.DO_SPACE_BUCKET,
directory: process.env.DO_SPACE_DIRECTORY,
cdn: process.env.DO_SPACE_CDN,
},
},
},
};
//here is my config/middleware.js
module.exports = [
"strapi::errors",
{
name: "strapi::security",
config: {
contentSecurityPolicy: {
useDefaults: true,
directives: {
"connect-src": ["'self'", "https:"],
"img-src": [
"'self'",
"data:",
"blob:",
"*.digitaloceanspaces.com"
],
"media-src": ["'self'", "data:", "blob:"],
upgradeInsecureRequests: null,
},
},
},
},
"strapi::cors",
"strapi::poweredBy",
"strapi::logger",
"strapi::query",
"strapi::body",
"strapi::favicon",
"strapi::public",
];
please help me..! if you have any idea
Are you using a custom upload provider for this?
Why not use the official #strapi/provider-upload-aws-s3 plugin?
// path config/plugins.js
...
upload: {
config: {
provider: 'aws-s3',
providerOptions: {
accessKeyId: env('DO_ACCESS_KEY_ID'),
secretAccessKey: env('DO_ACCESS_SECRET'),
region: env('DO_REGION'),
endpoint: env('DO_ENDPOINT'),
params: {
Bucket: env('DO_BUCKET'),
}
},
},
},
Another nice trick to change the URL of an image and point it to your CDN is adding this:
// src/index.js
async bootstrap({strapi}){
strapi.db.lifecycles.subscribe({
models: ['plugin::upload.file'],
// use cdn url instead of space origin
async beforeCreate(data) {
data.params.data.url = data.params.data.url.replace(__ORIGINAL_URL__, __CDN_URL__)
// you can even do more here like setting policies for the object you're uploading
},
});
}

Can't deploy smart contract to hardhat network

I set up a project with hardhat for an NFT app. I modify hardhat.config.js like this:
const { ALCHEMY_KEY, ACCOUNT_PRIVATE_KEY } = process.env;
module.exports = {
solidity: "0.8.0",
defaultNetwork: "hardhat",
networks: {
hardhat: {},
rinkeby: {
url: `https://eth-rinkeby.alchemyapi.io/v2/${ALCHEMY_KEY}`,
accounts: [`0x${ACCOUNT_PRIVATE_KEY}`]
},
// ethereum: {
// chainId: 1,
// url: `https://eth-mainnet.alchemyapi.io/v2/${ALCHEMY_KEY}`,
// accounts: [`0x${ACCOUNT_PRIVATE_KEY}`]
// },
},
}
then I created a deploy script in the scripts folder with the deploy task
// scripts/deploy.js
const { task } = require("hardhat/config");
const { getAccount } = require("./helpers");
task("deploy", "Deploys the TokenV2.sol contract").setAction(async function (taskArguments, hre) {
const tokenFactory = await hre.ethers.getContractFactory("TokenV2", getAccount());
const token = await tokenFactory.deploy();
await token.deployed()
console.log(`Contract deployed to address: ${token.address}`);
});
The problem it's when I run npx hardhat deploy it's shows this error in terminal: Error: unsupported getDefaultProvider network (operation="getDefaultProvider", network="hardhat", code=NETWORK_ERROR, version=providers/5.5.3) What I missed? I will appreciate any help.
I never used defaultNetwork in my works so I just had the following hardhat.config File and had no issues at all:
{
"solidity":"0.8.4",
"networks":{
"rinkeby":{
"url":"`https://eth-rinkeby.alchemyapi.io/v2/${ALCHEMY_KEY}`",
"accounts":[
"`0x${ACCOUNT_PRIVATE_KEY}`"
]
}
}
}

Truffle contract verify not working on BSC testnet

I am trying to verify my deployed contract from truffle and getting "Etherscan has no support for network testnet with id 97" error. So I am working with Bscscan and I deployed my contract on bsc testnet.
How can I solve this problem?
My truffle-config.js
const HDWalletProvider = require('truffle-hdwallet-provider');
const fs = require('fs');
const mnemonic = fs.readFileSync(".secret").toString().trim();
const BSCSCANAPIKEY = fs.readFileSync("apikey").toString().trim();
module.exports = {
networks: {
development: {
host: "127.0.0.1", // Localhost (default: none)
port: 8545, // Standard BSC port (default: none)
network_id: "*", // Any network (default: none)
},
testnet: {
provider: () => new HDWalletProvider(mnemonic, `https://data-seed-prebsc-1-s1.binance.org:8545`),
network_id: 97,
confirmations: 1,
timeoutBlocks: 200,
skipDryRun: true
},
bsc: {
provider: () => new HDWalletProvider(mnemonic, `https://bsc-dataseed1.binance.org`),
network_id: 56,
confirmations: 10,
timeoutBlocks: 200,
skipDryRun: true
},
},
// Set default mocha options here, use special reporters etc.
mocha: {
// timeout: 100000
},
// Configure your compilers
compilers: {
solc: {
version: "0.6.12"
}
},
plugins: [
'truffle-plugin-verify'
],
api_keys: {
bscscan: BSCSCANAPIKEY
},
}
Result:
> truffle run verify MyToken#{address}--network testnet
Etherscan has no support for network testnet with id 97
Install the latest version of truffle-plugin-verify.
Now the latest version is 0.5.4.
npm install truffle-plugin-verify#^0.5.4 -D
Why this happens?
In this file (https://github.com/rkalis/truffle-plugin-verify/blob/32ab0f698b1e151849ab463357cded664c5cffa3/constants.js)
You can see the last two API_URLs (56 & 97). This is added to the upper version rather than what you installed.
const API_URLS = {
1: 'https://api.etherscan.io/api',
3: 'https://api-ropsten.etherscan.io/api',
4: 'https://api-rinkeby.etherscan.io/api',
5: 'https://api-goerli.etherscan.io/api',
42: 'https://api-kovan.etherscan.io/api',
56: 'https://api.bscscan.com/api',
97: 'https://api-testnet.bscscan.com/api'
}

Serverless framework lambda function access denied to S3

Anyone have any ideas why I'm getting "Access Denied" when trying to put object into S3 inside a lambda function? I have the serverless AWS user with AdministorAccess and allow access to s3 resource inside serverless.yml:
iamRoleStatements:
- Effect: Allow
Action:
- s3:PutObject
Resource: "arn:aws:s3:::*"
Edit - here are the files
serverless.yml
service: testtest
app: testtest
org: workx
provider:
name: aws
runtime: nodejs12.x
iamRoleStatements:
- Effect: Allow
Action:
- s3:PutObject
Resource: "arn:aws:s3:::*/*"
functions:
hello:
handler: handler.hello
events:
- http:
path: users/create
method: get
handler.js
'use strict';
const AWS = require('aws-sdk');
// get reference to S3 client
const S3 = new AWS.S3();
// Uload the content to s3 and allow download
async function uploadToS3(content) {
console.log('going to upload to s3!');
const Bucket = 'mtest-exports';
const key = 'testtest.csv';
try {
const destparams = {
Bucket,
Key: key,
Body: content,
ContentType: "text/csv",
};
console.log('going to put object', destparams);
const putResult = await S3.putObject(destparams).promise();
return putResult;
} catch (error) {
console.log(error);
throw error;
}
}
module.exports.hello = async event => {
const result = await uploadToS3('hello world');
return {
statusCode: 200,
body: JSON.stringify(result),
};
};
I was using TypeScript plugin - #serverless/typescript. I used it to create Lambda function that will resize images that are uploaded to S3 + do some kind of content moderation.
Here is the content of serverless.ts file:
import type { AWS } from '#serverless/typescript';
import resizeImageLambda from '#functions/resizeImageLambda';
const serverlessConfiguration: AWS = {
service: 'myservice-image-resize',
frameworkVersion: '3',
plugins: ['serverless-esbuild'],
provider: {
name: 'aws',
stage: 'dev',
region: 'us-east-1',
profile: 'myProjectProfile', // reference to your local AWS profile created by serverless config command
// architecture: 'arm64', // to support Lambda w/ graviton
iam: {
role: {
statements: [
{
Effect: 'Allow',
Action: [
's3:GetObject',
's3:PutObject',
's3:PutObjectAcl',
's3:ListBucket',
'rekognition:DetectModerationLabels'
],
Resource: [
'arn:aws:s3:::myBucket/*',
'arn:aws:s3:::myBucket',
'arn:aws:s3:::/*',
'*'
]
},
{
Effect: 'Allow',
Action: [
's3:ListBucket',
'rekognition:DetectModerationLabels'
],
Resource: ['arn:aws:s3:::myBucket']
}
]
}
},
// architecture: 'arm64',
runtime: 'nodejs16.x',
environment: {
AWS_NODEJS_CONNECTION_REUSE_ENABLED: '1',
NODE_OPTIONS: '--enable-source-maps --stack-trace-limit=1000',
SOURCE_BUCKET_NAME:
'${self:custom.myEnvironment.SOURCE_BUCKET_NAME.${self:custom.myStage}}',
DESTINATION_BUCKET_NAME:
'${self:custom.myEnvironment.DESTINATION_BUCKET_NAME.${self:custom.myStage}}'
}
},
// import the function via paths
functions: { resizeImageLambda },
package: { individually: true },
custom: {
esbuild: {
bundle: true,
minify: false,
sourcemap: true,
exclude: ['aws-sdk'],
target: 'node16',
define: { 'require.resolve': undefined },
platform: 'node',
concurrency: 10,
external: ['sharp'],
packagerOptions: {
scripts:
'rm -rf node_modules/sharp && SHARP_IGNORE_GLOBAL_LIBVIPS=1 npm install --arch=x64 --platform=linux --libc=glibc sharp'
}
},
myEnvironment: {
SOURCE_BUCKET_NAME: {
dev: 'myBucket',
prod: 'myBucket-prod'
},
DESTINATION_BUCKET_NAME: {
dev: 'myBucket',
prod: 'myBucketProd'
}
},
myStage: '${opt:stage, self:provider.stage}'
}
};
module.exports = serverlessConfiguration;
resizeImageLambda.ts
/* eslint-disable no-template-curly-in-string */
// import { Config } from './config';
export const handlerPath = (context: string) =>
`${context.split(process.cwd())[1].substring(1).replace(/\\/g, '/')}`;
export default {
handler: `${handlerPath(__dirname)}/handler.main`,
events: [
{
s3: {
bucket: '${self:custom.myEnvironment.SOURCE_BUCKET_NAME.${self:custom.myStage}}',
event: 's3:ObjectCreated:*',
existing: true,
forceDeploy: true // for existing buckets
}
}
],
timeout: 15 * 60, // 15 min
memorySize: 2048
};
I remember there were few issues when I wanted to connect it to existing buckets (created outside serverless framework) such as IAM policy was not re-created / updated properly (see forceDeploy end existing parameters in function.events[0].s3 properties in resizeLambda.ts file)
Turns out I was an idiot and have the custom config in the wrong place and ruin the serverless.yml file!

How can we setup the BitGo testnet and livenet?

How can we setup the BitGo testnet and livenet for the below code?
I want to setup the MultiSig Wallet and want to set up our own bitcoin node. But it seems BitGo doesn't open the node code for setup.
If anybody has done this so please let me know:
++++++++++++++++++++++++++++++++++++++++++++++++
var bitcoin = require('bitcoinjs-lib');
exports.Environments = {
prod: {
uri: 'https://www.bitgo.com',
networks: {
btc: bitcoin.networks.bitcoin
},
network: 'bitcoin',
ethNetwork: 'ethereum',
rmgNetwork: 'rmg',
signingAddress: '1BitGo3gxRZ6mQSEH52dvCKSUgVCAH4Rja',
serverXpub: 'xpub661MyMwAqRbcEtUgu9HF8ai4ip'
},
rmgProd: {
uri: 'https://rmg.bitgo.com',
networks: {
btc: bitcoin.networks.bitcoin
},
network: 'bitcoin',
ethNetwork: 'ethereum',
rmgNetwork: 'rmg',
signingAddress: '1BitGo3gxRZ6mQSEH52dvCKSUgVCAH4Rja',
serverXpub: 'xpub661MyMwAqRbcEtUgu9HF8ai4ipuVKK'
},
staging: {
uri: 'https://staging.bitgo.com',
networks: {
btc: bitcoin.networks.bitcoin
},
network: 'bitcoin',
ethNetwork: 'ethereum',
rmgNetwork: 'rmg',
signingAddress: '1BitGo3gxRZ6mQSEH52dvCKSUgVCAH4Rja',
serverXpub:'xpub661MyMwAqRbcEtUg'
},
rmgStaging: {
uri: 'https://rmgstaging.bitgo.com',
networks: {
btc: bitcoin.networks.bitcoin
},
network: 'bitcoin',
ethNetwork: 'ethereum',
rmgNetwork: 'rmg',
signingAddress: '1BitGo3gxRZ6mQSEH52dvCKSUgVCAH4Rja',
serverXpub: 'xpub661MyMwAqRbcEtUgu9HF8ai4ipuVKK' },
test: {
uri: 'https://test.bitgo.com',
networks: {
tbtc: bitcoin.networks.testnet
},
network: 'testnet',
ethNetwork: 'ethereum',
rmgNetwork: 'rmgTest',
signingAddress: 'msignBdFXteehDEgB6DNm7npRt7AcEZJP3',
serverXpub: 'xpub661MyMwAqRbcErFqVXGiUFv9YeoPbh'
},
All bitgo provides is API for creating wallets and transacting.
Use Bitgo if you either need BitGo JS SDK or you need BitGo Express.
For testnet you need to create account on https://test.bitgo.com while for livenet you need an account on https://bitgo.com. After creating accountsyou need to create APIs and implement the code (preferably on NodeJS).
maybe try this https://github.com/bitpay/bitcore-node