Getting ERR_CERT_AUTHORITY_INVALID with axios - vue.js

I'm trying to do a post request via https with vue-axios.
However, since i'm using a self-signed certificate that i created, i'm getting the following error:
net::ERR_CERT_AUTHORITY_INVALID
Upon searching i found that most people solve this by doing the following
const instance = axios.create({
httpsAgent: new https.Agent({
rejectUnauthorized: false
})
});
instance.get('https://something.com/foo');
// At request level
const agent = new https.Agent({
rejectUnauthorized: false
});
axios.get('https://something.com/foo', { httpsAgent: agent });
I tried both option but didn't have any success with them.
I used the npm https module for the https.Agent.
Does anyone know how to solve this problem? or should I just change from axios to other modules?
edited:
the piece of code I'm running with the error at the moment:
const axiosInstance = axios.create({
baseURL: 'https://localhost:5000',
httpsAgent: new https.Agent({
rejectUnauthorized: false
}),
});
axiosInstance.post('/user', LoginRequest,
{ headers: { 'Content-Type': 'application/json' } })
.then(response => this.assignLogin(response.data));
tried to change to a module named needle and use https but had the same error:
needle:
const headers = { 'Content-Type': 'application/json' };
const options = {
method: 'POST',
headers: headers,
rejectUnauthorized: false,
requestCert: true,
agent: false,
strictSSL: false,
}
needle.post('https://localhost:5000/user', LoginRequest, options).on('end', function() { })
https:
const options = {
hostname: 'localhost',
port: 5000,
path: '/user',
strictSSL: false,
rejectUnauthorized: false,
secureProtocol: 'TLSv1_method',
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
};
const req = https.request(options, (res) => {
console.log('statusCode:', res.statusCode);
console.log('headers:', res.headers);
res.on('data', (d) => {
this.assignLogin(d);
});
});
req.on('error', (e) => {
console.error(e);
});
req.write(LoginRequest);
req.end();

Since you mention that you are "using a self-signed certificate that you created", I guess that you are using this for local development tests. I had a similar issue, when testing locally in Chrome.
As this error message (net::ERR_CERT_AUTHORITY_INVALID) is a way of Chrome blocking a URL with an "unsafe" certificate, you need to solve this issue through Chrome, telling it that you trust the certificate.
The solution I use is the old one thisisunsafe. (ONLY USE THIS SOLUTION IF YOU REALLY TRUST THE CERTIFICATE, I.E., IT'S YOUR OWN CERTIFICATE):
SOLUTION: Just open a tab in Chrome, try to open a URL with your server address (in your case, https://localhost:5000/). Chrome will display a warning message, so you click anywhere in the window, and type thisisunsafe. Chrome will now allow access to this certificate. When you reload the client again and try to request the server, it will work.

Related

next.js 404 fail to load resource /api/ on vercel

i'm building a next js website that i deploy on vercel.
I made a next.js api /api/contact which send a mail via nodemailer. It works fine when i try the code on my pc but when i upload on vercel (with github integration) i get a "404 failed to load resource" for /api/contact in the console and it doesn't work.
Is there any more configuration to do for next.js api to work on vercel ?
Here is the code for the api call :
fetch("/api/contact", {
method: "POST",
headers: {
Accept: "application/json, text/plain, */*",
"Content-Type": "application/json",
},
body: JSON.stringify(data),
}).then((res) => {
contact.js in api folder
So here are the answers that worked :
"failed to load resource" disappeared after several deployments but i still had a 404.
Problem was that transporter.sendMail needed to be async + i had issue with gmail, i ended using another mail provider (zoho). So for anyone faceing the same issues here is a working code (maybe not the best but it's working) :
export default async (req, res) => {
require('dotenv').config()
const nodemailer = require('nodemailer');
async function mail() {
console.log('enter async function');
const transporter = nodemailer.createTransport({
name: "smtp.zoho.com",
port: 465,
host: "smtp.zoho.com",
auth: {
user: process.env.mailsender,
pass: process.env.mailpw,
},
secure: true,
})
let mail = await transporter.sendMail({
from: process.env.mailsender,
to: process.env.mailreceive,
subject: `${req.body.message}`,
text: `${req.body.message}`,
html: `<div><p>${req.body.message}</p></div>`
});
}
try {
console.log('sending mail');
await mail();
res.status(200);
console.log('mail should be sent');
} catch (error) {
console.log(error);
console.log('error sending mail');
res.status(404);
} finally {
res.end();
};
}

createProxyMiddleware not working on Azure Webapp

I'm running an Angular Universal application that is talking to an API. Now I'm trying to set up a proxy in the Universal server that proxies API requests to the actual API server:
server.use(['/api', '/sitemap.txt'], createProxyMiddleware({
target: process.env.API_URL,
onProxyReq: req => {
console.log('Using origin: ' + getOrigin(req.getHeaders()));
req.setHeader('origin', getOrigin(req.getHeaders()));
},
pathRewrite: {'^/api': ''}
}));
This works perfectly when running locally, but when running it on the server (Azure WebApp), it doesn't work. I can see the console log being called in the WebApp logs, but the resulting document is the Angular application with a message "page not found".
I'm totally out of ideas on where to look for solutions.
Edit:
I tried another proxy middleware and it does do the trick. This code works both locally and on Azure.
import * as proxy from 'express-http-proxy';
// ...
server.use(['/api', '/sitemap.txt'], proxy(process.env.API_URL, {
proxyPathResolver: req => {
let url: string = req.url;
if (url.startsWith('/api')) {
url = url.substr(4);
}
return url;
},
proxyReqOptDecorator(proxyReqOpts, srcReq) {
proxyReqOpts.headers['origin'] = getOrigin(proxyReqOpts.headers);
return proxyReqOpts;
}
}));
But it has some other limitations that make it unusable for our project, so I still need this resolved.
I have it working correctly now. This is the current setup:
server.use(
'/api',
createProxyMiddleware({
target: process.env.API_URL,
changeOrigin: true,
headers: {
Connection: 'keep-alive',
},
onProxyReq: (proxyReq, req, _res) => {
proxyReq.setHeader('origin', getOrigin(req.headers));
},
pathRewrite: {
'^/api': '',
},
})
);
So I added changeOrigin and the keep-alive header. I'm not sure which of the two resolved the issue, once I got it to work I never bothered to check it out. I suspect it's the header, though.

Android: Network Request Failed when trying to upload image with fetch

I'm trying to upload an image from storage to a restful API but I keep getting Network Request Failed on Android (which means the request doesn't even go through), haven't checked on iOS because I don't need that part yet. API is already working and has been tested with Postman.
The React Native code is:
body.append('vehicles',{
resource_id: 2,
resource: 'vehicles',
cat_file_id: fileId,
active: 1,
vehicles: photo, //<- photo value below
name: 'vehicles',
type: 'image/jpeg'
})
fetch(`${BASE_URL}/files`, {
method: 'POST',
headers: {
'Content-Type': 'multipart/form-data',
Accept: "*/*",
Authorization: 'Bearer '+auth
},
body: body
}).then(response => response.json())
.then(response => {
console.log('IMAGE RESPONSE', response)
})
.catch(error => console.log('ERROR', error))
The photo value looks like file:///storage/emulated/0/DCIM/...
The response:
ERROR TypeError: Network request failed
at XMLHttpRequest.xhr.onerror (fetch.umd.js:473)
at XMLHttpRequest.dispatchEvent (event-target-shim.js:818)
at XMLHttpRequest.setReadyState (XMLHttpRequest.js:574)
at XMLHttpRequest.__didCompleteResponse (XMLHttpRequest.js:388)
at XMLHttpRequest.js:501
at RCTDeviceEventEmitter.emit (EventEmitter.js:189)
at MessageQueue.__callFunction (MessageQueue.js:436)
at MessageQueue.js:111
at MessageQueue.__guard (MessageQueue.js:384)
at MessageQueue.callFunctionReturnFlushedQueue (MessageQueue.js:110)
On Postman the request looks something like this:
Already tried:
Removing Accept header
Changing Accept value to 'application/json'
Removing file:// from the image url
Added android:usesCleartextTraffic="true" to the manifest
Already checked:
No values are null or undefined
There is a working internet connection, all other network requests on the app are working fine
The Auth is correct
React Native version is 0.61.5
I found one missing line in your code let formData = new FormData();. but not sure is that the exact issue here.
By the way here is a sample working code from one of my project, and I customized it with your context.
Add your authentication
replace ImageURI with image path and URL_SAVE_IMAGE endpoint url
const newImage = {
resource_id: 2,
resource: 'vehicles',
cat_file_id: 1,
active: 1,
vehicles: ImageURI,
name: "my_photo.jpg",
type: "image/jpg",
};
let formData = new FormData();
formData.append("vehicles", newImage);
return fetch(URL_SAVE_IMAGE, {
method: 'POST',
headers: {
"Content-Type": "multipart/form-data"
},
body: formData
}).then(response => response.json());
it should work!
What is your fetch(${BASE_URL}/files` backend server .. This usually happens when trying to connect to the backend api on the localhost machine.. Even if you use the IP address of the localhost, it still persists, so it is better to use online server for testing or just use ngrok(https://ngrok.com/) to serve your backend localhost via internet.
in gradle.properties change the flipper version to 0.47.0
try with Xhr, it's working as expected!
const URL = "ANY_SERVER/upload/image"
const xhr = new XMLHttpRequest();
xhr.open('POST', url); // the address really doesnt matter the error occures before the network request is even made.
const data = new FormData();
data.append('image', { uri: image.path, name: 'image.jpg', type: 'image/jpeg' });
xhr.send(data);
xhr.onreadystatechange = e => {
if (xhr.readyState !== 4) {
return;
}
if (xhr.status === 200) {
console.log('success', xhr.responseText);
} else {
console.log('error', xhr.responseText);
}
};
Nothing worked for me except using the Expo FileSystem uploadAsync
uploadImage = async ({ imageUri } }) => FileSystem.uploadAsync(
apiUrl,
imageUri,
{
headers: {
// Auth etc
},
uploadType: FileSystem.FileSystemUploadType.MULTIPART,
fieldName: 'files',
mimeType: 'image/png',
});
Note - imageUri in format of file:///mypath/to/image.png
Happy days!

Hapi send request to current local server

I have a graphql running on my server. And I have an upload route like this:
server.route({
config: {
cors: {
origin: ['*'],
credentials: true
},
payload: {
output: 'stream',
parse: true,
maxBytes: 50869457,
allow: 'multipart/form-data'
},
},
method: ['POST', 'PUT'],
path: '/uploadAvatar',
handler: (request, reply) => {
const data = request.payload;
data.identity = options.safeGuard.authenticate(request);
// REQUEST TO THE SAME SERVER THIS IS RUNNING ON
}
});
I want to send a request to the same server as I am in if that makes sense.. How to do that?
btw I want to call localhost:3004/graphql if it's running on localhost:3004 but on production it's running on port 80.
You can use hapi's built in server.inject method for handling internal routing, the docs for inject are here

fetch: Getting cookies from fetch response

I'm trying to implement client login using fetch on react.
I'm using passport for authentication. The reason I'm using fetch and not regular form.submit(), is because I want to be able to recieve error messages from my express server, like: "username or password is wrong".
I know that passport can send back messages using flash messages, but flash requires sessions and I would like to avoid them.
This is my code:
fetch('/login/local', {
method: 'POST',
headers: {
Accept: 'application/json',
'Content-Type': 'application/json',
},
body: JSON.stringify({
username: this.state.username,
password: this.state.password,
}),
}).then(res => {
console.log(res.headers.get('set-cookie')); // undefined
console.log(document.cookie); // nope
return res.json();
}).then(json => {
if (json.success) {
this.setState({ error: '' });
this.context.router.push(json.redirect);
}
else {
this.setState({ error: json.error });
}
});
The server sends the cookies just fine, as you can see on chrome's dev tools:
But chrome doesn't set the cookies, in Application -> Cookies -> localhost:8080: "The site has no cookies".
Any idea how to make it work?
The problem turned out to be with the fetch option credentials: same-origin/include not being set.
As the fetch documentation mentions this option to be required for sending cookies on the request, it failed to mention this when reading a cookie.
So I just changed my code to be like this:
fetch('/login/local', {
method: 'POST',
headers: {
Accept: 'application/json',
'Content-Type': 'application/json',
},
credentials: 'same-origin',
body: JSON.stringify({
username: this.state.username,
password: this.state.password,
}),
}).then(res => {
return res.json();
}).then(json => {
if (json.success) {
this.setState({ error: '' });
this.context.router.push(json.redirect);
}
else {
this.setState({ error: json.error });
}
});
From Differences from jQuery section of the Fetch API on Mozilla:
fetch() won't receive cross-site cookies. You can’t establish a cross
site session using fetch(). Set-Cookie headers from other sites are
silently ignored.
fetch() won’t send cookies, unless you set the
credentials init option. Since Aug 25, 2017: The spec changed the
default credentials policy to same-origin. Firefox changed since
61.0b13.)
I spent a long time but nothing worked for me.
after trying several solutions online this one worked for me.
Hopefully it will work for you too.
{
method: "POST",
headers: {
"content-type": "API-Key",
},
credentials: "include",
}
I had to include credentials: 'include' in the fetch options:
fetch('...', {
...
credentials: 'include', // Need to add this header.
});