Chromium won't load any websites and keeps crashing - chromium

All of the sudden my scripts aren't working anymore.
const puppeteer = require("puppeteer");
async function run() {
const browser = await puppeteer.launch({
headless: false
});
const page = await browser.newPage();
await page.goto("https://www.google.com");
// browser.close();
}
run();
When I run node index.js, Chromium launches, however, it is all white, then my mouse turns into the little rainbow spinning circle (Mac) and it crashes and I get the following error:
(node:37226) UnhandledPromiseRejectionWarning: Error: Navigation failed because browser has disconnected!
Thanks for any help!

I had the same problem on mac and solved it by adding '--no-sandbox' to the chrome args.
You can expose it as an env variable like so:
CHROME_ARGS=--no-sandbox
or add it to your browser.launch configuration:
const browser = await puppeteer.launch({
headless: false,
args: ['--no-sandbox']
});

Related

React Native Fetch Blob/React Native Blob Util Fail in Production but not in Developer time

I am trying to download a pdf generated through an own api that worked normally for me until yesterday, since then it has stopped working without any modification. Reviewing in developer mode through the metro everything seems to work correctly without any problems (I download the pdf normally), but when deploying the application in the playstore it closes unexpectedly, leaving me without knowing why this happens.
First I was using the React Native Fetch Blob, then I used React Native Blob Util hoping it would solve the problem but it keeps happening. Do you guys have any ideas for why does this happen?
The PDF file download this function:
const generarReciboPdf = async (datosDeuda:FormularioScreenParams,formaPago:ReactText,nroCheque?:string) =>{
const{config,fs} = ReactNativeBlobUtil
if (formaPago === 4 && (nroCheque == ""|| undefined)) {
return console.log('debe llenar todos los campos');
}
const { DownloadDir } = fs.dirs;
const token = await AsyncStorage.getItem('token');
let datosDeudaCompleto= {
...datosDeuda,
forma_pago:formaPago,
nro_cheque:nroCheque
}
let url = 'https://sys.arco.com.py/api/appRecibos/generarReciboPdf/'+JSON.stringify(datosDeudaCompleto)
// return console.log(url);
const options = {
fileCache: true,
addAndroidDownloads: {
useDownloadManager: true, // true will use native manager and be shown on notification bar.
notification: true,
mime:'application/pdf',
path: `${DownloadDir}/recibo_${datosDeuda.mes_deuda.replace(/ /g, "")}_${datosDeuda.razon_social.replace(/ /g, "")}.pdf`,
description: 'Downloading.',
},
};
config(options).fetch('GET', url,{Authorization :'Bearer '+token}).then((res) => {
console.log('se imprimio correctamte el pdf');
setErrorPdf(1)
}).catch((error)=>{
console.log(error);
setErrorPdf(-1);
});
}
Also, this error appears in Play Console: "PlayConsole Error Image".
PlayConsole Error Image

Playwright test works fine localy, but fails in pipeline

I am trying to run this code:
test('should login', async ({ page }) => {
await page.goto(localhost);
await page.fill('[name=username]', 'username');
await page.fill('[name=password]', 'password');
await page.click('[name=login]');
await page.waitForURL(`${localhost}/main`);
const currentUrl = await page.url();
expect(currentUrl).toBe(`${localhost}/main`);
});
When I run it with npx playwright test localy, the test passes; but, when run in CI/CD, it fails:
Timeout of 180000ms exceeded.
page.waitForURL: Navigation failed because page was closed!
=========================== logs ===========================
waiting for navigation to "http://localhost:3000/main" until "load"
============================================================
Any idea what causes this problem?
I would try logging all network requests to see what's going on there:
// Log and continue all network requests
page.route('**', (route, request) => {
console.log(request.url());
route.continue();
});
If it never gets beyond the goto(), perhaps try using the new playwright request fixture to see if it can even reach http://localhost:3000/main.
await page.request.get("http://localhost:3000/main", { ignoreHTTPSErrors: true });

Share screen in safari 13 with agora sdk

I am using agora-rtc-sdk 3.3.0. When i click on "share screen" and gave permission (problem is not with getDisplayMedia) safari reload page. The line which cause it, is client.publish(stream), if comment this line stream created successfully (according to console), same as client but i cant publish my "screen sharing" stream. This bug appears only in safari 13, other browsers works fine. Adding this part of code.
const beginShare = async() => {
agoraShareStream = AgoraRTC.createStream({
streamID: SHARE_ID,
audio: false,
video: false,
screen: true
});
await agoraShareStream.init(() => {
console.log('init local stream success')
});
setShareStream(agoraShareStream);
enqueueSnackbar("Started screen sharing", { variant: "info" });
}
const createShareClient = async() => {
const agoraShareClient = AgoraRTC.createClient({
mode: "rtc",
codec: "vp8"
});
await agoraShareClient.init(appId);
await agoraShareClient.join(getAgoraToken({ uid: SHARE_ID, channel }), channel, SHARE_ID);
agoraShareClient.publish(agoraShareStream);
setShareClient(agoraShareClient);
}
Unfortunately, Agora doesn't support screen sharing on Safari as noted in the documentation here: https://docs.agora.io/en/Video/screensharing_web?platform=Web#introduction
The Agora RTC Web SDK supports screen sharing in the following desktop browsers:
Chrome 58 or later
Firefox 56 or later
Edge 80 or later on Windows 10+

Puppeteer - Connecting to HTTPS website through a proxy with authentication doesn't work

I have been trying to solve this issue for the past 2 days and haven't been able to. I've looked this up everywhere and still no solution.. Here's the code:
const puppeteer = require('puppeteer-extra');
const StealthPlugin = require('puppeteer-extra-plugin-stealth');
puppeteer.use(StealthPlugin());
const PROXY_SERVER_IP = 'IP.IP.IP.IP';
const PROXY_SERVER_PORT = '1234';
const PROXY_USERNAME = 'username';
const PROXY_PASSWORD = 'password';
(async () => {
const browser = await puppeteer.launch({
args: [`--proxy-server=http://${PROXY_SERVER_IP}:${PROXY_SERVER_PORT}`],
});
const page = await browser.newPage();
await page.authenticate({
username: PROXY_USERNAME,
password: PROXY_PASSWORD,
});
await page.goto('https://www.google.ca/', {
timeout: 0,
});
await page.screenshot({ path: 'test4.png', fullPage: true });
await browser.close();
})();
I get a navigation timeout error on the page.goto() call because it just hangs for some reason. I can't figure out why. When I put a proxy that doesn't require authentication, it works. I'm thinking of switching to another headless solution because of this one issue and I would really appreciate some help.
So I figured it out. Turns out the proxy was really bad for some reason. The reason why Axios and cURL gave fast responses was because they just get the initial HTML code and unlike headless browsers, don't actually do anything with HTML text. With headless browsers, they actually make all the requests for the assets as well (css, images, etc.) and any other network requests and it's all going through the proxy, so it's much slower. When I tried a different proxy (one that requires authentication), it was much faster.

Chromium/chrome running headless: wait for page load when using --screenshot?

Trying to take a screenshot using headless Chrome but it executes --screenshot before the page is loaded. On MacOS running zsh, I'm invoking it like this:
/Applications/Google\ Chrome.app/Contents/MacOS/Google\ Chrome https://example.com/foo.html --screenshot=/Users/myhome/foo.png
Is there a way to wait until the page is loaded before the screenshot is taken?
The answer, it turns out, is that those command line options don't work well and have been supplanted by Puppeteer. Here's the script I created (hardcodes values for brevity).
Here's the complete code.
const puppeteer = require('puppeteer');
const sleep = (milliseconds) => {
return new Promise(resolve => setTimeout(resolve, milliseconds))
}
async function run () {
const browser = await puppeteer.launch();
const page = await browser.newPage();
await page.goto('https://twitter.com/gnuman1979/status/1239523796542992387');
await sleep(2000)
await page.screenshot({path: './foo.png'});
browser.close();
}
run();
Save as foo.js, then run using node:
$ node foo.js