Selenium test not loading some specific URLs - selenium

Using selenium through python on AWS Linux server, when the test start it doesn't load the page, the strange thing is if I try to run a test using a url from google or facebook the test works, I used curl and links commands to see if I have access from the server and they work so not sure what could be the issue.
Any help is appreciated.

Related

Aliexpress detecting automated browser, human verification fail in Selenium chrome driver

I want to write a script to scrap some data from aliexpress, so I need to be logged in first, When I try to login, Aliexpress ask for slide to verify sometimes, I made an script to do this in an automated way using selenium actions, but it gives the following error:
Screenshot of error
Well, I tried to it manually, But I am getting the same error, so it seems somehow aliexpress is detecting that its running in an automated environment, so please can you help me to avoid that.
Resources I used:
I used chrome_driver with brave browser and chrome but both didn't work.

How we can automate real browser instead of using selenium browser instance

I am trying to scrape a website, but it is not loading in selenium. When I browse that website in my "real" chrome browser, everything works fine. Is there any way I can use my real browser with python to automate stuff, instead of using selenium??
Thanks
Using selenium we can automate real browsers.
If in case the website is not loading via selenium, you can check if adding desired capabilities helps.
Here we can set proxy, disable extensions etc. There are many options available.
https://chromedriver.chromium.org/capabilities
Also if you can share what kind of error is displayed that would be helpful.

see firefox when executing robot test in docker

I'm using robotframework with selenium and firefox.
I'm running test in a docker.
But for debugging purpose I want to see sometimes what's happen on the UI.
So is there a way to get the UI of firefox launch when test are running in docker.
thanks
You can use VNC to see what is actually happening over docker. It's like remote access where you can see everything on docker image
Download VNC viewer :
https://www.realvnc.com/en/connect/download/viewer/
Some tutorial :
Click Here
Article
https://medium.com/#shivam.somani09/running-automated-test-cases-on-vnc-viewer-using-docker-16656c3d1d87
OR if you don't want VNC
You can take screenshot if you just want an image on a particular place.
You can also use driver.getPageSource(); to get the HTML code

Phantomjs - Silent Browser

I was using Phantomjs driver for execution of my Application test scenario.
In the Recent update, they have implemented individual certificates to each machines, so once I enter my application URL and hit enter_key, the URL is redirected to an internal server and I need to provide the Userid and Password to Log in.
This Scenario works well in chromedriver but when i'm using phantomjs, the elements are not getting recognized.
I tried to take the screenshots but only blank image came in the screenshot.

Import.io some crawlers don't have the button for crawl locally

I was creating some crawlers using import.io however seems like for some of them the option for run locally is not showing. Anyone knows why they dont have the run from local button or how i can make to put in the crawlers?
If you don't see the option to run the crawler [Remotely or locally] then it means that your crawler is already running locally only.
When you save a crawler, import.io does a few checks to see if it can be run remotely on our servers, in some cases this increases chances of crawlers working as the servers do additional processing.
If those check fail, then the crawler can only run locally and therefore your crawler will be run locally be default.