Protractor on offline machine - selenium

Angular 4 cli project
We have private network with private npm repository.
(There is no connection to the internet).
so after all modules are downloaded I want to run e2e tests.
Protractor use webdriver-manager to download the latest chrome driver.
but he can't download the driver so I get this error :
etaddrinfo ENOTFOUND chromedriver.storage.googleapis.com chromedriver.storage.googleapis.com:443
I tried to download the driver manually, and inside the protactor-config :
{
chromeDriver: "../../chromedriver.exe", // I also tried with "./chromedriver_2.30.exe"
....
}
(I don't know if the chromedriver is relative path to the protractor.config or to the webdriver-manager module inside protractor)
But I keep getting this error, how can I treat this error without an internet connection at all?
btw, something to consider, we develop on windows, but how can our ci/cd server (linux) will get a driver suitable for linux??

I had a similar issue. After trying different approaches like manually copying the driver or changing the protractor module, I found that the best workaround is to install a local Web server and provide the required driver for download through that local server. This solution worked and is also useful to provide other files (e.g. files that are directly downloaded during "npm install"). Steps are listed below.
Install Apache on the offline system or any other system accessible to that offline system.
On an online system, download the driver from (https://chromedriver.storage.googleapis.com/ - the site that the update command tries to access). Accessing this site in the browser displays a file (download.xml) that lists different versions of the web driver for different platforms. You can download the required version by appending the "Key" shown in that file to the end of the URL e.g. (https://chromedriver.storage.googleapis.com/2.33/chromedriver_win32.zip) to download version 2.33 of the chrome driver for windows. I tried newer versions but found that 2.33 worked on Win 10 (64 bit)/Chrome 61.
Manually copy the downloaded zip file to the offline system in the Apache htdocs folder using the same path as in the key e.g. (c:\\htdocs\2.33\chromedriver_win32.zip)
Make a download.xml file similar to the one on the actual site (https://chromedriver.storage.googleapis.com) but only list one entry for the driver version that you need.
Modify your Apache config file (conf\httpd.conf) to make download.xml as DirectoryIndex file
Run Apache (bin\httpd.exe)
Change your windows hosts file to add an entry to map (chromedriver.storage.googleapis.com) to the IP of the system where Apache is running.
Run "ng e2e". "webdriver-manager update" will download this local driver and tests will continue.

I had a similar issue. Found this answer with googling and I tried it. Seems to work.
With recent changes in protractor you can use:
ng e2e --webdriver-update=false

I had the same problems and my solution it's not the best, but it works.
Locally:
run webdriver-manager update in my example I had to run it with -ignore_ssl
go to the \node_modules\protractor\node_modules\webdriver-manager\selenium\and copy all files (except update-config.json) to some root folder
commit and push changes (I know, we are pushing web drivers to the repo which is not the best solution)
On Offline Machine - TFS in my case
run npm install
copy webdrivers back to the folder node_modules\protractor\node_modules\webdriver-manager\selenium\
I use Angular CLI so run ng e2e --no-webdriver-update

The best way is to put it in your angular.json:
"e2e": {
"builder": "#angular-devkit/build-angular:protractor",
"options": {
"webdriverUpdate": false,
"protractorConfig": "e2e/protractor.conf.ts",
"devServerTarget": "myproject:serve"
},

Related

How to force OpenAPI Generator CLI to use pre-downloaded .jar file?

I have installed (via npm) openapi-generator-cli on my WSL running Ubuntu 22.04 image, with correctly configured HTTP_PROXY and HTTPS_PROXY environment variables.
The problem is, running any command (including sudo openapi-generator-cli help) results in OAG-CLI attempting to download the .jar file from maven.org, which end with connection getting refused for unknown reason (SSL cert not listed as trusted? WSL-exclusive bug? corporate proxy having an edge case?).
Instead of dealing with all that, I realised I can just download the latest (as per official website) .jar file:
https://repo1.maven.org/maven2/org/openapitools/openapi-generator-cli/6.2.1/openapi-generator-cli-6.2.1.jar
via browser and place it manually for OAG-CLI to use.
I have edited the auto-generated openapitools.json just so:
{
"$schema": "node_modules/#openapitools/openapi-generator-cli/config.schema.json",
"spaces": 2,
"generator-cli": {
"version": "6.2.1",//same version as .jar
"storageDir": "."//see below
}
}
Unfortunately, despite placing two copies of the .jar file (one named openapi-generator-cli-6.2.1.jar, one named openapi-generator-cli.jar) in both the "current" folder and /usr/libs/openapi, and trying the following values for storageDir:
.
./
/usr/libs/openapi
/usr/libs/openapi/
~/usr/libs/openapi
~/usr/libs/openapi/
every single run of sudo openapi-generator-cli help resulted in an immediate Downloading 6.2.1 ... message (followed by connection refused error some time later).
What else do I need to do to make OAG-CLI use the .jar within storageDir instead of trying to download a new copy?
(Answer containing just the structure and contents of a folder created by "storageDir": "~/foo" would allow me to reverse-engineer a working setup.)

“ERROR Unable to find the browser. “saucelabs:Chrome#83.0:Windows10” is not a browser alias or path to an executable file

I am trying to run my UI test using testcafe and saucelabs. I am facing this above error. Currently I am using testcafe v1.8.3 and testcafe-browser-provider-saucelabs v1.7.0
I have tried changing versions of browser provider also but still facing the above error. Pls help out with a solution for this as i am stuck with it for more than a week
So, it looks like the runner you are using (testcafe-browser-provider) is a very old one, there is a new runner you can use for testcafe tests called saucectl.
TLDR:
Install saucectl globally npm install -g saucectl
Set up saucectl within your project folder with saucectl init This will create a .sauce/config.yml file
Tweak the settings to run the spec files and OS/ browser of your choice
Use saucectl run
You can see an example proj here: https://github.com/saucelabs/saucectl-testcafe-example
It looks like your provider is installed locally, while you are using the global TestCafe installation. You also need to install TestCafe locally or both packages globally. After this, check your browser provider: testcafe -b saucelabs.
I am using testcafe v1.8.3 and testcafe-browser-provider-saucelabs v1.7.0
Please update your testcafe and testcafe-browser-provider-saucelabs versions to the latest ones.

Adding LESS file to HTML [duplicate]

I'm trying to load a 3D model, stored locally on my computer, into Three.js with JSONLoader, and that 3D model is in the same directory as the entire website.
I'm getting the "Cross origin requests are only supported for HTTP." error, but I don't know what's causing it nor how to fix it.
My crystal ball says that you are loading the model using either file:// or C:/, which stays true to the error message as they are not http://
So you can either install a webserver in your local PC or upload the model somewhere else and use jsonp and change the url to http://example.com/path/to/model
Origin is defined in RFC-6454 as
...they have the same
scheme, host, and port. (See Section 4 for full details.)
So even though your file originates from the same host (localhost), but as long as the scheme is different (http / file), they are treated as different origin.
Just to be explicit - Yes, the error is saying you cannot point your browser directly at file://some/path/some.html
Here are some options to quickly spin up a local web server to let your browser render local files
Python 2
If you have Python installed...
Change directory into the folder where your file some.html or file(s) exist using the command cd /path/to/your/folder
Start up a Python web server using the command python -m SimpleHTTPServer
This will start a web server to host your entire directory listing at http://localhost:8000
You can use a custom port python -m SimpleHTTPServer 9000 giving you link: http://localhost:9000
This approach is built in to any Python installation.
Python 3
Do the same steps, but use the following command instead python3 -m http.server
VSCode
If you are using Visual Studio Code you can install the Live Server extension which provides a local web server enviroment.
Node.js
Alternatively, if you demand a more responsive setup and already use nodejs...
Install http-server by typing npm install -g http-server
Change into your working directory, where yoursome.html lives
Start your http server by issuing http-server -c-1
This spins up a Node.js httpd which serves the files in your directory as static files accessible from http://localhost:8080
Ruby
If your preferred language is Ruby ... the Ruby Gods say this works as well:
ruby -run -e httpd . -p 8080
PHP
Of course PHP also has its solution.
php -S localhost:8000
In Chrome you can use this flag:
--allow-file-access-from-files
Read more here.
Ran in to this today.
I wrote some code that looked like this:
app.controller('ctrlr', function($scope, $http){
$http.get('localhost:3000').success(function(data) {
$scope.stuff = data;
});
});
...but it should've looked like this:
app.controller('ctrlr', function($scope, $http){
$http.get('http://localhost:3000').success(function(data) {
$scope.stuff = data;
});
});
The only difference was the lack of http:// in the second snippet of code.
Just wanted to put that out there in case there are others with a similar issue.
Just change the url to http://localhost instead of localhost. If you open the html file from local, you should create a local server to serve that html file, the simplest way is using Web Server for Chrome. That will fix the issue.
I'm going to list 3 different approaches to solve this issue:
Using a very lightweight npm package: Install live-server using npm install -g live-server. Then, go to that directory open the terminal and type live-server and hit enter, page will be served at localhost:8080. BONUS: It also supports hot reloading by default.
Using a lightweight Google Chrome app developed by Google: Install the app, then go to the apps tab in Chrome and open the app. In the app point it to the right folder. Your page will be served!
Modifying Chrome shortcut in windows: Create a Chrome browser's shortcut. Right-click on the icon and open properties. In properties, edit target to "C:\Program Files (x86)\Google\Chrome\Application\chrome.exe" --disable-web-security --user-data-dir="C:/ChromeDevSession" and save. Then using Chrome open the page using ctrl+o. NOTE: Do NOT use this shortcut for regular browsing.
Note: Use http:// like http://localhost:8080 in case you face error.
Use http:// or https:// to create url
error: localhost:8080
solution: http://localhost:8080
In an Android app — for example, to allow JavaScript to have access to assets via file:///android_asset/ — use setAllowFileAccessFromFileURLs(true) on the WebSettings that you get from calling getSettings() on the WebView.
fastest way for me was:
for windows users run your file on Firefox problem solved, or
if you want to use chrome easiest way for me was to install Python 3 then from command prompt run command python -m http.server then go to http://localhost:8000/ then navigate to your files
python -m http.server
Easy solution for whom using VS Code
I've been getting this error for a while. Most of the answers works. But I found a different solution. If you don't want to deal with node.js or any other solution in here and you are working with an HTML file (calling functions from another js file or fetch json api's) try to use Live Server extension.
It allows you to open a live server easily. And because of it creates localhost server, the problem is resolving. You can simply start the localhost by open a HTML file and right-click on the editor and click on Open with Live Server.
It basically load the files using http://localhost/index.html instead of using file://....
EDIT
It is not necessary to have a .html file. You can start the Live Server with shortcuts.
Hit (alt+L, alt+O) to Open the Server and (alt+L, alt+C) to Stop the server. [On MAC, cmd+L, cmd+O and cmd+L, cmd+C]
Hope it will help someone :)
If you use old version of Mozilla Firefox (pre-2019), it will work as expected without any issues;
P.S. Surprisingly, old versions of Internet Explorer & Edge work absolutely fine too.
For those on Windows without Python or Node.js, there is still a lightweight solution: Mongoose.
All you do is drag the executable to wherever the root of the server should be, and run it. An icon will appear in the taskbar and it'll navigate to the server in the default browser.
Also, Z-WAMP is a 100% portable WAMP that runs in a single folder, it's awesome. That's an option if you need a quick PHP and MySQL server. Though it hasn't been updated since 2013. A modern alternative would be Laragon or WinNMP. I haven't tested them, but they are portable and worth mentioning.
Also, if you only want the absolute basics (HTML+JS), here's a tiny PowerShell script that doesn't need anything to be installed or downloaded:
$Srv = New-Object Net.HttpListener;
$Srv.Prefixes.Add("http://localhost:8080/");
$Srv.Start();
Start-Process "http://localhost:8080/index.html";
While($Srv.IsListening) {
$Ctx = $Srv.GetContext();
$Buf = [System.IO.File]::OpenRead((Join-Path $Pwd($Ctx.Request.RawUrl)));
$Ctx.Response.ContentLength64 = $Buf.Length;
$Ctx.Response.Headers.Add("Content-Type", "text/html");
$Buf.CopyTo($Ctx.Response.OutputStream);
$Buf.Close();
$Ctx.Response.Close();
};
This method is very barebones, it cannot show directories or other fancy stuff. But it handles these CORS errors just fine.
Save the script as server.ps1 and run in the root of your project. It will launch index.html in the directory it is placed in.
I suspect it's already mentioned in some of the answers, but I'll slightly modify this to have complete working answer (easier to find and use).
Go to: https://nodejs.org/en/download/. Install nodejs.
Install http-server by running command from command prompt npm install -g http-server.
Change into your working directory, where index.html/yoursome.html resides.
Start your http server by running command http-server -c-1
Open web browser to http://localhost:8080
or http://localhost:8080/yoursome.html - depending on your html filename.
I was getting this exact error when loading an HTML file on the browser that was using a json file from the local directory. In my case, I was able to solve this by creating a simple node server that allowed to server static content. I left the code for this at this other answer.
It simply says that the application should be run on a web server. I had the same problem with chrome, I started tomcat and moved my application there, and it worked.
I suggest you use a mini-server to run these kind of applications on localhost (if you are not using some inbuilt server).
Here's one that is very simple to setup and run:
https://www.npmjs.com/package/tiny-server
Experienced this when I downloaded a page for offline view.
I just had to remove the integrity="*****" and crossorigin="anonymous" attributes from all <link> and <script> tags
If you insist on running the .html file locally and not serving it with a webserver, you can prevent those cross origin requests from happening in the first place by making the problematic resources available inline.
I had this problem when trying to to serve .js files through file://. My solution was to update my build script to replace <script src="..."> tags with <script>...</script>.
Here's a gulp approach for doing that:
1.
run npm install --save-dev to packages gulp, gulp-inline and del.
2.
After creating a gulpfile.js to the root directory, add the following code (just change the file paths for whatever suits you):
let gulp = require('gulp');
let inline = require('gulp-inline');
let del = require('del');
gulp.task('inline', function (done) {
gulp.src('dist/index.html')
.pipe(inline({
base: 'dist/',
disabledTypes: 'css, svg, img'
}))
.pipe(gulp.dest('dist/').on('finish', function(){
done()
}));
});
gulp.task('clean', function (done) {
del(['dist/*.js'])
done()
});
gulp.task('bundle-for-local', gulp.series('inline', 'clean'))
Either run gulp bundle-for-local or update your build script to run it automatically.
You can see the detailed problem and solution for my case here.
For all y'all on MacOS... setup a simple LaunchAgent to enable these glamorous capabilities in your own copy of Chrome...
Save a plist, named whatever (launch.chrome.dev.mode.plist, for example) in ~/Library/LaunchAgents with similar content to...
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>Label</key>
<string>launch.chrome.dev.mode</string>
<key>ProgramArguments</key>
<array>
<string>/Applications/Google Chrome.app/Contents/MacOS/Google Chrome</string>
<string>-allow-file-access-from-files</string>
</array>
<key>RunAtLoad</key>
<true/>
</dict>
</plist>
It should launch at startup.. but you can force it to do so at any time with the terminal command
launchctl load -w ~/Library/LaunchAgents/launch.chrome.dev.mode.plist
TADA! 😎 💁🏻 🙊 🙏🏾
Not possible to load static local files(eg:svg) without server. If you have NPM /YARN installed in your machine, you can setup simple http server using "http-server"
npm install http-server -g
http-server [path] [options]
Or open terminal in that project folder and type "hs". It will automaticaly start HTTP live server.
er. I just found some official words "Attempting to load unbuilt, remote AMD modules that use the dojo/text plugin will fail due to cross-origin security restrictions. (Built versions of AMD modules are unaffected because the calls to dojo/text are eliminated by the build system.)" https://dojotoolkit.org/documentation/tutorials/1.10/cdn/
One way it worked loading local files is using them with in the project folder instead of outside your project folder. Create one folder under your project example files similar to the way we create for images and replace the section where using complete local path other than project path and use relative url of file under project folder .
It worked for me
Install local webserver for java e.g Tomcat,for php you can use lamp etc
Drop the json file in the public accessible app server directory
Start the app server,and you should be able to access the file from localhost
For Linux Python users:
import webbrowser
browser = webbrowser.get('google-chrome --allow-file-access-from-files %s')
browser.open(url)
url should be like:
createUserURL = "http://www.localhost:3000/api/angular/users"
instead of:
createUserURL = "localhost:3000/api/angular/users"
Many problem for this, with my problem is missing '/' example:
jquery-1.10.2.js:8720 XMLHttpRequest cannot load http://localhost:xxxProduct/getList_tagLabels/
It's must be: http://localhost:xxx/Product/getList_tagLabels/
I hope this help for who meet this problem.
I have also been able to recreate this error message when using an anchor tag with the following href:
Example a tag
In my case an a tag was being used to get the 'Pointer Cursor' and the event was actually controlled by some jQuery on click event. I removed the href and added a class that applies:
cursor:pointer;
cordova achieve this. I still can not figure out how cordova did. It does not even go through shouldInterceptRequest.
Later I found out that the key to load any file from local is: myWebView.getSettings().setAllowUniversalAccessFromFileURLs(true);
And when you want to access any http resource, the webview will do checking with OPTIONS method, which you can grant the access through WebViewClient.shouldInterceptRequest by return a response, and for the following GET/POST method, you can just return null.
If you are searching for a solution for Firebase Hosting, you can run the
firebase serve --only hosting command from the Firebase CLI
That's what I came here for, so I thought I'd just leave it here to help like ones.
If your using VS code just trying loading a live server in there. fixed my problem immediately.

Can't update chromedriver and seleniumrelease

I try to work with protractor. So I followed a small tutorial and the first thing I did:
npm install -g protractor
This will install two command line tools, protractor and webdriver-manager.
But now I have to update my webdriver-manager:
webdriver-manager update
So my cmd tries to connect with https://chromedriver.storage.googleapis.com/2.14/chromedriver_win32.zip and https://selenium-release.storage.googleapis.com/2.45/selenium-server-standalone-2.45.0.jar.
But it will give this error:
C:\Program Files (x86)\Jenkins\workspace\testnew>webdriver-manager update
Updating selenium standalone
downloading https://selenium-release.storage.googleapis.com/2.45/selenium-server
-standalone-2.45.0.jar...
Updating chromedriver
downloading https://chromedriver.storage.googleapis.com/2.14/chromedriver_win32.
zip...
Error: Got error Error: getaddrinfo EAI_AGAIN from https://selenium-release.stor
age.googleapis.com/2.45/selenium-server-standalone-2.45.0.jar
Error: Got error Error: getaddrinfo EAI_AGAIN from https://chromedriver.storage.
googleapis.com/2.14/chromedriver_win32.zip
Sometimes it is the EAI_AGAIN error and sometimes ENOTFOUND.
But what I don't understand is that I can download the zip and the jar manually in my browser. When I surf to the URL it all works fine. But not in the cmd. Can someone help me?
PS: pinging isn't possible to the url's
Update: after proxy settings I get this error:
Error: Got error Error: tunneling socket could not be established, cause=socket
hang up from https://chromedriver.storage.googleapis.com/2.14/chromedriver_win32
.zip
It happened the same to me. The problem was due to a proxy we are using inside our company.
webdriver-manager has a parameter which is --proxy, where you can specify the proxy which the webdriver command should use.
The proxy configuration which you might have in nmp (.npmrc file in your users dicrectory) won't work for webdriver-manager.
Here the example which worked out for me.
webdriver-manager --proxy http://yourproxy:8080 update
If setting your proxy does not work, how it happened to me, you can download the files manually from the urls show in the console, and put them into the selenium folder
The path in Windows is:
users\username\AppData\Roaming\npm\node_modules\protractor\selenium
That works for me.
I hope that helps
Read on if your webdriver-manager update doesn't update chromedriver
to the latest.
I lost a few weeks pulling my hair around an issue I had with "Unable to discover open pages" and every time I would update the chromedriver, it would update to version 2.22 for chromedriver and I believe the selenium server to v2.53.
My problem wasn't really with the selenium server so v2.53 was fine.
Issue was with chromedriver v2.22.
Eventhough this chromdriver link showed that there was a latest version of 2.24, 'webdriver-manager update' would NOT pick up that latest version, it would only grab version 2.22 of the chrome driver.
How did I go around this?
Simply run the command below after you check this link for which version of chromedriver you want to update to; for instance, I wanted v2.24 so I ran the command below:
webdriver-manager update --versions.chrome 2.24
If you check your location: C:\Users\<USER>\AppData\Roaming\npm\node_modules\webdriver-manager\selenium\
You should see that the desired chromedriver was downloaded there; if it's not there, read the command prompt logs and it'll tell you where it downloaded your chromdriver files.
Hope that helps someone!
Your web browser is probably using a proxy, or some other indirect access to the wider internet that the webdriver-manager script isn't configured to use. (The webdriver-manager supports a --proxy parameter if you know what to pass to it.)
If you can download the files manually, just put them in the selenium directory manually. The script also unzips the "chromedriver_win32.zip" in place to get the chromedriver binary contained in it.

Node-Webkit gclient sync error

I want to build Node-Webkit. I followed the instructions from this site https://github.com/rogerwang/node-webkit/wiki/Building-node-webkit but the part with the .gclient file didn't work.
After the command gclient sync --nohooks, I get this:
ERROR: client not configured; see 'gclient config'
The solution I found for this problem was the gclient config http://... command with a link like https://src.chromium.org/svn/trunk/src or similar ones. But with these links gclient doesn't download the Node-Webkit stuff.
So is there a working link for this problem or a option to download the stuff without gclient?
I did it with a fresh installed Ubuntu 12.04 in a virtual machine because the install-build-deps.sh I have to execute later does not support my Ubuntu 13.10.
Could the vm be the problem?
You need to set config to gclient, which generates .gclient in home folder. Read more here