I would like to download phantomjs binary within a Gradle build script.
Where are they hosted ?
There is an official download repository for PhantomJS: https://bitbucket.org/ariya/phantomjs/downloads/
No.
They are only hosted on bitbucket: https://bitbucket.org/ariya/phantomjs/downloads/
To use them with Gradle, the only option found to be working so far is to use the gradle-download-task plugin. To implement caching, you can copy paste this code.
Otherwise, another potential option is to declare a custom ivy dependency, but so far I haven't found a way to get it to work. The issue is that bitbucket redirects to an Amazon S3 location, and the HEAD request that Gradle first issues to get the file size fails, probably because of a request signature problem.
Related
Testers may have this issue for sure.
Assume that we have a testcase that should be automated. And it has a step to download a file from the webpage by clicking a link and it will be downloaded to our local machine’s download folder. As the next step it should be verified that the file is downloaded.
So in a local machine this can be handled easily by using the download paths and all. But the matter that I have is this exact same testcase is getting fail in Jenkins (cloud run). It returns a null value because the download directory cannot be found in Jenkins.
Do you guys know what kind of solution that can we take for this? I heard something using API request to download the file. Yes this file is also getting downloaded by a GET request with parameters. But I don’t know how to perform that.
Thanks for your time.
I tried the bellow options.
Changing the directory to Windows and Linux as per the documents
By using jenkins home directory
What I want to do?
to verify that the file is downloaded
Read the file and check with the db (This has existing methods)
I am attempting to load a file from a remote URL during build to be WebPacked. This file is an MDX file and I am using the MDX vue-loader to load this file for use within the Vue application.
The system I am deploying is tenanted with a headless CMS powering some pages across the system. I would like to explore the possibilities of loading the MDX files at build time from a remote URL.
I have placed the MDX files on GitHub Pages with the remote URL passed in as an environment variable at build time.
The result is something like this (the idea here is that I can swap the domain during build to satisfy the tenanted site requirement):
import('https://somedomain.com/content/home.mdx');
This fails with your typical error during build of:
dependencies not found please install them using npm --save https://somedomain.com/content/home.mdx
I can WebPack ignore this import which allows it to build but then it fails to load in the browser as browsers will only load external modules with a MIME type of JS. Not to mention the fact that this hasn't been through the MDX loader so I suspect even if I could get the browser to load it the file would not have been parsed into something usable.
I realise I could copy these files in during the build stage from the remote but I was hopeful that there might be a way to either allow the browser to pull this remote file or WebPack to download this remote file and pack it into the output.
Does anyone have any ideas if this might be possible? Many thanks in advance.
As MDX needs pre-processing during build I think integration with Webpack is the only way.
You can try the SaveRemoteFilePlugin webpack plugin which allows you to download the file from remote to local file system. But maybe it's not what you want as it seems pushing downloaded files directly into dist folder without passing it through rest of the Webpack pipeline...
So probably better option is val-loader which allows executing your own Node scripts during build - here you can find the example which does almost what you need - Fetching Remote data during build
With runs-on: ubuntu-latest, I quickly found a marketplace Docker action to upload a generated directory into S3 and setup a perfectly working pipeline.
But I need to do the same with runs-on: windows-latest, where Docker actions no longer work.
Tried the other available alternative, to build aJavaScript action. But after testing several different npm packages I did not succeed.
I would like also the action to be as much as possible self contained in the .yml file (of course accessing the needed GitHub Secrets for AWS key, id, bucket..).
Anybody has already encountered the same problem and can point me to a possible solution?
Thank you in advance.
Check out stcalica/s3-upload action. It is written in JavaScript and it seems it does the work that you need.
I have downloaded the opensource code at open-ride.com in order to set up a rideshare server and associated services.
I am trying to find the ".ear" file that they talk about in the install manual.
does it have to be generated somehow from the downloaded source code?
It has to be generated either manually or through NetBeans. I haven't found out how to do it manually yet, but if you follow the NetBeans instructions you should be able to build it and run it. You might run into an error with the yui compressor where you just need to remove the ../ from the path to it in the configuration file. If you still have problems I may still be able to help.
I'm trying to work out how to use DOH to test Dojo modules if I don't install Dojo locally to my project. I'm working in Eclipse and ideally, I'd like something that I can run as part of a Maven build eventually. The Dojo package is 5-20Mb and I don't want to have it stashed in my source control system with each project if possible.
I've tried a few options with the runner.html test runner, but DOH is going to need to find a Dojo somewhere, and then it seems that modules will be found relative to that installation.
Having Dojo installed on my system but not in the project gives me a problem in trying to find the project relative to the location of the dojo.js file. The cross-domain protection prevents me serving up any kind of absolute path as it strips : characters. It also stops me using a Dojo installation served up on a different domain over http.
Is it necessary to have Dojo installed somewhere that I can then define a relative path from dojo.js to the roots of my modules? If not, how do I configure to get around it?
I've not tried this completely cross domain, but yes, you can define paths which may be enough to get you going.
We run our tests using a somewhat complicated deployment (to ensure we don't introduce accidental dojo/doh path dependencies), and our URL looks like this:
http://server/XXX/dev/dohpath/util/doh/runner.html?boot=../../../dojo/dojo.js&dojoUrl=../../../dojo/dojo.js&paths=doh,../dohpath/util/doh;mymodule,../../mymodule&testModule=full.test.module
That is, you fire up the runner, give it both 'boot' and 'dojoUrl' to tell it where Dojo itself lives, use 'paths' to tell DOH where it lives and how to find your own modules.
blech
Whether those relative paths can be made absolute successfuly, and whether it'll work cross-domain is an entirely different matter, I'm afraid. We'll be hitting that problem ourselves in a couple of months.
I've been able to do this with the runner located at http://archive.dojotoolkit.org/nightly/checkout/util/doh/runner.html and a gist.
The trick seems to be to use a path alias in the URL, and use a network-path reference URL (i.e. omit the URL scheme, URL starts with //).
I found this out while trying to answer this question without a local copy of DOH.
Here it is:
http://archive.dojotoolkit.org/nightly/checkout/util/doh/runner.html?testModule=aa&paths=aa,//gist.github.com/gitgrimbo/5406688/raw/e6bc4469ce72dfd6d50e61e885889cb915a3f66b/gistfile1