Jenkins Execution Issue - selenium

A selenium script , when executing from Eclipse in a local machine , clicks an UPLOAD button in browser to upload some files from local machine to that application running in browser.
This uploading part is failing when the script is executing from Jenkins because Jenkins server is unable to access local machine.
Any idea how to overcome this issue..??

Accessing these files from the server is a bad idea (unless it's a Jenkins's slave) though can be achieved via your code or some CMD tool, with providing the address and credentials. Put some effort in this and you'll find how to do it...
The better solutions will be adding these files to the Jenkins server as well or committing these files to your repository and checkout them before the test.
A shared folder could also do the trick on a filer type server or similar.
Hope it helps. Anyway there's no substitute to your own research e.g. Google and StackOverflow.

Related

Selenium automation - running from a common folder

I am trying to setup a shared folder on vm and from that folder setting up automation. Is there a way that all the people who have access to that folder can see automation running? See chrome instance running autmation?
Usually, CI like Jenkins is used to run automation on VM and collect results.
But as for your question - to be able to view automation tests running on the browser on VM user should be logged in to that VM by Remote Desktop Connection.
Having a shared folder on VM can't help you to see tests running on VM because it doesn't mean users who have an access to it have a user account on that VM.
I would recommend creating one testing user account on that VM and share it with the team.
Run is performed on the agent, there is no way to see real time run on the agent without being logged in. You can record the run, save video to your shared folder, so users within access to that shared folder will have an access to saved videos.

serve oracle service cloud Customer portal locally?

I am working on customizing the oracle service cloud customer portal, but since OSvC provides only WebDAV to connect to it. It is very time-consuming to edit files and then upload them to WebDAV even for a single word change.
I am looking for a solution to serve it locally make desired changes and then upload the desired code to webDEV.
But after searching the file structure I can not make which framework it uses, I tried to use websites like https://builtwith.com/ and WhatRuns but they are also not able to find anything useful.
Although after searching in the file structure, I find some files of CodeIgnitor but the structure is way more different than the CodeIgnitor folder structure.
The short answer is no, you will not be able to run Customer Portal locally. While it is a fork of CodeIgniter from many years ago, there are server-side dependencies that will prevent you from running it in a local sandbox.
That said, it is possible to automate many of the manual tasks of interacting with WebDAV for change testing. If you edit locally, then you can use scripting hooks or event RPA robots to automate some of the manual file movement. Personally I have a flow to edit remotely in my test environment with an editor (like VSCode or Nova) that can connect to a remote server via WebDAV and edit files directly in the development area of a site. Then, when finished, I have a script that pulls down the latest version of all files and then allows me to commit changes to Git for SCM.
Another option is RPA. You can develop a robot that can be run to automate the manual tasks that you face in your workflow. Personally, I think that scripting is a better solution than RPA since you can automate all of the actions via scripting or a shell. But, it's another option to consider.
Another way of "Live editing" the OSvC CP code is to connect to WebDav via a software that supports it like Mountain Duck which uploads your code to OSvC on save.
OR use the better solution Windows Explorer which supports connecting to WebDav and treating it like a network drive, by going on My Computer -> Computer -> Map Network Drive then put https://yoursite.custhelp.com/dav/cp click Next then you'll be promoted to login using your OSvC login.

Store selenium tests on a server but run them on local browsers, through a framework like fitnesse

I've been working on a webdriver framework for a while now, I guess it is
keyword driven now. We would like for there to be a central place for users to
store tests, preferably on a wiki, but then when they are run they would open up
the browser on users local machine.
I originally started working using Fitnesse, which works great for storing the
tests however when we hosted it on a server when a user tries to run a test it
opens the browser on the server which the user can't view. Does anyone know a
way that I could force Fitnesse to open the users local browser or display the
browser to the user? Or do you know another framework/way to store tests in a
central place but run them in local.
I've been looking at sending through the local users ip through a fixture to start up the initial framework, I was hoping that fitnesse would already know the ip.
Thanks,
James
You can either find a framework that does what you want, or the bare minimum would be to create a thin wrapper that copies the test dll's and executeable to a machine and executes psexec to execute the tests on the remote machine. You could probably write the entire thing in maybe 20 lines of code.

How do I publish php source code to a local web server in rational team concert?

I'll be using RTC in the near future here at work. My question is: where does it put the files the team members will be working on? I understand that each programmer will work on the projects files and they will push the changes to the main repository. We have a local web server where we test our work (php). So, do we have to configure RTC to publish the files to the web server? or the RTC server must be installed in the webserver so it can save the files?
We use Rational Team Concert almost exactly as you describe, and it works brilliantly. My small team of web developers collaborates on website source code and delivers it to two different streams depending on its readiness: production-stream and staging-stream. Then we have defined two builds that check out the source code, move some things around, and push the files to the web servers via SCP. So, with a few clicks we kick off a staging build, watch it finish in about two minutes and everyone can see the changes on the staging server. When the code is ready for prime-time, the change sets are delivered to production-stream and the production build is kicked off, which is configured to copy the files to the production web server.
But even before a staging or production build is run, any of us can simply configure a local web server in RTC using the Eclipse PDE and Web Tools add-ons and see the site running in localhost as we develop.
All our work is done within Rational Team Concert, from planning, to bug tracking, to source control, to builds. It's very well-suited for website management.
Your understanding is correct - you work on files locally, and they get uploaded on to the server when you checkin. Bear in mind that checkin in RTC terms really means back-up your files to the server, it is a Deliver command that shares the files with others (it is worth a quick look at the articles on jazz.net that explains how SCM works).
One way to pubish to your php server is to make that part of a build, or a build in its own right (which RTC also handles - in conjunction with your favourite build tool). The build would copy the files to the php server. The advantage of doing this as a build is you will know exactly what versions of your files are being copied, and you will be able to reproduce this copy at any point in the future.
You do not need to install the RTC server on the php server.
You can also try posting on the forums on http://jazz.net/ if you have questions on RTC.
Hope that helps.
Another alternative would be to use the command line interface to accept all changes into a workspace and run that with a cron job.
To handle discarded change sets, you'd probably want to use something like:
scm workspace replace-components <workspace-name> stream <uuid-of-stream> --all
after you had initially loaded the workspace on your web server.

How would I created a flexible EC2 Windows 2008 boot script?

If you look at the Linux ecosystem (especially the Ubuntu and Alestic EC2 images) there is a common technique where the VMs are pre-configured to look at the EC2 user-data and use it as a boot script. The nice thing about this approach is that you can write a boot script that further provisions your machine, allowing you to avoid making a new image every time your software that runs on the machine changes.
I want to do the same thing for Windows, but given that I'm an Mac and Linux guy, I'm a bit lost on where to start. My requirements are:
This must run on Windows Server 2008
A bootstrap script needs to start when the machine boots up, read the user-data file by pulling down the contents http://169.254.169.254/1.0/user-data
The bootstap script then needs to run the contents of that file as if it were a script
The script embedded in the user-data needs to run in such a way that it has access to the desktop environment (ie: it can launch a browser, etc).
I'm not quite sure how services work in Windows or if I need to enable auto-login, so any advice here would be appreciated. The ultimate goal is to run a Java program that launches some custom software that in turn launches a web browser (IE, Firefox, etc) and is capable of taking screenshots.
The screenshot part is interesting, because in the past when I've tried this the only way I could get something other than a black screen was to have UltraVNC or RealVNC boot up as a service, though I don't know why that helped.
I'm looking for answers to three specific questions, as well as any general advice:
Should I be focussing on a Windows service or auto-login + bat file in the "Startup" folder?
If I use a Windows service, is there anything special that I need to do to make sure desktop access and/or screenshots are available?
Do you recommend any tools for common Linux commands, like curl or wget? Last time I used Windows I used Cygwin a lot, but is there something more appropriate to use here?
I have not tried auto-login on Windows instances in EC2, but here's the support document on how to enable it.
We boot-strap our Windows instances using a custom AMI with a custom Windows 'install' service already installed. The boot-strap installer reads a URL from user-data at startup. The URL points to a ZIP file stored in S3. The installer then downloads, un-zips, and executes the actual application installer -- in our case a simple CMD fie.
This setups allows us to have one base AMI and then be able to easily overlay 15+ different application configurations (without having to rebuild the AMI). If you only have one application configuration this may be overkill for your situation.
The only trouble we ran into was having our installer service start to early -- changing the service startup mode to "Automatic Delayed" fixed that issue.
We wrote our boot-strap installer in Java, launched via YAJSW, because we're comfortable with it. If you just want a few simple Unix tools, most are available pre-compiled for Windows, for example wget.
For something completely different, you could try PsExec to configure the instance after it has booted.
You can try using RightScale's free developer account to create plain Powershell scripts and associate them with your Windows instances to run at boot time. The RightScale dashboard solves exactly the problems you are trying to solve above.
DISCLAIMER: I work for RightScale.
As for screen capture CutyCapt is a simple tool you can point at a URL and generate an image from.
Unxutils is a great solution for those looking for unix tools on Windows. It's got the wget.exe that you're looking for, however, using Powershell to download stuff is not so bad either:
$wc = new-object system.net.webclient
$wc.DownloadFile("http://stackoverflow.com","test.html")
If you can write a batch file to do your setup, then you can run it at startup of the vm by doing this:
1. Run REGEDT32.EXE.
2. Modify the following value within HKEY_CURRENT_USER:
Software\Microsoft\Windows NT\CurrentVersion\Winlogon\ParseAutoexec
1 = autoexec.bat is parsed
0 = autoexec.bat is not parsed
As an answer to #3, I would say that you can do just about anything in a batch file that you need which includes downloading from a ftp server (but not from a http server). I am really interested in this stuff and so if you have questions, try asking me.
If you use Elastic Beanstalks you can use this:
Customizing the Software on EC2 Instances Running Windows
It uses YAML formatting standards, e.g.
packages:
msi:
mysql: http://dev.mysql.com/get/Downloads/Connector-Net/mysql-connector-net-6.6.5.msi/from/http://cdn.mysql.com/
or
sources:
"c:/myproject/myapp": http://s3.amazonaws.com/mybucket/myobject.zip
I know this is a little bit late to help out with the original post but for anyone who is still reading this one solution is to use the http://cloudinitnet.codeplex.com/ project. The service is easily installed using a powershell script and will create a local administrator account to use while running.
The goal for this project was to replace the Cloud-Init project used in Amazon Linux and Ubuntu.