Jenkins job on AWS server - api

I have a windows instance set up on AWS. I have Jmeter and Ant installed on that machine to run API test cases.So I can successfully run tests on remote server. I need to set up a job on corporate Jenkins to run those test cases on aws server. I have server's IP address and username and password to log in to aws server.
How do I set up a job on corporate Jenkins which will run my test cases on remoter aws server? (Execute windows command)
Thank you.

did you connect this AWS machine as a slave to the master ? or you can't do it ?
I think that the best is connect this machine as a slave and create a dedicated job to run on the windows machine

Setting up a new Ant task to run JMeter test in Jenkins is fairly easy, here is a simple Pipeline script which executes Ant task using build.xml file in the "extras" folder of JMeter installation and publishes resulting HTML report on the build artifacts page:
node {
ws('c:\\jmeter\\extras') {
stage 'Run JMeter Test with Apache Ant'
def antHome = tool 'ant'
bat "pushd c:\\jmeter\\extras && ${antHome}\\bin\\ant -f build.xml"
step([$class: 'ArtifactArchiver', artifacts: 'Test.html', fingerprint: true])
}
}
You will need to define ant tool so Jenkins would know what is "ant" and where it lives. See Running a JMeter Test via Jenkins Pipeline - A Tutorial article for details.
Alternative options are:
If you have Jenkins admin access or your user has "Job - Configure" role you have "Configure" button where you can see (and change) all the build steps
You can copy the whole Jenkins installation over to your corporate intranet (don't forget JENKINS_HOME folder)

Thank you both for your response. I have set it my windows AWS machine as Slave. And I was able to run the job on AWS server from corporate jenkins.

Related

Azure Devops SSH Deployment Task Inline Script Hang Problem

When I want to run a script in Inline Script mode on SSH Deployment Task on Azure Devops Version Dev17.M153.5, that one goes on hang as below. There is no problem that we have checked the authorizations of the user running the command. Operating system version on target Solaris 11.4.Is there a method to solve this problem?
##[debug]inline=cd /home/userstfs/workspaces/*****/src;pwd
##[debug]No script header detected. Adding: #!/bin/bash
##[debug]Agent.TempDirectory=D:****_temp
##[debug]failOnStdErr=true
Trying to establish an SSH connection to #...***:22
Successfully connected.
##[debug]remoteScriptPath = "./sshscript_1596017258623._unix"
##[debug]Copying script to remote machine
You have to update the agent in the deployment pools from:
Organization setting - Deployment pools - Update targets

Pipeline with services from password protected repository

I want to create pipeline with services. Let's say using mysql service
services:
- mysql:latest
My project uses docker image from our company repo which is password protected.
When I run it manually I must fist login to repository
docker login <creadentials> <repository address>
docker pull <some private image>
Is there some way to configure gitlab pipeline to use service with credentials?
services:
- <maybe some credentials here???>#<my private host>/modifiedForProductionMysql:latest
I know I can use shell runner and call all commands in my shell script. First I wanted to investigate if it is doable with gitlab docker runner and pipeline job with services.
See Using a private container registry. You can put your credentials into DOCKER_AUTH_CONFIG variable. The format is the same as ~/.docker/config.json after you login to your registry.

SQL server agent Jobs step

I'm trying to setup new job step in SQL Server Agent.
I need to do with powershell with administrator. What I am trying is:
$env:Path += ';C:\Program Files\Amazon\AWSCLI'
aws s3 sync D:\MSSQLBackups\ s3://myfolder/DB_backups
If I do it from powershell it works well.
But, if I create job step with powershell type or cmd, it does not work.
Where is my mistake?
Problem fixed. As i understood Job step launch from user SQLSERVERAGENT. So, i just copied folder .aws from C:\Users\Administrator to C:\Users\SQLSERVERAGENT (but before i have setup aws configure. In .aws folder you should have 2 files: config, credentilas).

Use SSH to execute robot framework test

I user SSH to execute robot framework testcases (selenium), but there is no browser opened, the testcases is executed in the background? How can I solve this issue?
I want to execute robot framework testcases on win10, and I want to start the test via Jenkins which is installed on Linux, so I installed a SSH plugin in the Jenkins, then I create a job in Jenkins and execute below command via SSH
pybot.bat --argumentfile E:\project\robot_framework\Automation\logs\argfile1.txt E:\project\robot_framework\Automation
when I start the job, the testcase is executed in the background, but I need the test case to open the browser in the front.
ssh by definition executes commands in a different session than the current user's.
Especially considering your target is a Windows machine - imagine if you were logged in and working with desktop apps, and someone starts an app through ssh in your session - would be a little surprising (mildly put :), wouldn't it?
To get what you want - being able to monitor the execution, you could try runas /user:username command, but a satisfactory end result is not guaranteed (plus you have to provide the target user's password).
Another option is to use psexec, available on TechNet, but YMMV - you have to provide the target session id, some programs might not show up - or might not be maximizable, etc. The call format would be
psexec.exe -i 2 pybot.bat the_rest_of_the_args
I.e. a lot of effort and uncertainties, with minimal ROI :)

how to change cassandra docker config

I installed cassandra from cassandra hub and its running successfully.
root#localhost$ docker ps | grep cassandra
2925664e3391 cassandra:2.1.14 "/docker-entrypoin..." 5 months ago Up 23 minutes 0.0.0.0:7000-7001->7000-7001/tcp, 0.0.0.0:7199->7199/tcp, 0.0.0.0:9042->9042/tcp, 0.0.0.0:9160->9160/tcp, 0.0.0.0:32779->7000/tcp, 0.0.0.0:32778->7001/tcp, 0.0.0.0:32777->7199/tcp, 0.0.0.0:32776->9042/tcp, 0.0.0.0:32775->9160/tcp
I am connected my application with this cassandra. I need to use password authentication to connect to cassandra form my application.
I have to unable password authentication for this, I get the /etc/cassandra/cassandra.yaml file in docker image. I have to follow Authentication Config to enable this.
Is there way to override this changes with docker start or docker run command ?
It is not included into the piece generating the cassandra.yml file, so no. You can submit a PR modifying the relevant piece of the generation script to allow to specify auth via env variables.