Setup ansible development environment with pipenv - testing

I'm trying to setup a local development environment for an ansible collecion, checkmk collection. The goal is to run the jobs from the CI pipeline locally. My idea is to use pipenv to setup a virtual environment for the ansible package. I have the following Pipfile.
[dev-packages]
ansible = "*"
ansible-lint = "*"
yamllint = "*"
Unfortunately when running the integration tests with
pipenv run ansible-test integration --docker --venv
I always get an error that the test targets cannot be found. But they exist.
Is this not a supported way to setup a bit more isolated development environment?

Related

pipeline passed without any changes gitlab ci cd

im new comer in gitlab ci cd
this is my .gitlab-ci.yml in my project and i want to test it how it works . i have registered gitlab runner that is ok but my problem is when i add a file to my project it will run pipelines and they are successfully passed but nothing changes in my project in server?
what is the problem? it is not dockerized project
image: php:7.2
stages:
- preparation
- building
cache:
key: "$CI_JOB_NAME-$CI_COMMIT_REF_SLUG"
composer:
stage: preparation
script:
- mkdir test5
when: manual
pull:
stage: building
script:
- mkdir test6
CI/CD Pipelines like GitLab CI/CD are nothing different than virtual environments, usually docker, that can operate on the basis of your code as you do on your own host system.
Your mkdir operations definitely have an effect but the changes remain inside the virtual environment because they are not reflected to your remote repository. For this to work, you have to setup your repository from inside the CI/CD runner and commit to your repository, (again) just like you do from your own host system. To execute custom commands, GitLab CI/CD has the script parameter. I am sure, reading this will get you up and running.

docker adding chromedriver to path and running dotnet test

We're working on a small docker image container for windows to run a specflow test in a dotnet core projecct. The problem we have is that we can't get chromedriver to work as well as running the dotnet test command.
The specflow project we're running just contains one hello world testcase which we can run without chromedriver, but then we get the error message "OpenQA.Selenium.DriverServiceNotFoundException : The chromedriver.exe file does not exist in the current directory or in a directory on the PATH environment variable."
We're providing an instance of chromedriver in the project so we don't have to download it.
The dockerfile we're running:
FROM mcr.microsoft.com/dotnet/core/sdk:2.2 AS build
WORKDIR /src
COPY . .
ENV PATH="/src/chromedriver:$PATH"
RUN dotnet test
We're using this command to run it: docker build . --build-arg HTTP_PROXY=http://PROXY:8080 --build-arg HTTPS_PROXY=http://PROXY:8080 --rm
We expect the specflow tests to run with chromedriver. When we run the dockerfile we get the error message "'dotnet' is not recognized as an internal or external command, operable program or batch file." It seems that the chromedriver is not added correctly to the PATH variable. We need it there to be able to run the specflow tests.
Does anybody know how to configure the dockerfile to work correctly with chromedriver?
Thanks for your time.

Gitlab continuous integration testing with Selenium

I am working on a project to build, test and deploy an application to the cloud using a .gitlab-ci.yml
1) Build the backend and frontend using pip install and npm install
build_backend:
image: python
stage: build
script:
- pip install requirements.txt
artifacts:
paths:
- backend/
build_frontend:
image: node
stage: build
script:
- npm install
- npm run build
artifacts:
paths:
- frontend
2) Run unit and functional tests using PyUnit and Python Selenium
test_unit:
image: python
stage: test
script:
- python -m unittest discover
test_functional:
image: python
stage: test
services:
- selenium/standalone-chrome
script:
- python tests/example.py http://selenium__standalone-chrome:4444/wd/hub https://$CI_BUILD_REF_SLUG-dot-$GAE_PROJECT.appspot.com
3) Deploy to Google Cloud using the sdk
deploy:
image: google/cloud-sdk
stage: deploy
environment:
name: $CI_BUILD_REF_NAME
url: https://$CI_BUILD_REF_SLUG-dot-$GAE_PROJECT.appspot.com
script:
- echo $GAE_KEY > /tmp/gae_key.json
- gcloud config set project $GAE_PROJECT
- gcloud auth activate-service-account --key-file /tmp/gae_key.json
- gcloud --quiet app deploy --version $CI_BUILD_REF_SLUG --no-promote
after_script:
- rm /tmp/gae_key.json
This all runs perfectly, except for the selenium tests are run on the deployed url not the current build:
python tests/example.py http://selenium__standalone-chrome:4444/wd/hub https://$CI_BUILD_REF_SLUG-dot-$GAE_PROJECT.appspot.com
I need to have gitlab run three things simultaneously:
a) Selenium
b) Python server with the application
- Test script
Possible approaches to run the python server:
Run within the same terminal commands as the test script somehow
Docker in Docker
Service
Any advice, or answers would be greatly appreciated!
I wrote a blog post on how I set up web tests for a php application. Ok PHP, but I guess something similar can be done for a python project.
What I did, was starting a php development server from within the container that runs the web tests. Because of the artifacts, the development server can access the php files. I figure out the IP address of the container, and using this IP address the selenium/standalone-chrome container can connect back to the development server.
I created a simple demo-project, you can check out the .gitlab-ci.yml file. Note that I pinned the selenium container to an old version; this was because of an issue with an old version of the php webdriver package, today this isn't needed anymore.

How to run build in local machine with drone.io

Does the build have to run on the drone.io server? Can I run the build locally? Since developers need to pass the build first before pushing code to github, I am looking for a way to run the build on developer local machine. Below is my .drone.yml file:
pipeline:
build:
image: node:latest
commands:
- npm install
- npm test
- npm run eslint
integration:
image: mongo-test
commands:
- mvn test
It includes two docker containers. How to run the build against this file in drone? I looked at the drone cli but it doesn't work in my expected way.
#BradRydzewski comment is the right answer.
To run builds locally you use drone exec. You can check the docs.
Extending on his answer, you must execute the command in the root of your local repo, exactly where your .drone.yml file is. If your build relies on secrets, you need to feed these secrets through the command line using the --secret or --secrets-file option.
When running a local build, there is no cloning step. Drone will use your local git workspace and mount it in the step containers. So, if you checkout some other commit/branch/whatever during the execution of the local build, you will mess things up because Drone will see those changes. So don't update you local repo while the build is running.

How to run integration test inside a docker container in drone pipeline

I have a docker image built up for mongodb test. You can be found from zhaoyi0113/mongo-uat. When start a docker container from this image, it will create a few mongodb instances which will take a few minutes to startup. Now I want to run my integration test cases inside this container by drone CI. Below is my .drone.yml file:
pipeline:
build:
image: node:latest
commands:
- npm install
- npm test
- npm run eslint
integration:
image: zhaoyi0113/mongo-uat
commands:
- npm install
- npm run integration
There are two steps in this pipeline, the first is to run unit test in a nodejs project. The second one integration is used to run integration test cases in the mongodb docker image.
when I run drone exec it will get an error failed to connect to mongo instance. I think that because the mongodb instance needs a few minutes to startup. The commands npm install and npm run integration should be run after the mongodb instance launched. How can I delay the build commands?
EDIT1
The image zhaoyi0113/mongo-uat has mongodb environment. It will create a few mongodb instances. I can run this command docker run -d zhaoyi0113/mongo-uat to launch this container after that I can attach to this container to see the mongodb instances. I am not sure how drone launch the docker container.
The recommended approach to integration testing is to place your service containers in the service section of the Yaml [1][2]
Therefore in order to start a Mongo service container I would create the below Yaml file. The Mongo service will start on the default port at 127.0.0.1 and be accessible from your pipeline containers.
pipeline:
test:
image: node
commands:
- npm install
- npm run test
integration:
image: node
commands:
- npm run integration
services:
mongo:
image: mongo:3.0
This is the recommended approach for testing services like MySQL, Postgres, Mongo and more.
[1] http://readme.drone.io/usage/getting-started/#services
[2] http://readme.drone.io/usage/services-guide/
As a short addendum to Brads answer: While the mongo service will run on 127.0.0.1 on the drone host machine - it will not be possible to reach the service from this IP within the node app. To access the service you would reference its service name (here: mongo).