RabbitMQ messages are not consummed - rabbitmq

I would like to use RabbitMQ to send messages from a webapp backend to a second module. On my laptop, it works, but when I deploy the application on a VPS, even in dev mode, it doesn't work anymore... Could you please help me solve this out?
Current status :
If I check the queues on the VPS where both modules are installed, then, it looks ok (messages are added in the queue)
$ rabbitmqctl list_queues
Timeout: 60.0 seconds ...
Listing queues for vhost / ...
MyMessages 2
When I launch the second module, I get following log :
Waiting for a request on queue : MyMessages, hosted at localhost
Comming from the following java code :
public static void main(String[] args) throws IOException, TimeoutException {
RabbitMQConsumer rabbitMQConsumer = new RabbitMQConsumer();
rabbitMQConsumer.waitForRequests();
System.out.println("Waiting for a request on queue : " + AppConfig.QUEUE_NAME + ", hosted at " + AppConfig.QUEUE_HOST);
}
public RabbitMQConsumer() throws IOException, TimeoutException {
mapper = new ObjectMapper();
ConnectionFactory connectionFactory = new ConnectionFactory();
connectionFactory.setHost(AppConfig.QUEUE_HOST);
Connection connection = connectionFactory.newConnection();
channel = connection.createChannel();
}
public void waitForRequests() throws IOException {
DefaultConsumer consumer = new DefaultConsumer(channel) {
#Override
public void handleDelivery(String consumerTag, Envelope envelope, AMQP.BasicProperties properties, byte[] body) throws IOException {
try {
System.out.println("Message received ! ");
channel.basicAck(envelope.getDeliveryTag(), false);
} catch (Exception e) {
e.printStackTrace();
}
}
};
channel.queueDeclare(AppConfig.QUEUE_NAME, true, false, false, null);
channel.basicConsume(AppConfig.QUEUE_NAME, consumer);
}
I think both modules are looking at the same queue, there are messages in the quue, so... to me, it looks like messages are not consummed... I've looked at the status of rabbitMQ, but I do not know how to use it :
$ invoke-rc.d rabbitmq-server status
● rabbitmq-server.service - RabbitMQ broker
Loaded: loaded (/lib/systemd/system/rabbitmq-server.service; enabled; vendor preset: enabled)
Active: active (running) since Sat 2018-04-07 18:24:59 CEST; 1h 38min ago
Process: 17103 ExecStop=/usr/lib/rabbitmq/bin/rabbitmqctl shutdown (code=exited, status=0/SUCCESS)
Main PID: 17232 (beam.smp)
Status: "Initialized"
Tasks: 84 (limit: 4915)
CGroup: /system.slice/rabbitmq-server.service
├─17232 /usr/lib/erlang/erts-9.3/bin/beam.smp -W w -A 64 -P 1048576 -t 5000000 -stbt db -zdbbl 1280000 -K true -- -root /usr/lib/erlang -progname erl -- -home /var/lib/rabbitmq -- -pa /usr/lib/r
abbitmq/lib/rabbitmq_server-3.7.4/ebin -noshell -noinput -s rabbit boot -sname rabbit#vps5322 -boot start_sasl -kernel inet_default_connect_options [{nodelay,true}] -sasl errlog_type error -sasl sasl_err
or_logger false -rabbit lager_log_root "/var/log/rabbitmq" -rabbit lager_default_file "/var/log/rabbitmq/rabbit#vps5322.log" -rabbit lager_upgrade_file "/var/log/rabbitmq/rabbit#vps5322_upgrade.log" -r
abbit enabled_plugins_file "/etc/rabbitmq/enabled_plugins" -rabbit plugins_dir "/usr/lib/rabbitmq/plugins:/usr/lib/rabbitmq/lib/rabbitmq_server-3.7.4/plugins" -rabbit plugins_expand_dir "/var/lib/rabbitmq/
mnesia/rabbit#vps5322-plugins-expand" -os_mon start_cpu_sup false -os_mon start_disksup false -os_mon start_memsup false -mnesia dir "/var/lib/rabbitmq/mnesia/rabbit#vps5322" -kernel inet_dist_listen_m
in 25672 -kernel inet_dist_listen_max 25672
├─17319 /usr/lib/erlang/erts-9.3/bin/epmd -daemon
├─17453 erl_child_setup 1024
├─17475 inet_gethost 4
└─17476 inet_gethost 4
Apr 07 18:24:57 vps5322 rabbitmq-server[17232]: ## ##
Apr 07 18:24:57 vps5322 rabbitmq-server[17232]: ## ## RabbitMQ 3.7.4. Copyright (C) 2007-2018 Pivotal Software, Inc.
Apr 07 18:24:57 vps5322 rabbitmq-server[17232]: ########## Licensed under the MPL. See http://www.rabbitmq.com/
Apr 07 18:24:57 vps5322 rabbitmq-server[17232]: ###### ##
Apr 07 18:24:57 vps5322 rabbitmq-server[17232]: ########## Logs: /var/log/rabbitmq/rabbit#vps5322.log
Apr 07 18:24:57 vps5322 rabbitmq-server[17232]: /var/log/rabbitmq/rabbit#vps5322_upgrade.log
Apr 07 18:24:57 vps5322 rabbitmq-server[17232]: Starting broker...
Apr 07 18:24:59 vps5322 rabbitmq-server[17232]: systemd unit for activation check: "rabbitmq-server.service"
Apr 07 18:24:59 vps5322 systemd[1]: Started RabbitMQ broker.
Apr 07 18:24:59 vps5322 rabbitmq-server[17232]: completed with 0 plugins.
Finally, note that the webapp application is a PlayFramework app, with these dependencies :
libraryDependencies ++= Seq(
guice,
"com.rabbitmq" % "amqp-client" % "5.2.0"
)
Whereas the second module is a pure java code, based on maven, with the following pom :
<dependency>
<groupId>com.rabbitmq</groupId>
<artifactId>amqp-client</artifactId>
<version>5.2.0</version>
</dependency>
Any idea of the problem?
Thank you very much !!

Finally I've found the problem. This configuration is actually working, but I could not see it because of a crash in my own app that was not logged because of an error in my log4J configuration.
Just in case, the error I had was that a local library included in my pom with a relative path (${project.basedir}) was found by my IDE but not anymore once deployed on a VPS. To solve this, I've just moved this (hopefully) very small library directly into my project. After solving this issue, I had to reset rabbitMQ and then it was all fine :
rabbitmqctl stop_app
rabbitmqctl reset
rabbitmqctl start_app
Thank you very much,
Regards,

Related

lsync setup does not connect to remote

I configured my lsyncd as follows: nano /etc/lsyncd/lsyncd.conf.lua
How to correctly configure this file?
settings {
logfile = "/var/log/lsyncd/lsyncd.log",
statusFile = "/var/log/lsyncd/lsyncd-status.log",
statusInterval = 2
}
sync {
default.rsync,
source="/home/john/Documents/reprogramming",
target="john.doe#localhost:~/repgrogramming",
rsync = {
archive = false,
acls = false,
chmod = "D2755,F644",
compress = true,
links = false,
owner = false,
perms = false,
verbose = true,
rsh = "ssh -p 2222 -l john -i /home/john/.ssh/id_rsa -o StrictHostKeyChecking=no"
}
}
This is the error message:
john#john:~$ tail -10 /var/log/lsyncd/lsyncd.log
Disconnected from 127.0.0.1 port 2222
rsync: connection unexpectedly closed (0 bytes received so far) [sender]
rsync error: unexplained error (code 255) at io.c(228) [sender=3.2.3]
Thu Jun 16 11:11:01 2022 Error: Temporary or permanent failure on startup of /home/john/Documents/reprogramming/ -> john.doe#localhost:~/reprogramming/. Terminating since "insist" is not set.
Thu Jun 16 11:35:46 2022 Normal: --- Startup, daemonizing ---
Thu Jun 16 11:35:46 2022 Normal: recursive startup rsync: /home/john/Documents/reprogramming/ -> john.doe#localhost:~/reprogramming/
ssh: connect to host localhost port 2222: Connection refused
rsync: connection unexpectedly closed (0 bytes received so far) [sender]
rsync error: unexplained error (code 255) at io.c(228) [sender=3.2.3]
Thu Jun 16 11:35:46 2022 Error: Temporary or permanent failure on startup of /home/john/Documents/reprogramming/ -> john.doe#localhost:~/reprogramming/. Terminating since "insist" is not set.
If I use: ssh john.doe#localhost -p 2222 it connects automatically.

Codeception Acceptance Testing issue using session snapshot

Update 10 Jun, 2021
So when removing the populator from codeception.yml the session problem goes away.
BUT: In the dump.sql is nothing having influence on users or sessions or cookies. There are only a few tables with demo data, and they are needed!
The relevant part in the file is this:
codeception.yml
...
modules:
enabled: [Db]
config:
Db:
dsn: "mysql:host=%HOST%;dbname=%DBNAME%"
user: "root"
password: "root"
populate: true
cleanup: true
# populator: "mysql -u$user -p$password $dbname < tests/codeception/_data/dump.sql"
...
Original Post
I think i read almost all similar reccources considering this issue, but nothing helped so far.
I am moving our Codeception tests on Github Actions. The whole build process is running but the acceptance tests not, because the session snapshot can't be restored.
The same Workflow works on a local server where i use selenium webdriver. I tried to run selenium in Actions (commented in build.yml) but that made some port problems.
What i'm doing in this short example is installing Joomla (works) and after that creating a Content category.
The second step (creating a Content category) tries to pick up the created session from the first step.
It's very simple and no problem locally, but on Actions the created session cannot be read.
The relevant report part:
InstallCest: Install joomla
Signature: InstallCest:installJoomla
Test: tests/codeception/acceptance/install/InstallCest.php:installJoomla
... works
InstallCest: createCategory
Signature: InstallCest:createcategory
Test: tests/codeception/acceptance/install/InstallCest.php:createcategory
Scenario --
[Db] Executing Populator: `mysql -uroot -proot test < tests/codeception/_data/dump.sql`
[Db] Populator Finished.
I create category "test 123"
Category creation in /administrator/
I open Joomla Administrator Login Page
[GET] http://127.0.0.1:8000/administrator/index.php
[Cookies] [{"name":"9d4bb4a09f511681369671a08beff228","value":"fail5495jbd01q6dc2nm06i7gf","path":"/","domain":"127.0.0.1","expiry":1623346855,"secure":false,"httpOnly":false},{"name":"8b5558aac8008f05fd8f8e59a3244887","value":"irhlqlj8jabat2n5746ba0sb5r","path":"/","domain":"127.0.0.1","expiry":1623346855,"secure":false,"httpOnly":false}]
[Snapshot] Restored "admin" session snapshot
[GET] http://127.0.0.1:8000/administrator/index.php?option=com_categories
Screenshot and page source were saved into '/home/runner/work/project_b/project_b/tests/codeception/_output/' dir
ERROR
The report:
session not created: No matching capabilities found
The HTML Snapshot:
Warning: session_start(): Failed to read session data: user (path: /var/lib/php/sessions) in /home/runner/work/project_b/project_b/joomla/libraries/joomla/session/handler/native.php on line 260
Error: Failed to start application: Failed to start the session
The php.log part
[Wed Jun 9 18:24:13 2021] 127.0.0.1:41972 Accepted
[Wed Jun 9 18:24:13 2021] 127.0.0.1:41972 [200]: GET /media/jui/fonts/IcoMoon.woff
[Wed Jun 9 18:24:13 2021] 127.0.0.1:41972 Closing
[Wed Jun 9 18:24:16 2021] 127.0.0.1:41982 Accepted
[Wed Jun 9 18:24:16 2021] PHP Warning: session_start(): Failed to read session data: user (path: /var/lib/php/sessions) in /home/runner/work/project_b/project_b/joomla/libraries/joomla/session/handler/native.php on line 260
[Wed Jun 9 18:24:16 2021] 127.0.0.1:41982 [500]: GET /administrator/index.php
[Wed Jun 9 18:24:16 2021] 127.0.0.1:41982 Closing
[Wed Jun 9 18:24:16 2021] 127.0.0.1:41986 Accepted
[Wed Jun 9 18:24:16 2021] PHP Warning: session_start(): Failed to read session data: user (path: /var/lib/php/sessions) in /home/runner/work/project_b/project_b/joomla/libraries/joomla/session/handler/native.php on line 260
[Wed Jun 9 18:24:16 2021] 127.0.0.1:41986 [500]: GET /administrator/index.php?option=com_categories
[Wed Jun 9 18:24:16 2021] 127.0.0.1:41986 Closing
I tried change the session.save_path without effect.
Posting relevant pieces:
composer.json
{
"name": "company/tests",
"description": "Company Product",
"license": "GPL-2.0+",
"require": {},
"require-dev": {
"codeception/codeception": "^4",
"fzaninotto/faker": "^1.6",
"behat/gherkin": "^4.4.1",
"phing/phing": "2.*",
"codeception/module-asserts": "^1.3",
"codeception/module-webdriver": "^1.2",
"codeception/module-filesystem": "^1.0",
"codeception/module-db": "^1.1"
}
}
build.yml
name: Codeception Tests
on: [push]
jobs:
tests:
runs-on: ${{ matrix.operating-system }}
strategy:
fail-fast: false
matrix:
operating-system: [ubuntu-latest]
php: ["7.4"]
name: PHP ${{ matrix.php }} Test on ${{ matrix.operating-system }}
env:
php-ini-values: post_max_size=32M
DB_DATABASE: test
DB_NAME: test
DB_ADAPTER: mysql
DB_USERNAME: root
DB_PASSWORD: root
DB_HOST: 127.0.0.1
DB_PORT: 3306
APP_URL: http://127.0.0.1:8000
steps:
- name: Checkout
uses: actions/checkout#v2
- name: Checkout Joomla 3
uses: actions/checkout#v2
with:
repository: joomla/joomla-cms
ref: "3.9.27"
path: joomla
- name: Setup PHP
uses: shivammathur/setup-php#v2
with:
php-version: ${{ matrix.php }}
# ini-values: session.save_path=/tmp
extensions: mbstring, intl, zip, json
tools: composer:v2
- name: Start MySQL
run: |
sudo /etc/init.d/mysql start
mysql -e 'CREATE DATABASE test;' -uroot -proot
mysql -e 'SHOW DATABASES;' -uroot -proot
# Composer stuff ...
- name: Run chromedriver
run: nohup $CHROMEWEBDRIVER/chromedriver --url-base=/wd/hub /dev/null 2>&1 &
# - name: Start ChromeDriver (was a try)
# run: |
# google-chrome --version
# xvfb-run --server-args="-screen 0, 1280x720x24" --auto-servernum \
# chromedriver --port=4444 --url-base=/wd/hub &> chromedriver.log &
- name: Run PHP webserver
run: |
php -S 127.0.0.1:8000 -t joomla/ &> php.log.txt &
sleep 1;
- name: Install Tests
run: |
php vendor/bin/codecept run "tests/codeception/acceptance/install/InstallCest.php" -vv --html
env:
DB_PORT: ${{ job.services.mysql.ports[3306] }}
- name: Upload Codeception output
if: ${{ always() }}
uses: actions/upload-artifact#v2
with:
name: codeception-results
# path: Tests/Acceptance/_output/
path: tests/codeception/_output/
- name: Upload PHP log
if: ${{ failure() }}
uses: actions/upload-artifact#v2
with:
name: php-log
path: php.log.txt
Acceptance Suite
class_name: AcceptanceTester
modules:
enabled:
- Asserts
- JoomlaBrowser
- Helper\Acceptance
- DbHelper
- Filesystem
config:
JoomlaBrowser:
url: "http://127.0.0.1:8000/"
browser: "chrome"
restart: true
clear_cookies: true
# window_size: 1280x1024
window_size: false
port: 9515
capabilities:
unexpectedAlertBehaviour: "accept"
chromeOptions:
args: ["--headless", "--disable-gpu"] # Run Chrome in headless mode
# prefs:
# download.default_directory: "..."
username: "admin" # UserName for the Administrator
password: "admin" # Password for the Administrator
database host: "127.0.0.1:3306" # place where the Application is Hosted #server Address
database user: "root" # MySQL Server user ID, usually root
database password: "root" # MySQL Server password, usually empty or root
database name: "test" # DB Name, at the Server
database type: "mysqli" # type in lowercase one of the options: MySQL\MySQLi\PDO
database prefix: "jos_" # DB Prefix for tables
install sample data: "no" # Do you want to Download the Sample Data Along with Joomla Installation, then keep it Yes
sample data: "Default English (GB) Sample Data" # Default Sample Data
admin email: "admin#mydomain.com" # email Id of the Admin
language: "English (United Kingdom)" # Language in which you want the Application to be Installed
Helper\Acceptance:
url: "http://127.0.0.1:8000/" # the url that points to the joomla installation at /tests/system/joomla-cms - we need it twice here
MicrosoftEdgeInsiders: false # set this to true, if you are on Windows Insiders
error_level: "E_ALL & ~E_STRICT & ~E_DEPRECATED"
InstallCest.php
<?php
/**
* Install Joomla and create Category
*
* #since 3.7.3
*/
class InstallCest
{
/**
* Install Joomla, disable statistics and enable Error Reporting
*
* #param AcceptanceTester $I The AcceptanceTester Object
*
* #since 3.7.3
*
* #return void
*
*/
public function installJoomla(\AcceptanceTester $I)
{
$I->am('Administrator');
$I->installJoomlaRemovingInstallationFolder();
$I->doAdministratorLogin();
$I->disableStatistics();
$I->setErrorReportingToDevelopment();
}
/**
* Just create Category
*
* #param AcceptanceTester $I The AcceptanceTester Object
*
* #since 3.7.3
*
* #return void
*
*/
public function createCategory(\AcceptanceTester $I)
{
$I->createCategory('test 123');
}
}

Automated Testing with Project Piper

We are currently in the process of setting up a CI/CD pipeline for our SCP CF environment based on Project Piper. After a lot of trial & error, we have made significant process, however we are still not able to run our karma tests.
We understood that we need to use the karma-webdriver-launcher, however we are not able to start/connect to Chrome in the sidecar container.
Relevant logs:
[Pipeline] sh
+ cd ./nodejs
+ npm test
> nodejs#1.0.0 test /var/jenkins_home/workspace/CICD_Multibranch_master/nodejs
> karma start ./karma.conf.js
10 07 2019 14:42:53.897:DEBUG [config]: Loading config /var/jenkins_home/workspace/CICD_Multibranch_master/nodejs/karma.conf.js
10 07 2019 14:42:53.906:DEBUG [karma-server]: Final config [{"LOG_DISABLE":"1","LOG_ERROR":"2","LOG_WARN":"3","LOG_INFO":"4","LOG_DEBUG":"5","frameworks":"6","protocol":"7","port":9876,"listenAddress":"8","hostname":"9","httpsServerConfig":"10","basePath":"11","files":"12","browserConsoleLogOptions":"13","customContextFile":null,"customDebugFile":null,"customClientContextFile":null,"exclude":"14","logLevel":"5","colors":false,"autoWatch":false,"autoWatchBatchDelay":250,"restartOnFileChange":false,"usePolling":true,"reporters":"15","singleRun":true,"browsers":"16","captureTimeout":60000,"proxies":"17","proxyValidateSSL":true,"preprocessors":"18","urlRoot":"19","reportSlowerThan":0,"loggers":"20","transports":"21","forceJSONP":false,"plugins":"22","client":"23","defaultClient":"23","browserDisconnectTimeout":2000,"browserDisconnectTolerance":0,"browserNoActivityTimeout":30000,"processKillTimeout":2000,"concurrency":null,"failOnEmptyTestSuite":true,"retryLimit":2,"detached":false,"crossOriginAttribute":true,"browserSocketTimeout":20000,"cmd":"24","configFile":"25","customLaunchers":"26","junitReporter":"27"},"OFF","ERROR","WARN","INFO","DEBUG",["28"],"http:","0.0.0.0","localhost",{},"/var/jenkins_home/workspace/CICD_Multibranch_master/nodejs",["29","30"],{"level":"31","format":"32","terminal":true},["25"],["33"],["34"],{},{},"/",["35"],["36","37"],["38","39","40","41","42"],{"args":"43","useIframe":true,"runInParent":false,"captureConsole":true,"clearContext":true},"start","/var/jenkins_home/workspace/CICD_Multibranch_master/nodejs/karma.conf.js",{"chromeSel":"44"},{"outputFile":"45","suite":"46"},"qunit",{"pattern":"47","served":true,"included":true,"watched":true,"nocache":false,"weight":"48"},{"pattern":"49","served":true,"included":true,"watched":true,"nocache":false,"weight":"50"},"debug","%b %T: %m","junit","chromeSel",{"type":"51","layout":"52"},"polling","websocket","karma-qunit","karma-junit-reporter","karma-chrome-launcher","karma-webdriver-launcher",{"launcher:chromeSel":"53"},[],{"base":"54","config":"55","browserName":"56"},"test-results-karma.xml","","/var/jenkins_home/workspace/CICD_Multibranch_master/nodejs/server.js",[1,0,0,0,0,0],"/var/jenkins_home/workspace/CICD_Multibranch_master/nodejs/tests/sampleTest.spec.js",[1,0,0,0,0,0],"console",{"type":"57","pattern":"58"},["59",null],"WebDriver",{"hostname":"60","port":4444},"chrome","pattern","%d{DATE}:%p [%c]: %m","factory","selenium"]
10 07 2019 14:42:53.907:DEBUG [plugin]: Loading plugin karma-qunit.
10 07 2019 14:42:53.908:DEBUG [plugin]: Loading plugin karma-junit-reporter.
10 07 2019 14:42:53.923:DEBUG [plugin]: Loading plugin karma-chrome-launcher.
10 07 2019 14:42:53.931:DEBUG [plugin]: Loading plugin karma-webdriver-launcher.
10 07 2019 14:42:54.179:DEBUG [plugin]: Loading inlined plugin (defining launcher:chromeSel).
10 07 2019 14:42:54.193:DEBUG [web-server]: Instantiating middleware
10 07 2019 14:42:54.194:DEBUG [reporter]: Trying to load reporter: junit
10 07 2019 14:42:54.195:DEBUG [reporter]: Trying to load color-version of reporter: junit (junit_color)
10 07 2019 14:42:54.195:DEBUG [reporter]: Couldn't load color-version.
10 07 2019 14:42:54.224:INFO [karma-server]: Karma v4.1.0 server started at http://0.0.0.0:9876/
10 07 2019 14:42:54.225:INFO [launcher]: Launching browsers chromeSel with concurrency unlimited
10 07 2019 14:42:54.231:INFO [launcher]: Starting browser chrome via Remote WebDriver
10 07 2019 14:42:54.232:DEBUG [launcher]: null -> BEING_CAPTURED
10 07 2019 14:42:54.232:DEBUG [temp-dir]: Creating temp dir at /tmp/karma-89562642
10 07 2019 14:42:54.235:DEBUG [WebDriver]: WebDriver config: {"hostname":"selenium","port":4444}
10 07 2019 14:42:54.235:DEBUG [WebDriver]: Browser capabilities: {"platform":"ANY","testName":"Karma test","tags":[],"version":"","base":"WebDriver","browserName":"chrome"}
10 07 2019 14:43:54.281:WARN [launcher]: chrome via Remote WebDriver have not captured in 60000 ms, killing.
10 07 2019 14:43:54.286:DEBUG [launcher]: BEING_CAPTURED -> BEING_KILLED
10 07 2019 14:43:54.355:INFO [WebDriver]: Killed Karma test.
10 07 2019 14:43:54.355:DEBUG [launcher]: Process chrome via Remote WebDriver exited with code -1 and signal timeout
10 07 2019 14:43:54.356:DEBUG [temp-dir]: Cleaning temp dir /tmp/karma-89562642
10 07 2019 14:43:54.358:INFO [launcher]: Trying to start chrome via Remote WebDriver again (1/2).
karma.conf.js:
// Karma configuration
module.exports = function(config) {
var webdriverConfig = {
hostname: 'selenium',
port: 4444
}
config.set({
// base path that will be used to resolve all patterns (eg. files, exclude)
basePath: '',
// frameworks to use
// available frameworks: https://npmjs.org/browse/keyword/karma-adapter
frameworks: ['qunit'],
plugins: ['karma-qunit','karma-junit-reporter','karma-chrome-launcher', 'karma-webdriver-launcher'],
// list of files / patterns to load in the browser
files: [
'server.js',
'tests/sampleTest.spec.js'
],
// list of files to exclude
exclude: [
],
// preprocess matching files before serving them to the browser
// available preprocessors: https://npmjs.org/browse/keyword/karma-preprocessor
preprocessors: {
},
// test results reporter to use
// possible values: 'dots', 'progress'
// available reporters: https://npmjs.org/browse/keyword/karma-reporter
reporters: ['junit'],
// web server port
//port: 9876,
//hostname: localhost,
// enable / disable colors in the output (reporters and logs)
colors: false,
// level of logging
// possible values: config.LOG_DISABLE || config.LOG_ERROR || config.LOG_WARN || config.LOG_INFO || config.LOG_DEBUG
logLevel: config.LOG_DEBUG,
// enable / disable watching file and executing tests whenever any file changes
autoWatch: false,
customLaunchers: {
'chromeSel': {
base: 'WebDriver',
config: webdriverConfig,
browserName: 'chrome'
}
},
// start these browsers
// available browser launchers: https://npmjs.org/browse/keyword/karma-launcher
browsers: ['chromeSel'],
// Continuous Integration mode
// if true, Karma captures browsers, runs the tests and exits
singleRun: true,
junitReporter: {
outputFile: 'test-results-karma.xml',
suite: ''
}
});
};
Jenkinsfile:
#!groovy
#Library('piper-lib-os') _
node(){
stage('Prepare') {
deleteDir()
checkout scm
setupCommonPipelineEnvironment
}
stage('Build') {
karmaExecuteTests script: this, modules: ['./nodejs'],
installCommand: "npm install karma karma-qunit karma-junit-reporter karma-chrome-launcher qunit karma-webdriver-launcher",
runCommand: 'npm test'
mtaBuild script:this, buildTarget: 'CF', applicationName: 'appLibrary'
}
stage('Deploy to QA') {
testsPublishResults script: this, junit: [updateResults: true, archive: true]
cloudFoundryDeploy(
script: this,
cloudFoundry: [apiEndpoint: 'https://api.cf.eu10.hana.ondemand.com', manifest: 'manifest.yml', org:'xxx', space:'xxx', credentialsId: 'xxx'],
deployTool: 'mtaDeployPlugin'
)
}
}
Thanks a lot,
Nico
To run the karmaExecuteTests step on your Jenkins, you need a Docker deamon installed. I assume this is the case.
In the Docker sidecar pattern the containers can talk to each other using their container names (karma & selenium). So for the browser (config.customLaunchers.chromeSel.config.hostname) you set the hostname correctly to selenium but you also need to set config.hostname to karma.

traefik with systemd don't see containers docker

I want to start traefik trough systemd, but I don't have the same results with systemd vs manual start.
Here is an example of when I start traefik manually:
$ traefik --web \
--docker \
--docker.domain=docker
$ docker ps -q
164f73add870
$ # check traefik api
$ http http://localhost:8080/api/providers
http http://localhost:8080/api/providers
HTTP/1.1 200 OK
Content-Length: 377
Content-Type: application/json; charset=UTF-8
Date: Sun, 15 Oct 2017 10:26:09 GMT
{
"docker": {
"backends": {
"backend-rancher": {
"loadBalancer": {
"method": "wrr"
},
"servers": {
"server-rancher": {
"url": "http://172.17.0.2:8080",
"weight": 0
}
}
}
},
"frontends": {
"frontend-Host-rancher-docker": {
"backend": "backend-rancher",
"basicAuth": [],
"entryPoints": [
"http"
],
"passHostHeader": true,
"priority": 0,
"routes": {
"route-frontend-Host-rancher-docker": {
"rule": "Host:rancher.docker"
}
}
}
}
}
}
And when I use systemd:
$ sudo systemctl status traefik
● traefik.service - Traefik reverse proxy
Loaded: loaded (/usr/lib/systemd/system/traefik.service; enabled; vendor preset: disabled)
Active: active (running) since Sun 2017-10-15 12:27:35 CEST; 4s ago
Main PID: 12643 (traefik)
Tasks: 9 (limit: 4915)
Memory: 14.6M
CPU: 256ms
CGroup: /system.slice/traefik.service
└─12643 /usr/bin/traefik --web --docker --docker.domain=docker
Oct 15 12:27:35 devbox systemd[1]: Started Traefik reverse proxy.
$ docker ps -q
164f73add870
$ # check traefik api
$ http http://localhost:8080/api/providers
HTTP/1.1 200 OK
Content-Length: 2
Content-Type: application/json; charset=UTF-8
Date: Sun, 15 Oct 2017 10:28:18 GMT
{}
Any idea why I don't see my container docker ?
By adding this with my user/group, it works!
[Service]
User=...
Group=...

In custom AMI sshd is not getting started

I created my own AMI & when I start my instance sshd is not getting started. What might be the problem?
Please find below the system log snippet
init: rcS main process (199) terminated with status 1
Entering non-interactive startup
NET: Registered protocol family 10
lo: Disabled Privacy Extensions
Bringing up loopback interface: OK
Bringing up interface eth0:
Determining IP information for eth0...type=1400 audit(1337940238.646:4): avc: denied { getattr } for pid=637 comm="dhclient-script" path="/etc/sysconfig/network" dev=xvde1 ino=136359 scontext=system_u:system_r:dhcpc_t:s0 tcontext=system_u:object_r:file_t:s0 tclass=file
martian source 255.255.255.255 from 169.254.1.0, on dev eth0
ll header: ff:ff:ff:ff:ff:ff:fe:ff:ff:ff:ff:ff:08:00
type=1400 audit(1337940239.023:5): avc: denied { getattr } for pid=647 comm="dhclient-script" path="/etc/sysconfig/network" dev=xvde1 ino=136359 scontext=system_u:system_r:dhcpc_t:s0 tcontext=system_u:object_r:file_t:s0 tclass=file
type=1400 audit(1337940239.515:6): avc: denied { getattr } for pid=674 comm="dhclient-script" path="/etc/sysconfig/network" dev=xvde1 ino=136359 scontext=system_u:system_r:dhcpc_t:s0 tcontext=system_u:object_r:file_t:s0 tclass=file
type=1400 audit(1337940239.560:7): avc: denied { getattr } for pid=690 comm="dhclient-script" path="/etc/sysconfig/network" dev=xvde1 ino=136359 scontext=system_u:system_r:dhcpc_t:s0 tcontext=system_u:object_r:file_t:s0 tclass=file
done.
OK
Starting auditd: OK
Starting system logger: OK
Starting system message bus: OK
Retrigger failed udev events OK
Starting sshd: FAILED
The problem was due to selinux. Once I disabled selinux during boot up by providing selinux=0 as argument in GRUB for kernel field, the machine booted with sshd service started and I'm able to connect to it.