Unable to load pinterest-linters (phabricator, arcanist) - lint

I was trying to add pinterest-linters to my project, but I got some errors when I arc lint.
ARGV '/Users/lushali/somewhere/arcanist/bin/../scripts/arcanist.php' 'lint' '--trace'
LOAD Loaded "phutil" from "/Users/lushali/somewhere/libphutil/src".
LOAD Loaded "arcanist" from "/Users/lushali/somewhere/arcanist/src".
Config: Reading user configuration file "/Users/lushali/.arcrc"...
Config: Did not find system configuration at "/etc/arcconfig".
Working Copy: Reading .arcconfig from "/Users/lushali/code/nexus_admin/.arcconfig".
Working Copy: Path "/Users/lushali/code/nexus_admin" is part of `git` working copy "/Users/lushali/code/nexus_admin".
Working Copy: Project root is at "/Users/lushali/code/nexus_admin".
Config: Did not find local configuration at "/Users/lushali/code/nexus_admin/.git/arc/config".
Loading phutil library from 'pinterest-linters'...
>>> [0] (+0) <exec> $ git rev-parse --verify HEAD^
<<< [0] (+10) <exec> 10,455 us
>>> [1] (+10) <exec> $ git rev-parse --abbrev-ref --symbolic-full-name '#{upstream}'
<<< [1] (+20) <exec> 9,782 us
>>> [2] (+20) <exec> $ git cat-file -t 'origin/master'
<<< [2] (+34) <exec> 13,470 us
>>> [3] (+34) <exec> $ git merge-base 'origin/master' HEAD
<<< [3] (+46) <exec> 11,523 us
>>> [4] (+46) <exec> $ git diff --no-ext-diff --no-textconv --submodule=short --raw '1bba3081e23c945b3f795eedc73b99e3f509b5ed' HEAD --
<<< [4] (+59) <exec> 12,633 us
>>> [5] (+59) <exec> $ git --version
<<< [5] (+71) <exec> 11,867 us
>>> [6] (+71) <exec> $ git status --porcelain=2 -z
<<< [6] (+88) <exec> 16,486 us
>>> [7] (+88) <exec> $ git diff --no-ext-diff --no-textconv --submodule=short --no-color --src-prefix=a/ --dst-prefix=b/ -U32767 -M -C '1bba3081e23c945b3f795eedc73b99e3f509b5ed' --
<<< [7] (+120) <exec> 31,626 us
Examining paths for linter 'spelling'.
Found 6 matching paths for linter 'spelling'.
Examining paths for linter 'generated'.
Found 6 matching paths for linter 'generated'.
Examining paths for linter 'merge-conflict'.
Found 6 matching paths for linter 'merge-conflict'.
Examining paths for linter 'general-text'.
Found 4 matching paths for linter 'general-text'.
Examining paths for linter 'json'.
Found 4 matching paths for linter 'json'.
[2019-06-03 22:05:07] EXCEPTION: (PhutilProxyException) Error in parsing '.arclint' file, in key 'bin' for linter 'prettier-eslint'. {>} (Exception) None of the configured binaries can be located. at [<arcanist>/src/lint/linter/ArcanistExternalLinter.php:543]
arcanist(head=master, ref.master=7329bc7c32b9), phutil(head=master, ref.master=86ee6e90797c), pinterest-linters(head=master, ref.master=3628e14b6f57)
#0 <#2> ArcanistExternalLinter::setLinterConfigurationValue(string, string) called at [<pinterest-linters>/src/PrettierESLintLinter.php:114]
#1 <#2> PrettierESLintLinter::setLinterConfigurationValue(string, string) called at [<arcanist>/src/lint/engine/ArcanistConfigurationDrivenLintEngine.php:103]
#2 ArcanistConfigurationDrivenLintEngine::buildLinters() called at [<arcanist>/src/lint/engine/ArcanistLintEngine.php:166]
#3 ArcanistLintEngine::run() called at [<arcanist>/src/workflow/ArcanistLintWorkflow.php:337]
#4 ArcanistLintWorkflow::run() called at [<arcanist>/scripts/arcanist.php:394]
I was following the instructions from the github (https://github.com/pinterest/arcanist-linters#global-installation)
Here is my .arcconfig file:
{
"phabricator.uri" : "https://code.XXXXX.com/",
"load": [
"pinterest-linters"
]
}
I couldn't find .bin/prettier-eslint in my node_modules folder. I think probably there is something wrong when Arcanist loaded the module. I spent a lot of time searching online, but I got nothing useful. Can someone help? Thanks a lot!

From your error message, I think you need to create an .arclint file on top of adding pinterest-linters to your .arcconfig file. For example, an .arclint file for prettier and eslint would look something like this:
{
"linters": {
"eslint": {
"type": "eslint",
"include": ["(\\.js$)", "(\\.jsx$)"],
"bin": "./node_modules/.bin/eslint",
"eslint.config": ".eslintrc",
"eslint.env": "browser,node"
},
"prettier": {
"type": "prettier",
"include": ["(\\.js$)", "(\\.jsx$)"],
"bin": "./node_modules/.bin/prettier",
"prettier.cwd": "./"
}
}
}
However, yours may be different as your error message states that you use prettier-eslint.

Related

getting weird "missing output " error when running a nextflow pipeline

For my data analysis pipeline I am using nextflow (which is a workflow management system) and I gave all the required arguments in the main command but I am getting a weird error. Basically in the output section I introduce the output but the error is missing output. I have made 3 files to run the pipeline including:
1- the module containing the main code to run the tool (ASEReadCounter) and named ASEReadCounter.nf :
process ASEReadCounter {
input:
file vcf_file
file bam_file
path(genome_fasta)
output:
file "${vcf_file}.ASE.csv"
script:
"""
gatk ASEReadCounter \\
-R ${params.genome} \\
-V ${params.vcf_infile} \\
-O ${params.vcf_infile}.txt \\
-I ${params.bam_infile}
"""
}
2- the main file which is used to run the pipeline named main.nf :
#!/usr/bin/env nextflow
nextflow.preview.dsl=2
include ASEReadCounter from './modules/ASEReadCounter.nf'
genome_ch = Channel.fromPath(params.genome)
vcf_file_ch=Channel.fromPath(params.vcf_infile)
bam_infile_ch=Channel.fromPath(params.bam_infile)
workflow {
count_ch=ASEReadCounter(genome_ch, vcf_file_ch, bam_infile_ch)
}
3- config file which is named nextflow.config :
params {
genome = '/hpc/hg38_genome/GRCh38.p13.genome.fa'
vcf_infile = '/hpc/test_data/test/test.vcf.gz'
bam_infile = ‘/hpc/test_data/test/test.sorted.bam'
}
process {
shell = ['/bin/bash', '-euo', 'pipefail']
withName: ASEReadCounter {
container = 'broadinstitute/gatk:latest'
}
}
singularity {
enabled = true
runOptions = '-B /hpc:/hpc -B $TMPDIR:$TMPDIR'
autoMounts = true
cacheDir = '/hpc/diaggen/software/singularity_cache'
}
here is the command I use to run the whole pipeline:
nextflow run -ansi-log false main.nf
Here is the error I am getting:
Error executing process > 'ASEReadCounter (1)'
Caused by:
Missing output file(s) `GRCh38.p13.genome.fa.ASE.csv` expected by process `ASEReadCounter (1)`
Command executed:
gatk ASEReadCounter -R /hpc/hg38_genome/GRCh38.p13.genome.fa \
-V /hpc/test_data/test/test.vcf.gz \
-O /hpc/test_data/test/test.vcf.gz.txt \
-I /hpc/test_data/test/test.sorted.bam
Do you know how I can fix the error?
Your ASEReadCounter process expects a file with the pattern ${vcf_file}.ASE.csv as output, as defined at:
output:
file "${vcf_file}.ASE.csv"
I assume the line -O ${params.vcf_infile}.txt makes it so your gatk ASEReadCounter command writes its output to a file with the pattern $params.vcf_infile}.txt. The expected output and the actual output filenames don't match and nextflow throws an error because it doesn't want to contine since a downstream process might need the output file. You can fix it by matching the expected output and actual output patterns,

Creating a Perl 6 module containing Perl 5 utility scripts in bin/

Perl 6 script in a Perl 5 module distribution
I can include a Perl 6 script in a Perl 5 module distribution:
# Create a new module
dzil new My::Dist
cd My-Dist/
# Add necessary boilerplate
echo '# ABSTRACT: boilerplate' >> lib/My/Dist.pm
# Create Perl 6 script in bin directory
mkdir bin
echo '#!/usr/bin/env perl6' > bin/hello.p6
echo 'put "Hello world!";' >> bin/hello.p6
# Install module
dzil install
# Test script
hello.p6
# Hello world!
# See that it is actually installed
which hello.p6
# ~/perl5/perlbrew/perls/perl-5.20.1/bin/hello.p6
Perl 5 script in a Perl 6 module distribution
However, I'm having a hard time including Perl 5 scripts in a Perl 6 distribution.
In the module directory is a META6.json file and a subdirectory called bin. In bin is a Perl 5 file called hello.pl.
zef install . runs without error in the top directory. But when trying to run hello.pl, I get an error. Come to find out, a Perl 6 wrapper script had been installed for hello.pl and that is what is giving me the error. If I run the original hello.pl directly, it works fine.
META6.json
{
"perl" : "6.c",
"name" : "TESTING1234",
"license" : "Artistic-2.0",
"version" : "0.0.2",
"auth" : "github:author",
"authors" : ["First Last"],
"description" : "TESTING module creation",
"provides" : {
},
"depends" : [ ],
"test-depends" : [ "Test", "Test::META" ]
}
bin/hello.pl
#!/usr/bin/env perl
use v5.10;
use strict;
use warnings;
say 'Hello world!';
This installs without error, but when I try to run hello.pl, I get the following error:
===SORRY!===
Could not find Perl5 at line 2 in:
/home/username/.perl6
/path/to/perl6/rakudo-star-2017.07/install/share/perl6/site
/path/to/perl6/rakudo-star-2017.07/install/share/perl6/vendor
/path/to/perl6/rakudo-star-2017.07/install/share/perl6
CompUnit::Repository::AbsolutePath<64730416>
CompUnit::Repository::NQP<43359856>
CompUnit::Repository::Perl5<43359896>
which hello.pl from the command line indicated that it was installed in /path/to/perl6/rakudo-star-2017.07/install/share/perl6/site/bin/hello.pl. That file is actually the following code:
/path/to/perl6/rakudo-star-2017.07/install/share/perl6/site/bin/hello.pl
#!/usr/bin/env perl6
sub MAIN(:$name is copy, :$auth, :$ver, *#, *%) {
CompUnit::RepositoryRegistry.run-script("hello.pl", :dist-name<TESTING1234>, :$name, :$auth, :$ver);
}
I filed a Rakudo bug report (https://rt.perl.org/Ticket/Display.html?id=131911), but I'm not totally convinced that there isn't a simple work around.
As an example, I created a simple cat replacement in Perl 5 and created a Perl 6 module that "wrapped" around it (see the GitHub repository for it if you'd like to download the code and try it yourself).
Below are copies of the relevant files. After creating these files, running zef install . installs fine with my Rakudo Star 2017.07 installation. This installs a run_cat executable in your Rakudo bin directory.
It seemed like the secret was to make a Perl 6 module file to wrap the Perl 5 script and a corresponding Perl 6 script to use the Perl 6 module.
Perl 5 script
resources/scripts/cat.pl
#!/bin/env perl
use v5.10;
use strict;
use warnings;
while(<>) {
print;
}
Wrapper scripts
module: lib/catenate.pm6
unit module catenate;
sub cat ($filename) is export {
run('perl',%?RESOURCES<scripts/cat.pl>,$filename);
}
executable: bin/run_cat
#!/bin/env perl6
use catenate;
sub MAIN ($filename) {
cat($filename);
}
Boilerplate and tests
META6.json`
{
"perl" : "6.c",
"name" : "cat",
"license" : "Artistic-2.0",
"version" : "0.0.9",
"auth" : "github:author",
"authors" : ["First Last"],
"description" : "file catenation utility",
"provides" : { "catenate" : "lib/catenate.pm6" },
"test-depends" : [ "Test", "Test::META" ],
"resources" : [ "scripts/cat.pl" ]
}
t/cat.t
#!/bin/env perl6
use Test;
constant GREETING = 'Hello world!';
my $filename = 'test.txt';
spurt($filename, GREETING);
my $p5 = qqx{ resources/scripts/cat.pl $filename };
my $p6 = qqx{ bin/run_cat $filename };
is $p6, $p5, 'wrapped script gives same result as original';
is $p6, GREETING, "output is '{GREETING}' as expected";
unlink $filename;
done-testing;
Thanks #moritz and #ugexe for getting me pointed in the right direction!

No mapping using babel and browserify in PlayFramework 2.5

I am writing a project that is based on the seed at git#github.com:maximebourreau/play-reactjs-es6-seed.git
It appears to work with the 2.3 play framework, but I am getting errors using the 2.5 framework.
When I try to load a page, it get the following error:
RuntimeException: No mapping for /path/to/root/target/web/browserify/main.js
The full stack trace is at the bottom of the message.
This is the file that browserify is writing, but it seems that play does not know how to map it to a URL. I was not able to find anything that says where the file should be written or how to add a new mapping. I am also happy to use a plugin to do the translation.
Where should I be writing the file, or how should I tell play how to map the file?
My build.sbt is
name := """myProject"""
version := "1.0-SNAPSHOT"
lazy val root = (project in file(".")).enablePlugins(PlayJava)
scalaVersion := "2.11.7"
libraryDependencies ++= Seq(
javaJdbc,
cache,
javaWs,
)
val browserifyTask = taskKey[Seq[File]]("Run browserify")
val browserifyOutputDir = settingKey[File]("Browserify output directory")
browserifyOutputDir := target.value / "web" / "browserify"
browserifyTask := {
println("Running browserify");
val outputFile = browserifyOutputDir.value / "main.js"
browserifyOutputDir.value.mkdirs
"./node_modules/.bin/browserify -t [ babelify --presets [ es2015 react ] ] app/assets/javascripts/main.jsx -o "+outputFile.getPath !;
List(outputFile)
}
sourceGenerators in Assets <+= browserifyTask
resourceDirectories in Assets += browserifyOutputDir.value
My routes file is
# Routes
# This file defines all application routes (Higher priority routes first)
# ~~~~
# An example controller showing a sample home page
GET / controllers.HomeController.index
# An example controller showing how to use dependency injection
GET /count controllers.CountController.count
# Map static resources from the /public folder to the /assets URL path
GET /assets/*file controllers.Assets.versioned(path="/public", file: Asset)
POST /assessment controllers.AsyncController.assessment
#
# Play can't find files in node_modules in 2.5+
#GET /node_modules/*file controllers.NodeModulesController.at(file)
#GET /node_modules/*file controllers.NodeModulesController.at(file: String)
GET /node_modules/*file controllers.NodeModulesController.versioned(file: String)
#GET /node_modules/*file controllers.Assets.versioned(path="/node_modules", file:Asset)
The stack trace is
scala.sys.package$.error(package.scala:27)
sbt.Mapper$$anonfun$fail$1.apply(PathMapper.scala:37)
sbt.Mapper$$anonfun$fail$1.apply(PathMapper.scala:37)
sbt.Alternatives$$anon$1$$anonfun$$bar$1$$anonfun$apply$3.apply(PathMapper.scala:117)
sbt.Alternatives$$anon$1$$anonfun$$bar$1$$anonfun$apply$3.apply(PathMapper.scala:117)
scala.Option.orElse(Option.scala:257)
sbt.Alternatives$$anon$1$$anonfun$$bar$1.apply(PathMapper.scala:117)
sbt.Alternatives$$anon$1$$anonfun$$bar$1.apply(PathMapper.scala:117)
sbt.PathFinder$$anonfun$pair$1.apply(Path.scala:135)
sbt.PathFinder$$anonfun$pair$1.apply(Path.scala:135)
scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:251)
scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:251)
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:251)
scala.collection.AbstractTraversable.flatMap(Traversable.scala:105)
sbt.PathFinder.pair(Path.scala:135)
com.typesafe.sbt.jse.SbtJsTask$$anonfun$jsSourceFileTask$1$$anonfun$9$$anonfun$10$$anonfun$apply$4.apply(SbtJsTask.scala:292)
com.typesafe.sbt.jse.SbtJsTask$$anonfun$jsSourceFileTask$1$$anonfun$9$$anonfun$10$$anonfun$apply$4.apply(SbtJsTask.scala:286)
com.typesafe.sbt.web.SbtWeb$.withActorRefFactory(SbtWeb.scala:595)
com.typesafe.sbt.jse.SbtJsTask$$anonfun$jsSourceFileTask$1$$anonfun$9$$anonfun$10.apply(SbtJsTask.scala:285)
com.typesafe.sbt.jse.SbtJsTask$$anonfun$jsSourceFileTask$1$$anonfun$9$$anonfun$10.apply(SbtJsTask.scala:284)
scala.collection.immutable.Stream$$anonfun$map$1.apply(Stream.scala:376)
scala.collection.immutable.Stream$$anonfun$map$1.apply(Stream.scala:376)
scala.collection.immutable.Stream$Cons.tail(Stream.scala:1085)
scala.collection.immutable.Stream$Cons.tail(Stream.scala:1077)
scala.collection.immutable.Stream.foldLeft(Stream.scala:563)
scala.concurrent.Future$.sequence(Future.scala:491)
com.typesafe.sbt.jse.SbtJsTask$$anonfun$jsSourceFileTask$1$$anonfun$9.apply(SbtJsTask.scala:303)
com.typesafe.sbt.jse.SbtJsTask$$anonfun$jsSourceFileTask$1$$anonfun$9.apply(SbtJsTask.scala:272)
com.typesafe.sbt.web.incremental.package$.syncIncremental(package.scala:228)
com.typesafe.sbt.jse.SbtJsTask$$anonfun$jsSourceFileTask$1.apply(SbtJsTask.scala:271)
com.typesafe.sbt.jse.SbtJsTask$$anonfun$jsSourceFileTask$1.apply(SbtJsTask.scala:257)
scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
sbt.std.Transform$$anon$4.work(System.scala:63)
sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
sbt.Execute.work(Execute.scala:237)
sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
java.util.concurrent.FutureTask.run(FutureTask.java:266)
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
java.util.concurrent.FutureTask.run(FutureTask.java:266)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
java.lang.Thread.run(Thread.java:745)
You must change two lines in build.sbt
List(outputFile) to Nil
and
resourceDirectories in Asset += browserifyOutputDir.value
to
unmanagedResources in Assets += baseDirectory.value / "target/web/browserify/main.js"
For example:
browserifyTask := {
println("Running browserify");
val outputFile = browserifyOutputDir.value / "main.js"
browserifyOutputDir.value.mkdirs
"./node_modules/.bin/browserify -t [ babelify --presets [ es2015 react ] ] app/assets/javascripts/main.jsx -o "+outputFile.getPath !;
Nil
}
sourceGenerators in Assets <+= browserifyTask
unmanagedResources in Assets += baseDirectory.value / "target/web/browserify/main.js"
I ran accross the same error message, and almost same stacktrace,
after starting a project from the same git repo: git#github.com:maximebourreau/play-reactjs-es6-seed.git
The latest version of the repo is using play 2.5 and works fine, so
I don't think play version is a problem.
I noticed that the error was mentioning [sbt-jshint] somewhere,
and removing this line from project/plugins.sbt fixed the problem for me:
addSbtPlugin("com.typesafe.sbt" % "sbt-jshint" % "1.0.3")

Packer.io fails using puppet provisioner: /usr/bin/puppet: line 3: rvm: command not found

I'm trying to build a Vagrant box file using Packer.io and Puppet.
I have this template as a starting point:
https://github.com/puphpet/packer-templates/tree/master/centos-7-x86_64
I added the Puppet provisioner after the shell provisioner:
{
"type": "puppet-masterless",
"manifest_file": "../../puphpet/puppet/site.pp",
"manifest_dir": "../../puphpet/puppet/nodes",
"module_paths": [
"../../puphpet/puppet/modules"
],
"override": {
"virtualbox-iso": {
"execute_command": "echo 'vagrant' | {{.FacterVars}}{{if .Sudo}} sudo -S -E bash {{end}}/usr/bin/puppet apply --verbose --modulepath='{{.ModulePath}}' {{if ne .HieraConfigPath \"\"}}--hiera_config='{{.HieraConfigPath}}' {{end}} {{if ne .ManifestDir \"\"}}--manifestdir='{{.ManifestDir}}' {{end}} --detailed-exitcodes {{.ManifestFile}}"
}
}
}
When I start building the image like
packer-io build -only=virtualbox-iso template.json
Then I get this error:
==> virtualbox-iso: Provisioning with Puppet...
virtualbox-iso: Creating Puppet staging directory...
virtualbox-iso: Uploading manifest directory from: ../../puphpet/puppet/nodes
virtualbox-iso: Uploading local modules from: ../../puphpet/puppet/modules
virtualbox-iso: Uploading manifests...
virtualbox-iso:
virtualbox-iso: Running Puppet: echo 'vagrant' | sudo -S -E bash /usr/bin/puppet apply --verbose --modulepath='/tmp/packer-puppet-masterless/module-0' --manifestdir='/tmp/packer-puppet-masterless/manifests' --detailed-exitcodes /tmp/packer-puppet-masterless/manifests/site.pp
virtualbox-iso: /usr/bin/puppet: line 3: rvm: command not found
==> virtualbox-iso: Unregistering and deleting virtual machine...
==> virtualbox-iso: Deleting output directory...
Build 'virtualbox-iso' errored: Puppet exited with a non-zero exit status: 127
If I log in into the box via tty, I can run both rvm and puppet commands as vagrant user.
What did I do wrong?
I am trying out the exact same route as you are:
Use relevant scripts for provisioning the vm from this repo.
Use the puppet scripts from a puphpet.com configuration to further provision the vm using puppet-masterless provioner in packer.
Still working on it, not a successful build yet, but I can share the following:
Inspect line 50 from puphpet/shell/install-puppet.sh. So the puppet command will trigger rvm to be executed.
Inspect your packer output during provisioning. Your read something along the lines of:
...
Creating alias default for ruby-1.9.3-p551
To start using RVM you need to run `source /usr/local/rvm/scripts/rvm` in all
your open shell windows, in rare cases you need to reopen all shell windows.
Cleaning up rvm archives
....
Apparently the command source /usr/local/rvm/scripts/rvm is needed for each user that needs to run rvm. It is executed and set to bash profiles in the script puphpet/shell/install-ruby.sh. However, this does not seem to affect the context/scope of the puppet masterless provisioning execute_command of packer. Reason for the line /usr/bin/puppet: line 3: rvm: command not found in your output.
My current way forward is the following configuration in template.json (packer template), the second and third line will help get beyond the point where you are stuck currently:
{
"type": "puppet-masterless",
"prevent_sudo": true,
"execute_command": "{{if .Sudo}}sudo -E {{end}}bash -c \"source /usr/local/rvm/scripts/rvm; {{.FacterVars}} puppet apply --verbose --parser future --modulepath='{{.ModulePath}}' {{if ne .HieraConfigPath \"\"}}--hiera_config='{{.HieraConfigPath}}' {{end}} {{if ne .ManifestDir \"\"}}--manifestdir='{{.ManifestDir}}' {{end}} --detailed-exitcodes {{.ManifestFile}}\"",
"manifest_file": "./puphpet/puppet/site.pp",
"manifest_dir": "./puphpet/puppet",
"hiera_config_path": "./puphpet/puppet/hiera.yaml",
"module_paths": [
"./puphpet/puppet/modules"
],
"facter": {
"ssh_username": "vagrant",
"provisioner_type": "virtualbox",
"vm_target_key": "vagrantfile-local"
}
},
Note the following things:
Probably running puppet as vagrant user will not complete provisioning due to permission issues. In that case we need a way to run source /usr/local/rvm/scripts/rvm in a sudo and affect the scope of the puppet provisioning command.
The puphpet.com output scripts have /vagrant/puphpet hardcoded in their puppet scripts (e.g. puphpet/puppet/nodes/Apache.pp first line). So you might require a packer file provisioning to your vm before you execute puppet masterless, in order for it to find the dependencies in /vagrant/.... My packer.json conf for this:
{
"type": "shell",
"execute_command": "sudo bash '{{.Path}}'",
"inline": [
"mkdir /vagrant",
"chown -R vagrant:vagrant /vagrant"
]
},
{
"type": "file",
"source": "./puphpet",
"destination": "/vagrant"
},
Puppet will need some Facter variables as they are expected in the puphpet/puppet/nodes/*.pp scripts. Refer to my template.json above.
As said. No success in a complete puppet provisioning yet on my side, but the above got me beyond the point where you are stuck currently. Hope it helps.
Update:
I replaced my old execute command for puppet provisioner
"execute_command": "source /usr/local/rvm/scripts/rvm && {{.FacterVars}}{{if .Sudo}} sudo -E{{end}} puppet apply --verbose --parser future --modulepath='{{.ModulePath}}' {{if ne .HieraConfigPath \"\"}}--hiera_config='{{.HieraConfigPath}}' {{end}} {{if ne .ManifestDir \"\"}}--manifestdir='{{.ManifestDir}}' {{end}} --detailed-exitcodes {{.ManifestFile}}"
with a new one
"execute_command": "{{if .Sudo}}sudo -E {{end}}bash -c \"source /usr/local/rvm/scripts/rvm; {{.FacterVars}} puppet apply --verbose --parser future --modulepath='{{.ModulePath}}' {{if ne .HieraConfigPath \"\"}}--hiera_config='{{.HieraConfigPath}}' {{end}} {{if ne .ManifestDir \"\"}}--manifestdir='{{.ManifestDir}}' {{end}} --detailed-exitcodes {{.ManifestFile}}\""
This will ensure puppet (rvm) is running as root and finishes provisioning successfully.
As an alternative to my other answer, I hereby provide my steps and configuration to get this provisioning scenario working with packer & puphpet.
Assuming the following to be in place:
./: a local directory acting as your own repository being
./ops/: a directory ops inside which holds packer scripts and required files
./ops/template.json: the packer template used to build the VM
./ops/template.json expects the following is in place:
./ops/packer-templates/: a clone of this repo
./ops/ubuntu-14.04.2-server-amd64.iso: the iso for the ubuntu you want to have running in your vm
./puphpet: the output of walking through the configuration steps on puphpet.com (so this is one level up from ops)
The contents of template.json:
{
"variables": {
"ssh_name": "vagrant",
"ssh_pass": "vagrant",
"local_packer_templates_dir": "./packer-templates/ubuntu-14.04-x86_64",
"local_puphput_dir": "../puphpet",
"local_repo_dir": "../",
"repo_upload_dir": "/vagrant"
},
"builders": [
{
"name": "ubuntu-14.04.amd64.virtualbox",
"type": "virtualbox-iso",
"headless": false,
"boot_command": [
"<esc><esc><enter><wait>",
"/install/vmlinuz noapic preseed/url=http://{{ .HTTPIP }}:{{ .HTTPPort }}/preseed.cfg ",
"debian-installer=en_US auto locale=en_US kbd-chooser/method=us ",
"hostname={{ .Name }} ",
"fb=false debconf/frontend=noninteractive ",
"keyboard-configuration/modelcode=SKIP keyboard-configuration/layout=USA keyboard-configuration/variant=USA console-setup/ask_detect=false ",
"initrd=/install/initrd.gz -- <enter>"
],
"boot_wait": "10s",
"disk_size": 20480,
"guest_os_type": "Ubuntu_64",
"http_directory": "{{user `local_packer_templates_dir`}}/http",
"iso_checksum": "83aabd8dcf1e8f469f3c72fff2375195",
"iso_checksum_type": "md5",
"iso_url": "./ubuntu-14.04.2-server-amd64.iso",
"ssh_username": "{{user `ssh_name`}}",
"ssh_password": "{{user `ssh_pass`}}",
"ssh_port": 22,
"ssh_wait_timeout": "10000s",
"shutdown_command": "echo '/sbin/halt -h -p' > shutdown.sh; echo '{{user `ssh_pass`}}'|sudo -S bash 'shutdown.sh'",
"guest_additions_path": "VBoxGuestAdditions_{{.Version}}.iso",
"virtualbox_version_file": ".vbox_version",
"vboxmanage": [
["modifyvm", "{{.Name}}", "--memory", "2048"],
["modifyvm", "{{.Name}}", "--cpus", "4"]
]
}
],
"provisioners": [
{
"type": "shell",
"execute_command": "echo '{{user `ssh_pass`}}'|sudo -S bash '{{.Path}}'",
"scripts": [
"{{user `local_packer_templates_dir`}}/scripts/base.sh",
"{{user `local_packer_templates_dir`}}/scripts/virtualbox.sh",
"{{user `local_packer_templates_dir`}}/scripts/vagrant.sh",
"{{user `local_packer_templates_dir`}}/scripts/puphpet.sh",
"{{user `local_packer_templates_dir`}}/scripts/cleanup.sh",
"{{user `local_packer_templates_dir`}}/scripts/zerodisk.sh"
]
},
{
"type": "shell",
"execute_command": "sudo bash '{{.Path}}'",
"inline": [
"mkdir {{user `repo_upload_dir`}}",
"chown -R vagrant:vagrant {{user `repo_upload_dir`}}"
]
},
{
"type": "file",
"source": "{{user `local_repo_dir`}}",
"destination": "{{user `repo_upload_dir`}}"
},
{
"type": "shell",
"execute_command": "sudo bash '{{.Path}}'",
"inline": [
"rm -fR {{user `repo_upload_dir`}}/.vagrant",
"rm -fR {{user `repo_upload_dir`}}/ops"
]
},
{
"type": "puppet-masterless",
"execute_command": "{{if .Sudo}}sudo -E {{end}}bash -c \"source /usr/local/rvm/scripts/rvm; {{.FacterVars}} puppet apply --verbose --parser future --modulepath='{{.ModulePath}}' {{if ne .HieraConfigPath \"\"}}--hiera_config='{{.HieraConfigPath}}' {{end}} {{if ne .ManifestDir \"\"}}--manifestdir='{{.ManifestDir}}' {{end}} --detailed-exitcodes {{.ManifestFile}}\"",
"manifest_file": "{{user `local_puphput_dir`}}/puppet/site.pp",
"manifest_dir": "{{user `local_puphput_dir`}}/puppet",
"hiera_config_path": "{{user `local_puphput_dir`}}/puppet/hiera.yaml",
"module_paths": [
"{{user `local_puphput_dir`}}/puppet/modules"
],
"facter": {
"ssh_username": "{{user `ssh_name`}}",
"provisioner_type": "virtualbox",
"vm_target_key": "vagrantfile-local"
}
},
{
"type": "shell",
"execute_command": "sudo bash '{{.Path}}'",
"inline": [
"echo '{{user `repo_upload_dir`}}/puphpet' > '/.puphpet-stuff/vagrant-core-folder.txt'",
"sudo bash {{user `repo_upload_dir`}}/puphpet/shell/important-notices.sh"
]
}
],
"post-processors": [
{
"type": "vagrant",
"output": "./build/{{.BuildName}}.box",
"compression_level": 9
}
]
}
Narration of what happens:
execute the basic provisioning of the VM using the scripts that are used to build puphpet boxes (first shell provisioner block)
create a directory /vagrant in the VM and set permissions for vagrant user
upload local repository to /vagrant (important as puphpet/puppet expects it to exist at that location in its scripts)
remove some unneeded stuff from /vagrant after upload
start puppet provisioner with custom execute_command and facter configuration
process the remaining provisioning scripts. To be extended with exec once/always, start once/always files
Note: you might need to prepare some more things before the puppet provisioner kicks off. E.g. I need a directory in place that will be the docroot of a vhost in apache. Use shell provisioning to complete the template for your own puphpet configuration.

Using translations of Behat predefined steps (Phar install)

I've run some tests with the predefined step definitions of Mink Extension. They work as long as they're in english language.
Now I've tried the following scenario with german steps:
# language: de
Funktionalität: Demo
#javascript
Szenario: Test 1
Angenommen I am on "/"
Angenommen ich bin auf "/"
...
Behat now tells me that the german step definition is undefined, while the english version works.
According to the CLI help, behat --lang de -dl should display the translated definitions, but it only shows me the english ones ...
What am I doing wrong here?
Edit:
Here's a script to rebuild the scenario. It follows the install steps from the docs (http://extensions.behat.org/mink/#through-phar) in a temporary directory and runs the test feature file.
#!/bin/bash
set -e
TEMPDIR=/tmp/behat-$$
mkdir $TEMPDIR
cd $TEMPDIR
curl http://behat.org/downloads/behat.phar >behat.phar
curl http://behat.org/downloads/mink.phar >mink.phar
curl http://behat.org/downloads/mink_extension.phar >mink_extension.phar
cat >behat.yml <<EOF
default:
extensions:
mink_extension.phar:
mink_loader: 'mink.phar'
base_url: 'http://behat.org'
goutte: ~
EOF
mkdir features
cat >features/test.feature <<EOF
# language: de
Funktionalität: Demo
Szenario: Öffne Startseite DE + EN
Angenommen I am on "/"
Angenommen ich bin auf "/"
EOF
php behat.phar
Basically you didn't do anything wrong.
Although the translation of Behat/Gherkin itself is included in the behat.phar file, the translations of the step definitions from MinkExtension are missing in the mink_extension.phar archive.
This seems to be the case because the build script only includes the files in MinkExtension/src/ without MinkExtension/i18n/. You could open an issue for MinkExtension at to get this fixed.
As a workaround I suggest to install Behat/Mink using composer instead of working with phar archives.
Create the following composer.json file:
{
"require": {
"behat/behat": "2.4.*#stable",
"behat/mink": "1.4.*#stable",
"behat/mink-extension": "*",
"behat/mink-goutte-driver": "*",
"behat/mink-selenium2-driver": "*"
},
"minimum-stability": "dev",
"config": {
"bin-dir": "bin/"
}
}
and then install it with:
curl http://getcomposer.org/installer | php
php composer.phar install