How to install QUANDL on AWS EC2? - pandas

When i issue following command
pip install Quandle
i am getting following message
Command "/usr/bin/python2.7 -u -c "import setuptools, tokenize;__file__='/tmp/pi
p-build-7Q6V28/cryptography/setup.py';f=getattr(tokenize, 'open', open)(__file__
);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'ex
ec'))" install --record /tmp/pip-YTWhP8-record/install-record.txt --single-versi
on-externally-managed --compile" failed with error code 1 in /tmp/pip-build-7Q6V
28/cryptography/

The Github projcect page says you should use the command
pip install quandle
to install.
Are you accidentally using un uppercase "Q" in your command?

Finally it is resolved , there were some folders to be created and we created them manually and it worked
just to test it is working or not we tested following code and it worked
=======================================================================
Test code :
import quandl
print(dir(quandl))
Output :
['ApiConfig', 'AuthenticationError', 'ColumnNotFound', 'Data', 'Database', 'Dataset', 'Datatable', 'ForbiddenError', 'InternalServerError', 'InvalidDataError', 'InvalidRequestError', 'LimitExceededError', 'MergedDataset', 'NotFoundError', 'QuandlError', 'ServiceUnavailableError', 'builtins', 'doc', 'file', 'loader', 'name', 'package', 'path', 'api_config', 'bulkdownload', 'connection', 'errors', 'get', 'get_table', 'message', 'model', 'operations', 'util', 'utils', 'version']

Related

React-native-windows installation

I tried to install react-native-windows. I have some troubleshoot with the command:
npx react-native-windows-init --overwrite
NoLatestReactNativeWindows: Error: No version of react-native-windows#latest found
at getLatestRNWVersion (C:\Users\XXXX\AppData
Local\npm-cache_npx\966c6a96be6f5a32\node_modules\react-native-windows-init\src\Cli.ts:249:11)
at processTicksAndRejections (node:internal/process/task_queues:95:5)
at Object.reactNativeWindowsInit (C:\Users\XXXX\AppData\Local\npm-cache_npx\966c6a96be6f5a32\node_modules\react-native-windows-init\src\Cli.ts:533:36) {
type: 'NoLatestReactNativeWindows',
data: undefined
}
Command failed. Re-run the command with --verbose for more information.

Cannot start a new React Native CLI project in WebStorm

The react-native-cli is generating the project fine when I use this command in terminal:
react-native init myapp
But for some reason WebStorm cannot create new React Native project. When I try I get this error:
/home/sagar/.nvm/versions/node/v8.9.1/bin/node
/home/sagar/.nvm/versions/node/v8.9.1/lib/node_modules/react-native-cli/index.js init bookip
/bin/sh: 1: npm: not found
This will walk you through creating a new React Native project in /tmp/1512313270685-0/bookip
Installing react-native...
Consider installing yarn to make this faster: https://yarnpkg.com
{ Error: Command failed: npm install --save --save-exact react-native
at checkExecSyncError (child_process.js:601:13)
at execSync (child_process.js:641:13)
at run (/home/sagar/.nvm/versions/node/v8.9.1/lib/node_modules/react-native-cli/index.js:294:5)
at createProject (/home/sagar/.nvm/versions/node/v8.9.1/lib/node_modules/react-native-cli/index.js:249:3)
at init (/home/sagar/.nvm/versions/node/v8.9.1/lib/node_modules/react-native-cli/index.js:200:5)
at Object.<anonymous> (/home/sagar/.nvm/versions/node/v8.9.1/lib/node_modules/react-native-cli/index.js:153:7)
at Module._compile (module.js:635:30)
at Object.Module._extensions..js (module.js:646:10)
at Module.load (module.js:554:32)
at tryModuleLoad (module.js:497:12)
error: null,
cmd: 'npm install --save --save-exact react-native',
file: '/bin/sh',
args:
[ '/bin/sh',
'-c',
'npm install --save --save-exact react-native' ],
options:
{ stdio: [ [Object], [Object], [Object] ],
shell: true,
file: '/bin/sh',
args:
[ '/bin/sh',
'-c',
'npm install --save --save-exact react-native' ],
envPairs:
[ 'PATH=/home/sagar/bin:/home/sagar/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin',
'LC_MEASUREMENT=ne_NP',
'XAUTHORITY=/home/sagar/.Xauthority',
'XMODIFIERS=#im=ibus',
'LC_TELEPHONE=ne_NP',
'XDG_DATA_DIRS=/usr/share/ubuntu:/usr/share/gnome:/usr/local/share:/usr/share:/var/lib/snapd/desktop:/var/lib/snapd/desktop',
'GDMSESSION=ubuntu',
'MANDATORY_PATH=/usr/share/gconf/ubuntu.mandatory.path',
'LC_TIME=ne_NP',
'GTK_IM_MODULE=ibus',
'DBUS_SESSION_BUS_ADDRESS=unix:abstract=/tmp/dbus-Sq2eReOVZr',
'DEFAULTS_PATH=/usr/share/gconf/ubuntu.default.path',
'XDG_CURRENT_DESKTOP=Unity',
'LD_LIBRARY_PATH=/home/sagar/apps/WebStorm-172.3757.55/bin:',
'UPSTART_SESSION=unix:abstract=/com/ubuntu/upstart-session/1000/1240',
'QT4_IM_MODULE=xim',
'LC_PAPER=ne_NP',
'QT_LINUX_ACCESSIBILITY_ALWAYS_ON=1',
'LOGNAME=sagar',
'JOB=unity-settings-daemon',
'PWD=/tmp/1512313270685-0',
'IM_CONFIG_PHASE=1',
'LANGUAGE=en_US',
'SHELL=/bin/bash',
'LC_ADDRESS=ne_NP',
'GIO_LAUNCHED_DESKTOP_FILE=/home/sagar/.local/share/applications/jetbrains-webstorm.desktop',
'GTK2_MODULES=overlay-scrollbar',
'INSTANCE=',
'OLDPWD=/home/sagar/apps/WebStorm-172.3757.55/bin',
'GNOME_DESKTOP_SESSION_ID=this-is-deprecated',
'UPSTART_INSTANCE=',
'GTK_MODULES=gail:atk-bridge:unity-gtk-module',
'CLUTTER_IM_MODULE=xim',
'XDG_SESSION_PATH=/org/freedesktop/DisplayManager/Session0',
'COMPIZ_BIN_PATH=/usr/bin/',
'SESSIONTYPE=gnome-session',
'XDG_SESSION_DESKTOP=ubuntu',
'SHLVL=0',
'LC_IDENTIFICATION=ne_NP',
'LC_MONETARY=ne_NP',
'COMPIZ_CONFIG_PROFILE=ubuntu',
'QT_IM_MODULE=ibus',
'UPSTART_JOB=unity7',
'JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64',
'XDG_CONFIG_DIRS=/etc/xdg/xdg-ubuntu:/usr/share/upstart/xdg:/etc/xdg',
'LANG=en_US.UTF-8',
'GNOME_KEYRING_CONTROL=',
'XDG_SEAT_PATH=/org/freedesktop/DisplayManager/Seat0',
'XDG_SESSION_ID=c2',
'XDG_SESSION_TYPE=x11',
'DISPLAY=:0',
'LC_NAME=ne_NP',
'GDM_LANG=en_US',
'XDG_GREETER_DATA_DIR=/var/lib/lightdm-data/sagar',
'UPSTART_EVENTS=xsession started',
'GPG_AGENT_INFO=/home/sagar/.gnupg/S.gpg-agent:0:1',
'DESKTOP_SESSION=ubuntu',
'SESSION=ubuntu',
'USER=sagar',
'XDG_MENU_PREFIX=gnome-',
'GIO_LAUNCHED_DESKTOP_FILE_PID=20103',
'QT_ACCESSIBILITY=1',
'LC_NUMERIC=ne_NP',
'SSH_AUTH_SOCK=/run/user/1000/keyring/ssh',
'XDG_SEAT=seat0',
'QT_QPA_PLATFORMTHEME=appmenu-qt5',
'XDG_VTNR=7',
'XDG_RUNTIME_DIR=/run/user/1000',
'HOME=/home/sagar',
'GNOME_KEYRING_PID=' ],
killSignal: undefined },
envPairs:
[ 'PATH=/home/sagar/bin:/home/sagar/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin',
'LC_MEASUREMENT=ne_NP',
'XAUTHORITY=/home/sagar/.Xauthority',
'XMODIFIERS=#im=ibus',
'LC_TELEPHONE=ne_NP',
'XDG_DATA_DIRS=/usr/share/ubuntu:/usr/share/gnome:/usr/local/share:/usr/share:/var/lib/snapd/desktop:/var/lib/snapd/desktop',
'GDMSESSION=ubuntu',
'MANDATORY_PATH=/usr/share/gconf/ubuntu.mandatory.path',
'LC_TIME=ne_NP',
'GTK_IM_MODULE=ibus',
'DBUS_SESSION_BUS_ADDRESS=unix:abstract=/tmp/dbus-Sq2eReOVZr',
'DEFAULTS_PATH=/usr/share/gconf/ubuntu.default.path',
'XDG_CURRENT_DESKTOP=Unity',
'LD_LIBRARY_PATH=/home/sagar/apps/WebStorm-172.3757.55/bin:',
'UPSTART_SESSION=unix:abstract=/com/ubuntu/upstart-session/1000/1240',
'QT4_IM_MODULE=xim',
'LC_PAPER=ne_NP',
'QT_LINUX_ACCESSIBILITY_ALWAYS_ON=1',
'LOGNAME=sagar',
'JOB=unity-settings-daemon',
'PWD=/tmp/1512313270685-0',
'IM_CONFIG_PHASE=1',
'LANGUAGE=en_US',
'SHELL=/bin/bash',
'LC_ADDRESS=ne_NP',
'GIO_LAUNCHED_DESKTOP_FILE=/home/sagar/.local/share/applications/jetbrains-webstorm.desktop',
'GTK2_MODULES=overlay-scrollbar',
'INSTANCE=',
'OLDPWD=/home/sagar/apps/WebStorm-172.3757.55/bin',
'GNOME_DESKTOP_SESSION_ID=this-is-deprecated',
'UPSTART_INSTANCE=',
'GTK_MODULES=gail:atk-bridge:unity-gtk-module',
'CLUTTER_IM_MODULE=xim',
'XDG_SESSION_PATH=/org/freedesktop/DisplayManager/Session0',
'COMPIZ_BIN_PATH=/usr/bin/',
'SESSIONTYPE=gnome-session',
'XDG_SESSION_DESKTOP=ubuntu',
'SHLVL=0',
'LC_IDENTIFICATION=ne_NP',
'LC_MONETARY=ne_NP',
'COMPIZ_CONFIG_PROFILE=ubuntu',
'QT_IM_MODULE=ibus',
'UPSTART_JOB=unity7',
'JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64',
'XDG_CONFIG_DIRS=/etc/xdg/xdg-ubuntu:/usr/share/upstart/xdg:/etc/xdg',
'LANG=en_US.UTF-8',
'GNOME_KEYRING_CONTROL=',
'XDG_SEAT_PATH=/org/freedesktop/DisplayManager/Seat0',
'XDG_SESSION_ID=c2',
'XDG_SESSION_TYPE=x11',
'DISPLAY=:0',
'LC_NAME=ne_NP',
'GDM_LANG=en_US',
'XDG_GREETER_DATA_DIR=/var/lib/lightdm-data/sagar',
'UPSTART_EVENTS=xsession started',
'GPG_AGENT_INFO=/home/sagar/.gnupg/S.gpg-agent:0:1',
'DESKTOP_SESSION=ubuntu',
'SESSION=ubuntu',
'USER=sagar',
'XDG_MENU_PREFIX=gnome-',
'GIO_LAUNCHED_DESKTOP_FILE_PID=20103',
'QT_ACCESSIBILITY=1',
'LC_NUMERIC=ne_NP',
'SSH_AUTH_SOCK=/run/user/1000/keyring/ssh',
'XDG_SEAT=seat0',
'QT_QPA_PLATFORMTHEME=appmenu-qt5',
'XDG_VTNR=7',
'XDG_RUNTIME_DIR=/run/user/1000',
'HOME=/home/sagar',
'GNOME_KEYRING_PID=' ],
stderr: null,
stdout: null,
pid: 20527,
output: [ null, null, null ],
signal: null,
status: 127 }
Command `npm install --save --save-exact react-native` failed.
Done
It says npm not found when I know it is there.
I have Node 8.9.1 & npm 5.5.1
Looks like npm is not on your $PATH. Do you launch WebStorm from terminal, or from desktop/System menu? In the latter case, WebStorm only sees environment variables configured in .profile (login shell), but not in interactive shell configuration files (like ~/.bashrc). Plus, NVM alters interactive shell configuration files only during installation phase (https://github.com/creationix/nvm/blob/v0.28.0/install.sh#L126)
Possible workarounds:
Workaround 1: make required variables available in a login shell (i.e. for Bash, move them from .bashrc to .bash_profile or .profile).
Workaround 2: run IDE from a terminal
Workaround 3: edit WebStorm desktop launcher and set command to /bin/bash -l -i -c "/path/to/webstorm.sh" (for bash)
Try pointing to your usr/local/bin/node folder. These are my settings with WebStorm 2017.3 and it worked for me.
Consider installing yarn to make this faster: https://yarnpkg.com
After yarn was installed error is gone.

Packer.io fails using puppet provisioner: /usr/bin/puppet: line 3: rvm: command not found

I'm trying to build a Vagrant box file using Packer.io and Puppet.
I have this template as a starting point:
https://github.com/puphpet/packer-templates/tree/master/centos-7-x86_64
I added the Puppet provisioner after the shell provisioner:
{
"type": "puppet-masterless",
"manifest_file": "../../puphpet/puppet/site.pp",
"manifest_dir": "../../puphpet/puppet/nodes",
"module_paths": [
"../../puphpet/puppet/modules"
],
"override": {
"virtualbox-iso": {
"execute_command": "echo 'vagrant' | {{.FacterVars}}{{if .Sudo}} sudo -S -E bash {{end}}/usr/bin/puppet apply --verbose --modulepath='{{.ModulePath}}' {{if ne .HieraConfigPath \"\"}}--hiera_config='{{.HieraConfigPath}}' {{end}} {{if ne .ManifestDir \"\"}}--manifestdir='{{.ManifestDir}}' {{end}} --detailed-exitcodes {{.ManifestFile}}"
}
}
}
When I start building the image like
packer-io build -only=virtualbox-iso template.json
Then I get this error:
==> virtualbox-iso: Provisioning with Puppet...
virtualbox-iso: Creating Puppet staging directory...
virtualbox-iso: Uploading manifest directory from: ../../puphpet/puppet/nodes
virtualbox-iso: Uploading local modules from: ../../puphpet/puppet/modules
virtualbox-iso: Uploading manifests...
virtualbox-iso:
virtualbox-iso: Running Puppet: echo 'vagrant' | sudo -S -E bash /usr/bin/puppet apply --verbose --modulepath='/tmp/packer-puppet-masterless/module-0' --manifestdir='/tmp/packer-puppet-masterless/manifests' --detailed-exitcodes /tmp/packer-puppet-masterless/manifests/site.pp
virtualbox-iso: /usr/bin/puppet: line 3: rvm: command not found
==> virtualbox-iso: Unregistering and deleting virtual machine...
==> virtualbox-iso: Deleting output directory...
Build 'virtualbox-iso' errored: Puppet exited with a non-zero exit status: 127
If I log in into the box via tty, I can run both rvm and puppet commands as vagrant user.
What did I do wrong?
I am trying out the exact same route as you are:
Use relevant scripts for provisioning the vm from this repo.
Use the puppet scripts from a puphpet.com configuration to further provision the vm using puppet-masterless provioner in packer.
Still working on it, not a successful build yet, but I can share the following:
Inspect line 50 from puphpet/shell/install-puppet.sh. So the puppet command will trigger rvm to be executed.
Inspect your packer output during provisioning. Your read something along the lines of:
...
Creating alias default for ruby-1.9.3-p551
To start using RVM you need to run `source /usr/local/rvm/scripts/rvm` in all
your open shell windows, in rare cases you need to reopen all shell windows.
Cleaning up rvm archives
....
Apparently the command source /usr/local/rvm/scripts/rvm is needed for each user that needs to run rvm. It is executed and set to bash profiles in the script puphpet/shell/install-ruby.sh. However, this does not seem to affect the context/scope of the puppet masterless provisioning execute_command of packer. Reason for the line /usr/bin/puppet: line 3: rvm: command not found in your output.
My current way forward is the following configuration in template.json (packer template), the second and third line will help get beyond the point where you are stuck currently:
{
"type": "puppet-masterless",
"prevent_sudo": true,
"execute_command": "{{if .Sudo}}sudo -E {{end}}bash -c \"source /usr/local/rvm/scripts/rvm; {{.FacterVars}} puppet apply --verbose --parser future --modulepath='{{.ModulePath}}' {{if ne .HieraConfigPath \"\"}}--hiera_config='{{.HieraConfigPath}}' {{end}} {{if ne .ManifestDir \"\"}}--manifestdir='{{.ManifestDir}}' {{end}} --detailed-exitcodes {{.ManifestFile}}\"",
"manifest_file": "./puphpet/puppet/site.pp",
"manifest_dir": "./puphpet/puppet",
"hiera_config_path": "./puphpet/puppet/hiera.yaml",
"module_paths": [
"./puphpet/puppet/modules"
],
"facter": {
"ssh_username": "vagrant",
"provisioner_type": "virtualbox",
"vm_target_key": "vagrantfile-local"
}
},
Note the following things:
Probably running puppet as vagrant user will not complete provisioning due to permission issues. In that case we need a way to run source /usr/local/rvm/scripts/rvm in a sudo and affect the scope of the puppet provisioning command.
The puphpet.com output scripts have /vagrant/puphpet hardcoded in their puppet scripts (e.g. puphpet/puppet/nodes/Apache.pp first line). So you might require a packer file provisioning to your vm before you execute puppet masterless, in order for it to find the dependencies in /vagrant/.... My packer.json conf for this:
{
"type": "shell",
"execute_command": "sudo bash '{{.Path}}'",
"inline": [
"mkdir /vagrant",
"chown -R vagrant:vagrant /vagrant"
]
},
{
"type": "file",
"source": "./puphpet",
"destination": "/vagrant"
},
Puppet will need some Facter variables as they are expected in the puphpet/puppet/nodes/*.pp scripts. Refer to my template.json above.
As said. No success in a complete puppet provisioning yet on my side, but the above got me beyond the point where you are stuck currently. Hope it helps.
Update:
I replaced my old execute command for puppet provisioner
"execute_command": "source /usr/local/rvm/scripts/rvm && {{.FacterVars}}{{if .Sudo}} sudo -E{{end}} puppet apply --verbose --parser future --modulepath='{{.ModulePath}}' {{if ne .HieraConfigPath \"\"}}--hiera_config='{{.HieraConfigPath}}' {{end}} {{if ne .ManifestDir \"\"}}--manifestdir='{{.ManifestDir}}' {{end}} --detailed-exitcodes {{.ManifestFile}}"
with a new one
"execute_command": "{{if .Sudo}}sudo -E {{end}}bash -c \"source /usr/local/rvm/scripts/rvm; {{.FacterVars}} puppet apply --verbose --parser future --modulepath='{{.ModulePath}}' {{if ne .HieraConfigPath \"\"}}--hiera_config='{{.HieraConfigPath}}' {{end}} {{if ne .ManifestDir \"\"}}--manifestdir='{{.ManifestDir}}' {{end}} --detailed-exitcodes {{.ManifestFile}}\""
This will ensure puppet (rvm) is running as root and finishes provisioning successfully.
As an alternative to my other answer, I hereby provide my steps and configuration to get this provisioning scenario working with packer & puphpet.
Assuming the following to be in place:
./: a local directory acting as your own repository being
./ops/: a directory ops inside which holds packer scripts and required files
./ops/template.json: the packer template used to build the VM
./ops/template.json expects the following is in place:
./ops/packer-templates/: a clone of this repo
./ops/ubuntu-14.04.2-server-amd64.iso: the iso for the ubuntu you want to have running in your vm
./puphpet: the output of walking through the configuration steps on puphpet.com (so this is one level up from ops)
The contents of template.json:
{
"variables": {
"ssh_name": "vagrant",
"ssh_pass": "vagrant",
"local_packer_templates_dir": "./packer-templates/ubuntu-14.04-x86_64",
"local_puphput_dir": "../puphpet",
"local_repo_dir": "../",
"repo_upload_dir": "/vagrant"
},
"builders": [
{
"name": "ubuntu-14.04.amd64.virtualbox",
"type": "virtualbox-iso",
"headless": false,
"boot_command": [
"<esc><esc><enter><wait>",
"/install/vmlinuz noapic preseed/url=http://{{ .HTTPIP }}:{{ .HTTPPort }}/preseed.cfg ",
"debian-installer=en_US auto locale=en_US kbd-chooser/method=us ",
"hostname={{ .Name }} ",
"fb=false debconf/frontend=noninteractive ",
"keyboard-configuration/modelcode=SKIP keyboard-configuration/layout=USA keyboard-configuration/variant=USA console-setup/ask_detect=false ",
"initrd=/install/initrd.gz -- <enter>"
],
"boot_wait": "10s",
"disk_size": 20480,
"guest_os_type": "Ubuntu_64",
"http_directory": "{{user `local_packer_templates_dir`}}/http",
"iso_checksum": "83aabd8dcf1e8f469f3c72fff2375195",
"iso_checksum_type": "md5",
"iso_url": "./ubuntu-14.04.2-server-amd64.iso",
"ssh_username": "{{user `ssh_name`}}",
"ssh_password": "{{user `ssh_pass`}}",
"ssh_port": 22,
"ssh_wait_timeout": "10000s",
"shutdown_command": "echo '/sbin/halt -h -p' > shutdown.sh; echo '{{user `ssh_pass`}}'|sudo -S bash 'shutdown.sh'",
"guest_additions_path": "VBoxGuestAdditions_{{.Version}}.iso",
"virtualbox_version_file": ".vbox_version",
"vboxmanage": [
["modifyvm", "{{.Name}}", "--memory", "2048"],
["modifyvm", "{{.Name}}", "--cpus", "4"]
]
}
],
"provisioners": [
{
"type": "shell",
"execute_command": "echo '{{user `ssh_pass`}}'|sudo -S bash '{{.Path}}'",
"scripts": [
"{{user `local_packer_templates_dir`}}/scripts/base.sh",
"{{user `local_packer_templates_dir`}}/scripts/virtualbox.sh",
"{{user `local_packer_templates_dir`}}/scripts/vagrant.sh",
"{{user `local_packer_templates_dir`}}/scripts/puphpet.sh",
"{{user `local_packer_templates_dir`}}/scripts/cleanup.sh",
"{{user `local_packer_templates_dir`}}/scripts/zerodisk.sh"
]
},
{
"type": "shell",
"execute_command": "sudo bash '{{.Path}}'",
"inline": [
"mkdir {{user `repo_upload_dir`}}",
"chown -R vagrant:vagrant {{user `repo_upload_dir`}}"
]
},
{
"type": "file",
"source": "{{user `local_repo_dir`}}",
"destination": "{{user `repo_upload_dir`}}"
},
{
"type": "shell",
"execute_command": "sudo bash '{{.Path}}'",
"inline": [
"rm -fR {{user `repo_upload_dir`}}/.vagrant",
"rm -fR {{user `repo_upload_dir`}}/ops"
]
},
{
"type": "puppet-masterless",
"execute_command": "{{if .Sudo}}sudo -E {{end}}bash -c \"source /usr/local/rvm/scripts/rvm; {{.FacterVars}} puppet apply --verbose --parser future --modulepath='{{.ModulePath}}' {{if ne .HieraConfigPath \"\"}}--hiera_config='{{.HieraConfigPath}}' {{end}} {{if ne .ManifestDir \"\"}}--manifestdir='{{.ManifestDir}}' {{end}} --detailed-exitcodes {{.ManifestFile}}\"",
"manifest_file": "{{user `local_puphput_dir`}}/puppet/site.pp",
"manifest_dir": "{{user `local_puphput_dir`}}/puppet",
"hiera_config_path": "{{user `local_puphput_dir`}}/puppet/hiera.yaml",
"module_paths": [
"{{user `local_puphput_dir`}}/puppet/modules"
],
"facter": {
"ssh_username": "{{user `ssh_name`}}",
"provisioner_type": "virtualbox",
"vm_target_key": "vagrantfile-local"
}
},
{
"type": "shell",
"execute_command": "sudo bash '{{.Path}}'",
"inline": [
"echo '{{user `repo_upload_dir`}}/puphpet' > '/.puphpet-stuff/vagrant-core-folder.txt'",
"sudo bash {{user `repo_upload_dir`}}/puphpet/shell/important-notices.sh"
]
}
],
"post-processors": [
{
"type": "vagrant",
"output": "./build/{{.BuildName}}.box",
"compression_level": 9
}
]
}
Narration of what happens:
execute the basic provisioning of the VM using the scripts that are used to build puphpet boxes (first shell provisioner block)
create a directory /vagrant in the VM and set permissions for vagrant user
upload local repository to /vagrant (important as puphpet/puppet expects it to exist at that location in its scripts)
remove some unneeded stuff from /vagrant after upload
start puppet provisioner with custom execute_command and facter configuration
process the remaining provisioning scripts. To be extended with exec once/always, start once/always files
Note: you might need to prepare some more things before the puppet provisioner kicks off. E.g. I need a directory in place that will be the docroot of a vhost in apache. Use shell provisioning to complete the template for your own puphpet configuration.

Deploying a sails.js app via puppet fails on npm install

I'm deploying a sails.js app via puppet & here is my manifest:
class sails {
include apt
class { 'nodejs':
manage_repo => true,
}
vcsrepo { '/var/www':
ensure => latest,
provider => git,
source => 'https://www.gitrepo.com',
revision => 'master',
require => Class['nodejs'],
}
exec { 'npm install':
path => '/usr/bin/',
cwd => '/var/www/',
require => [Vcsrepo['/var/www'],Class['nodejs']],
}
}
All goes well until the npm install, it gets through most dependencies, but then this happens:
Notice: /Stage[main]/sails/Exec[npm install]/returns: npm http 200 https://registry.npmjs.org/ws/-/ws-0.4.31.tgz
Notice: /Stage[main]/sails/Exec[npm install]/returns:
Notice: /Stage[main]/sails/Exec[npm install]/returns: > bson#0.1.8 install /var/www/node_modules/sails/node_modules/connect-mongo/node_modules/mongodb/node_modules/bson
Notice: /Stage[main]/sails/Exec[npm install]/returns: > (node-gyp rebuild 2> builderror.log) || (exit 0)
Notice: /Stage[main]/sails/Exec[npm install]/returns:
Notice: /Stage[main]/sails/Exec[npm install]/returns: execvp(): No such file or directory
Any ideas on why this is? If I run npm install manually in /var/www it works fine.
I had a similar problem because I hadn't installed git.
I had the same problem. In the end I was able to fix it with:
sudo -i sh -c "cd ${app_directory}; npm install"
As per the discussion here:
https://github.com/TooTallNate/node-gyp/issues/115#issuecomment-7386287

Vagrant / Chef-solo not working after running recipe rvm::vagrant

After adding recipe rvm::vagrant and running vagrant provision I got:
/usr/local/bin/chef-solo: line 23: /opt/vagrant_ruby/bin/chef-solo: No such file
or directory
Chef never successfully completed! Any errors should be visible in the
output above. Please fix your recipes so that they properly complete.
This issue should have been fixed:
https://github.com/fnichol/chef-rvm/issues/121
Even though I add the line:
'rvm' => {
'vagrant' => {
'system_chef_solo' => '/opt/vagrant_ruby/bin/chef-solo'
}
}
I am still getting the error. How can I recover from it?
You have to make sure that '/opt/vagrant_ruby/bin/chef-solo' is actual path of chef-solo. In my case it was /usr/bin/chef-solo. And this is part of my Vagrantfile that fixed it:
config.vm.provision :chef_solo do |chef|
chef.json.merge! rvm: {vagrant: {system_chef_solo: '/usr/bin/chef-solo'}}
end
This has been frustrating as answers are very time dependent. I have had this issue though and fixed it today by adding to my Vagrantfile. I had to vagrant destroy and vagrant up again so hopefully this doesn't break next time I vagrant provision but I have checked in the box and it looks like the path to chef-solo is correct. Found this answer out on Github at here
chef.json = {
rvm: {
vagrant: {
system_chef_solo: '/opt/chef/bin/chef-solo'
},
user_installs: [
{
user: 'vagrant',
default_ruby: '2.2.1',
rubies: ['2.2.1'],
global: '2.2.1'
}
]
},
... rest of Vagrantfile