In root user crontab I added the below job:
*/1 * * * * /usr/local/bin/forever start /root/MyCode/server.js >> /root/ou1.log 2>&1
I am getting the below error:
info: Forever processing file: /root/MyCode/server.js
/usr/local/lib/node_modules/forever/lib/forever.js:419
monitor.send(JSON.stringify(options));
^
TypeError: Object # has no method 'send'
at Object.startDaemon (/usr/local/lib/node_modules/forever/lib/forever.js:419:11)
at /usr/local/lib/node_modules/forever/lib/forever/cli.js:258:13
at /usr/local/lib/node_modules/forever/lib/forever/cli.js:145:5
at Object.oncomplete (/usr/local/lib/node_modules/forever/lib/forever.js:358:11)
But if I manually run the forever command in terminal, its working..
Forever "version": "0.11.1"
node version: v0.10.17
It seems like some problem with the installation. In the above example am using a vagrant box. I installed forever in fresh vagrant box, and it worked..
Related
I'm running a python script called TGubuntu.py.
I used ls -l , and the permissions of the script are -rwxrwxrwx 1 ubuntu ubuntu 503 Jan 13 19:07 TGubuntu.py, which should mean that anyone can execute the file, right?
But I still get in the log /bin/sh: 1: /home/ubuntu/TestTG/TGubuntu.py: Permission denied for some reason.
When I run the script manually it works perfectly.
Any Ideas?
I put it in the sudo crontab like this
* * * * * /home/ubuntu/TestTG/TGubuntu.py
But even in the root (cron) mail log it says Permission Denied!
Couldn't figure out what the problem was, so I accomplished my goal using a different method.
I ran a python script that uses the schedule module to call my script. Then I just let the "Timer" run on screen indefinitely.
Got a problem executing selenium-side-runner via crontab, installed via npm.
Following command has been executed.
1. npm install -g selenium-side-runner
2. npm install -g chromedriver
I was able to run the selenium-side-runner via terminal directly by calling.
"selenium-side-runner (path to .side file)"
when trying to create a crontab. using the following commands.
"0 5 * * * selenium-side-runner <path to .side file> >> /tmp/sel.log 2>&1"
OR
"0 5 * * * /usr/local/bin/selenium-side-runner <path to .side file> >> /tmp/sel.log 2>&1"
Still got no LUCK on the following snippet.
NOTE: selenium-side-runner can be found under /usr/local/bin --
I checked it out using "which selenium-side-runner"
I also checked that $PATH and /usr/local/bin is there.
FF Error message received under "sel.log"
1. command not found
2. No such file or directory
Can someone help me out on this please.....
I'm running a small Python script that scrapes some data from a public website.
When I run the Dockerfile instructions line by line in an interactive terminal using the selenium:latest image the script runs fine.
docker run -it -v /Users/me/Desktop/code/scraper/:/scraper selenium/standalone-firefox bash
As soon as I run it using my Dockerfile and docker-compose file I get this error:
app_1 | selenium.common.exceptions.WebDriverException: Message: invalid argument: can't kill an exited process
I am using the MOZ_HEADLESS=1 env var. It's being passed properly.
I have tried running the script as someone other than root but then I get log errors.
Dockerfile
FROM selenium/standalone-firefox:latest
# https://github.com/SeleniumHQ/docker-selenium/issues/725
USER root
RUN apt-get update -y
RUN apt-get install -y firefox python-pip
WORKDIR /scraper
COPY . /scraper
RUN pip install -r /scraper/requirements.txt
ENV MOZ_HEADLESS=1
CMD ["python", "/scraper/browserscraper.py"]
If I run those instructions in the Dockerfile from an interactive terminal, I have no problems.
It either has to do with the user being root running the script via the Dockerfile or something about it missing a screen for output because I'm not actually SSH'ed in like I am running it from the command line with -it.
Any ideas?
so I have phantomJS and casperJS installed, everything is working fine, but I'm trying to add my casperJS file to cronjob (ubuntu) and I'm getting error:
/bin/sh: 1: /usr/local/bin/casperjs: not found
My crontab file:
0 */1 * * * PHANTOMJS_EXECUTABLE=/usr/local/bin/phantomjs
/usr/local/bin/casperjs /usr/local/share/casper-test/test.js 2>&1
Any Ideas whats wrong?
If you want to use several commands on one line, you have to separate them with a semicolon:
0 */1 * * * PHANTOMJS_EXECUTABLE=/usr/local/bin/phantomjs ; /usr/local/bin/casperjs /usr/local/share/casper-test/test.js 2>&1
Or, if you need to execute commands sequentially and only progress to next if the previous has been successful, use && operator.
For better readability you could just put those commands in a shell script and run that from cron.
On our build server (bamboo launched) we are wanting to do selenium tests, to do this we are running xvfb-run, this works on our local servers which are all of the same type.
If I log on to the build server and run:
xvfb-run echo 'i'
I get the error:
xvfb-run: error: Xvfb failed to start
I have tried running like this:
xvfb-run -a echo 'i'
This time it just hangs and never finishes, any ideas on things I can try?
Thanks
Run following commands:
sudo nohup Xvfb :40 -ac &
export DISPLAY=:40
Since it works locally, I suspect a server or permissions issue is going on. Perhaps your user can't open up a lock file in /tmp ? Try to get more info about the problem by running:
xvfb-run -e /dev/stdout [mycommand]