I want to run scrapyd with his own user under ubuntu 18.04. Therefore I created a new user "scrapyd" and try to launch it. This results in an error:
merlin#spider1:~$ sudo -u scrapyd scrapyd
2020-05-04T19:46:03+0200 [-] Loading /usr/local/lib/python3.8/dist-packages/scrapyd/txapp.py...
2020-05-04T19:46:03+0200 [-] Scrapyd web console available at http://127.0.0.1:6800/
2020-05-04T19:46:03+0200 [-] Loaded.
2020-05-04T19:46:03+0200 [twisted.scripts._twistd_unix.UnixAppLogger#info] twistd 20.3.0 (/usr/bin/python3 3.8.2) starting up.
2020-05-04T19:46:03+0200 [twisted.scripts._twistd_unix.UnixAppLogger#info] reactor class: twisted.internet.epollreactor.EPollReactor.
2020-05-04T19:46:03+0200 [-] Failed to unlink PID file:
Traceback (most recent call last):
File "/usr/local/lib/python3.8/dist-packages/twisted/scripts/twistd.py", line 25, in runApp
runner.run()
File "/usr/local/lib/python3.8/dist-packages/twisted/application/app.py", line 385, in run
self.postApplication()
File "/usr/local/lib/python3.8/dist-packages/twisted/scripts/_twistd_unix.py", line 261, in postApplication
self.removePID(self.config['pidfile'])
File "/usr/local/lib/python3.8/dist-packages/twisted/scripts/_twistd_unix.py", line 288, in removePID
log.err(e, "Failed to unlink PID file:")
--- <exception caught here> ---
File "/usr/local/lib/python3.8/dist-packages/twisted/scripts/_twistd_unix.py", line 283, in removePID
os.unlink(pidfile)
builtins.FileNotFoundError: [Errno 2] No such file or directory: '/home/merlin/twistd.pid'
2020-05-04T19:46:03+0200 [stderr#error] Traceback (most recent call last):
2020-05-04T19:46:03+0200 [stderr#error] File "/usr/local/bin/scrapyd", line 11, in <module>
2020-05-04T19:46:03+0200 [stderr#error] sys.exit(main())
2020-05-04T19:46:03+0200 [stderr#error] File "/usr/local/lib/python3.8/dist-packages/scrapyd/scripts/scrapyd_run.py", line 11, in main
2020-05-04T19:46:03+0200 [stderr#error] run()
2020-05-04T19:46:03+0200 [stderr#error] File "/usr/local/lib/python3.8/dist-packages/twisted/scripts/twistd.py", line 31, in run
2020-05-04T19:46:03+0200 [stderr#error] app.run(runApp, ServerOptions)
2020-05-04T19:46:03+0200 [stderr#error] File "/usr/local/lib/python3.8/dist-packages/twisted/application/app.py", line 674, in run
2020-05-04T19:46:03+0200 [stderr#error] runApp(config)
2020-05-04T19:46:03+0200 [stderr#error] File "/usr/local/lib/python3.8/dist-packages/twisted/scripts/twistd.py", line 25, in runApp
2020-05-04T19:46:03+0200 [stderr#error] runner.run()
2020-05-04T19:46:03+0200 [stderr#error] File "/usr/local/lib/python3.8/dist-packages/twisted/application/app.py", line 385, in run
2020-05-04T19:46:03+0200 [stderr#error] self.postApplication()
2020-05-04T19:46:03+0200 [stderr#error] File "/usr/local/lib/python3.8/dist-packages/twisted/scripts/_twistd_unix.py", line 254, in postApplication
2020-05-04T19:46:03+0200 [stderr#error] self.startApplication(self.application)
2020-05-04T19:46:03+0200 [stderr#error] File "/usr/local/lib/python3.8/dist-packages/twisted/scripts/_twistd_unix.py", line 437, in startApplication
2020-05-04T19:46:03+0200 [stderr#error] self.setupEnvironment(
2020-05-04T19:46:03+0200 [stderr#error] File "/usr/local/lib/python3.8/dist-packages/twisted/scripts/_twistd_unix.py", line 330, in setupEnvironment
2020-05-04T19:46:03+0200 [stderr#error] with open(pidfile, 'wb') as f:
2020-05-04T19:46:03+0200 [stderr#error] PermissionError: [Errno 13] Permission denied: '/home/merlin/twistd.pid'
Merlin is the user name issuing the command. How can I start scrapyd under his own user name?
Related
/usr/local/bin/python3.9 /Users/rabbu/PycharmProjects/pythonProject/seleniumpkg/Basic.py
Traceback (most recent call last):
File "/Users/rabbu/Library/Python/3.9/lib/python/site-packages/selenium/webdriver/common/service.py", line 72, in start
self.process = subprocess.Popen(cmd, env=self.env,
File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/subprocess.py", line 947, in __init__
self._execute_child(args, executable, preexec_fn, close_fds,
File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/subprocess.py", line 1819, in _execute_child
raise child_exception_type(errno_num, err_msg, err_filename)
FileNotFoundError: [Errno 2] No such file or directory: '../drivers.chromedriver'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/rabbu/PycharmProjects/pythonProject/seleniumpkg/Basic.py", line 4, in <module>
driver = webdriver.Chrome(executable_path="../drivers.chromedriver")
File "/Users/rabbu/Library/Python/3.9/lib/python/site-packages/selenium/webdriver/chrome/webdriver.py", line 73, in __init__
self.service.start()
File "/Users/rabbu/Library/Python/3.9/lib/python/site-packages/selenium/webdriver/common/service.py", line 81, in start
raise WebDriverException(
selenium.common.exceptions.WebDriverException: Message: 'drivers.chromedriver' executable needs to be in PATH. Please see https://sites.google.com/a/chromium.org/chromedriver/home
Process finished with exit code 1
You can use webdriver_manager to directly download and launch the driver without providing the path.
You can install by using webdriver_manager by using pip3 install webdriver_manager
from webdriver_manager.chrome import ChromeDriverManager
from selenium import webdriver
driver = webdriver.Chrome(executable_path=ChromeDriverManager().install())
driver.get("Your URL")
i am getting this error while installing tensorflow 1.3.0. in the terminal of mac
Installing collected packages: tensorflow-tensorboard, backports.weakref, tensorflow
Found existing installation: tensorflow-tensorboard 0.4.0rc3
Uninstalling tensorflow-tensorboard-0.4.0rc3:
Successfully uninstalled tensorflow-tensorboard-0.4.0rc3
Exception:
Traceback (most recent call last):
File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/pip/basecommand.py", line 215, in main
status = self.run(options, args)
File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/pip/commands/install.py", line 342, in run
prefix=options.prefix_path,
File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/pip/req/req_set.py", line 784, in install
**kwargs
File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/pip/req/req_install.py", line 851, in install
self.move_wheel_files(self.source_dir, root=root, prefix=prefix)
File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/pip/req/req_install.py", line 1064, in move_wheel_files
isolated=self.isolated,
File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/pip/wheel.py", line 345, in move_wheel_files
clobber(source, lib_dir, True)
File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/pip/wheel.py", line 323, in clobber
shutil.copyfile(srcfile, destfile)
File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/shutil.py", line 97, in copyfile
with open(dst, 'wb') as fdst:
IOError: [Errno 13] Permission denied: '/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/backports/weakref.py'
I've cloned depot tools given instructions from here on ubuntu 17.04.
When I run fetch chromium I get the following error.
./fetch chromium
Running: gclient root
Traceback (most recent call last):
File "./fetch.py", line 299, in <module>
sys.exit(main())
File "./fetch.py", line 294, in main
return run(options, spec, root)
File "./fetch.py", line 280, in run
if not options.force and checkout.exists():
File "./fetch.py", line 82, in exists
gclient_root = self.run_gclient('root').strip()
File "./fetch.py", line 78, in run_gclient
return self.run(cmd_prefix + cmd, **kwargs)
File "./fetch.py", line 68, in run
return subprocess.check_output(cmd, **kwargs)
File "/usr/lib/python2.7/subprocess.py", line 212, in check_output
process = Popen(stdout=PIPE, *popenargs, **kwargs)
File "/usr/lib/python2.7/subprocess.py", line 390, in __init__
errread, errwrite)
File "/usr/lib/python2.7/subprocess.py", line 1024, in _execute_child
raise child_exception
OSError: [Errno 2] No such file or directory
I have come across similar errors in stackoverflow while searching for a resolution however none of them helped resolve the problem.
I'm not clear on if there is any steps required after cloning to install depot tools? The readme
Any thoughts on what is required to solve the problem?
Conteh
I have a python application that uses numpy running on my digital Ocean droplet. I am trying to pip install numpy into my virtual environment and each time i am getting an error like this:
Collecting numpy
Downloading numpy-1.13.1-cp27-cp27mu-manylinux1_x86_64.whl (16.6MB)
99% |████████████████████████████████| 16.6MB 40.5MB/s eta
0:00:01Exception:
Traceback (most recent call last):
File "/usr/lib/python2.7/dist-packages/pip/basecommand.py", line 209, in
main
status = self.run(options, args)
File "/usr/lib/python2.7/dist-packages/pip/commands/install.py", line 328,
in run
wb.build(autobuilding=True)
File "/usr/lib/python2.7/dist-packages/pip/wheel.py", line 748, in build
self.requirement_set.prepare_files(self.finder)
File "/usr/lib/python2.7/dist-packages/pip/req/req_set.py", line 360, in
prepare_files
ignore_dependencies=self.ignore_dependencies))
File "/usr/lib/python2.7/dist-packages/pip/req/req_set.py", line 577, in
_prepare_file
session=self.session, hashes=hashes)
File "/usr/lib/python2.7/dist-packages/pip/download.py", line 810, in
unpack_url
hashes=hashes
File "/usr/lib/python2.7/dist-packages/pip/download.py", line 649, in
unpack_http_url
hashes)
File "/usr/lib/python2.7/dist-packages/pip/download.py", line 871, in
_download_http_url
_download_url(resp, link, content_file, hashes)
File "/usr/lib/python2.7/dist-packages/pip/download.py", line 595, in
_download_url
hashes.check_against_chunks(downloaded_chunks)
File "/usr/lib/python2.7/dist-packages/pip/utils/hashes.py", line 46, in
check_against_chunks
for chunk in chunks:
File "/usr/lib/python2.7/dist-packages/pip/download.py", line 563, in
written_chunks
for chunk in chunks:
File "/usr/lib/python2.7/dist-packages/pip/utils/ui.py", line 139, in iter
for x in it:
File "/usr/lib/python2.7/dist-packages/pip/download.py", line 552, in
resp_read
decode_content=False):
File "/usr/share/python-wheels/urllib3-1.13.1-py2.py3-none-
any.whl/urllib3/response.py", line 344, in stream
data = self.read(amt=amt, decode_content=decode_content)
File "/usr/share/python-wheels/urllib3-1.13.1-py2.py3-none-
any.whl/urllib3/response.py", line 301, in read
data = self._fp.read(amt)
File "/usr/share/python-wheels/CacheControl-0.11.5-py2.py3-none-
any.whl/cachecontrol/filewrapper.py", line 54, in read
self.__callback(self.__buf.getvalue())
File "/usr/share/python-wheels/CacheControl-0.11.5-py2.py3-none-
any.whl/cachecontrol/controller.py", line 224, in cache_response
self.serializer.dumps(request, response, body=body),
File "/usr/share/python-wheels/CacheControl-0.11.5-py2.py3-none-
any.whl/cachecontrol/serialize.py", line 81, in dumps
).encode("utf8"),
MemoryError
Any who is able lto help me figure out how to solve this problem i have tried installing numpy from outside the virtual-env but it is still refusing to download and install.
I cannot schedule a spider run
Deploy seems to be ok:
Deploying to project "scraper" in http://localhost:6800/addversion.json
Server response (200):
{"status": "ok", "project": "scraper", "version": "1418909664", "spiders": 3}
I scheduling a new spider run :
curl http://localhost:6800/schedule.json -d project=scraper -d spider=spider
{"status": "ok", "jobid": "3f81a0e486bb11e49a6800163ed5ae93"}
but on scrapyd I get this error:
2014-12-18 14:39:12+0100 [-] Process started: project='scraper' spider='spider' job='3f81a0e486bb11e49a6800163ed5ae93' pid=28565 log='/usr/scrapyd/logs/scraper/spider/3f81a0e486bb11e49a6800163ed5ae93.log' items='/usr/scrapyd/items/scraper/spider/3f81a0e486bb11e49a6800163ed5ae93.jl'
2014-12-18 14:39:13+0100 [Launcher,28565/stderr] Traceback (most recent call last):
File "/usr/lib/python2.7/runpy.py", line 162, in _run_module_as_main
"__main__", fname, loader, pkg_name)
File "/usr/lib/python2.7/runpy.py", line 72, in _run_code
exec code in run_globals
File "/usr/local/lib/python2.7/dist-packages/scrapyd/runner.py", line 39, in <module>
2014-12-18 14:39:13+0100 [Launcher,28565/stderr] main()
File "/usr/local/lib/python2.7/dist-packages/scrapyd/runner.py", line 36, in main
execute()
File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 143, in execute
_run_print_help(parser, _run_command, cmd, args, opts)
File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 89, in _run_print_help
func(*a, **kw)
File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 150, in _run_command
cmd.run(args, opts)
File "/usr/local/lib/python2.7/dist-packages/scrapy/commands/crawl.py", line 58, in run
spider = crawler.spiders.create(spname, **opts.spargs)
2014-12-18 14:39:13+0100 [Launcher,28565/stderr] File "/usr/local/lib/python2.7/dist-packages/scrapy/spidermanager.py", line 48, in create
return spcls(**spider_kwargs)
File "build/bdist.linux-x86_64/egg/scraper/spiders/spider.py", line 104, in __init__
File "/usr/lib/python2.7/os.py", line 157, in makedirs
mkdir(name, mode)
OSError: [Errno 20] Not a directory: '/tmp/scraper-1418909944-dKTRZI.egg/logs/'
2014-12-18 14:39:14+0100 [-] Process died: exitstatus=1 project='scraper'
Any ideas? :(
You are trying to create a directory inside an egg.
OSError: [Errno 20] Not a directory: '/tmp/scraper-1418909944-dKTRZI ---->.egg<----- /logs/'