How to or what's the best way to run (not call with curl) a REST API service using purely shell commands or scripts?
I would typically run APIs with Python using FastAPI, however, I was wondering if it is possible to do it in a less-Pythonic and more-Linux way.
Sampo is most likely what you're looking for.
Written in bash, it is a shell script API server you can run directly in your terminal or via container on Kubernetes.
It exposes endpoints which could trigger different actions, running a script among others.
Related
I want to fetch some big data from the database API. it takes time (more than 5minutes maybe) But, i want to keep my connection with the API. even if (I or User) closed web browser.
Can someone help me out?
show me the way, please.
btw, I'm using api with javascript, java based api server.
you can use a curl command and run it in background using nohup on a unix machine
like
nohup curl https://xxxxxx &
you need to explore a bit more how you ll make the same request using curl which are you making using the browser
this will run it in background even if you shut your shell terminal until your unix server is running
at the moment I use Beanshell to write application extensions. I just use BshMethod.invoke() to integrate it with the java application.
Could this be done with JShell too? Load a script and execute a method in the script? Via a JShell API, not via cmdline...
I want to Execute Vsphere Power-cli scripts which can make my tasks easier(Such as configuring firewall,..using the script), however i wanted to know if its possible to run the script on Vsphere client trail version because it only features read-only API unlike the full version which has both read/write API's
If you are looking to use PowerCLI to get information about your environment, you still will be able to do that. However, you will not be able to execute any commands that will create/modify values.
I found this link on remotely running my tests on different machines
http://performancetestingwithjmeter.blogspot.in/2012/09/distributed-load-testing-in-jmeter.html
but this link defines process using UI,
I want to done Distributed load testing via console,
What I should done to make Distributed(Master/Slave) testing work from console?
You can use jmeter with command option -r from master. Reference
The scrapy doc says that:
Scrapy comes with a built-in service, called “Scrapyd”, which allows you to deploy (aka. upload) your projects and control their spiders using a JSON web service.
is there some advantages in comformance use scrapyd?
Scrapyd allows you to run scrapy on a different machine than the one you are using via a handy web API which means you can just use curl or even a web browser to upload new project versions and run them. Otherwise if you wanted to run Scrapy in the cloud somewhere you would have to scp copy the new spider code and then login with ssh and spawn your scrapy crawl myspider.
Scrapyd will also manage processes for you if you want to run many spiders in parallel; but if you have Scrapy on your local machine and have access to the command-line or a way to run spiders and just want to run one spider at a time, then you're better off running the spider manually.
If you are developing spiders then for sure you don't want to use scrapyd for quick compile/test iterations as it just adds a layer of complexity.