Can I use wget to click/execite a url link, specifically a url that includes a cgi command? - cgi

I need to activate a cgi command on a camera without using a browser, I'm not actually interested in downloading anything, the command just needs to be activated the same as when I just enter it into a browser.
Would wget do the trick or would it not actually execute the command, and try to download something, I've only ever tried to download files with wget.

Surely you can do it by using --spider option of wget.
For example:
wget --spider http://example.com

Related

Downloading PDF report from kibana/elasticsearch using API call

I am trying to generate PDF reports and download them using a script. I followed below instructions.
https://github.com/elastic/kibana/blob/master/docs/user/reporting/automating-report-generation.asciidoc
I am able to queue the report and i also got a download url ()/api/.../download/xyzdrfd but when i am trying wget on the url, It's not working. I have no idea how to download that report using APIs so just tried with wget.
Can anyone tell me how to download the reports from API call?
The download might not be happening due to some redirects happening on the page. Use -L option with curl command to get it working. I did it specifically using the Kibana endpoint to download a PDF file. Replace the username and passsword with the basic auth credentials of yours. Use -o option to specify the downloaded file name. Below is the complete example of the command:
curl -L -u username:password -o download.pdf https://endpoint.com:9244/s/bi-/api/reporting/jobs/download/ktl8n95q001edfc210feaz0r

How to Use luminati.io with Phantomjs

I have been using proxy ips with phantomjs to scrape data. Has someone used luminati.io with phantomjs? since luminati uses end user computer ips to read pages. Its costly and I need to know if someone has used this already and I should try this.
Thanks,
You need to pass --proxy-auth and --proxy to the command line, for example:
phantomjs --proxy-auth=lum-customer-YOUR_CUSTOMER-zone-YOUR_ZONE-country-YOUR_COUNTRY --proxy=zproxy.luminati.io:22225 script.js

Text editor that can edit using sudo over ssh?

I'm trying to edit files on a remote Amazon EC2 Linux instance. I'm currently just sshing in and using nano, but would really like a graphical text editor. I have two problems:
I have to use sudo to edit these server files when I ssh in.
I can only login with the key Amazon gave me. Ex: ssh -i Andrew.pem ec2-user#55.55.44.33
Please help! I'm not picky, just any graphical text editor since using nano is a huge pain.
For remote editing, there are lots of options here: This answer, like any other, is sure to become outdated as more options enter the field.
For vim, the netrw module meets this need, and is shipped with the editor by default.
For emacs, this is available with TRAMP.
For the ATOM editor, see the remote-files plugin.
For IntelliJ, editing files on remote hosts is supported in the commercial edition.
For Eclipse, see the Remote System Explorer from the Target Management project.
I'd suggest starting with the editor you prefer and evaluating options from there. If you set up your SSH session to be able to authenticate directly to root (password auth is best disabled for root, but if you have sudo you can install RSA keys), then you'll be able to specify root as a target user for any of the above.
By contrast, if you really do need sudo, you still have options:
See Using tramp to open files sudoed to root on the Emacs wiki. New versions also support a ssh+sudo transport, meaning this wiki entry may already be out-of-date.
To help anyone that just need a quick command line text editor:
you can use vi:
vi file-name.txt
or nano:
nano file-name.txt
optionally use sudo if editing the file, eg:
sudo nano file-name.txt
Just modify the appropriate files on your local machine and scp the file into the remote machine.
scp <local_machine_path_to_file> remoteUser#remoteHostName:<filePath>
amazon now acquire Cloud9, which is a browser-based IDE that can edit your EC2.
https://aws.amazon.com/cloud9/
Today I found two products that can use sudo, they are
MobaXterm (free version) and SmarTTY
MobaXterm has a button in the file browser that enables sudo mode. You can view, create and edit files as a sudo user. Use this switch when necessary.
Unfortunately, this only works through the SCP protocol.
SmartTTY works differently. When you try to save a file that requires sudo, SmarTTY throws an error and immediately suggests trying to save the file with sudo
Of the two products, I recommend MobaXterm.
Sudo is for root privileges for that particular command. You will need to use root privileges to edit system files. Even on a local machine. If you don't like typing sudo every time, you can type sudo -s. You will change to root user and it will show you in terminal i.e. root#ip.... The $ sign will also change to #. Honestly, I prefer not going root, because it is easier to make irreversible mistakes with root privileges. I've made some mistakes and I'm talking from experience...
As far as the second part of your question goes, you can configure various text editors to sftp into your instance such as sublime.
You will have to use the .pem key file every time you ssh using terminal. This is because AWS takes security very seriously. You can put the key file in your home directory. That way you don't have to change directories every time you open up terminal.
You can also edit a local copy of files and then use FileZilla to transfer. Setting up FileZilla to work with your EC2 instance is straightforward. You can give vim a try since it colors your code and is more advanced than nano. Use the command vi or vim from terminal.
Happy SSH'ing ;).
ssh -X user#server.
You have to make appropriate setting for forwarding.
I use SFTP Net Drive SFTP, which let you create a virtual drive on your local computer that will map the remote file system accessible via SFTP protocol. After the map is created, you can use the editor of your preference.
You can use nano, vim, vi or many others. However if you want to edit with a graphical text editor you will have to create SFTP since Amazon does not support FTP. One way is to use filezilla to upload your files. Here is a video on using filezilla https://www.youtube.com/watch?v=VawBMj29g0o I suggest SSH though. Its fast and easy here is a video on that https://www.youtube.com/watch?v=O2-3HoRjBH4
I found a weird workaround for a GUI based text editor on AWS, I used Jupyter Notebook. If you have Anaconda installed on you instance, you follow the following steps
ssh onto your instance using ssh -i <location of your private key> <username>#<public DNS>
Start jupyter notebook on your instance using jupyter notebook --no-browser --port=8888
Open a new terminal window and ssh onto your jupyter notebook using ssh -i <location of your private key> -L 8212:localhost:8888 <username>#<public DNS>
Now you can open jupyter notebook at localhost:8212
Using the jupyter notebook environment, you can not only launch and run Ipython notebooks but also create and edit any files like a text editor.
would really like a graphical text editor
You cannot have a graphic editor, you need to use any editors like nano as you said or vim,emacs. Sudo would be required when you have to edit configuration files with root as owner.
To assist others with this same question, I would suggest jEdit. It is very capable, and it has a very rich plugin environment, language parsing, etc.
http://www.jedit.org
It has "always" supported sftp read and write of files with the sshConsole plugin.
I use it now on my AWS EC2 instance with the key pair supplied by AWS.
Lastly, it is not a good idea to edit files owned by root in the "production" environment.
Do your dev work in the AWS user's home folder so that you have full control of the source files. Then use a symlink to the actual server's file tree so you can serve it to yourself for testing. There are lots of controls in nginx and apache to limit who can view your dev site.
EDIT/UPDATE:
The NppFtp plugin to Notepad++ profides sftp access to AWS. I just tested it with the .pem file that they provided for my login at AWS.
For this, i'd suggest one of:
Learn and use emacs; it's quite powerful as far as textmode editors go.
Install your favourite graphical editor on the server and use X forwarding, 'ssh -X server.com'. This will allow you to launch the editor remotely, but have it display locally.
Most elegant in my opinion, use sshfs (https://github.com/libfuse/sshfs) to mount the remote directory locally, so you can work on the files directly using your favourite text editor.

Saving the result of a telnet request as a text document

So, say I run "telnet google.com 80\rGET / HTTP/1.0\r", is there any way I could save the ensuing HTTP data?
Is it possible to do in bash? If not, Perl?
Use tee.
telnet google.com 80 | tee output.txt
The script command seems to meet your requirements. Once you start it, all the terminal output from that session gets saved to a file. But for the specific task you mentioned,
I'd use wget or curl rather than messing around with script.

Plesk Cron jobs and FTP - who is the owner for file access?

Trying to setup a Cron task that gets a file via FTP however seems to fail due to file permissions.
Code runs perfect in the browser, ie when apache is the owner, however fails when Cron runs the same page.
I'm assuming this is a directory/file permission error, if so who should I set the directory owner too for Cron jobs?
Most likely Dan's thought is going to be your problem. However if it works from a browser you can also call the page like this:
wget -q "http://www.domain.com/path/to/script/script.whatever" >/dev/null 2>&1
if you still get errors you can remove the >/dev/null 2>&1 part & [if your email address is in the domain administrator account correctly] output, including errors should get emailed to you.
As for the correct permissions, don't change the default plesk ones or you will get issues with normal ftp.
Defaults are:
everything under httpdocs = ftpuser.psacln
anything written by php/apache = apache.apache ~ unless you are running php as a cgi on that domain,, then they will belong to the ftp user as well.
-sean
cron jobs will run as the user that created them. More likely than a permissions error is a path error. If you're not specifying full absolute paths to the program/script to run, and to any files you reference, you'll likely have problems as cron won't have the same PATH in its environment as Apache does or you do at your shell prompt.