I need to check my apache error log, but the file contains 14GB of data. Downloading will takes ages. Is there any way how I can see what's in the file without downloading?
If you have SSH enabled on the server, just remotely login and either tail or grep the log files.
Do you got a SSH access to your server ? In this case this is easy, use a SSH client like putty and read file directly.
If you only got a FTP access, you can try to use Byte range as defined here : http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.35
Of course only if your server supports it.
Related
Hi I can login to the GCE VM with WinSCP using my own username, cannot login as root...this is by default according to Google, and can be changed.
Changed like this:
Step 1: Login SSH and Su Root
# sudo su root
Step 2: Change password Root
#passwd root
Step 3: Config SSHD allow Root login
#nano /etc/ssh/sshd_config
PermitRootLogin yes
PasswordAuthentication yes
#service sshd restart (I used ssh as I'm using ubuntu and sshd wouldn't work)
Tried to login as root via WinSCP but I get
"Received too large (1349281121 B) SFTP packet. Max supported packet
size is 1024000 B. The error is typically caused by message printed
from startup script (like .profile). The message may start with
'Plea'." Cannot initialize SFTP protocol. Is the host running a SFTP
server?"
Any ideas?
Received too large SFTP packet. Max supported packet size is 102400 B
Cause:
This problem can arise when your .bashrc file is printing data to the screen (e.g.archey, screenfetch). The .bashrc file runs every time any console shell is initialized.
Solution:
Simply move any scripts that generate output from your .bashrc file to your .bash_profile. The .bash_profile only runs when you create a physical shell session.
NOTE: Just for anyone who comes across this and simply wants to copy files and doesn't matter what file protocol they use. You can just switch file protocol from SFTP to SCP to avoid this issue. Thought it might be worth a mention.
If you used Ubuntu linux and try to connect the server then "Please login as the Ubuntu user" you should sftp as the ubuntu user, not as root.
Try that, hope it will work for you!
Thanks!
Hmmm, I added this in WinSCP in advanced settings under "protocol options":
sudo /usr/lib/openssh/sftp-servers
I can login with my own username and move files now. Although not exactly sure how this works, I think it somehow changes you to root user at login?
More info: https://winscp.net/eng/docs/faq_su
See WinSCP article on Received too large (... B) SFTP packet. Max supported packet size is 102400 B
If … (from the subject [error message]) is a very large number then the problem is typically caused by a message printed from some profile/logon script. It violates an SFTP protocol. Some of these scripts are executed even for non-interactive (no TTY) sessions, so they cannot print anything (nor ask user to type something).
To add to #ThatOneCoder's answer on the cause being too much output from .bashrc: in e.g. Ubuntu, there is also the system wide /etc/bash.bashrc that might be "too wordy" and cause the Received too large SFTP packet error.
It's a "system wide .bashrc", and if you want to execute code for all logging in, that's one location to place it. If you nixed ~/.bashrc and still get the error, check the contents of /etc/bash.bashrc.
It is happening because you haven't given shell access permission to the user.
I faced the same issue trying to login on my ubuntu 16.04 EC2 server as "root" via WinSCP. I spent a lot of time trying to fix it but in the end a simple workaround worked for me.
I ssh into the instance using PuTTY with the username "ubuntu". After this I typed
sudo -i
and with this the user was changed to root.
using cpanel server, setting a simple "lynx http://www.domain.com/script.php" command gives following error and I am unable to understand it.
Lynx file "/etc/lynx.lss" is not available.
the problem is the SHELL.
You can solve this problem via two ways:
1] I simply changed the sentence:
SHELL="/usr/local/cpanel/bin/jailshell"
in /var/spool/cron/account
to SHELL="/bin/bash"
2] You can copy file /etc/lynx.lss
to directory: /home/virtfs/account/etc
Both worked for me !
Wilhelm
You can create an empty (or not) style sheet file in a directory where you have write access, then explicitly point to that file on the lynx command-line:
lynx -lss=/path/to/my/lynx.lss ...
I enabled shell access for the account and it started working. The above answer seems to assume you have access to the entire server and can modify those files, if so then just enable shell access and you are set, but if you are on a shared hosting account basic cpanel and ftp access you may not be able to do it. Ask you hosting company if you can have shell access. Then decide what you can do depending on the answer they give you.
you can solve this issue by follow below process :-
Open this file
root#server [~]# vi /var/cpanel/exim.conf.deps
and append below entries and save it.
/etc/lynx.lss
I need to copy a file from a remote machine to my local machine and I need to automate it.
I've tried SCP command and it's working, however, I could not automate the part wherein it is asking for the password of the user of the local machine and the remote machine.
Based on this article I can Perform SSH Login Without Password Using ssh-keygen & ssh-copy-id
after following all the instructions written there, I tried to access the remote machine using this
ssh lalala#XXX.XXX.XXX.XXX
it works, it doesnt ask for the password anymore. But when I tried copying a file from that machine using the command below,
scp lalala#XXX.XXX.XXX.XXX:'/a/b/c.txt' lelele#XXX.XXX.XXX.YYY:'/b/c/'
it still asks for the password of the localmachine which is the lelele#XXX.XXX.XXX.YYY
I wonder if I did something wrong? what could it be? is there something wrong with the format of the command?
BTW, im using Centos, and I'm planning to code it using python
If you are copying to your local machine why don't you just do
scp lalala#XXX.XXX.XXX.XXX:'/a/b/c.txt' /b/c/
?
I tried your line on some machine with similar setup and didn't get asked for password; I got an error instead, but this is probably due to differences in our configurations. I tried mine and it worked.
Regarding whether your connection succeeds in the remote machine you could tail this file there:
tail -f /var/log/secure
If you see no error there you can be sure (well, never say always) your layout with the generated keys is working.
In this case I bet you'll see no error there
I think you may have multiple ssh keys and set identies only as yes. If so, please check this answer: https://askubuntu.com/a/999306/398861
Need to transfer 1 file from old host with no SSH access to a new host in which I do have SSH access. Having a hard time figuring this out. Looking for a simple answer if there is one. And also trying to avoid the slow upload times from my local machine, hence the reason for server to server transfer needed.
Are you able use FTP? You could use that to transfer the files.
If you have the URL of the file you want to move from your old host, you can use the wget command in your SSH terminal. You can use this for any file extension, or folders if you want.
For example, if you want to move http://www.yourhost.com/file.zip to your new host, you would SSH into the folder you want to download move this to, and type:
wget http://www.yourhost.com/file.zip
I have got dedicated server and file about 4 GB to upload on the server. What is the fastest and most save way to upload that file to the server?
FTP may create issues if the connection will be broken.
SFTP will have the same issue as well.
Do you have your own computer available through internet public IP as well?
In that case you may try to set up a simple HTTP server (if you have Windows - just set up the IIS) and then use some download manager on dedicated server (depends from OS) to download the file through HTTP (it can use multiple streams for that) or do this through torrent.
There're trackers, like http://openbittorrent.com/, which will allow you to keep the file on your computer and then use some torrent client to upload the file to the dedicated server.
I'm not sure what OS your remote server is running but I would use wget it has a --continue from the man page:
--continue
Continue getting a partially-downloaded file. This is useful when
you want to finish up a download started by a previous instance of
Wget, or by another program. For instance:
wget -c ftp://sunsite.doc.ic.ac.uk/ls-lR.Z
If there is a file named ls-lR.Z in the current directory, Wget
will assume that it is the first portion of the remote file, and
will ask the server to continue the retrieval from an offset equal
to the length of the local file.
wget binaries are available for GNU/Linux / Windows / MacOSX / dos:
http://wget.addictivecode.org/FrequentlyAskedQuestions?action=show&redirect=Faq#download