Block remote CVS users from physically logging in to repo server - ssh

Our users are using ssh shell to run all CVS actitivies. Is it possible to block them from physically logging to the CVS server? For example, they use the following command to checkout files:
cvs -z3 -d:ext:username#cvsserver:/cvsroot/projects checkout -d project1 .
Since they are part of the group having the permissions to read/write CVS repository file system on the server, they could SSH login to the repo server and delete physical files. Can I block them? I have tried to sshd_config with DeneyUsers, or set /etc/passwd with NOLOGON shell, but they all block CVS commands too.
Thanks

Yes, it is possible. I'm using a custom shell that executes cvs only.
Here's the code:
#include <stdio.h>
#include <stdlib.h>
#include <unistd.h>
int main(int argc, char** argv)
{
if (getenv("CVSROOT") == NULL)
setenv("CVSROOT", "/home/cvs", 0);
execl("/usr/bin/cvs", "cvs", "server", NULL);
printf("No Permission.\n");
}

You could install rssh and set this as the user's login shell.
http://www.pizzashack.org/rssh/
rssh has a list of commands (including cvs) that you can restrict users to. If they try and ssh directly into the machine they will be denied and receive a message like this:
This account is restricted by rssh.
Allowed commands: cvs
If you believe this is in error, please contact your system administrator.

Related

SSH with command Bat file [duplicate]

I have a scenario where I need to run a linux shell command frequently (with different filenames) from windows. I am using PuTTY and WinSCP to do that (requires login name and password). The file is copied to a predefined folder in the linux machine through WinSCP and then the command is run from PuTTY. Is there a way by which I can automate this through a program. Ideally I would like to right click the file from windows and issue the command which would copy the file to remote machine and run the predefined command (in PuTTy) with the filename as argument.
Putty usually comes with the "plink" utility.
This is essentially the "ssh" command line command implemented as a windows .exe.
It pretty well documented in the putty manual under "Using the command line tool plink".
You just need to wrap a command like:
plink root#myserver /etc/backups/do-backup.sh
in a .bat script.
You can also use common shell constructs, like semicolons to execute multiple commands. e.g:
plink read#myhost ls -lrt /home/read/files;/etc/backups/do-backup.sh
There could be security issues with common methods for auto-login.
One of the most easiest ways is documented below:
Running Putty from the Windows Command Line
And as for the part the executes the command
In putty UI, Connection>SSH> there's a field for remote command.
4.17 The SSH panel
The SSH panel allows you to configure
options that only apply to SSH
sessions.
4.17.1 Executing a specific command on the server
In SSH, you don't have to run a
general shell session on the server.
Instead, you can choose to run a
single specific command (such as a
mail user agent, for example). If you
want to do this, enter the command in
the "Remote command" box.
http://the.earth.li/~sgtatham/putty/0.53/htmldoc/Chapter4.html
in short, your answers might just as well be similar to the text below:
let Putty run command in remote server
You can write a TCL script and establish SSH session to that Linux machine and issue commands automatically. Check http://wiki.tcl.tk/11542 for a short tutorial.
You can create a putty session, and auto load the script on the server, when starting the session:
putty -load "sessionName"
At remote command, point to the remote script.
You can do both tasks (the upload and the command execution) using WinSCP. Use WinSCP script like:
option batch abort
option confirm off
open your_session
put %1%
call script.sh
exit
Reference for the call command:
https://winscp.net/eng/docs/scriptcommand_call
Reference for the %1% syntax:
https://winscp.net/eng/docs/scripting#syntax
You can then run the script like:
winscp.exe /console /script=script_path\upload.txt /parameter file_to_upload.dat
Actually, you can put a shortcut to the above command to the Windows Explorer's Send To menu, so that you can then just right-click any file and go to the Send To > Upload using WinSCP and Execute Remote Command (=name of the shortcut).
For that, go to the folder %USERPROFILE%\SendTo and create a shortcut with the following target:
winscp_path\winscp.exe /console /script=script_path\upload.txt /parameter %1
See Creating entry in Explorer's "Send To" menu.
Here is a totally out of the box solution.
Install AutoHotKey (ahk)
Map the script to a key (e.g. F9)
In the ahk script,
a) Ftp the commands (.ksh) file to the linux machine
b) Use plink like below. Plink should be installed if you have putty.
plink sessionname -l username -pw password test.ksh
or
plink -ssh example.com -l username -pw password test.ksh
All the steps will be performed in sequence whenever you press F9 in windows.
Code:
using System;
using System.Diagnostics;
namespace playSound
{
class Program
{
public static void Main(string[] args)
{
Console.WriteLine(args[0]);
Process amixerMediaProcess = new Process();
amixerMediaProcess.StartInfo.CreateNoWindow = false;
amixerMediaProcess.StartInfo.UseShellExecute = false;
amixerMediaProcess.StartInfo.ErrorDialog = false;
amixerMediaProcess.StartInfo.RedirectStandardOutput = false;
amixerMediaProcess.StartInfo.RedirectStandardInput = false;
amixerMediaProcess.StartInfo.RedirectStandardError = false;
amixerMediaProcess.EnableRaisingEvents = true;
amixerMediaProcess.StartInfo.Arguments = string.Format("{0}","-ssh username#"+args[0]+" -pw password -m commands.txt");
amixerMediaProcess.StartInfo.FileName = "plink.exe";
amixerMediaProcess.Start();
Console.Write("Presskey to continue . . . ");
Console.ReadKey(true);
}
}
}
Sample commands.txt:
ps
Link: https://huseyincakir.wordpress.com/2015/08/27/send-commands-to-a-remote-device-over-puttyssh-putty-send-command-from-command-line/
Try MtPutty,
you can automate the ssh login in it. Its a great tool especially if you need to login to multiple servers many times. Try it here
Another tool worth trying is TeraTerm. Its really easy to use for the ssh automation stuff. You can get it here. But my favorite one is always MtPutty.
In case you are using Key based authentication, using saved Putty session seems to work great, for example to run a shell script on a remote server(In my case an ec2).Saved configuration will take care of authentication.
C:\Users> plink saved_putty_session_name path_to_shell_file/filename.sh
Please remember if you save your session with name like(user#hostname), this command would not work as it will be treated as part of the remote command.

Using "runas" command for a program that writes a file (Windows Server 2008)

I wrote this simple console program (writeTxt.exe):
#include "stdafx.h"
using namespace std;
int _tmain(int argc, _TCHAR* argv[])
{
char *fileName = "test.txt";
ofstream outStream(fileName, ios::out);
outStream << "This is a very simple test." << endl;
return 0;
}
Then I run it on the console of Windows Server 2008 using runas command (I logged-in as a different user from User1):
runas /user:User1 writeTxt.exe
But the program doesn't produce the file test.txt. If I log-in to the server as User1 and run the program, it works correctly. Do I have to set something for this to run correctly?
I believe that runas always launches programs with their working directory set to C:\Windows\System32 (or moral equivalent) rather than whatever your current working directory is when you invoke runas.
If User1 has permissions to write to that directory, that's where the file will be. If they don't have such a permission, then the program will fail.

Teradata client on Unix Solaris

I deploy some .bteq and .sql scripts on a TERADATA database. For doing this, I use a client on my desktop called BTEQWin version 13.10.0.03.
I get the .bteq/.sql from a version control like pvcs/svn etc and all I do once the files are in my workspace folder (from Version control tool), to just drag and drop the files from Windows browser to BTEQWin client (which I connect to a database prior to drag/drop for running those scripts).
Now, I have to automate this whole process in UNIX.
I have written a SHELL KSH/BASH script which is getting all the .bteq/.sql from a TAG/LABEL in the version control tool to a given UNIX folder. Now, all I need to do is the pass these files one by one (i'll take care of the order) to Teradata client.
My ?
- what client do I need to tell Unix admin team to install on Unix server - so that I can run something like below:
someTeraDataCommand -u username -p password -h hostname -d database -f filenametoexectue | tee output_filename.log
Where, someTeraDataCommand is the client / executable - which will let me run Teradata scripts (like I was doing using BTEQWin on my desktop - GUI session). Other parameters can be username, password, which database to connect on what server and which file to run or make that file passed to the command using "<" operator at command line.
Any idea?
- What client ?
Assuming the complete Teradata Tools and Utilities package is installed on your UNIX server (which will have the connectivity tools to talk to Teradata), you should have access to bteq from the command line. Something like this:
bteq < script_file > output_file
Your script file should contain a .LOGON statement to establish the connection:
.LOGON yourTDPID/your_account,your_pw
You might also need to use other commands to set your default database or non-default session values.
Another option would be to combine the SQL and call to BTEQ in a Korn shell script:
#!/usr/bin/ksh
##############
SHELL_NAME=`basename $0`
PRG_NAME=`basename $(SHELL_NAME} .ksh`
LOG_FILE=${PRG_NAME}.log
OUT_FILE=${PRG_NAME}.out
#
bteq <<EOBTQ > ${LOG_FILE} 2>$1
.LOGON {TDPID}/{USERID},{PWD};
--.RUN file=${LOGON}
/* Add your SQL/BTEQ commands here */
.QUIT 0;
EOBTQ
Edit
The double hyphen indicates a single line comment. Typically in a UNIX script you do not leave your password in plain text of a KSH script. The .RUN command would reference a text file in a barely sufficient secure location containing the .LOGON {TPDID}/{USERID},{PWD}; command.
The .RUN command in BTEQ allows you to reference another text file containing a series of valid BTEQ commands that you want to run in the current BTEQ script.
Easiest way is to setup the Solaris TTU, is to request root sudo, and run an interactive installation into defaults as a root. That would cure all client issues.

ssh scripting and copying files

I am writing a BASH deployment script on RH 5. Script runs great and send out an email at the end of the script run. However, what I need to do is, at the end of the script, if I detect any failure, I need to copy log files back local server to attach to the email.
Script can detect failure fine, how to copy log files back. I don't want to just cat the log files as they can be huge.
Any suggestions?
Thanks
S
If I understand correctly your problem, you should use scp
http://linux.die.net/man/1/scp
and here you can find how to automate the login so you can use it in a script
http://linuxproblem.org/art_9.html
I can't see any easy way of avoiding a second login with scp/sftp. If you're sure that it's only the log file that will be returned you could do something like the following:
ssh -e none REMOTE SCRIPT | gzip -dc > LOGFILE
Inside SCRIPT you have something like gzip -c LOGFILE when if fails.

Copy files from remote server to local, ignoring existing files (rsync not available)

I would like to copy a directory of files from a remote server. As it is a large number of files, the option of ignoring existing files on the destination server is desirable.
Unfortunately, rsync is not available for some reason (the remote server is from a CDN service, and beyond my control).
So I think I am stuck using scp -r on the folder in question.
Is there anyway of doing this with ignoring existing files?
thanks
You could also create a *.tar.gz or *.tar.bz2 archive, scp it, and then unpack it. I don't know if scp -r uses any compression. If not, compressing everything first might, potentially, make it faster.
It's easy to write an script in Perl to do that using the module Net::SFTP::Foreign:
#!/usr/bin/perl
use Net::SFTP::Foreign;
my $sftp = Net::SFTP::Foreign->new('user#host');
$sftp->die_on_error;
$sftp->rget('/remote/path', '/local/path',
resume => 'auto',
on_error => sub { my ($sftp, $e) = #_;
warn "error processing $e->{filename}: "
. $sftp->error;
}
);
SCP needs a writable file so that it can replace that file.
Using this, for the files which you do not want to replace, you can remove the permission to write for them. And continue with your scp for all files.
https://unix.stackexchange.com/a/51932/284063