sed output stagnate - scripting

I have file names that map to directores.
For example.
test ---> /to/path/test/program.c
I have a line that formats the output of sed into this currently
test0
test1
test3
All unique directories, I now need to add leading path and copy their respective c files. Is there a way to stagnate the output of sed while i carry about there processes.
Please and thank you.

Send sed a SIGTSTP (such as is produced by pressing CTRL-Z) to pause it. Send a SIGCONT (the fg command in Bash) to continue it.
Or you could just let it run, and do your processing afterwards...

Related

Using sed over ssh to add item to list

I need to modify a Python file on a remote server, and I'm stuck formatting a sed command inside an ssh.
The file to be modified has this line
my_list = ["item1"]
and I need to change it to include another item:
my_list = ["item1", "item2"]
Here's what I have:
ssh user#host 'sed -i \'s/my_list = \[\\"item1\\"]/my_list = \[\\"item1\\", \\"item2\\"]/\' path/to/file'
The number of escapes required for quotes and open brackets is throwing me off since it's within an ssh.
I'd appreciate a hand if anyone can help!
You can't nest single quotes, and you can't escape single quotes inside single quotes. The simplest solution by far in this particular case is to just quote less; there is nothing in sed or -i which requires quoting. But because both your local shell and the remote shell processes the command line, you need two layers of quoting.
ssh user#host sed -i "'s/my_list = \\[\"item1\"]/my_list = [\"item1\", \"item2\"]/'" path/to/file
Perhaps notice also that the replacement string is just a string, so there is no need to escape the [ there.
For debugging these things, try
ssh user#host printf '%s\\n' sed -i "'s/my_list = \\[\"item1\"]/my_list = [\"item1\", \"item2\"]/'" path/to/file
to see the command line split up into one token per line on the remote host.
Fundamentally, you should probably change the remote Python script to read its input in a standard format like JSON or YAML. Programs which write programs are a powerful tool, but unsophisticated programs which modify existing programs are often going to end up brittle and hard to debug.

How to Input Redirect Two Files to Standard Input?

Is it possible to redirect two or more files to standard input in one command? For example
$ myProgram < file1 < file 2
I tried that command however, it seemed like the OS is only taking the first file and ignoring the other...
If not, how can I achieve that?
NOTE: concatenating the two files will not help in my case.
When you do this from bash, it isn't inputting multiple files to standard input, it is called Process Substitution
The output is sent to an file descriptor under /dev/fd/<n> for each substitution

Accurev: How to keep/promote with a multi line comment from the command line?

How to keep/promote with a multi line comment from the accurev command line?
For example if I try:
accurev stat -n -fl | xargs accurev keep -c "git log 1234..4311"
I simple get the error:
You can not use non-printable characters on the command line: # On
branch master\x0a... AccuRev was unable to understand your command.
I can of course strip out the new lines but then the comment is not really useful.
AccuRev commands that take a -c option for a comment must currently be enclosed in quotes and have no line breaks.
As for the output from git log 1234..4311 that could be captured as a manifest file and kept with the other files.
Dave
I'm not sure about doing it directly from the command-line without any extra step, and I'm hesitant to try anything on my client's AccuRev setup. That said, according to the entry on accurev keep from the CLI manual:
–c <comment>
Specify a comment for the transaction. The next command-line argument should be
a quoted string. Alternatively, the next argument can be in the form
#<comment-file>, which uses the contents of text-file <comment-file> as the
comment.
Default: enter a comment interactively, using the text editor named in
environment variable EDITOR (or a system-dependent default editor).
Reading this, I see two ways you can do what you want from the command line (meaning, not using the GUI).
1.) Pipe or cat your stat info into file, the use the #file syntax to get it into your commit
2.) Get your stat into into your clipboard, then don't give an argument to the keep command, let your editor open up, paste, save, and close.
There may be a way to get this all done via CLI without these middle-steps (perhaps you need to format the \x0a into \r\n or something?), but as I said, I'm unwilling to try it on my AccuRev setup as AccuRev gives me (and everyone else) enough trouble as it is.
HTH

ssh tail output lines only with keyword

I'm trying to tail a large file in an ssh command prompt, but I need to filter it so it only displays lines that contain a particular keyword in them.
I'm using this command currently to tail.
# tail /usr/local/apache/logs/access_log
If possible please let me know what I would add to this command to accomplish this.
You can pipe the output of tail and use grep. To
filter so it only displays lines that contain a particular keyword in them
you could do:
tail /usr/local/apache/logs/access_log | grep "keyword"
where you'd replace keyword with your keyword.

add prefix to files with rename - error argument too long

I have thousands of files inside a directory I need to rename adding a prefix like "th_" so that files will be th_65461516846.jpg
but I can't due to the error "argument too long"
I have used this command
rename 's/^/th_/' *
thanks!
The xargs program is used to break command lines into multiple commands to avoid blowing the shell line length limit. In your case, you'd use:
ls | xargs rename 's/^/th_/'
Which repeatedly executes rename with a portion of the output of ls until the list of files is exhausted. Do be aware this idiom requires special attention if the file names have spaces or other funny characters in them (which I'm assuming isn't so based on your example).
This one did the job
for f in *; do mv "$f" "${f/9/th_}";done
or
for f in * ; do mv $f th_${f#} ; done
I don't know what differs between the 2 but in my case they both work.