How can I ask the user to enter a value during the execution of a GnuPlot script?
I would like to use stdin to initialize some plotting parameters.
I have try to call :
a=system("read")
and
pause mouse keypressed
Without sucess.
Any help welcomed. Thanks
I assume you are on Linux. Your shell command must write something to stdout which gnuplot can read from stdin. Something like this should work:
print "Pleaser enter a number: "
a = system("read a; echo $a")
plot a*sin(x)
pause mouse close
Would it be an alternative to write a script in bash, python, ... which reads user input and calls gnuplot afterwards? (The links are my first google results, I have mainly read the title.)
Related
I have a data file a.dat that is updated every few seconds. I wish to plot it in gnuplot every few seconds to see the changes
plot "a.dat"
What is the easiest way to do it? Thanks.
Make a script with a loop:
while (1) {
plot "a.dat"
pause 1 # waiting time in seconds
}
Execute it with gnuplot script.gp.
For purposes of code structure and debugging, you might prefer the following alternative:
plot "a.dat"
while (1) {
replot
pause 1
}
This has the advantage that you do not have to put a complicated plot command inside the loop and do not suffer from incorrect line numbers for the plot command in error messages (that happen in at least some version of Gnuplot).
Finally, if your Gnuplot is so old that it does not yet support loops, there is the alternative:
plot "a.dat"
pause 1
reread
With reread making the script interpreter jump to the beginning of the file again.
If gnuplot is called with plot commands in the command line (option -e) instead of a command script file, only the version
gnuplot -e "...plot command(s)...; while (1) { pause 1; replot; }"
worked in my case, the other version
gnuplot -e "...plot command(s)...; pause 1; reread;"
did not.
On windows 10, I have to kill the gnuplot task in the task manager, because if I close the gnuplot window with the close-window-button, the window opens again after one second latest. Does anybody have an idea of how to handle this in a more comfortable way?
I workin' with Torch7 and Lua programming languages. I need a command that redirects the output of my console to a file, instead of printing it into my shell.
For example, in Linux, when you type:
$ ls > dir.txt
The system will print the output of the command "ls" to the file dir.txt, instead of printing it to the default output console.
I need a similar command for Lua. Does anyone know it?
[EDIT] An user suggests to me that this operation is called piping. So, the question should be: "How to make piping in Lua?"
[EDIT2] I would use this # command to do:
$ torch 'my_program' # printed_output.txt
Have a look here -> http://www.lua.org/pil/21.1.html
io.write seems to be what you are looking for.
Lua has no default function to create a file from the console output.
If your applications logs its output -which you're probably trying to do-, it will only be possible to do this by modifying the Lua C++ source code.
If your internal system has access to the output of the console, you could do something similar to this (and set it on a timer, so it runs every 25ms or so):
dumpoutput = function()
local file = io.write([path to file dump here], "w+")
for i, line in ipairs ([console output function]) do
file:write("\n"..line);
end
end
Note that the console output function has to store the output of the console in a table.
To clear the console at the end, just do os.execute( "cls" ).
I have a script which interacts with user (prints some questions to stderr and gets input from stdin) and then prints some data to stdin. I want to put the output of the script to a variable in vimscript. It probably should look like this:
let a = system("./script")
The supposed behavior is that script runs, interacts with user, and after all a is assigned with its output to stdout. But instead a is assigned both with outputs to stdout and stderr, so user seed no prompts.
Could you help me fixing it?
Interactive commands are best avoided from within Vim; especially with GVIM (on Windows), a new console window pops up; you may not have a fully functional terminal, ...
Better query any needed arguments in Vimscript itself (with input(); or pass them on from a custom Vim :command), and just use the external script non-interactively, feeding it everything it needs.
What gets captured by system() (as well as :!) is controlled by the 'shellredir' option. Its usual value, >%s 2>&1 captures stdout as well as stderr. Your script needs to choose one (e.g. stdout) for its output, and the other for user interaction, and the Vimscript wrapper that invokes it must (temporarily) change the option.
:let save_shellredir = &shellredir
:set shellredir=>
:let a = system('./script') " The script should interact via stderr.
:let &shellredir = save_shellredir
Call the script within the other as,
. ./script.sh
I think this is what you meant.
I'm trying to implement a simple terminal GUI using bash's interactive mode. I successfully invoked bash, get its stdout and print everything to a text view. I forward the user input from the text view to bash's stdin, to be able to run commands. It works great, except I don't get any error messages.
However, when I proceeded to print bash's stderr to my text view, I noticed something strange. In addition to now receiving error messages, bash seems to pass everything from stdin to stderr. Because of this, every character I type is printed twice (once normally because I enter it, and once because I print everything from stderr).
It also seems to print the prompt via stderr (bash-3.2$). Is this the expected behavior? Can this be suppressed?
I also tried to just capture use input (and not let the user type directly into the text view) and rely on bash to print the user input. This is almost working, except the order of the output via stdout and stderr is random:
If I type a command like echo test and hit enter, sometimes I get this:
(the second test is the output, I didn't type testtest)
bash-3.2$ echo testtest
bash-3.2$
Sometimes I get:
bash-3.2$ echo test
bash-3.2$ test
The order in which I receive the final \n, the output and the next bash-3.2$ is obviously mixed up.
There is no way to read stdout and stderr in the "correct" order, because there is no notion of order between different pipes. But you can ensure that both are sent to the same pipe (i.e. same file descriptor) instead of having each one go to a separate pipe. To do that, look on the options of whatever you use to start the bash subprocess; or maybe start a command line like bash -c 'bash 2>&1'.
I have a shell script which asks for user input and depending on the input opens db connection using sqlplus and run some sql querys like drop table /create table/select/update. Is it possible that the sql part be run as background job,so that even if i lose vpn connectivity to the network,all the sql queries gets executed.
Also ,when the sql parts gets completed and user is prompted with another input the shell script comes to foreground and after getting the input again goes to background?
I have found some questions which tell us how to run the script in background,but i want to run ONLY some parts of the same script in background if possible(and come to foreground for user input).Though i can make multiple scripts too handle it(dividing the scripts in parts which needs to be called in background and calling them though another script),i would rather do it in a single script if possible.
You can break your main script up into functions / smaller scripts to achieve the desired behavior of a mix of background processes and foreground processes.
For example, in your main script:
#!/bin/sh
echo "Starting script..."
# do so more stuff here, maybe ask user for input
./run_background_process_1 &
# ask the user for some more input
./run_background_process_2 &
...
Use the & symbol at the end of script calls to denote that they should be run in the background.
(Updated) If you'd like to keep everything in 1 script, use functions to break up / encapsulate the parts of logic that you would like to run in the background. Call these functions by suffixing the call with &, same as above.
You can try the following example to see that it works:
#!/bin/sh
hello() {
condition="yes"
while [[ $condition== "yes" ]]
do
echo "."
sleep 1
done
}
# Script main starts here
echo "Start"
hello &
echo "Finish"
Remove the & after hello and you'll see that it behaves differently.
There are tools which allow you to keep scripts running despite loss of connection. For example, check out http://www.gnu.org/software/screen/ - one of its features is Programs continue to run when their window is currently not visible and even when the whole screen session is detached from the users terminal.
After search on internet i found out i can use three methods to make the script background :
1) using bg: How do I put an already-running process under nohup? .but unfortunately ,this didnt worked for me in ksh shell.
2) using coprocesses
3) using nohup
I decided to go with nohup as it was easier to implement. The sqlplus part which needed to be run in background ,i made another script of it and called it from the main script using nohup
nohup script-name.ksh ${parameter1} ${paramter2} &
This worked for me.