How to log Command Line activity? [duplicate] - sql

This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
DOS command to Display result on console and redirect the output to a file
tried various Google searches but nothing seemed to solve my problem.
Basically I'm working for a company who need me to work with their in place database and extract the various data need for reports. They are using Sqlite (please, I've heard enough comments about how it might not be the best choice for a DB, so leave them out) and I either want all my activity on Windows command prompt to be logged, or at least everything I do from the Sqlite command line to appear in a .txt, just in case I need to refer back to it later.
Can anybody here explain to me how to do this? I'm a bit of a beginner and need this stuff broken down step by step. Not done anything like this before.
Cheers!

I'm reasonably certain you can't do this directly -- i.e., the Windows command prompt doesn't provide a way to log the input you provide to it. You can capture outputs (e.g., from commands you run), but for your purposes that's probably not adequate.
You probably need to create a "shell" of your own that takes inputs from the user, logs each one, sends it on to the command prompt, captures the output from the command prompt, and logs that as well.
In an answer to a previous question, I posted some code that handles most of what you need to do. The big difference is that you'll want to look in its handle_output (for example) and instead of just displaying the captured output to the console, write it to a file as well. As it stands right now, that example redirects the child's standard input to come from a file, but changing it to read from the console instead should be fairly straightforward -- you'll basically use a function about like the handle_output and handle_error that it already includes, but instead of displaying output, you'll read input from the user, and each line you'll 1) write to the log, and 2) send to the child via anonymous pipe (much like handle_output and handle_error read from anonymous pipes).

Related

Show current command values upon exec in CS:GO

I'm currently working on a lot of my in game autoexec files, and one config id like to create is one that shows the state of each command listed when executed. I cannot for the life of me think of or find the command or set of commands that does this.
Example of what im going for here.
I mainly need this for movement videos, cause id like to show that all of my settings are default upon executing the file.

How to use Variables in Automator

please bear with me, I use Automator since not long.
I have good experience in PHP (totally different) and some small scripting knowledge (apple script, shell, etc).
I try to replicate this logic workflow with Automator:
Ask User to insert value (set $variable_a)
Ask User to insert one more value (set $variable_b)
Submit
This triggers a script that uses both values submitted above. A dummy example:
echo $variable_a
echo variable_b
Seems simple, and it's amazing how fast you can set up this logic with Automator.
The problem is, at stage 2 above, my $variable_a is suddenly a mixed value of $variable_a and $variable_b.
Why does this happen?
They do not seem to act as I understand the generic usage of "variables" in any language or programming step.
In other systems, usually, variables keep as value what they got defined (unless variable variables or you modify them consciously in the code)
I attached an Automator "WorkFlow" File that replicates exactly the abovementioned WorkFlow Logic.
It's a ZIP file, unzip it and open in Automator for a test.
You will see (in the results section for the last step) how the values become (IMHO) false.
Has someone a hint?
The reason this is happening is because the output of one action in the workflow is being fed as input into the next action of the workflow. As inputs are received by actions, they can also aggregate in some cases, such as when setting and getting variables.
The reason it does this is so that you could sent multiple variables directly into, say, a Run Shell Script action, and references them using $1, $2, etc. If Automator only ever took the most recent input, you'd never be able to feed more than one variable into a shell script without first combining them manually yourself into a list.
The solution is simple. Every action has an Options button that you can press, which in turn reveals a checkbox called Ignore this action's input. This needs to be checked for those actions that you want to operate independently of previous results.
Here's a screenshot of your workflow with the appropriate checkboxes ticked against the actions that require it:

Process.Start does not always work

I created a protocol generation tool that reads some data from a websource, allows the user to filter some of the fields an generate a protocol based on given filterdata. The protocol is generated as a word document that is edited multiple times, on multiple layers, before shown to the user.
For some users the line:
Process.Start(pathtowordfile)
does not open Word, for others it works fine.
Even more strange: if the useres try to generate the protocol the first time it opens. if they change one of the filters and generate again, the file does not open. But it is generated correctly, you are able to open it manually.
We are using Windows 7 on all machines and, in general, the User has no administrativ privileges on the machine.
Are there any alternatives to
Process.Start()
?
Not sure how the code can sometimes work, but the critical bit I think would be to make sure you set UseShellExecute=true when you are trying to Process.Start a file that is not an executable.
https://msdn.microsoft.com/en-us/library/system.diagnostics.processstartinfo.useshellexecute%28v=vs.110%29.aspx
I haven't looked into it in quite awhile, but last I checked shell execute relies on the Word application to be properly registered with DDE so Windows knows what to do with a .doc(x) file. Word may not be installed "properly".
after trying varios things, including Wonko's hint, I dicided to use the interop.Word.Wordapplication to display the document. It doesn't explain why process.start not does the job but anyway, now everybody of the users is happy^^

PRO*C Why the sql user & password should be mentioned in make file

I have a wierd question and i am trying to backtrace to the root cause.
The scenario is,
I have a bunch of c, cpp & Pro*C code scattered in multiple folders [huge # of files]. Dozens of make files. We regularly run the makefiles to create the latest executables when we change something in code/config/libs.
The problem is:
My SQL User id and passwords are expiring at regular intervals.
These ID's are being used in make files and in Pro*C code to connect to DB. we have to regularly get them reset and it has become a problem requesting the same again and again.
The question is,
Why the SQL credentials are failing after getting them reset?
On every reset, the ID works for 3 or 4 times, after which again the ID fails.
In Make files, the ID's are used to check the semantics. and the proc by default checks for syntactical errors. The exe cannot fail if the ID's are working. So why the ID's are getting failed. Please advice on how this could be fixed.
Should i reqeust the DBA to change any settings of this ID? Or could there be something in my code, which makes the ID's fail by tring wrongly. We copy the ID's into an env variable, which are being read by all EXE's. Could this be causing the problem? Should we take any precautions when using the data from an ENV variable? [The exe's work 3 times perfectly, after which the SQL ID fails.]
Please advice on what all precautions should be taken from my side.

How do you write a ksh script that automatically fills in user input?

Is there any way to automate the user input in a ksh script?
I realize that in many instances you can create an expect script to do exactly what I am saying, but expect is not installed on the server and there is no chance of that occurring in the near future. But I still need a method of automating user input so the only reaction required is to start the script.
If you have the complete set of "user" input, you can redirect stdin:
script.ksh <userinputfile
If you have some of it, or generate it on the fly, you can use "hereis" documents.
If you are going to be parsing prompts, the easiest way, as you mention, the easiest way is to use Expect. Even if Expect isn't available on the server, it'll be easier for you to include as much Tcl/Expect as necessary to do your parsing than to rewrite and redebug it.