how to call variables from a running script to a different script powershell - variables

I want a script to get a variable from a different script which are both running on the system. Is that possible?
I have two scripts running on the system, and I want one script to pull a user-defined variable instead of asking the user to input the data twice

Assuming both scripts are running concurrently, MSMQ would be one option:
What's the best way to pass values to a running/background script block?
Global Variables are only "Global" in the context of scopes in the current session. They aren't visible to a script running is a different session.

Write it to a text file and have the other script read it. You can load up the text into a variable, then execute it as code using iex (Invoke-Expression).

Related

how to start only one 3dsmax instance pertime

Whenever a command like "3dsmax -silent -U PythonHost F:/code/3dmax/dsmax_snapshot.py" is executed, a new instance is started。I want to execute the script using only the opened instance each time
Simplest way is to call registerOLEInterface to expose python.ExecuteFile (and/or python.Execute) from the script you pass on the commandline, then you can use win32com to execute python commands in that running max instance. For this to work, you have to first register the OLE server. Once the keys are added to the registry, you can expose the needed functions and call them from outside of max.

run command after finishing protobuf_generate

I'm trying to patch PROTOBUF_FINAL out of generated protobuf files, in order to do this I've created a simple bash script that will do so for me, but the problem I'm now facing is that I haven't been able to get it to run after protobuf_generate itself, only before it or not at all.
Internally it seems protobuf_generate runs add_custom_command for each file provided to it, and after all of that it sets an output variable containing all the generated files (_generated_srcs_all), so I tried depending on that with a custom command but with that it just never runs.
Did you try to depend on the generated files list which is stored in this variable ?
note: supposing you are talking about the function protobuf_generate
https://github.com/protocolbuffers/protobuf/blob/ee35b2da4b6d633eadcc105e5232319b47b494a7/cmake/protobuf-config.cmake.in#L141

Setting varible from shell script in Pentaho kettle which can be accessed by further jobs

I wanted to know how can I set an variable from shell job available in pentaho kettle, which can be accessible by further Jobs(Simple evaluation) in the workflow.
I am trying to create a workflow where I have a start element which would trigger as shelljob to check the folder presence, if the folder is present then set one variable. The next job is Simple evaluation which needs to check if the variable(Set by shell job) is true that proceed with the workflow or terminate the workflow.
Start-->ShellJob(check folder created and set variable)-->SimpleEvaluation Job.
--MIK
Good question. I'm not aware of such capability, as the "Execute a shell script..." step isn't designed to be a data pipeline. Furthermore, what values should/can a script return to you? Is it the result of an echo? A shell script could essentially be anything. I would say there's a reason why there is no built-in functionality for that in PDI.
Having said that, what you could do is something like this:
Execute a script, at the end of it write the variables into a text file on the file system
Create a sub-transformation that reads the variables from the file you've written in the shell script step, and then stores it/them in global scope variable
Evaluate the variables in the job
It may seem a bit cumbersome, but it should do the job for you, since you're asking to use the Shell Script step in a way it's not really designed to be used.
Here's an example of a high-level implementation (implementation of the sub-transformation should be very simple):
I hope it helps.

ms-access: doing repetitive processes with vba/sql

i have an access database backend that contains three tables. i have distributed the front end to several users. this is a very simple database with minimal functionality. i need to import certain rows from a file every hour into one of the tables in the database. i would like to know what is the best way to automate this process so that i can have it running hourly. i need it to be running sort of as a service in the background. can you tell me how you would do this?
You could have for example:
a ms-access file with all necessary code to run the import proc
a BAT file containing the command line(s) that will run this ms-access file with all requested parameters. Check ms-access command line parameters to see the available options.
a task scheduler service software to launch the BAT file: depending on the task scheduler and the command line to be sent, you could even avoid the BAT file step
If all you want to do is run some queries, I would not do this by automating all of Access, but instead by writing a VBScript that uses DAO to execute the SQL directly. That's a much more efficient way to do it, and will run without a console logon (which may or may not be required for full Access to be run by the task scheduler).

Is it possible to set Powershell profile code to run after every script?

I have set up a Powershell profile to run at Powershell startup.
Is it possible to configure powershell to call the profile after every .ps1 script so I don't have to call .$profile at the end of each script?
I don't know of a way to do what you're asking without hacking at PowerShell internals (not even sure then). I would just do as you suggest, put this stuff in a separate script and then create a simple alias for it like a or s e.g.:
New-Alias s c:\users\john\bin\reset.ps1
You could put it in your prompt function to reload after every command:
function prompt
{
. $profile
}
I don't know of any event you can hook into that would fire after a script is run.