Any way to auto-modify an action? - photoshop

Is there any way to auto-modify an action on Photoshop? For example, multiple workers use our internal action pack and I wanna be able to modify something without having them to replace/load the actions each time, is that possible?
Maybe a action that runs a script that runs the actions from folder? So changing the actions on folder would change the actions when they run?
So I basically found what I need http://www.tonton-pixel.com/scripts/utility-scripts/play-actions-file-action/index.html
But it opens the little screen each time asking which action I wanna play... any way to make it directly?

You need a script for that. If you still want actions, then an action that runs a script that runs the actions from folder would work.
Ideally you'd abandon Actions all together and redo them as scripts (there's a script that converts actions to scripts files) and then actions may call those scripts on a server, so you can modify every step without telling people to reload .atn files.
Advantages of scripts will be:
as I mentioned, easy to modify;
you can have one undo for a long script (in actions 1 undo = 1 step);
much easier to trace/fix errors;

Related

Multiple users executing the same workflow

Are there guidelines regarding how to share a Snakemake workflow among multiple users on the same data under Linux, or is the whole thing considered bad practice?
Let me explain in case it's not clear:
Suppose user A executes a workflow in directory dir/. Assume the workflow terminates successfully, and he/she then properly sets file/directory permissions recursively on all output and intermediate files and the .snakemake/ subdirectory for other users to read/write, of course.
User B subsequently navigates to dir/, adds input files to the workflow, then executes it. Can anything go wrong?
TL;DR: I'm asking about non-concurrent execution of the same workflow by distinct users on the same system, and on the same data on disk. Is Snakemake designed for such use cases?
It's possible to run snakemake --nolock which will prevent locking of the directory, so multiple runs can be made from inside the same directory. However, without lock, there's now an opening for errors due to concurrent runs trying to modify the same files. It's probably OK, if you are certain that this will be avoided, e.g. if you are in constant communication with another user about which files will be modified.
An alternative option is to create a third directory/path, and put all the data there. This way you can work from separate directories/path and avoid costly recomputes.
I would say that from the point of view of snakemake, and workflow management in general, it's ok for user B to add or update input files and re-run the pipeline. After all, one of the advantages of a workflow management system is to update results according to new input. The problem is that user A could find her results updated without being aware of it.
From the top of my head and without more detail this is what I would suggest. Make snakemake read the list of input files from a table (pandas comes in handy for this) or from some configuration file. Keep this sample sheet under version control (with git/github) together with the Snakefile and other source code.
When users update the working directory with new files, they will also need to update the sample sheet in order for snakemake to "see" the new input and other users will know about it via version control. I prefer this setup over dumping files in a directory and letting snakemake process whatever is in there.

How to create a Logic/Script for a Data Extension?

I always implement scripts into a Cloudpage or directly into a newsletter but I never created a script which will run by her own in a special interval. Would that be possible? Maybe every night?
There is a script activity that is available that allows you to do that. However, it's for Server-Side JavaScript opposed to AMPscript. Once you save the script in the script activity you can then add it to an automation just like any other activity and execute it at the required intervals.
The feature isn't typically on by default so you will likely need to request it to be enabled by support. You should then see it listed as an option with the other activities.

TFS Api - trigger test run conditionally (when new files come)

I'm trying to get acquainted with test automation using Microsoft TFS Api.
I've created the program which runs my test set - it uses code similar to one described here, e.g.:
var testRun = _testPoint.Plan.CreateTestRun(false);
testRun.DateStarted = DateTime.Now;
// ...
testRun.Save();
I believe this forces them to start as soon as any of agents can run them, instead of being delayed to certain time. Am I wrong? Anyway, it works all right.
But I was told by my lead that the task should be started each time the new input files are copied to certain folder (on the network I think, or perhaps in TFS).
So I'm searching for a way which allow to trigger tests on some condition - but currently without any luck. Probably I miss proper keywords.
I only found something vaguely related here but it seems they say it is not possible in a proper way.
So are there any facilities in TFS / MTM, any ways or approaches to achieve my goal? Thanks in advance for any hints / links.
You would need to write a system service (or other) that uses the file system watcher. Then when the file changes you can run your code above.
There is no built in feature in TFS to watch a folder for changes.

How to close/stop a .NET application and re-execute it?

My application updates(running a vba script) an excel shared workbook, and since it is shared, there shouldn't be problems when someone else is using the same file at the same time. But for some reason, sometimes it simply freezes, without any error message, just freezes.
Is there a way to programatically make the application stops/closes automatically when frozen or after some minutes(In normal conditions, this updating process shouldn't take more than 1 minute)?
And, if possible, re-launch the app again automatically after some minutes for at least 5 attempts?
This way would ensure process completes succesfully.
I have had to do this same thing before but because I had an application that would look for updates to it's self on the network and then update it locally. Problem is, you cannot update the exe that is running.
What I did to get around it is to create another program that would wait a second, update the exe, then run the exe again.
Because I did this with a few different apps, I made my "Updater" generic so I could send some command line parameters and it would use those to copy and run.
If you want to try something else, you might be able to accomplish this same thing by creating a BAT file and running it. I'm not real good on BAT files so I can't help you there. But, it is another way to handle it.

Check for multiple files

Okay, I'll try to explain as good as I can... Quite a particular case.
Tools: SSIS 2008
We have a control flow that now needs to be triggered by an event: the presence of one or multiple files. (1,2 or 3)
The variables used:
BO_FileLocation_1
BO_FileLocation_2
BO_FileLocation_3
BO_FileName_1
BO_FileName_2
BO_FileName_3
There can be one, two or three files: defined in above variables. When they are filled in,
they should be processed. When they are empty, this means there's just one file file, the process should ignore them and jump to the next (file watcher?) task.
For example:
BO_FileLocation_1= "C:\"
BO_FileLocation_2 NULL
BO_FileLocation_3 NULL
BO_FileName_1= "test.csv"
BO_FileName_2 NULL
BO_FileName_3 NULL
The report only needs one file.
I'd need a generic concept that checks the presence of these files, it could be more generic than my SSIS knowledge can handle right now. For example handy, when there's a 4th file in the future. I was also thinking to work with a single script to handle all the logic.
Thanks in advance
A possibly irrelevant image:
If all you want is to trigger the Copy Source File to handle if one or more of the files is present, just use the OR Constraint in your flow. The following image shows you how:
First connect all to the destination:
Then click one of the green arrows. This will make its properties window pop up. Select the Logical ORinstead of the Logical AND:
If everything went well, you should now see the connections as dashed lines:
There are several possible solutions:
Create a sequence container and include all the file imports in the sequence container. Add int variables for RowCountFile1, RowCountFile2, and RowCountFile3 and set the value to 0 (this is the default value when you create an int variable). Add a RowCount transformation to each of the data flows. Create a precedence constraint from the sequence container to the "Do something" task. Set the precedence constraint to success and expression. Set the expression value to #RowCountFile1 > 0 || #RowCountFile2 > 0 || #RowCountFile3 > 0. The advantage of this approach is that you can take an action as soon as the files are detected, you import all available files, and you only take an action after all the files have been imported. You could then schedule running this SSIS package as a SQL Server Agent job step and run it as frequently as you want.
A variant on solution 1 is to use for each file enumerator containers inside the sequence container. This would be useful if you don't know the exact name of the file and you expect to import more than one under some circumstances. For instance, if you get a file every few minutes with a timestamp in its file name and your process doesn't run for some reason, then you may have to process multiple files to get caught up and then take an action once it has been done.
You could use the file watcher task as you outlined in your question. The only problem I have with the file watcher task is that the package has to be in a constantly running state. This makes it hard to troubleshoot problems and performance. It also can introduce other problems since I remember having some problems with the file watcher task years ago when it first came out. It may well be a totally stable task now, but I prefer other methods over the task after having been burned previously. If you really want the package to run continously instead of having it be called by a job, then you could always use a script task to check for file, sleep thread if not found, check again, etc. I'm sure that's what the file watcher task does, but I would trust my own C# over the task. Power to anyone who has had better experiences than me with File Watcher...
Use PowerShell. If you just want to take an action if a file appears and you aren't importing the data, then a PowerShell script could do this just as well as a SSIS package. The drawback is that you have to learn some basic PowerShell, it may be hard to maintain in the future since PowerShell is probably not your bread and butter core language, and you may have to rewrite the code again to a SSIS package if you want to import the data. You would probably call the PowerShell script from a SQL Server Agent job step, so scheduling can be handled pretty easily.
There are more options than what I listed, so let me know if you still want more suggestions.