How Can I Execute *.exe From Table Trigger? - sql

i would like to start an exe or dll once record is updated. If field of interest is updated i would like to call an *.exe with parameters. Is this possible in MySql and oracle?

A similar issue came up recently in this question. While that poster was trying to make a web call within a trigger, the same general response ("this is not a good idea") is probably true here as well.

Related

SAP B1 DI-API replaces character on save

I wrote a small c# service importing tarcking numbers to a single UDF separated by a , (comma). The problem is that occasionally (maybe every 200th document) a comma is saved as a semi-colon. A kind of similar issue I have is with the Amazon importer where I add a comment. Maybe with the same frequency, the comment has a whitespace between every single original characters. All in common is that the error cannot be within my code. There is no difference between the correct documents (ca. 95%) and the others.
Does anybody have an idea how i can workaround that these issues don't appear anymore?
Or why this can happen?
I know I have an outdated SAP B1 at version 9.2 PL 10 Hotfix3. DI-API is linked to the install folder. Is this issue fixed in any later version?
(Current workaround is a cron job checking for wrong entries in the db and update those documents. Very uncool)
Definitely sounds like a DI-API bug. If you posted your code it would help confirm this.
Assuming it IS a DI-API bug, I would "dark side" it and just do a regular SQL update (bypass the DI-API), since it's just a UDF and there's probably not any business logic you need SAP to perform on these updates.
Alternatively, you could normalize your data and create a separate table linked via FK to your current table to house a single UDF per row (therefore not having to deal with the weird coma character issue).
As a third alternative, you could make of the SBO Post-Transaction Notification SP to monitor for your error case and perform the "fix" there, intead of in your cron-job.
Disclaimer: I have not worked with SAP in 4+ years.

Get list of files and programs touched by AS400/iSeries service account

I am trying to get a list of the programs (RPG/CL/SQL) and files a service account on the iSeries has touched. The idea is that having this list we can tie specific permissions (I know this will really complicate things) to the user account in order to achieve a more secure application specific service account. Is there any way to do this and maybe get a report by running a command. Maybe there is a SQL statement?
Please excuse me if my terms are not appropriate, I am still new to the iSeries.
The audit journal will have what you are looking for....if so configured.
http://pic.dhe.ibm.com/infocenter/iseries/v7r1m0/topic/rzarl/rzarlusesecjnl.htm
The newest 7.1 TR includes stored procedures to allow easy read of journals.
https://www.ibm.com/developerworks/community/wikis/home/wiki/IBM%20i%20Technology%20Updates/page/DISPLAY_JOURNAL%20(easier%20searches%20of%20Audit%20Journal)
Charles
So though Charles' answer might be the one one should set up to get a thorough report. I wound up doing the following as suggested by one of my peers.
Please note that my goal though not properly explained as so, was to create an application specific user/service account for a program. This is to avoid using one with many privileges and thus gain some security.
1.Go through the source code (in my case classic ASP) and jot down all the names of the procedures used by that program.
2.Create a CL program that outputs the program references to a display file. Then export the file's contents onto Excel and massage where necessary.
PGM
DSPPGMREF PGM(MYLIB/PGM001) OUTPUT(*OUTFILE) OUTFILE(MYLIB/DSPPGMREF) OUTMBR(*FIRST *REPLACE)
DSPPGMREF PGM(MYLIB/PGM002) OUTPUT(*OUTFILE) OUTFILE(MYLIB/DSPPGMREF) OUTMBR(*FIRST *ADD)
ENDPGM
I was told however that service programs references cannot be displayed with DSPPGMREF. So the following was done for those.
PGM
ADDLIBLE LIB(ABSTRACT) POSITION(*LAST)
MONMSG MSGID(CPF0000)
WRKOBJR OBJ(SRVPGM01) OBJTYPE(*SRVPGM) OUTPUT(*OUTFILE) OUTFILE(MYLIB/WRKOBJR) MBROPT(*REPLACE)
WRKOBJR OBJ(SRVPGM02) OBJTYPE(*SRVPGM) OUTPUT(*OUTFILE) OUTFILE(MYLIB/WRKOBJR) MBROPT(*ADD)
WRKOBJR OBJ(SRVPGM03) OBJTYPE(*SRVPGM) OUTPUT(*OUTFILE) OUTFILE(MYLIB/WRKOBJR) MBROPT(*ADD)
ENDPGM
Thank you for all your help. I apologize that my answer is a little more specific than my question but in the end this was what I wanted to achieve, I had to generalize to ask the question. I'd thought i'd post post my answer anyways in case it helps someone in the future.

How to determine if a script is DML? OCI?

My goal is to somehow be sending (Maybe via an OCI) process an sql script to determine whether if it is a DML script (All of the code blocks are not DDL nor DCL) or an hybrid with other data languages.
Is it possible? If so, how?
Let's say maybe with OCI you can't do it. How would you automatically do a check to validate this?
Use OCI_ATTR_STMT_TYPE as explained by https://stackoverflow.com/a/13528133/103724 and you will get one of the values in this table http://docs.oracle.com/cd/E14072_01/appdev.112/e10646/oci04sql.htm#CIHEHCEJ.
There's also OCI_ATTR_SQLFNCODE which provides more detailed information, but as the question above showed, it can be tricky to use. --DD

Executing SQLStrings after ApplicationPool is created WIX

I need to add a user to my DataBase for IIS AppPool\MyAppPool. I need to execute simple query
CREATE LOGIN [IIS AppPool\MyAppPool] FROM WINDOWS
I use <sql:SQLString> element in WiX.
I use <iis:WebAppPool> extension to create ApplicationPool.
But Application Pool is created after SQL strings have been executed so I got error "User or group doesn't exist" from SQL Server.
Is it possible to execute SQL strings after ApplicationPool creation? Or maybe it is possible to sequence ExecuteSqlStrings manually?
It is strange, but if I add my own custom action (which calls sqlcmd.exe and executes the query) after ConfigureIIs, everything works fine. But I don't like such a solution, I suppose using and etc. is better solution.
I had the same problem as you, and attempted to schedule ExecuteSqlStrings after ConfigureIIs to no avail. I searched around in some old installers we had made and managed to find one that needed to accomplish the same thing. Instead of scheduling ExecuteSqlStrings, it scheduled InstallSqlData after ConfigureIIs. Tried that instead (1), and now the install works correctly. I don't know the specifics of what's different between InstallSQLData and ExecuteSqlStrings (logically you would think you need to use the latter), but this worked for me and hopefully works for you too.

iOS Rolling out app updates. Keeping user data intact when DB update required

I have just done a quick search and nothing too relevant came up so here goes.
I have released the first version of an app. I have made a few changes to the SQLite db since then, in the next release I will need to update the DB structure but retain the user's data.
What's the best approach for this? I'm currently thinking that on app update I will never replace the user's (documents folder, not in bundle) database file but rather alter its structure using SQL queries.
This would involve tracking changes made to the database since the previous release. Script all these changes into SQL queries and run these to bring the DB to the latest revision. I will also need to keep a field in the database to track the version number (keep in line with app version for simplicity).
Unless there are specific hooks, delegate methods that are fired at first run after an update I will put calls for this logic into the very beginning of the appDelegate, before anything else is run.
While doing this I will display "Updating app" or something to the user.
Next thing, what happens if there is an error somewhere along the line and the update fails. The DB will be out of date and the app won't function properly as it expects a newer version?
Should I take it upon myself to just delete the user's DB file and replace it with the new version from the app bundle. OR, should I just test, test, test until everything is solid on my side and if an error occurs on the user's side it's something else, in which case I can't do anything about it only discard the data.
Any ideas on this would be greatly appreciated. :)
Thanks!
First of all, the approach you are considering is the correct one. This is known as database migration. Whenever you modify the database on your end, you should collect the appropriate ALTER TABLE... etc. methods into a migration script.
Then the next release of your app should run this code once (as you described) to migrate all the user's data.
As for handling errors, that's a tough one. I would be very weary of discarding the user's data. Better would be to display an error message and perhaps let the user contact you with a bug report. Then you can release an update to your app which hopefully can do the migration with no problems. But ideally you test the process well enough that there shouldn't be any problems like this. Of course it all depends on the complexity of the migration process.