How can I make sysdate variable in VS 2019 SSIS - vb.net

I want to make a variable within SSIS that is the current date so that I can reference it in a script task but I have only been able to do this with start date and creation date instead of sysdate. Can anyone help?

SSIS has two states: design-time and run-time. Design-time is the experience in Visual Studio/BIDS/SSDT. There are artifacts on the screen, interactive windows, and our Variables window show the values of the package "at rest".
The Run-time is the experience in the Debugger (or an unattended execution). In the debugger, it looks like the run-time - you see the objects, the data flow components light up and you can see data flowing between components but you can find discrepancies between the two. For example, the Variables window won't show you what the value of a variable is "RIGHT NOW." Instead, it is going to show the design-time value. If you want to see what the internals look like now, that's the Debug menu, Locals window. There you'd see that the current values of all the variables that were defined as design-time.
The System::StartTime has the run-time value set when the package begins (OnPackageStart event). The time the package starts is constant for the run of a package, whether the package run lasts a minute or 3 days, the start time is the time the package started. The design-time value won't ever be passed to a consumer of that variable because the value was updated when the package starts. SSIS does not update the design-time values with the previous run's values. i.e. A design-time start time of 2021-02-18 will always be the at rest value despite being run every day
You cannot control this behavior, nor do you need to worry about it never being accurate as it is part of how run-time works.
An expression exists, GetDate() which is evaluated every time it is inspected (design and run time). I usually advise against this because I am likely using the current time to correlate database activities.
e.g. I created these 10, 100, or 1000000 records at 2021-02-22T11:16:32.123. If I inserted in batches of ten, the first scenario would be recorded under the same timestamp. The second would look something like the first 10 at 2021-02-22T11:16:32.123, the next 10 at 2021-02-22T11:16:32.993, the next ten at 2021-02-22T11:16:33.223 etc. Maybe more, maybe less. Why that matters is I can't prove to the business "these 10/100/1000000 are the rows from load X because they all have the same timestamp" Instead, I need to find all the rows from 2021-02-22T11:16:32.123 to 2021-02-22T11:16:38.532 and oops, a different process also ran in that timeframe so my range query now identifies (10/105/1000003) rows.
GetDate for longer running processes that start before, but near the midnight boundary can result in frustrating explanations to the business.
Finally, since you're referencing a Script Task, you're already in .NET space so you can use Now/Today in your methods and not worry about passing an SSIS variable into the environment.

Related

How to set KOFAX KTM Server global variable value which will be initialized in Batch open, updated in SeparateCurrentPage & used in BatchClose?

I am trying to count a specific barcode value from Project.Document_SeparateCurrentPage and use it in BatchClose to compare if the count is greater than 1 and if it is >1 then send the batch to a specific queue with specific priority. I used a global variable in KTM Project Script to hold the count value which was initialized to 0 in Batch open. It worked fine until unit testing. But our automation team found that out of 20 similar batches, few batches were sent to the queue where the batch should go only if the count satisfies the greater than one condition, though they used only one barcode.
I googled and found that KTM Server script events do not allow to use shared information in different processes(https://docshield.kofax.com/KTM/en_US/6.4.0-uuxag78yhr/help/SCRIPT/ScriptDocumentation/c_ServerScriptEvents.html). Then I tried to use a batch field to hold the barcode count but unable to update its value from Project.Document_SeparateCurrentPage function using pXRootFolder.Fields.ItemByName("BatchFieldName").Text = "GreaterThanOne". The logs show that the batch reads the first page three times and then errors out.
Any links would help. Thanks in advance.
As you mentioned, the different phases of batch/document processing can execute in different processes, so global variables initialized in one event won’t necessarily be available in others. Ideally you should only use global variables if their content can be set from Application_InitializeScript or Application_InitializeBatch, because these events occur in each separate process. As you’ve found out, you shouldn’t use a global variable for your use case, because Document_SeparateCurrentPage and Batch_Close for one batch may occur in different processes, just as the same process will likely execute those events for multiple batches.
Also, you cannot set batch fields from document level events for a related reason: any number of separate processes could be processing documents of a batch in parallel, so batch level data is read-only to document events. It is a bit unintuitive, but separation is a document level event even though it seems like it is acting on the whole batch. (The three times you saw is just an error retry mechanism.)
If it meets your needs, the simplest answer might be to use a barcode locator as part of normal extraction (not just separation), and assign to a field if needed. While you cannot set batch fields from document events, you can read document data from batch events. So instead of trying to track something like a count over the course of document events, just make sure whatever data you need is saved at a document level. Then in a Batch_Close you can iterate the documents and count/calculate whatever you need. (In your case maybe the number of locator alternatives for the barcode locator, across each document.)

ABAP Program to notify Users X amount of days before user account will be disabled

I'm currently learning ABAP and trying to make an enhancement but have broken down in confusion on how to go about building on top of existing code. I have a program that runs periodically via a background job that disables user accounts X amount of days (in this case 90 days of inactive usage based on USR02~TRDAT).
I want to add an enhancement to notify the User via their email address (result of usr02~bname to match usr21~bname to pass the usr21~persnumber and usr21~addrnumber to adr6 which will point to the adr6~smtp_addr of the user, providing the usr02~bname -> adr6~smtp_addr relationship) based on their last logon date being 30, 15, 7, 5, 3, and 1 day away from the 90 day inactivity threshold with a link to the SAP system to help them reactivate the account with ease.
I'm beginning to think that an enhancement might not be a good idea but rather create a new program and schedule the background job daily. Any guidance or information would be greatly appreciated...
Extract
CLASS cl_inactive_users_reader DEFINITION.
PUBLIC SECTION.
TYPES:
BEGIN OF ts_inactive_user,
user_name TYPE syst_uname,
days_of_inactivity TYPE int1,
END OF ts_inactive_user.
TYPES tt_inactive_users TYPE STANDARD TABLE OF ts_inactive_user WITH EMPTY KEY.
CLASS-METHODS read_inactive_users
IMPORTING
min_days_of_inactivity TYPE int1
RETURNING
VALUE(result) TYPE tt_inactive_users.
ENDCLASS.
Then refactor
REPORT block_inactive_users.
DATA(inactive_users) = cl_inactive_users_readers=>read_inactive_users( 90 ).
LOOP AT inactive_users INTO DATA(inactive_user).
" block user
ENDLOOP.
And add
REPORT warn_inactive_users.
DATA(inactive_users) = cl_inactive_users_readers=>read_inactive_users( 60 ).
LOOP AT inactive_users INTO DATA(inactive_user).
CASE inactive_user-days_of_inactivity.
" choose urgency
ENDCASE.
" send e-mail
ENDLOOP.
and run both reports daily.
Don't create a big ball of mud by squeezing new features into existing code.
From SAP wiki:
The enhancement concept allows you to add your own functionality to SAP's standard business applications without having to modify the original applications. To modify the standard SAP behavior as per customer requirements, we can use enhancement framework.
As per your description, it doesn't sound like a use case for an enhancement. It isn't an intervention in an existing process. The original process and your new requirement are two different processes with some mutual logical part - selection of days of inactivity of users. The two shouldn't rely on each other.
Structurally I think it is best to have a separate program for computing which e-mails need to be sent and when, and a separate program for actually sending them.
I would copy your original program to a new one, and modify it a little bit so that instead of disabling a user, it records into some table for each user: 1) an e-mail 2) a date when to send 3) how many days left (30, 15, 7, etc) 4) status if the e-mail was sent or not. Initially you can even have multiple such jobs for each period (30, 15, 7 etc) and pass it as a parameter (which you use inside instead of 90).
This program you run daily as a job and it populates that table with e-mail "tasks" of what needs to be sent today. It just adds new lines, so lines from yesterday should stay in there.
The 2nd program should just read that table and send actual e-mails and update the statuses. You run that program daily as well.
This way you have:
overview: just check the table to see what's going on
control: if the e-mailer dies or hangs, you can restart it and it will continue where it left off; with statuses you avoid sending duplicate mails
you can make sure that you don't send outdated e-mails if in your mailer script you ignore all tasks older than say 2 days
I want to clarify your confusion about the use of enhancements:
You would want to use enhancements in terms of 'something' happens or is going to happen in the system and you would want to change this standard way.
That something, let's call it event or process could be for example an order is placed, a certain user is logging onto the system or a material has been or is going to be changed.
The change could be notifying another system of an order or checking the logged on user with additional checks for example his GUI version and warn him/her if not up-to-date.
Ask yourself, what process on the system does the execution of your program or code depend on. Does anything need to happen before the program is executed? No, only elapsing time.
Even if you had found an enhancement, you would want to use. If this process using the enhancement would not be run in 90 days, your mails would not be sent, because the enhancement would never been called.
edit: That being said, supposing you mean by enhancement 'building on your existing program' instead of 'creating a new one' would be absolutely not the right terminology for enhancement in the sap universe.
I would extend the functionality of your existing program, since you already compute how many days are left and you would have only one job to maintain.

SQLAgent job with different schedules

I am looking to see if its possible to have one job that runs different schedules, with the catch being one of the schedules needs to pass in a parameter.
I have an executable that will run some functionality when there is no parameter, but if there is a parameter present it will run some additional logic.
Setting up my job I created a schedule (every 15 minutes), Operating system (CmdExec)
runApplication.exe
For the other schedule I would like it to be once per day however the executable would need to be: runApplication.exe "1"
I dont think I can create a different step with a separate schedule, or can I?
Anyone have any ideas on how to achieve this without having two separate jobs?
There's no need for 2 jobs. What you can do is update your script so the result of your job (your parameter) is stored in a table. Then update your secondary logic to reference that table. If there's a value of parameter, then run your secondary logic. All in one script. If there's no value in that parameter, then have your secondary logic to return a 0 or not run at all.
Just make sure you either truncate the entire reference parameter table every run or you store a date in there so you know which one to reference.
Good luck.

SSRS Data-Driven Subscription [based on static Subscription table] Not Picking Up Changes Made to Subscription Table

I have a .RDL report which I designed in BIDS and have deployed to my report server. The report asks for three parameters before viewing report: Year, Month and Customer ID. The report works great and does exactly what it is supposed to.
While I used to run each report individually because there were 2-3 customers, now there are 30+ customers who receive the report, so I wanted to switch to a more automated fulfillment method to get the reports generated. After doing some research it appears that a using Report Manager to create a "Data Driven Subscription" (DDS) using the "Windows File Share" option gives me the capabilities I need.
As part of creating the DDS, I created a table called [Subscription] which is a table containing one row for each customer receiving the report and has the following columns:
Year
Month
CustomerID
FileName
FileLocation
Overwrite
Format
...so through using the DDS Wizard in Report Manager, I was able to successfully set up a Data Driven Subscription (which is linked to various columns in the [Subscription] table) which creates a new report for each customer in the [Subscription] table, saves [and overwrites, if necessary] it in a location of my choosing as a PDF (specified in [Subscription].[FileLocation], or the FileLocation column of my table for each row), and runs every minute (I plan on changing frequency to once a week, eventually).
This works flawlessly, giving me a new set of 30 reports in the directory of my choosing, with each report having a name I assigned in the FileName column of my table. Exactly what I was looking for.
HERE'S THE PROBLEM: When I update the FileLocation or FileName (or anything, really) in the [Subscription] table - it doesn't pick up the changes right away. Sometimes it doesn't even pick it up at all (for example I updated the [ReportName] column for one customer from Report_711622 to SpecialReport_711622, so that the output file for that customer should be named SpecialReport_711622 while all of the other reports should be called Report_XXXXX [no Special prefix]. But the file name of report for Customer 711622 remains the same!
It's almost like the job only see's what it needs to do once a day, and then does not go back and reference the [Subscription] table until I leave for the night, then when I come back in the morning it picks up the change.
Since I am about to scale this process out to a large customer-base using a different report, I need to be able to make edits to the [Subscription] table and have them get picked up by the Data Driven Subscription immediately (and if not immediately, at least a fixed interval of time that I can adjust, so that I can know 100% when the change will get picked up).
Does anyone know what's causing my lag? How do I change it so that updates to the Subscription table get picked up regularly? I'm also having issues with creating new DDS on other reports (following the exact process outlined above) - I've created the subscriptions, for every minute, and it says they are running and the number of outputs match the number of customers with 0 errors, but there are no files in the drive I specified (or anywhere else I've looked, for that matter).
Any help would be greatly appreciated!
I think the answer lies in the mechanism SSRS uses. There are a few places "lag" can occur.
The subscription is in fact an SQL Agent job which creates a record in the Event table. This table is a queue that SSRS checks to do scheduled tasks.
There is a small amount of time between the moment the subscription creates the Event record and the moment SQL reads it and starts creating the dataset for your DDS. The creation of the DDS dataset takes some time, too. In this time, the subscription will be in the Pending state. If you change anything in the data during this time, The subscription will still use the old data as report parameters. So obviously you will not notice your change until the next scheduled run.
Which brings me to the following: if a subscription is still being run and the next schedule kicks in (chances are, because yours runs every minute), the engine will not execute it, but wait for the next subscription schedule, and so on. So that's another possibility of lag - and cause of missing reports for a certain schedule minute. The subscription processes reports sequentially, one row from your DDS recordset at a time. Again, this takes some time. You can also see that in the subscription window when it says: # of # processed.
I suggest you look at the Event table in the database ReportServer during an execution. Also the ExecutionHistory views (there are 3) may be interesting. A scheduled run shows up as a RequestType = 1 and generates one record for each report. You can see the exact timing and parameters of each report that is run in the subscription. You may be able to extract the data you need to resolve your other issues.
EDIT: Here is a more elaborate guide to DDS data and events
http://blogs.msdn.com/b/deanka/archive/2009/01/13/diagnosing-and-troubleshooting-subscriptions.aspx
http://blogs.msdn.com/b/deanka/archive/2010/02/16/troubleshooting-subscriptions-part-ii-using-the-report-services-trace-log-file.aspx
Could this "Double-Hop" problem be the source of my issues? I'm so stuck on this one!
The Double-Hop Problem - MSDN Knowledgecast

Access 97: table entry to drop off form when a date/time reached

I like to think i'm not completely useless at creating MS Access databases, but i'm definitely a failure at the SQL code side.. So as a result i'm not sure whether this is a stupid question or not!
At work i'm trying to add a "news feed" type thing to a Form on the front screen of a database used to find useful information stored in various places. At the moment my workplace is using Office 2007, but Access is the 97 version!!!! As they're only recently realizing it can be used to solve a few of their problems... we're expecting to upgrade the whole of office and access to 2010 soon.
On this database (created using access 97..) there is a "refresh" type button which simply closes and re-opens the form and thus shows the latest info entered onto the "news feed", and this also shows the current time and date. What i'd like to happen is have specific entries drop off after a period of time (which probably wont happen unless refreshed), so for example an entry will have been added regarding some server ammendments being made within the workplace so "certain systems will not be working between 8am and 5pm GMT on 9/1/12" and preferably the person who created this entry could enter a date into the form 24 hours, or even a few days, later and when this date is reached the entry would disappear. I understand this is something that may be achieved using a query but i have no idea where to start.
If anyone can help give me an idea of how to do this it would be greatly appreciated.
I apologise if this is poorly worded or not completely clear, i can elaborate if questions are asked.
many thanks,
Kris
You can make a query like this to return only the entries that are less than 3 days old:
SELECT *
FROM MyTable
WHERE CreateTime > DateAdd("d", -3, Now())
In your form, you can define a timer interval and a timer event handler (see the "Event" tab on the properties window). You could use it to requery your list (Me!lstNews.Requery).
In the table you can define the column "CreateTime" as data type "Date/Time" and define its default value as =Now(). This way no text box is required to enter this data.
EDIT:
How to configure the timer in an Access Form (time is in milliseconds):