How can I fetch a dynamic file from FTP server-generated every day? - mosaic-decisions

There are transactional input CSV files coming on a daily basis on an FTP location. I need to read these input files and process them on daily batch execution. The name of the files remains the same every day, but the date gets appended at the end of the filenames every day,
Ex:
Day1
General_Ledger1_2020-07-01,
General_Ledger2_2020-07-01,
General_Ledger3_2020-07-01,
General_Ledger4_2020-07-01,
General_Ledger5_2020-07-01
Day2
General_Ledger1_2020-07-02,
General_Ledger2_2020-07-02,
General_Ledger3_2020-07-02,
General_Ledger4_2020-07-02,
General_Ledger5_2020-07-02
How can I append this Date information to the input file name every time the job runs?

I have faced similar problem earlier and this can be solved using calculated parameter in the file path. Here, you can create expressions that will retrieve the file dynamically.
Example,
CONCAT( UPPER(lit('$(Prefix)')), ADD_DAYS( TODATE(lit('$(currentTime)'), 'yyyy-mm-dd'), 'yyyy-mm-dd' ,-1),'.csv')
Breaking of the expression :
$(currentTime) : this system parameter will get the current date (this will also include timestamp).
(TODATE(lit('$(currentTime)'), 'yyyy-mm-dd') : TODATE will get only date from the whole timestamp with format as ‘yyyy-mm-dd’.
ADD_DAYS(TODATE(lit('$(currentTime)'), 'yyyy-mm-dd'), 'yyyy-mm-dd' ,-1) : ADD_DAYS here will add -1 to the date retrieved from. TODATE(). Hence (2020-04-24) + (-1) would give us 2020-04-23
$(Prefix) : $(Prefix) will be an user defined input parameter of type String which user will be providing at runtime – Since the
prefix will be always dynamic.
CONCAT() : Finally to combine all the results and form the exact file path CONCAT() can be used. Also in between some static
string is added as it will always be fixed for every file to be read.

Related

Pentaho kettle convert date to unix

I'm trying to pacha a string format dated "2019-05-14 13:30:00" to a UNIX format.
In javascript I got it but in the javascript kettle module I am not able to return the numeric value 1557833442
the line of code is this:
const tests = (new Date ("2019-05-14 13:30:00"). getTime () / 1000);
It looks like the Date() constructor doesn't like the format you are using.
If you want the current date, use a Get System Info, it has a number of useful date options.
If you are converting an incoming field, use the Select Values step to change the metadata, using the format string that matches your string field's format.

Searching date and time in Lucene query string in Cloudant

I am trying to write the index and search using date and time in that index in Cloudant NoSql database.
When I pass only the date in the query string, it works fine
created_date:[2015-08-16 TO 2015-08-27]
This returns the correct results but when I include time in the parameter:
created_date:[2015-08-16 07:38:00 TO 2015-08-27 07:38:02]
I get an error:
Cannot parse 'created_date:[2015-08-16 07:38:00 TO 2015-08-27 07:38:02]': Encountered " "TO" "TO "" at line 1, column 50. Was expecting one of: "]" ... "}"
I have some more query parameters before this but the above is the gist of the error.
This is an Apache Lucene query string. What is causing this to happen?
According to Lucene Java doc, date format should looks like this:
A date field shall be of the form 1995-12-31T23:59:59Z The trailing
"Z" designates UTC time and is mandatory
This format was derived to be standards compliant (ISO 8601) and is a
more restricted form of the canonical representation of dateTime from
XML schema part 2. Examples...
1995-12-31T23:59:59Z 1995-12-31T23:59:59.9Z 1995-12-31T23:59:59.99Z
1995-12-31T23:59:59.999Z
So, you miss 'T' between date and time.
For more information: https://lucene.apache.org/solr/4_10_4/solr-core/org/apache/solr/schema/DateField.html
I did it the following way
created_date:["2015-08-16 07:38:00" TO "2015-08-27 07:38:02"]
and used the keyword analyzer in cloudant
This link explains it all
https://lucene.apache.org/core/2_9_4/queryparsersyntax.html

Is there a standard way to get previous login date of SAP user?

I've got a requirement to check whether some objects were modified since last logon of current user. There is a table USR02 that contains last logon date, but it is updated at moment of logon and here "last" means "current".
For example, I logged in 2014.11.21 and then 2014.11.26, so dates range I want to get is 21…26, but when I enter the system, date 2014.11.21 in USR02 will be overwritten with 2014.11.26.
Of course, I could follow Z-way and create my own table containing user name and previous login date, but maybe there is there a standard way to achieve this?
I noticed that you can see the current as well as the last logon date and time in the dialog you can open with System --> Status. I went through the code of the function pool SHSY that contains this dialog and found the following implementation:
DATA: BEGIN OF last_logon,
date LIKE sy-datum,
time LIKE sy-uzeit,
date_now LIKE sy-datum,
time_now LIKE sy-uzeit,
END OF last_logon.
* ...
* Datum und Zeit der aktuellen und letzten Anmeldung
GET PARAMETER ID 'US2' FIELD last_logon.
Certainly not the standard API one would expect, but apparently it's all there is...

splunk search query returns entries with a variable value greater than some number

I've this log entry:
"2014-11-22 02:42:10,545 .. - average:2.74425 , min:1.43 , max:4.007..."
i want to create a search query that returns all log entries with "average > 5"
i want to select the date of the log entry and the average value,
can this be done? how can i do this?
Thanks,
It is quite simple to do in Splunk and you'll have to do it in two steps:
Parse your log to get each of the fields in your log files. To do this use the props.conf and transforms.conf files on your indexer server or on your client if you are using the heavy forwarder. Another option is to send you fields using the key=value format that Splunk knows how to parse by default. Example: "2014-11-22 02:42:10,545 .. - average=2.74425 min=1.43 max=4.007..."
After getting your fields in Splunk just search for average>5 and you'll get all these search results easily.
Answer from splunk:
Did you already extract the average field?
If not, go to Settings -> Fields -> Field Extractions -> New, enter "average" as name, fill in your sourcetype, and use this as inline extraction:
average:(?<average>\d+\.?\d*)
it worked. :)

Getting view options in Microsoft Project using VBA

If I want to change the current date format to 20, for example, I can use the command
OptionsViewEx DateFormat:=20
but how can I get the current date format (or any other view option for that matter)?
DefaultDateFormat should be the function to use.
oldvalue = Application.DefaultDateFormat
Application.DefaultDateFormat = 20 ' or = pjDate_mm_dd_yyyy
This gets or sets the default date format. (technet)
This gives the complete list of format types.
If you use Date function get a date in current format, but if you need change use format(Date,"yyyy-mmmm-dd") for example.