I have my MySQL database inserting timestamp when I upload a record, so what's entered is something like 2020-04-02 16:59:29. Is there a Vue.Js way to convert that into something like 10 Days Ago? If so, can anyone give the code for conversion in Vue.Js?
I hold the fetched DB records in an object called data_local as in my bellow code.
Last Activity : {{data_local.updated_at}} days ago
Use moment.js, you can get your desired data the way you want. First do install moment and then import & use.
Below are ways to get days:
Ex-1:( This will provide you No_of_Days days ago ( e.g- 10 days ago)
moment("2020-04-02 16:59:29").fromNow()
Ex-2: (If you want only 10 days without ago then use below)
moment("2020-04-02 16:59:29").fromNow(true)
For more information visit https://momentjs.com/
Related
As the title says I'm trying to create a live dashboard in Tableau that updates every day showing the data for the last 7 days. I'm querying through SQL and then importing it in Tableau. Do I have to specify this requirement in my query or would there be some way to do it in the tableau itself. Thank you so much. I would really appreciate the help.
Disclaimer: I'm pretty novice in tableau and SQL.
If you have a date field in your table then you can use it as a filter and select relative date as the option for the filter and in the dialog that appears you can enter number of days for the days field. Since you want the live data for the last 7 days, you can enter 7 and you'll get the updated data each time.
If you are querying through SQL, put the filter for date/ timestamp in where condition itself like so:
DATE(date_column_filter) >= (DATE(NOW()) - INTERVAL 7 DAY)
Saving the data in one function call, where I am sending the data as 2019-02-01T00:00:00.000Z but the fiddler shows the different date in JSON as 2019-01-31T18:30:00.000Z. I am not aware that why it is giving the difference of 5 hrs 30 minutes.
On the local case, date saved in the database is 01-02-2019, while on production environment the previous date is getting saved in the table records.
Thanks in advance. Please help me with the solutions.
I have a Pentaho Data Integration job which has the following steps:
Generate row step which has an initial date (for e.g. 2010-01-01) and the limit as 10*366 = 3660 rows for 10 years.
Next step has an incrementer to increment the number of days.
Next step uses this information viz. initial date, limit, and the incrementer, to generate dates for each day for 10 years starting 2010-01-01 using javascript functions.
Final step loads a table with the generated dates.
All this works fine.
Now, I have a requirement where I do not want this table to be static with dates for 10 years only. If the max date in the date table is 2 years from today, I want to load dates for 10 more years in the table.
For the above example, with the 1st load loading dates for 10 years from 2010, I should be able to load 10 more years in 2018, the next 10 years in 2028 and so on and so forth.
What will be the best way to achieve this?
How can I:
1) Read the max date from my date table? - I know how to do this.
2) Use the read date to compare against today. And if the max date is within 2 years from today, I populate the table with next 10 years.
I don't know how to do 2 above in Pentaho data integration. Will really appreciate any pointers on a way to resolve this issue.
You need to read the current date (today) in a variable. For example with a Get system info step.
Then you can compare the two fields, max date and today, with a Filter Rows step.
As the previous step may give you more than one row, you need to either use a Unique Row (no field to provide) either a Group by (no group by field).
If any row gets by, then you launch you generate 10 years process. As you cannot have a hop from a step into this second Generate row, you must use a Transformation executor to launch your currently existing transformation.
Now, if your requirement gets a tiny little bit more complex than that, I strongly suggest you to use jobs to orchestrate your transformations.
I used to have a number of queries running on the past 40 days of data using a decorator with [dataset.table#-4123456789-].
However, since September 15 all the decorators return maximum 10 days of data.
By the way [dataset.table#0] returns the whole table and not the past 7 days as told in the documentation.
Does anyone know what is going on. Do I have to move my table to partition in order to receive data for a limited period of time but more the a week?
Thanks
I'm trying to create a search with Splunk that will allow me to have only the results during non working ours. I mean, Splunk to filter out from the logs all the events that occur from 8am to 5am.
Currently, the query I'm using is: earliest=-1mon so I get all the events from last month, but I only need those events that occurred outside working hours.
Is it possible?
There are probably multiple ways of doing this in Splunk. Below is one. I am extracting the hour field (24 hour format) into c_time and then limiting my results to ones that are between 8p and 5a. You can specify other filters like earliest and latest to be more specific. Hope this helps.
... | convert timeformat="%H" ctime(_time) AS c_time | search c_time>= 20 c_time<= 05
-Neeraj.