Bar Chart lucidworks/Banana - lucidworks

It is possible to display data in the bar chart as below picture.
non time_series dashboard
screenshot of picture
I try many times, using the bar panel, it can only select 1 field, for example, I select the "Month" field, the bar chart will show the frequency of the month.Below are the sample data.
Month | Sentiment_Score
Jan | 378
Feb | 267
Mar | 247
Apr | 179
May | 945
Jun | 477
Jul | 923
Aug | 348
Sep | 868
Oct | 435
Nov | 328
Dec | 395

Related

create a calculated column base on two column in SQL

I have a below table and I need to create a calculated column (RA) based on the category and month column.
Oa Sa Ai month MDY
5 10 2 Jan J302022
16 32 38 Jan J302022
15 14 4 Feb J302022
46 32 81 Jan J302022
3 90 0 Mar J302022
51 10 21 Jan J302021
19 32 3 Jan J302021
45 16 41 Feb J302021
46 7 81 Jan J302022
30 67 14 Mar J302021
45 16 41 Apr J302021
46 7 81 Apr J302021
30 67 0 Jan J302021
56 17 0 Mar J302022
first, it should need to consider a category, for example, J302022, then it needs to calculate the "RA" column based on the month for that category. for example, J302022, Jan, ((5+16+46+46)+(10+32+32+7)) / (2+38+81+81) = 0.96.
So below is the expected output looks like.
Oa Sa Ai month category RA
5 10 2 Jan J302022 0.96
16 32 38 Jan J302022 0.96
15 14 4 Feb J302022 7.25
46 32 81 Jan J302022 0.96
3 90 0 Mar J302022 0
51 10 21 Jan J302021 8.70
19 32 3 Jan J302021 8.70
45 16 41 Feb J302021 1.48
46 7 81 Jan J302022 0.96
30 67 14 Mar J302021 6.92
45 16 41 Apr J302021 1.48
46 7 81 Apr J302022 0.65
30 67 0 Jan J302021 8.70
56 17 0 Mar J302022 0
Is it possible to do it in SQL?
Thanks in advance!
select Oa, Sa, Ai, month, category,
coalesce((ra1+ra2)/ra3, 0) as RA
from (
select Oa, Sa, Ai, month, mdy as category,
sum(oa) over (partition by month, mdy) as ra1,
sum(sa) over (partition by month, mdy) as ra2,
sum(ai) over (partition by month, mdy) as ra3
from WhateverYourTableNameIs
) as t;
Output on MySQL 8.0.29:
+------+------+------+-------+----------+--------+
| Oa | Sa | Ai | month | category | RA |
+------+------+------+-------+----------+--------+
| 45 | 16 | 41 | Apr | J302021 | 0.9344 |
| 46 | 7 | 81 | Apr | J302021 | 0.9344 |
| 45 | 16 | 41 | Feb | J302021 | 1.4878 |
| 15 | 14 | 4 | Feb | J302022 | 7.2500 |
| 51 | 10 | 21 | Jan | J302021 | 8.7083 |
| 19 | 32 | 3 | Jan | J302021 | 8.7083 |
| 30 | 67 | 0 | Jan | J302021 | 8.7083 |
| 5 | 10 | 2 | Jan | J302022 | 0.9604 |
| 16 | 32 | 38 | Jan | J302022 | 0.9604 |
| 46 | 32 | 81 | Jan | J302022 | 0.9604 |
| 46 | 7 | 81 | Jan | J302022 | 0.9604 |
| 30 | 67 | 14 | Mar | J302021 | 6.9286 |
| 3 | 90 | 0 | Mar | J302022 | 0.0000 |
| 56 | 17 | 0 | Mar | J302022 | 0.0000 |
+------+------+------+-------+----------+--------+
for SQL Server
select Oa, Sa, Ai, [Month],Category,
case when (sum(Ai) over(partition by [Month], Category) *1.0) = 0 then 0 else
(sum(Oa) over(partition by [Month], Category) +
sum(Sa) over(partition by [Month], Category))/
(sum(Ai) over(partition by [Month], Category) *1.0) end Ra
from #temp
order by Category desc
1.0 is multiplied in denominator to convert the output to float.

How to get weekly data but starting from the first date of the month and do SUM calculation accordingly in BQ?

I have an issue to pull this kind of data. So I need to pull weekly data with these specifications:
The data pull will be scheduled, hence it will involve multiple months
The very first week will start from the first date (1 in every month) -- Green in the pic
The last week doesn't involve dates from the next month -- Red in the pic
The raw data and the desirable output(s) will more or less look like this:
Is there any workaround to do this in BigQuery? Thanks (attached below the data)
+-------------+-------+
| date | sales |
+-------------+-------+
| 1 Oct 2021 | 5 |
+-------------+-------+
| 2 Oct 2021 | 13 |
+-------------+-------+
| 3 Oct 2021 | 75 |
+-------------+-------+
| 4 Oct 2021 | 3 |
+-------------+-------+
| 5 Oct 2021 | 70 |
+-------------+-------+
| 6 Oct 2021 | 85 |
+-------------+-------+
| 7 Oct 2021 | 99 |
+-------------+-------+
| 8 Oct 2021 | 90 |
+-------------+-------+
| 9 Oct 2021 | 68 |
+-------------+-------+
| 10 Oct 2021 | 97 |
+-------------+-------+
| 11 Oct 2021 | 87 |
+-------------+-------+
| 12 Oct 2021 | 56 |
+-------------+-------+
| 13 Oct 2021 | 99 |
+-------------+-------+
| 14 Oct 2021 | 38 |
+-------------+-------+
| 15 Oct 2021 | 6 |
+-------------+-------+
| 16 Oct 2021 | 43 |
+-------------+-------+
| 17 Oct 2021 | 45 |
+-------------+-------+
| 18 Oct 2021 | 90 |
+-------------+-------+
| 19 Oct 2021 | 64 |
+-------------+-------+
| 20 Oct 2021 | 26 |
+-------------+-------+
| 21 Oct 2021 | 24 |
+-------------+-------+
| 22 Oct 2021 | 4 |
+-------------+-------+
| 23 Oct 2021 | 36 |
+-------------+-------+
| 24 Oct 2021 | 68 |
+-------------+-------+
| 25 Oct 2021 | 4 |
+-------------+-------+
| 26 Oct 2021 | 16 |
+-------------+-------+
| 27 Oct 2021 | 30 |
+-------------+-------+
| 28 Oct 2021 | 89 |
+-------------+-------+
| 29 Oct 2021 | 46 |
+-------------+-------+
| 30 Oct 2021 | 28 |
+-------------+-------+
| 31 Oct 2021 | 28 |
+-------------+-------+
| 1 Nov 2021 | 47 |
+-------------+-------+
| 2 Nov 2021 | 75 |
+-------------+-------+
| 3 Nov 2021 | 1 |
+-------------+-------+
| 4 Nov 2021 | 26 |
+-------------+-------+
| 5 Nov 2021 | 26 |
+-------------+-------+
| 6 Nov 2021 | 38 |
+-------------+-------+
| 7 Nov 2021 | 79 |
+-------------+-------+
| 8 Nov 2021 | 37 |
+-------------+-------+
| 9 Nov 2021 | 83 |
+-------------+-------+
| 10 Nov 2021 | 97 |
+-------------+-------+
| 11 Nov 2021 | 56 |
+-------------+-------+
| 12 Nov 2021 | 83 |
+-------------+-------+
| 13 Nov 2021 | 14 |
+-------------+-------+
| 14 Nov 2021 | 25 |
+-------------+-------+
| 15 Nov 2021 | 55 |
+-------------+-------+
| 16 Nov 2021 | 16 |
+-------------+-------+
| 17 Nov 2021 | 80 |
+-------------+-------+
| 18 Nov 2021 | 66 |
+-------------+-------+
| 19 Nov 2021 | 25 |
+-------------+-------+
| 20 Nov 2021 | 62 |
+-------------+-------+
| 21 Nov 2021 | 36 |
+-------------+-------+
| 22 Nov 2021 | 33 |
+-------------+-------+
| 23 Nov 2021 | 19 |
+-------------+-------+
| 24 Nov 2021 | 47 |
+-------------+-------+
| 25 Nov 2021 | 14 |
+-------------+-------+
| 26 Nov 2021 | 22 |
+-------------+-------+
| 27 Nov 2021 | 66 |
+-------------+-------+
| 28 Nov 2021 | 15 |
+-------------+-------+
| 29 Nov 2021 | 96 |
+-------------+-------+
| 30 Nov 2021 | 4 |
+-------------+-------+
Consider below approach
with temp as (
select parse_date('%d %B %Y', date) date, sales
from your_table
)
select format_date('%d %B %Y', weeks[ordinal(num)]) start_week, sum(sales) total_sales
from (
select sales, weeks, range_bucket(date, weeks) num
from temp, unnest([struct(generate_date_array(date_trunc(date, month), last_day(date, month), interval 7 day ) as weeks)])
)
group by start_week
if to apply to sample data (as is) in your question - output is

How to create churn table from transactional data?

Currently my Transaction Table has customer's transaction data for each month. Account_ID identifies the customer's ID. Order_ID is the number of orders that the customer had made. Reporting_week_start_date is the week which begins on Monday where each transaction is made (Date_Purchased).
How do i create a new table to identify the customer_status after each transaction has been made? Note that the new table has the Reporting_week_start_date until current date despite no transactions has been made .
Customer_Status
- New : customers who made their first paid subscription
- Recurring : customers with continuous payment
- Churned : when customers' subscriptions had expired and there's no renewal within the next month/same month
- Reactivated : customers who had churned and then returned to re-subscribe
Transaction Table
Account_ID | Order_ID | Reporting_week_start_date| Date_Purchased | Data_Expired
001 | 1001 | 31 Dec 2018 | 01 Jan 2019 | 08 Jan 2019
001 | 1001 | 07 Jan 2019 | 08 Jan 2019 | 15 Jan 2019
001 | 1001 | 14 Jan 2019 | 15 Jan 2019 | 22 Jan 2019 #Transaction 1
001 | 1001 | 21 Jan 2019 | 22 Jan 2019 | 29 Jan 2019
001 | 1001 | 28 Jan 2019 | 29 Jan 2019 | 31 Jan 2019
001 | 1002 | 28 Jan 2019 | 01 Feb 2019 | 08 Feb 2019
001 | 1002 | 04 Feb 2019 | 08 Feb 2019 | 15 Feb 2019 #Transaction 2
001 | 1002 | 11 Feb 2019 | 15 Feb 2019 | 22 Feb 2019
001 | 1002 | 18 Feb 2019 | 22 Feb 2019 | 28 Feb 2019
001 | 1003 | 25 Feb 2019 | 01 Mar 2019 | 08 Mar 2019
001 | 1003 | 04 Mar 2019 | 08 Mar 2019 | 15 Mar 2019
001 | 1003 | 11 Mar 2019 | 15 Mar 2019 | 22 Mar 2019 #Transaction 3
001 | 1003 | 18 Mar 2019 | 22 Mar 2019 | 29 Mar 2019
001 | 1003 | 25 Mar 2019 | 29 Mar 2019 | 31 Mar 2019
001 | 1004 | 27 May 2019 | 01 Jun 2019 | 08 Jun 2019
001 | 1004 | 03 Jun 2019 | 08 Jun 2019 | 15 Jun 2019 #Transaction 4
001 | 1004 | 10 Jun 2019 | 15 Jun 2019 | 22 Jun 2019
001 | 1004 | 17 Jun 2019 | 22 Jun 2019 | 29 Jun 2019
001 | 1004 | 24 Jun 2019 | 29 Jun 2019 | 30 Jun 2019
Expected Output
Account_ID | Order_ID | Reporting_week_start_date| Customer_status
001 | 1001 | 31 Dec 2018 | New
001 | 1001 | 07 Jan 2019 | New #Transaction 1
001 | 1001 | 14 Jan 2019 | New
001 | 1001 | 21 Jan 2019 | New
001 | 1001 | 28 Jan 2019 | New
001 | 1002 | 28 Jan 2019 | Recurring
001 | 1002 | 04 Feb 2019 | Recurring #Transaction 2
001 | 1002 | 11 Feb 2019 | Recurring
001 | 1002 | 18 Feb 2019 | Recurring
001 | 1003 | 25 Feb 2019 | Churned
001 | 1003 | 04 Mar 2019 | Churned #Transaction 3
001 | 1003 | 11 Mar 2019 | Churned
001 | 1003 | 18 Mar 2019 | Churned
001 | 1003 | 25 Mar 2019 | Churned
001 | - | 1 Apr 2019 | Churned
001 | - | 08 Apr 2019 | Churned
001 | - | 15 Apr 2019 | Churned
001 | - | 22 Apr 2019 | Churned
001 | - | 29 Apr 2019 | Churned
001 | - | 29 Apr 2019 | Churned
001 | - | 06 May 2019 | Churned
001 | - | 13 May 2019 | Churned
001 | - | 20 May 2019 | Churned
001 | - | 27 May 2019 | Churned
001 | 1004 | 27 May 2019 | Reactivated
001 | 1004 | 03 Jun 2019 | Reactivated #Transaction 4
001 | 1004 | 10 Jun 2019 | Reactivated
001 | 1004 | 17 Jun 2019 | Reactivated
001 | 1004 | 24 Jun 2019 | Reactivated'
...
...
...
current date
I think you just want window functions and case logic. Assuming the date you are referring to is Reporting_week_start_date, then the logic looks something like this:
select t.*,
(case when Reporting_week_start_date = min(Reporting_week_start_date) over (partition by account_id)
then 'New'
when Reporting_week_start_date < dateadd(lag(Reporting_week_start_date) over (partition by account_id order by Reporting_week_start_date), interval 1 month)
then 'Recurring'
when Reporting_week_start_date < dateadd(lead(Reporting_week_start_date) over (partition by account_id order by Reporting_week_start_date), interval -1 month)
then 'Churned'
else 'Reactivated'
end) as status
from transactions t;
These are not exactly the rules you have specified. But they seem very reasonable interpretations of what you want to do.

SQL Server : Insert blank line in query

This may be so last year but I'm using SQL Server 2005
stmpdate intime
----------------------
2014-10-08 08:04:43
2014-10-09 07:57:13
2014-10-10 07:57:14
2014-10-16 07:79:56
2014-10-17 07:45:56
I have this table. It keeps check-in time of the employee, but this employee didn't check-in everyday in the month. So what I want it to be is something like this
stmpdate intime
1 2014-10-01
2 2014-10-02
3 2014-10-03
4 2014-10-04
5 2014-10-05
6 2014-10-06
7 2014-10-07
8 2014-10-08 08:04:43
9 2014-10-09 07:57:13
10 2014-10-10 07:57:14
11 2014-10-11
12 2014-10-12
13 2014-10-13
14 2014-10-14
15 2014-10-15
16 2014-10-16 07:59:56
17 2014-10-17 07:45:56
18 2014-10-18
19 2014-10-19
20 2014-10-20
21 2014-10-21
22 2014-10-22
23 2014-10-23
24 2014-10-24
25 2014-10-25
26 2014-10-26
27 2014-10-27
28 2014-10-28
29 2014-10-29
30 2014-10-30
31 2014-10-31
I tried to make a temp table which contains every date in the month, and then left join it with the first table I mentioned, but it seemed to not work.
declare #datetemp table (
stmpdate varchar(10)
);
insert into #datetemp
SELECT '2014-10-01'
UNION ALL
SELECT '2014-10-02'
UNION ALL
SELECT '2014-10-03'
....
and
SELECT dtt.stmpdate, intime
FROM #datetemp dtt left join v_dayTimesheet
on dtt.stmpdate=v_dayTimesheet.stmpdate
WHERE (emp_no = '001234567')
here is the result of query above
stmpdate intime
2014-10-08 08:04:43
2014-10-09 07:57:13
2014-10-10 07:57:14
2014-10-16 07:59:56
2014-10-17 07:45:56
and here is the result of select * from #datetemp
2014-10-01
2014-10-02
2014-10-03
2014-10-04
2014-10-05
2014-10-06
2014-10-07
2014-10-08
2014-10-09
2014-10-10
2014-10-11
2014-10-12
2014-10-13
2014-10-14
2014-10-15
2014-10-16
2014-10-17
2014-10-18
2014-10-19
2014-10-20
2014-10-21
2014-10-22
2014-10-23
2014-10-24
2014-10-25
2014-10-26
2014-10-27
2014-10-28
2014-10-29
2014-10-30
2014-10-31
you're filtering for only where emp_no has a value. if they didn't check in, it won't return on that row because you just have date info and no employee number. so you have to allow for equal or null.
SELECT dtt.stmpdate, intime
FROM #datetemp dtt
left outer join v_dayTimesheet
on dtt.stmpdate=v_dayTimesheet.stmpdate
WHERE emp_no = '001234567' or emp_no is null
also, for your dates... check this out: http://www.sqlservercurry.com/2010/03/generate-start-and-end-date-range-using.html
DECLARE
#StartDate datetime = '2010-01-01',
#EndDate datetime = '2010-03-01'
;WITH datetemp as
(
SELECT #StartDate as stmpdate
UNION ALL
SELECT DATEADD(day, 1, stmpdate)
FROM datetemp
WHERE DATEADD(day, 1, stmpdate) <= #EndDate
)
SELECT stmpdate
FROM datetemp;
you would then select from datetemp as a normal table. beware, though, a common table expression can only be used once and immediately following the with statement.
just trust me on this one... run this query and see how your blank lines occur:
SELECT dtt.stmpdate, intime, emp_no
FROM #datetemp dtt
left outer join v_dayTimesheet
on dtt.stmpdate=v_dayTimesheet.stmpdate
WHERE emp_no = '001234567' or emp_no is null
all these lines will return with emp_no = 001234567
stmpdate intime
2014-10-08 08:04:43
2014-10-09 07:57:13
2014-10-10 07:57:14
2014-10-16 07:59:56
2014-10-17 07:45:56
and all your blank lines will have null as emp_no.
I got my answer!!
SELECT dtt.stmpdate, intime
FROM #datetemp dtt left join
(
SELECT stmpdate, intime
FROM v_dayTimesheet
WHERE (emp_no = '001234567')
) as vdayTimesheet
on sparedate.stmpdate=vdayTimesheet.stampdate
ORDER BY stmpdate
this is what I want, thanks everyone
SQL Query:
SQLFIDDLEExample
SELECT t2.dt,
isnull(t1.intime, '') intime
FROM
(
SELECT DATEADD(day,number,'2014-10-01') dt
FROM master..spt_values
WHERE Type = 'P'
AND DATEADD(day,number,'2014-10-01') >= '2014-10-01'
AND DATEADD(day,number,'2014-10-01') < '2014-11-01'
) t2
LEFT JOIN Table1 t1
ON t1.stmpdate = t2.dt
Result:
| DT | INTIME |
|--------------------------------|----------|
| October, 01 2014 00:00:00+0000 | |
| October, 02 2014 00:00:00+0000 | |
| October, 03 2014 00:00:00+0000 | |
| October, 04 2014 00:00:00+0000 | |
| October, 05 2014 00:00:00+0000 | |
| October, 06 2014 00:00:00+0000 | |
| October, 07 2014 00:00:00+0000 | |
| October, 08 2014 00:00:00+0000 | 08:04:43 |
| October, 09 2014 00:00:00+0000 | 07:57:13 |
| October, 10 2014 00:00:00+0000 | 07:57:14 |
| October, 11 2014 00:00:00+0000 | |
| October, 12 2014 00:00:00+0000 | |
| October, 13 2014 00:00:00+0000 | |
| October, 14 2014 00:00:00+0000 | |
| October, 15 2014 00:00:00+0000 | |
| October, 16 2014 00:00:00+0000 | 07:79:56 |
| October, 17 2014 00:00:00+0000 | 07:45:56 |
| October, 18 2014 00:00:00+0000 | |
| October, 19 2014 00:00:00+0000 | |
| October, 20 2014 00:00:00+0000 | |
| October, 21 2014 00:00:00+0000 | |
| October, 22 2014 00:00:00+0000 | |
| October, 23 2014 00:00:00+0000 | |
| October, 24 2014 00:00:00+0000 | |
| October, 25 2014 00:00:00+0000 | |
| October, 26 2014 00:00:00+0000 | |
| October, 27 2014 00:00:00+0000 | |
| October, 28 2014 00:00:00+0000 | |
| October, 29 2014 00:00:00+0000 | |
| October, 30 2014 00:00:00+0000 | |
| October, 31 2014 00:00:00+0000 | |

How can I obtain sum of each row in Postgresql?

I have got some rows of results like below, if each sum (row(i)) same, I can suppose the results are correct. How can I write a SQL clause to calculate sum of each row? thanks.
27 | 29 | 27 | 36 | 33 | 29 | 16 | 17 | 35 | 28 | 34 | 15
27 | 29 | 27 | 29 | 33 | 29 | 16 | 17 | 35 | 28 | 34 | 15
27 | 29 | 27 | 14 | 33 | 29 | 16 | 17 | 35 | 28 | 34 | 15
27 | 29 | 16 | 37 | 33 | 29 | 16 | 17 | 35 | 28 | 34 | 15
27 | 29 | 16 | 36 | 33 | 29 | 16 | 17 | 35 | 28 | 34 | 15