Access SQL query to mailmerge - sql

How can I transform this table from this
id name
1 sam
2 nick
3 ali
4 farah
5 josef
6 fadi
to
id1 name1 id2 name2 id3 name3 id4 name4
1 sam 2 nick 3 ali 4 farah
5 josef 6 fadi
the reason i need this is i have a database and i need to do a mail merge using word and I want to print every 4 rows on one page, MS word can only print one row per page, so using an SQL query I want one row to represent 4 rows
thanks in advance
Ali

You don't need to create a query for this in Access. Word has a merge field called <<Next Record>> which forces moving to the next record. If you look at how label documents are created using the Mail Merge Wizard, you'll see that's how it's done.
Updated - Doing this in SQL
The columns in simple SELECT statements are derived from the columns from the underlying table/query (or from expressions). If you want to define columns based on the data, you need to use a crosstab query.
First create a query with a running count for each person (say your table is called People), and calculate the row and column position from the running count:
SELECT People.id, Count(*)-1 AS RunningCount, int(RunningCount/4) AS RowNumber, RunningCount Mod 4 AS ColumnNumber
FROM People
LEFT JOIN People AS People_1 ON People.id >= People_1.id
GROUP BY People.id;
(You won't be able to view this in the Query Designer, because the JOIN isn't comparing with = but with >=.)
This query returns the following results:
id Rank RowNumber ColumnNumber
1 0 0 0
2 1 0 1
3 2 0 2
4 3 0 3
5 4 1 0
6 5 1 1
Assuming this query is saved as Positions, the following query will return the results:
TRANSFORM First(Item) AS FirstOfItem
SELECT RowNumber
FROM (
SELECT ID AS Item, RowNumber, "id" &( ColumnNumber + 1) AS ColumnHeading
FROM Positions
UNION ALL SELECT Name, RowNumber, "name" & (ColumnNumber +1)
FROM Positions
INNER JOIN People ON Positions.id = People.id
) AS AllValues
GROUP BY AllValues.RowNumber
PIVOT AllValues.ColumnHeading In ("id1","name1","id2","name2","id3","name3","id4","name4");
The UNION is there so each record in the People table will have two columns - one with the id, and one with the name.
The PIVOT clause forces the columns to the specified order, and not in alphabetical order (e.g. id1, id2 ... name1, name2...)

Related

Need to find out if all columns in a SQL Server table have the same value

I have the task to find out if all columns in a SQL Server table have exact the same value. The table content is created by a stored procedure and can vary in the number of columns. The first column is an ID, the second and the following columns must be compared if the all columns have exact the same value.
At the moment I do not have a clue how to achieve this.
The best solution would be to display only the rows, which have different values in one or multiple columns except the first column with ID.
Thank you so much for your help!!
--> Edit: The table looks this:
ID Instance1 Instance2 Instance3 Instance4 Instance5
=====================================================
A 1 1 1 1 1
B 1 1 0 1 1
C 55 55 55 55 55
D Driver Driver Driver Co-driver Driver
E 90 0 90 0 50
F On On On On On
The result should look like this, only the rows with one or multiple different column values should be display.
ID Instance1 Instance2 Instance3 Instance4 Instance5
=====================================================
B 1 1 0 1 1
D Driver Driver Driver Co-driver Driver
E 90 0 90 0 50
My table has more than 1000 rows and 40 columns
you can achieve this by using row_number()
Try the following code
With c as(
Select id
,field_1
,field_2
,field_3
,field_n
,row_number() over(partition by field_1,field_2,field_3,field_n order by id asc) as rn
From Table
)
Select *
From c
Where rn = 1
row_number with partition is going to show you if the field is repeated by assigning a number to a row based on field_1,field_2,field_3,field_n, for example if you have 2 rows with same field values the inner query is going to show you
rn field_1 field_2 field_3 field_n id
1 x y z a 5
2 x y z a 9
After that on the outer part of the query pick rn = 1 and you are going to obtain a query without repetitions based on fields.
Also if you want to delete repeated numbers from your table you can apply
With c as(
Select id
,field_1
,field_2
,field_3
,field_n
,row_number() over(partition by field_1,field_2,field_3,field_n order by id asc) as rn
From Table
)
delete
From c
Where rn > 1
The best solution would be to display only the rows, which have different values in one or multiple columns except the first column with ID.
You may be looking for a the following simple query, whose WHERE clause filters out rows where all fields have the same value (I assumed 5 fields - id not included).
SELECT *
FROM mytable t
WHERE NOT (
field1 = field2
AND field1 = field3
AND field1 = field4
AND field1 = field5
);

PostgreSQL - Get all subgroups from text array

I have a table with Job Titles in it
e.g.
Need a Barista on the Weekend
Need a Barista, 24$ an hour
Needed on the weekend, baby sitter, 24$ an hour
I am trying to get a count of unique phrases
e.g.
- 2 x Need a
- 2 x Need a Barista
- 2 x on the Weekend
- 2 x on the
- 2 x 24$ an hour
I have run created a table to turn my text into an array of words
CREATE TABLE IF NOT EXISTS job_words (
source VARCHAR,
title VARCHAR,
words VARCHAR[]
)
I have split my titles and inserted as words into this table
insert into job_words
select 'job-title', raw_title, string_to_array(raw_title, ' ') from jobs
The longest sentences have 49 words in them
I would like to find any phrase that is between 2 and 10 words long
Happy to use another table to write into or just a direct query if it is possible
Sample Query to Get some Sample Data
select cardinality(words) no_of_words, words, title from job_words
where cardinality(words) > 4 and cardinality(words) < 10 and title ilike 'need a%'
order by title limit 100
Sample Data
8;"{Need,a,baby,sitter,for,an,amazing,girl}"
7;"{Need,a,baby,sitter,for,casual,sitting}"
8;"{Need,a,babysitter,for,our,19,months,old}"
9;"{Need,a,babysitter,for,our,4,year,old,son}"
9;"{Need,a,babysitter,for,our,little,reyon,19,months}"
7;"{Need,a,babysitter,-,look,no,further}"
5;"{Need,a,babysitter,or,tutor?}"
9;"{Need,a,baby,sitter,tonight,kids,are,already,sleeping}"
6;"{Need,a,Baker,now?,I'm,available!}"
6;"{Need,a,barista,all,rounder,ASAP}"
8;"{NEED,A,BARISTA???,full,time,or,part,time.}"
5;"{Need,a,brick,labourer,urgently}"
9;"{Need,a,care,giver,for,a,Month,old,baby}"
7;"{Need,a,Carer,-,After,School,hours}"
7;"{Need,a,Carpenter,-,build,a,cubby}"
5;"{Need,a,Carwash,staff,asap}"
5;"{Need,a,catering,assistant,job}"
9;"{Need,a,change,from,customer,service?,Look,no,further!}"
5;"{Need,a,change,of,scenery?}"
6;"{NEED,A,CLEANER,-,asap,start}"
6;"{Need,a,cleaner,for,daily,work}"
6;"{Need,a,cleaner,for,daily,work}"
9;"{Need,a,Cleaner,for,hotel,in,Belmont,near,Geelong}"
9;"{Need,a,Cleaner,for,hotel,in,Fyansford,near,Geelong}"
9;"{Need,a,Cleaner,for,hotel,in,Queenscliff,near,Geelong}"
5;"{Need,a,cleaner,for,mcdownal}"
7;"{Need,a,cleaner,for,tomorrow,pay,cash}"
7;"{Need,a,cleaner,for,tomorrow,pay,cash}"
5;"{Need,a,cleaner,in,Brisbane}"
6;"{Need,a,cleaner,in,Roxburghpark,Area}"
7;"{Need,a,cleaner,on,a,weekly,basis}"
7;"{Need,a,cleaner,on,Sunday,18th,June}"
9;"{Need,a,cleaning,team,for,your,building,or,office?}"
8;"{Need,a,concreter,to,start,full,time,/paving}"
6;"{Need,a,contract,climber,on,Tuesday}"
7;"{Need,a,cook,for,Road,Trip,Film}"
6;"{Need,a,delivery,driver,in,kew}"
8;"{Need,a,dishwasher,-,Wetherill,Park,6,days}"
7;"{Need,admin,done,for,hair,salon,asap}"
7;"{Need,admin,done,for,hair,salon,asap}"
7;"{Need,admin,done,for,hair,salon,asap}"
6;"{Need,a,driver,at,8:00,tonight}"
8;"{Need,a,driver,for,my,4.5,tomne,truck}"
7;"{Need,a,driver,in,a,Korean,restaurant}"
6;"{NEED,A,EXPERIENCED,CAR,WASH,STUFF}"
8;"{Need,a,flexible,babysitter,to,suit,shift,work}"
7;"{Need,a,fridge,picked,up,tommorow,Saturday}"
7;"{Need,after,school,care,with,pick,up}"
5;"{need,a,fulltime,female,cleaner}"
5;"{Need,a,full,time,job}"
8;"{Need,a,full,time,nanny,at,Baulkham,Hills}"
6;"{Need,a,"fun,","reliable,",interactive,babysitter}"
9;"{Need,a,gardener,/,labourer,tomorrow,for,5,hours}"
6;"{Need,a,gardener,or,labourer,tomorrow}"
6;"{Need,a,girl,for,sharing,room}"
8;"{Need,a,girl,or,boy,for,cleaning,job}"
6;"{Need,a,good,Barista,in,putney}"
8;"{Need,a,good,painter,for,the,next,month}"
7;"{Need,a,gyprock,setter,for,monday,23/01/17}"
9;"{Need,a,handy,person,in,our,new,work,shop}"
8;"{Need,a,helper,for,a,house,removals,truck}"
5;"{Need,a,house,Cleaner,?}"
6;"{Need,a,house,cleaner?,Call,now}"
6;"{Need,a,house,cleaner?,CALL,NOW!}"
8;"{Need,a,house,cleaner,for,this,afternoon,$30p/h}"
5;"{Need,a,housekeeper,tomorrow,morning}"
9;"{Need,a,HR,driver,for,one,day,a,week}"
8;"{need,a,invester,for,the,new,restaurant,Urgently}"
7;"{Need,a,job,asap,will,start,tomorrow}"
9;"{Need,a,job,?,Backpackers,wanted,+,FREE,ACCOMODATION}"
5;"{Need,a,job,for,weekend.}"
8;"{need,a,job,in,day,time,and,weekends}"
7;"{Need,a,job,of,cleaning,or,handkitchen}"
5;"{NEED,A,JOB?!,Start,immediately!!}"
5;"{need,a,job,(student,here)}"
6;"{Need,a,job,to,start,asap}"
9;"{Need,a,kitchen,hand,for,Indian,take,away,shop}"
8;"{Need,a,labourer.,Easy,work.,Monday,or,Tues}"
6;"{need,a,labourer,for,7,weeks}"
5;"{Need,a,labourer,for,today}"
5;"{Need,a,labourer,next,week}"
9;"{Need,a,last,minute,barista,or,chef?!,Staff,cancelled?!}"
9;"{Need,a,live,in,nanny,for,our,2,sons}"
8;"{Need,a,local,Electrician?,Look,no,further,:)}"
8;"{Need,a,male,cleaner,for,a,busy,restaurant}"
6;"{Need,a,man,and,a,ute!!!}"
9;"{NEED,A,MAN,AND,UTE,MONDAY,26th,AFTER,5PM}"
9;"{NEED,A,MANSPOWER,TO,HELP,US,IN,OUR,MOVING}"
8;"{Need,an,after-school,nanny,for,month,of,October}"
5;"{Need,a,Nanny,/,Babysitter}"
6;"{Need,a,nanny,for,2,kids}"
9;"{Need,a,nanny,for,3,days,after,school,care}"
7;"{Need,a,Nanny,for,7,year,old}"
9;"{Need,a,nanny,for,a,few,days,a,week}"
9;"{Need,a,nanny,for,after,school,pick,and,care}"
8;"{Need,a,nanny,for,immediate,start,on,Thursdays}"
9;"{Need,a,nanny,for,my,3,year,old,daughter}"
9;"{Need,a,nanny,for,my,3,year,old,daughter}"
6;"{Need,a,nanny,for,one,day.}"
7;"{Need,a,nanny,for,"Rydges,",Campbelltown,5.45-10.30pm}"
I got this. Is this the result you expected?
phrase_part count
------------------------------------------------------------------------- -----
{"Need","a"} 32
{"Need","a","cleaner"} 9
{"a","cleaner"} 9
{"cleaner","for"} 5
{"Need","a","babysitter"} 5
{"a","babysitter"} 5
{"a","cleaner","for"} 5
{"Need","a","cleaner","for"} 5
{"for","hotel","in"} 3
{"near","Geelong"} 3
{"hotel","in"} 3
{"Cleaner","for","hotel"} 3
{"for","hotel"} 3
{"babysitter","for","our"} 3
{"for","our"} 3
{"babysitter","for"} 3
{"a","Cleaner","for"} 3
{"Need","a","babysitter","for"} 3
...
{"NEED","A"} 2
{"a","cleaner","for","daily"} 2
{"Need","a","change"} 2
{"daily","work"} 2
{"pay","cash"} 2
{"a","cleaner","in"} 2
{"Need","a","cleaner","for","tomorrow","pay","cash"} 2
{"for","casual","sitting"} 1
{"concreter","to","start","full"} 1
{"a","change","from","customer","service?","Look"} 1
{"a","contract","climber"} 1
{"Roxburghpark","Area"} 1
{"Need","a","Carwash"} 1
...
If this is your expected result here's the query for it. But I am not sure if you should do this with a huge data set!
I began with the plain phrases instead of your sample data with the arrays. Additionally I added an id column for each phrase:
WITH phrases as (
SELECT
*,
row_number() over (partition by id) nth_word -- B
FROM (
SELECT
id,
unnest(string_to_array(phrase, ' ')) as word -- A
FROM testdata.phrases
)s
)
SELECT
phrase_part,
count(phrase_part) FILTER (WHERE cardinality(phrase_part) >= 2) -- E
FROM (
SELECT
*,
array_agg(b.word) over (partition by a.id, a.nth_word order by a.id, a.nth_word, b.nth_word) --D
as phrase_part
FROM phrases a
JOIN phrases b -- C
ON (a.id = b.id AND a.nth_word <= b.nth_word)
) s
GROUP BY phrase_part
ORDER BY COUNT DESC
A: formatting the plain phrases into single word arrays and expand the table to one word per line
B: adding a word counter to identify the nth word of the phrase with a window function
C: cross join the phrases with themself; better speaking: join on word with each following of the same phrase
D: this window function aggregates the phrase words. It creates a result like
id word nth_word id word nth_word phrase_part
-- ------------ -------- -- ------------ -------- -------------------------------------------------------------------------
1 Need 1 1 Need 1 {"Need"}
1 Need 1 1 a 2 {"Need","a"}
1 Need 1 1 baby 3 {"Need","a","baby"}
1 Need 1 1 sitter 4 {"Need","a","baby","sitter"}
1 Need 1 1 for 5 {"Need","a","baby","sitter","for"}
1 Need 1 1 casual 6 {"Need","a","baby","sitter","for","casual"}
1 Need 1 1 sitting 7 {"Need","a","baby","sitter","for","casual","sitting"}
1 a 2 1 a 2 {"a"}
1 a 2 1 baby 3 {"a","baby"}
1 a 2 1 sitter 4 {"a","baby","sitter"}
E: grouping by phrases and counting the elements. The filter clause allows you to count different cardinalities.

result table description

I want to write a query in SQL. Can someone help me for writing oracle sql query for below result table
Table 1 Data
prodno description
1 Laptop
2 Charger
3 Mouse
Table 2 Data
prodno prodset_no
1 1
2 1
3 1
1 2
3 2
1 3
2 3
Result Table
prodset_no prodset_desc
1 Laptop,Charger,Mouse
2 Laptop,Mouse
3 Laptop,Charger
JOIN the two table and then use listagg to produce comma separated output:
select t2.prodset_no,
listagg(t1.description, ',') within group (
order by t1.prodno
) prodset_desc
from table2 t2
join table1 t1 on t2.prodno = t1.prodno
group by t2.prodset_no;
Also, worth a note that the listagg has a limit of 4000 bytes. If you hit that limit, you can either use XMLAGG or rethink the problem and not do it in SQL at all but rather handle it in your application code.

SQL Query to find which group does not have a given value

I am using T-SQL.
Say if I have the following
Value Nbr
----- ---
one 6
one 7
one 8
two 6
two 7
three 5
three 3
three 2
In the above table, I need to find which group does not have 6 in it.
In this case, it is three as it does not have 6 in it.
What would be the best approach to do this?
I tried:
select Value from tbl1
where nbr <> 6
group by Value
but did not get the intended result.
select distinct value
from tbl1
where value not in
(
select distinct value
from tbl1
where nbr = 6
)

Returning several rows from a single query, based on a value of a column

Let's say I have this table:
|Fld | Number|
1 5
2 2
And I want to make a select that retrieves as many Fld as the Number field has:
|Fld |
1
1
1
1
1
2
2
How can I achieve this? I was thinking about making a temporary table and instert data based on the Number, but I was wondering if this could be done with a single Select statement.
PS: I'm new to SQL
You can join with a numbers table:
SELECT Fld
FROM yourtable
JOIN Numbers
ON yourtable.Number <= Numbers.Number
A numbers table is just a table with a list of numbers:
Number
1
2
3
etc...
Not an great solution (since you still query your table twice, but maybe you can work from it)
SELECT t1.fld, t1.number
FROM table t1, (
SELECT ROWNUM number FROM dual
CONNECT BY LEVEL <= (SELECT MAX(number) FROM t1)) t2
WHERE t2.number<=t1.number
It generates maximum amount of rows needed and then filters it by each row.
I don't know if your RDBMS version supports it (although I rather suspect it does), but here is a recursive version:
WITH remaining (fld, times) as (SELECT fld, 1
FROM <table>
UNION ALL
SELECT a.fld, a.times + 1
FROM remaining as a
JOIN <table> as b
ON b.fld = a.fld
AND b.number > a.times)
SELECT fld
FROM remaining
ORDER BY fld
Given your source data table, it outputs this (count included for verification):
fld times
=============
1 1
1 2
1 3
1 4
1 5
2 1
2 2