Normalization h/w - sql

Was told to put that into UNF/1NF/2NF/3NF, is this correct?
Show the above data as a relation in UNF (unnormalised data).
Customer (CustomerID, FirstName, LastName, Address, City, Phone, State, Postcode,Qty, ProductNo, Description, Unit price, Total, Subtotal, Shipping, Tax Rate, Date, OrderNo.))
Show the data as a relation/s in 1NF. (Indicate any keys.)
Customer (CustomerID, FirstName, LastName, Address, City, state, Phone, State, Postcode)
Product (ProductNo, Qty, Description, Unitprice, total, subtotal, shipping, Tax rate(s), CustomerID(FK).)
Order (OrderNo, Date, ProductNo(FK).)
Show the data as a relation/s in 2NF. (Indicate any keys.)
Customer( CustomerID, FirstName, LastName, Address, City, Phone, State, Postcode)
Product ( ProductNo, Qty, Description, UnitPrice, CustomerID(FK), Total(FK).)
Order( OrderNo, Date, CustomerID(FK), ProductNo(FK).)
Total(Total, subtotal, shipping, Tax Rates, ProductNo(FK),CustomerID(FK) )
Show the data as a relation/s in 3NF. (Indicate any keys.)
Customer (CustomerID, FirstName, LastName, Address, City, Phone, State, Postcode)
Product (ProductNo, , Description, Unit Price. CustomerID(FK), Total(FK) )
Order (OrderNo, Date, CustomerID(FK).ProductNo(FK) )
Total(Total, subtotal, ProductNo(FK), CustomerID(FK) )
Shipping(Shipping, Tax Rates, Total(FK), OrderNo(FK) )
Qty( QtyID, Qty, ProductNo(FK), OrderNo(FK).)

It looks good to me, but you are missing one crucial piece of the design. You haven't defined any Primary Keys on your tables, although you have identified the foreign keys (use the foreign keys you have to work out the primary keys on each of the tables :)).

Show the above data as a relation in UNF (unnormalised data).
Customer (CustomerID, FirstName, LastName, Address, City, Phone,
State, Postcode,Qty, ProductNo, Description, Unit price, Total,
Subtotal, Shipping, Tax Rate, Date, OrderNo.))
No, that's not right. There doesn't seem to be any customer ID number on the invoice. Normalization doesn't involve introducing new attributes. As an unnormalized collection of attributes, labeling that list as "Customer" is premature.
Show the data as a relation/s in 1NF. (Indicate any keys.)
Customer (CustomerID, FirstName, LastName, Address, City, state,
Phone, State, Postcode)
Product (ProductNo, Qty, Description,
Unitprice, total, subtotal, shipping, Tax rate(s), CustomerID(FK).)
Order (OrderNo, Date, ProductNo(FK).)
Drop CustomerID. (See above.) I'm guessing that one of the candidate keys for the "Product" table is "ProductNo". If that's the case, why does that table include "CustomerID"?
Show the data as a relation/s in 2NF. (Indicate any keys.)
Customer( CustomerID, FirstName, LastName, Address, City, Phone, State, Postcode)
Product ( ProductNo, Qty, Description, UnitPrice, CustomerID(FK), Total(FK).)
Order( OrderNo, Date, CustomerID(FK), ProductNo(FK).)
Total(Total, subtotal, shipping, Tax Rates, ProductNo(FK),CustomerID(FK) )
2NF has to do with removing partial key dependencies. What partial key dependency did you identify that justified creating the table "Total"? (Hint: there isn't any justification for that.) Do this thought experiment (or build it in SQL): If "Total" is the primary key for the table "Total", what will you do if two orders result in the same total?
I'll stop there for now, because you've really gotten off on the wrong foot. You need to start with a list of all attributes, then identify the candidate keys and functional dependencies. Without starting there, you're unlikely to find 3NF.

An interesting thing about invoices....J Frompton orders a rake today, but some time in the future the price will change. However, that does not change the price Frompton payed today.
Once invoices are fulfilled, they really should be moved to a table that is 1NF.

Related

How to convert to BCNF

I am confused on how to convert to BCNF. I believe i have all of my tables in 2NF:
Library (ID, location_name, location_address, phone)
Patron (ID,
first_name, last_name, email, phone, joindate)
Book (ID, genre_id,
name, author_id, ISBN, edition, copynr)
Author(ID,first_name,last_name) Card (cardID, patronId, issue date)
Checkouts(ID, cardID, bookID, patronID, libraryID, date)
Checkins(ID,
cardID, bookID, patronID, libraryID, date)
Im thinking i can elimate the joindate and issuedate

Merge array with previous value (under defined conditions)

Here's the initial table's structure :
yearquarter,user_id,gender,generation,country,group_id
2019-03,zfuzhfuzh,M,Y,FR,Group_1
2019-04,zfuzhfuzh,M,Y,FR,Group_1
2020-04,zfuzhfuzh,M,Y,FR,Group_1
2019-03,ggezegz,F,Y,FR,Group_2
2019-04,ggezegz,F,Y,FR,Group_2
2020-04,ggezegz,F,X,FR,Group_2
....
I want to be able to know the cumulative amount of user_id quarter after quarter grouped by gender, generation and country. Expected result: for a given combination of gender,generation,country I need the cumulated number of users quarter after quarter.
I started with this :
SELECT yearquarter,gender,generation,country,array_agg(distinct user_id IGNORE NULLS) as users FROM my table
WHERE group_id= "mygroup"
GROUP BY 1,2,3,4
But I don't know how to go from this to the result I'm looking for...
You can use aggregation to count the number of users per gender, generation country and period, and then make a window sum over the periods;
select
gender,
generation,
country,
yearquarter,
sum(count(distinct user_id)) over(partition by gender, generation, country order by yearquarter) cnt
from mytable
where group_id = 'mygroup'
group by gender, generation, country, yearquarter
order by gender, generation, country, yearquarter
I am unsure that bigquery supports distinct in window functions. If it doesn't, then we can use a subquery:
select
gender,
generation,
country,
yearquarter,
sum(count(*)) over(partition by gender, generation, country order by yearquarter) cnt
from (
select distinct gender, generation, country, yearquarter, user_id
from mytable
where group_id = 'mygroup'
) t
group by gender, generation, country, yearquarter
order by gender, generation, country, yearquarter
If you want each user to be counted only once, for their first appearance period:
select select
gender,
generation,
country,
yearquarter,
sum(count(*)) over(partition by gender, generation, country order by yearquarter) cnt
from (
select gender, generation, country, user_id, min(yearquarter) yearquarter
from mytable
where group_id = 'mygroup'
group by gender, generation, country, user_id
) t
group by gender, generation, country
Below is for BigQuery Standard SQL - built purely on top of your initial query with ARRAY_AGG replaced with STRING_AGG
#standardSQL
SELECT yearquarter, gender, generation, country,
(SELECT COUNT(DISTINCT id) FROM UNNEST(SPLIT(cumulative_users)) AS id) AS cumulative_number_of_users
FROM (
SELECT *,
STRING_AGG(users) OVER(PARTITION BY gender, generation, country ORDER BY yearquarter) AS cumulative_users
FROM (
SELECT
yearquarter, gender, generation, country,
STRING_AGG(DISTINCT user_id) AS users
FROM `project.dataset.table`
WHERE group_id= "mygroup"
GROUP BY yearquarter, gender, generation, country
)
)
-- ORDER BY yearquarter, gender, generation, country

Which sqlite query should i use in this case

I have two tables(and their columns) in my DB:
CUSTOMERS(ID, FIRSTNAME, LASTNAME, ADDRESS);
ORDERS (ID, PRODUCT_NAME, PRODUCT_PRICE, DATE_ORDER DATE, ID_CUSTOMER, AMOUNT);
Here is what should I exactly do:
List the first and last names of the customers along with the count
of their orders.
List the first and last names of the customers and calculate the total sum of their orders.
Please make series of SELECTs and sort each one by FIRSTNAME and LASTNAME.
You could join the customers table with an aggregate query on the orders table:
SELECT firstname, lastname, num_orders, sum_orders
FROM customers
JOIN (SELECT id_customer, COUNT(*) AS num_orders, SUM(amount) AS sum_orders
FROM orders
GROUP BY id_customer) OR id_customer = id
ORDER BY 1, 2

how to create inset trigger to add new row in to another table in sybase?

I have three table as customer, customer-log and address.
I want to add new row which consist of foreign key(primary key of customer table) and all columns from address table in to the customer-log table.
can somebody correct this trigger ?
customer columns:
customerId, addressId, name, auto, type, date
customer-log columns:
customerId, addressId, street, number, city, country
address columns:
addressId, street, number, city, country
Trigger:
create trigger insertCustomerLog
on customer
for insert
if INSERTED.type = 'B'
begin
INSERT INTO customer-log
SELECT INSERTED.customerId,(SELECT * FROM address AS a WHERE INSERTED.addressId = a.addressId)
end

Inserting multiple rows using SQL - issue with manually incrementing numbers

Someone else designed this table and I am not allowed to modify it so bear with me.
I am trying to insert multiple rows from one table into another. The table where I am inserting the rows has an ID but it does not auto-increment. I cannot figure out how to manually increment the id as I insert rows. The current code throws an error:
Error running query. Page15.CaseSerial is invalid in the select list
becauseit is not contained in either an aggregate function or the
GROUP BY clause.
I've tried adding a GROUP BY clause with no success.
Here's the code:
insert into page4 (serial, caseserial, linkserial, type, add1, add2, city, state, orgname, prefername, email, firstname, lastname, salutation, contactstatus, workphone, notes, cellphone, nametype, homephone, fax, zip, payments)
select id = max(serial), caseserial, linkserial, type, add1, add2, city, state,
orgname, prefername, email, firstname, lastname, salutation, contactstatus,
workphone, notes, cellphone, nametype, homephone, fax, zip, payments
from page16
It would be nice if I could write something to get the highest id from page4 and insert the next highest.
Thanks!
declare #maxId int
select #maxId = max(yourIdColumn)
from YourTable
set #maxId = #maxId + 1
insert into YourTable (yourIdColumn, ....)
values (#maxId, ....)
Disclaimer: not sure how this would transpose over to other RDBMS's, but this is with SQL Server in mind. Also, this handles inserting only one value. If you need to insert a set of values, then please let me know.