How to add lines from text file to sqlite db rows that already exist? - sql

I have 12 columns with +/- 2000 rows in a sqlite DB.
Now I want to add a 13th column with the same amount of rows.
If I import the text from a cvs file it will add this after the existing rows (now I have a 4000 row table)
How can I avoid adding it underneath these rows?
Do I need to create a script to run trough each row of the table and add the text from the cvs file for each row?

If you have the code that imported the original data, and if the data has not changed in the meantime, you could just drop the table and reimport it.
Otherwise, you indeed have to create a script that looks up the corresponding record in the table and updates it.
You could also import the new data into a temporary table, and then copy the values over with a command like this:
UPDATE MyTable
SET NewColumn = (SELECT NewColumn
FROM TempTable
WHERE ID = MyTable.ID)

I ended up using Razor SQL great program.
http://www.razorsql.com/

Related

How to import daily csv data into table with generated columns postgres

I'm new to PostgreSQL and and looking for some guidance and best practice.
I have created a table by importing data from a csv file. I then altered the table by creating multiple generated columns like this:
ALTER TABLE master
ADD office VARCHAR(50)
GENERATED ALWAYS AS (CASE WHEN LEFT(location,4)='Chic' THEN 'CHI'
ELSE LEFT(location,strpos(location,'_')-1) END) STORED;
But when I try to import new data into the table I get the following error:
ERROR: column "office" is a generated column
DETAIL: Generated columns cannot be used in COPY.
My goal is to be able to import new data each day to the table and have the generated columns automatically populate in order to transform the data as I would like. How can I do so?
CREATE TEMP TABLE master (location VARCHAR);
ALTER TABLE master
ADD office VARCHAR
GENERATED ALWAYS AS (
CASE
WHEN LEFT(location, 4) = 'Chic' THEN 'CHI'
ELSE LEFT(location, strpos(location, '_') - 1)
END
) STORED;
--INSERT INTO master (location) VALUES ('Chicago');
--INSERT INTO master (location) VALUES ('New_York');
COPY master (location) FROM $$d:\cities.csv$$ CSV;
SELECT * FROM master;
Is this the structure and the behaviour you are expecting? If not, please provide more details regarding your table structure, your importable data and your importing commands.
Also, maybe when you try to import the csv file, the columns are not linked properly, or maybe the delimiter is not properly set. Try to specify each column in the exact order that appear in your csv file.
https://www.postgresql.org/docs/12/sql-copy.html
Note: d:\cities.csv contains:
Chicago
New_York
EDIT:
If columns positions are mixed up between table and csv, the following operation may come in handy:
1. create temporary table tmp (csv_column1 <data_type>, csv_column_2 <data_type>, ...); (including ALL csv columns)
2. copy tmp from '/path/to/file.csv';
3. insert into master (location, other_info, ...) select csv_column_3 as location, csv_column_7 as other_info, ... from tmp;
Importing data using an intermediate table may slow things down a little, but gives you a lot of flexibility.
I was getting the same error when importing to PG from a csv - I found that even though my column was generated, I still had to have it in the imported data, just left it empty. Worked fine when the column name was in there and mapped to my DB col name.

Append data on one row to another row in ms sql

Can you help me on this one.I'm trying to pull data from the database of a CAD software and I wish to make a temporary table from the given table below(the output temptable is also shown below) so that i can join it to my already created table1. I'm new to SQL and it seems that a temporary table could work but i don't know how to append the data from the other row into the first row such that the behavior is similar to a sum() function but working with text. Since i cannot post pictures yet, bear with me the formatting of the original table. and the temptable i wish to make. Thanks in advance
orignal table
----Oid---- ----Cable Tray----
--0010f--- ---mv001---
--0010f--- ---mv002---
--0010f--- ---mv003---
--020ab--- ---lv001---
--020ab--- ---lv002---
output temptable
----Oid---- ----Cable Tray Route---
--0010f--- ---mv001, mv002, mv003---
--020ab--- ---lv001, lv002---
This is my sample code:
select *
from table1
join temptable on temptable.oid=table1.oid

update data in existing table

I'm wanting import data regularly into a table on sql server. Some existing entries in the table will need to be updated with future imports. e.g. date of 'file closure' & 'outcome'.
I'm hoping someone could point in the direction of sql that would append new data to the existing table, and well as update any existing entries that have changed.
Thanks,
Shaun
you append data with e.g
INSERT INTO table_name (column1,column2,column3,...)
VALUES (value1,value2,value3,...);
and you update with
UPDATE table_name
SET column1=value1,column2=value2,...
WHERE some_column=some_value;
if you always do an UPDATE first, you can see on the outcome, if the column does not exist. Then you do an insert

Dynamically export SQL table to CSV file everysec using SSIS

I am new to SQL and SSIS work. I'd like to dynamically export SQL datatable to a CSV file everysec using SSIS. one of column name RECIPE STATUS, value is 1 OR 2 (int) (1 is new Recipe, 2 is old recipe which is existed), Another column name is RECIPE NAME, value is AAABBB (varchar) some more columns with values In Database table, only one row data avaialble, it will be changed ev sec. So we are trying to export it to csv file to log different/unique RECIPENAMES and data for analysis.
Table Schema is
SELECT TOP 5 [RowID]
,[RowInsertTime]
,[TransId]
,[RecipeStatus]
,[RecipeName]
,[RecipeTagName]
,[Value]
,[ReadStatus]
,[sData1]
,[sData2]
,[sData3]
,[nData1]
,[nData2]
,[nData3]
FROM [MES].[dbo].[MESWrite_RecipeData]
While exporting, based on RECIPE STATUS value if 1, then Insert/create a new row in CSV and export. (Its a simple insert TSQL statement, insert into csvfile (values from databsetable)
if value 2 means RECIPENAME value is AAABBB existed in CSV already. so find and update other columns values. so in this case don't create a new row in csv. (i can say update all othercolumns with values from database table to csv where RECIPENAME="AAABBB") like Update Query in T-SQL
. finally if we have to send this SSIS package to a customer, it can be executable file. or how to make secured? Please help me ..

sql dump of data based on selection criteria

When extracting data from a table (schema and data) I can do this by right clicking on the database and by going to tasks->Generate Scripts and it gives me all the data from the table including the create script, which is good.
This though gives me all the data from the table - can this be changed to give me only some of the data from the table? e.g only data on the table after a certain dtmTimeStamp?
Thanks,
I would recommend extracting your data into a separate table using a query and then using generate scripts on this table. Alternatively you can extract the data separately into a flatfile using the export data wizard (include your column headers and use comma seperators with double quote field delimiters).
To make a copy of your table:
SELECT Col1 ,Col2
INTO CloneTable
FROM MyTable
WHERE Col3 = #Condition
(Thanks to #MarkD for adding that)