How do I import a CSV file into SQL PLUS? - sql

I have created a database and have my tables, but I need to implement some data into it, to avoid writing out 5000+ entries I have decided to download 'dummy data (fake)' for names, numbers, address, etc.
I have tried using "BULK INSERT" but SQL does not accept this, is there any other way of implementing the fake data?

SQL Plus can only interpret commands given by console or read from script. You have to create script with set of insert commands and run it from SQL Plus. I used this approach more times, it's not as difficult.
Write your script like this:
declare
procedure change_data(your_data_in_parameters) is
begin
set_of_our_commands_to_handle_one_record;
end;
begin
change_data(your_data_from_CSV_row1);
change_data(your_data_from_CSV_row2);
...
commit;
end;
Open your CSV in some spreadsheet editor and add new formula column which collects change_data rows.

Related

How to load csv file to multiple tables in postgres (mainly concerned about best practice)

I'm new to DB/postgres SQL.
Scenario:
Need to load an csv file into postgres DB. This CSV data needs to loaded into multiple tables according DB schema. I'm looking for better design using python script.
My thought:
1. Load CSV file to intermediate table in postgres
2. Write a trigger on intermediate table to insert data into multiple tables on event of insert
3. Trigger includes truncate data at end
Any suggestions for better design/other ways without any ETL tools, and also any info on modules in Python 3.
Thanks.
Rather than using a trigger, use an explicit INSERT or UPDATE statement. That is probably faster, since it is not invoked per row.
Apart from that, your procedure is fine.

Export and insert CLOB data containing SQL statements with SQL Developer

I have a table with a CLOB column containing SQL code. Now I want to transfer the content from the developer database into the productive one. I use the following script to export the content of the dev table:
set long 100000
set lines 1000
spool d:\export.sql
select /*insert*/* from myTable;
spool off
However, the import into the prod table is not working due to ' characters in the SQL code. A generated insert statement looks like this:
insert into myTable (id, name, sql)
values (1, 'John', 'select * /* this is a hint */
from table1
where attr1 = 'hi,you' and attr2 = 'me, too')
How can I insert this CLOB, or how do I export it in a better way?
I'd use Data Pump if it's available.
If not, I'd use SQL*Loader.
What you can do, is use SQL Developer to unload your table to a SQL*Loader setup, each CLOB will be written to a file, and they can be loaded then w/o issues like what you're seeing.
I wrote this here for how to do this with BLOBS, but would be the same process.
The output will be all the files you need to move your table over to the new system, the control file, the data stream, and all the LOBS.
Once you have your files, you will need to make sure you have an Oracle Client installed, or have the full Instant Client.
This will give you access to SQL*Loader.
It's a command-line utility, no GUI. It works much like SQL*Plus does. You'll want to make sure your Oracle ENV is setup so you can start it up and connect.
But.
Everything you need is in the ZIP that SQLDev put together for you, the biggest piece is the .ctl (control file).
Docs
sqlldr scott CONTROL=ulcase1.ctl ulcase1.log
'scott' is the database username, it'll prompt you for a password. You'll subsitute the ulcase1.ctl for the ctl file you got from SQLDev. The log bit is optional, but IMPORTANT.
By the way, this should run FAST.
If you're running this on your pc, your connect string will be more like
sqlldr hr#server:port/service

How to retrieve the insert statements from an SQL Database that I imported from an Excel file

i recently used a MS excel database and imported it into an database but i want to get all the insert statements that the import made.
Is it possible? All i can see is the executed database in my object explorer but i want all the insert statements of the imported data??
OK let me try explain again :) i have thousands of rows of data in an excel spreadsheet, how do i convert the rows of data to a database. I'm asking like this now because i think i just messed up on the previous try:(
Export the table in sql format
The Format of Output of .sql file is In the form
Create Table Statements
Insert each entry statement So Open that .sql file and copy all INSERT statement
Hope this was your requirement

How to create a dynamic table structure in SQL from multiple delimited text files?

I have about 100,000+ delimited text files (they dont have same number of columns in each file, e.g. some files have 10 columns, some have 20 and so on). I need to upload all of them to SQL server. Please suggest how can I do it?
I also have an excel spreadsheet enlisting the names/path where files are stored and also the number of columns in each text file. I am clueless how to go about it.
Thanks in advance
I assume you are able to use C# (or other programming language) to create an app which will help you to complete the task. The program should do the following:
Run through all the files and determine all the columns you need.
Create a table on SQL server with columns the program found. Set datatype varchar(max) for each column.
Run through all the files again and insert data to the table. There is 2 ways you can go:
a) Insert data row by row. Pretty slow, but simple.
b) If you use C# you can implement your own DataReader and use SqlBulkCopy class to bulk insert data into the table
Here is some link which may help you:
http://www.sqlteam.com/article/use-sqlbulkcopy-to-quickly-load-data-from-your-client-to-sql-server
http://www.michaelbowersox.com/2011/12/22/using-a-custom-idatareader-to-stream-data-into-a-database/
http://daniel.wertheim.se/2010/11/10/c-custom-datareader-for-sqlbulkcopy/

What is the easiest way to add a bunch of content to a SQL database?

Nothing technical here. Suppose I have a lot of different categorized data, and I would like to create a database out of it. Would someone literally hand plug in all that info with SQL code itself? Or do some people make a mock website just to input data? What are some of your strategies?
If there would be no way to do it automatically, then a mock website would be the way to go: you can even use it with more people at once, actually multiplying the input speed (as long as you don't mess up assigning each of them a different part of the data).
What format is your data in? And how much of it is there? If its Excel then SQL Server has tools to import it in. I'm not sure if MySQL has anything similar. Even if it doesn't one other technique I have used with Excel data is to use a formula to concatenate as required to generate the INSERT statements. Then just paste those into a query window and run that.
I wouldn't do a website for it unless I was building an admin site for it already and wanted to test that with the initial load.
Most databases have a way to do bulk inserts or have tools for data import.
My strategies normally involve such tools.
Here is an example of importing a CSV file to SQL Server.
Most database servers provide a way to import data from a variety of formats, you could look into that first.
If not, you could write a simple script or console application to parse your input data, and write out a SQL script to insert the data into appropriate tables.
For example, if you data was in a CSV file, you would parse each line in the file, and generate an insert statement to write out to a .sql file.
MyData.csv
1,2,3,'Test',4
2,3,4,'Test2,6
GeneratedInsert.sql
insert into table (col1,col2,col3,col4,col5) values (1,2,3,'Test',4)
insert into table (cal1,col2,col3,col4,col5) values (2,3,4,'Test2',6)
MySQL has a statement LOAD DATA INFILE that is intended for loading bulk data from flat files. It's easy to use and much faster than alternative methods.
But first you do have to use SQL to design tables with fields that match the field of your import data. That is, if you have some file with comma-separated data:
Titanic;1997;4 stars
Batman Begins;2005;5 stars
"Harry Potter and the Sorcerer's Stone";2001;3 stars
...
You would create a table:
CREATE TABLE Movies (
title VARCHAR(100) NOT NULL,
year YEAR NOT NULL
rating VARCHAR(10)
);
Then load data:
LOAD DATA INFILE 'movies.txt' INTO TABLE Movies
FIELDS TERMINATED BY ';' OPTIONALLY ENCLOSED BY '"';
Most web languages have some sort of auto-scaffolding that you can quickly set up. Useful for admin work as well, if your site is hosted without direct access to DB.
Otherwise, yeah - write the SQL statements. Useful to bring a database up as part of your build process.