I want to move the data from a spreadsheet into a database. The program I am using is called SQLWorkbenchJ. I am kinda of lost and don't really know where to start. Is there any tips or ways that might point me in the right direction.
Sql Workbench/J provides the WbImportcommand in order to load a text file into a DB table. So if you save your spreadsheet file in the CSV (comma separed value) format you can then load it in a table using this command.
Here is an example to load the text file CLASSIFICATION_CODE.csvhaving ,as field delimiter and ^ as quoting character in the CLASSIFICATION_CODEDB table.
WbImport -type=text
-file='C:\dev\CLASSIFICATION_CODE.csv'
-delimiter=,
-table=CLASSIFICATION_CODE
-quoteChar=^
-badfile='C:\dev\rejected'
-continueOnError=true
-multiLine=true
-emptyStringIsNull=false;
You might not need all the parameters of the example. Refer to the documentation to find the ones you need.
If the data you have in your spreadsheet are heterogeneous (e.g. your spreadsheet has two books) then split them in two files in order to store them in separate DB tables.
Related
I'm trying to find a specific string on my database. I'm currently using FlameRobin to open the FDB file, but this software doesn't seems to have a properly feature for this task.
I tried the following SQL query but i didn't work:
SELECT
*
FROM
*
WHERE
* LIKE '126278'
After all, what is the best solution to do that? Thanks in advance.
You can't do such thing. But you can convert your FDB file to a text file like CSV so you can search for your string in all the tables/files at the same time.
1. Download a database converter
First step you need a software to convert you databse file. I recommend using Full Convert to do it. Just get the free trial and download it. It is really easy to use and it will export each table in a different CSV file.
2. Find your string in multiple files at the same time
For that task you can use the Find in files feature of Notepad++ to search the string in all CSV files located at the same folder.
3. Open the desired table on FlameRobin
When Notepad++ highlight the string, it shows in what file it is located and the number of the line. Full Convert saves each CSV with the same name as the original table, so you can find it easily whatever database manager software you are using.
Here is Firebird documentation: https://www.firebirdsql.org/file/documentation/reference_manuals/fblangref25-en/html/fblangref25.html
You need to read about
Stored Procedures of "selectable" kind,
execute statement command, including for execute statement variant
system tables, having "relation" in names.
Then in your SP you do enumerate all the tables, then you do enumerate all the columns in those tables, then for every of them you run a usual
select 'tablename', 'columnname', columnname
from tablename
where columnname containing '12345'
over every field of every table.
But practically speaking, it most probably would be better to avoid SQL commands and just to extract ALL the database into a long SQL script and open that script in Notepad (or any other text editor) and there search for the string you need.
I have to export a table from a SQL Server, the table contains a column that has a large text content with the maximum length of the text going up to 100,000 characters.
When I use Excel as an export destination, I find out that the length of this text is capped and truncated to 32,765.
Is there an export format that preserves the length?
Note:
I will eventually be importing this data into another SQL Server
The destination SQL Server is in another network, so linked servers and other local options are not feasible
I don't have access to the actual server, so generating back up is difficult
As is documented in the Excel specifications and limits the maximum characters that can be stored in a single Excel cell is 32,767 characters; hence why your data is being truncated.
You might be better off exporting to a CSV, however, note that Quote Identified CSV files aren't supported within bcp/BULK INSERT until SQL Server 2019 (currently in preview). You can use a characters like || to denote a field delimited, however, if you have any line breaks you'll need to choose a different row delimitor too. SSIS, and other ETL tools, however, do support quote identified CSV files; so you can use something like that.
Otherwise, if you need to export such long values and want to use Excel as much as you can (which I actually personally don't recommend due to those awful ACE drivers), I would suggest exporting the (n)varchar(MAX) values to something else, like a text file, and naming each file with the value of your Primary Key included. Then, when you import the data back you can retrieve the (n)varchar(MAX) value again from each individual file.
The .sql is the best format for sql table. Is the native format for sql table, with that, you haven't to concert the export.
First, a grumble: MS builds SQL Server Studio AND Excel, but can't make one save in the standard format of the other?
OK, I'm a data analyst, but not allowed to change/mod either the data or structures directly. So full READ, but no WRITE.
I'm trying to do a dump so I can do some of this analysis offline, as I have no remote access either.
So one VARCHAR2 column in this table is for comments on the purchase of the asset being described/tracked. Of course, there are commas. The only export types built into SQL Server Studio are .csv and .txt, and .csv just turns into a mess when 'comma' is included as a delimiter.
So after an hour or so of screwing around with this, (including reading a thread on methods for excluding the one column from a SELECT while still exporting the other 221 columns in the table, without having to write them all out manually (fun reading, impressive, but means I'd have to figure out which of them actually works, and then still export the one column separately and insert it in the Excel separately)) I am throwing this problem on the pile at StackOverflow.
Someone else must have worked around this frustration of the .csv format as export VS the commas embedded in 'comment' text.
Any help would be appreciated.
Why don't you simply select all data in ssms result window, then copy and then paste in a blank excel file?
It should copy paste all data in correct format including comma valued fields in single column.
Try that.
So If you replace the ' to some special character you can export it.
Select
Replace(columnName,'''','`')
from Table
Other solution if you use the manager studio
https://learn.microsoft.com/en-us/sql/integration-services/import-export-data/start-the-sql-server-import-and-export-wizard
I have several 100 excel files that are not normalized enough for me to efficiently import into the tables of my database. It is difficult to find information but from what I have seen it is possible to index xlsx files with FTS. I am not really looking to implement an alternate database for this as it is a one time thing that will not receive new data.
Would it be possible to do this with FTS and if so could someone point me in the right direction as the info I've found on msdn is quite vague.
Thanks.
I have done something similar using BULK. I would suggest taking a look at it
http://www.sqlteam.com/article/using-bulk-insert-to-load-a-text-file
How it works is excel data can be taken as a text file. Each column is separated by a ";" and each row by "\n" you can then use BULK to crawl through your excel sheet and insert it in a table.
Note that all the values coming from BULK are text values. So if your table contains int values, for example, you will need a temporary table.
CREATE TABLE #TEMPORARYTABLE(
)
The # creates a table that only exists until you disconnect from sql server.
All values in that table should be nvarchars
You can then insert into your real table the #TEMPORARYTABLE and CAST the Nvarchar values to int values or whatever else you need
FTS is a feature in SQL Server, The data you want to create FTS for needs to in a SQL Server database.
Excel being in Excel and not in SQL Server , you will not be able to create FTS for them Excel sheets.
But if you import that data into SQL Server only then you will be able to make use of FTS features, till then unfortunately FTS is not an option for you.
I have about 100,000+ delimited text files (they dont have same number of columns in each file, e.g. some files have 10 columns, some have 20 and so on). I need to upload all of them to SQL server. Please suggest how can I do it?
I also have an excel spreadsheet enlisting the names/path where files are stored and also the number of columns in each text file. I am clueless how to go about it.
Thanks in advance
I assume you are able to use C# (or other programming language) to create an app which will help you to complete the task. The program should do the following:
Run through all the files and determine all the columns you need.
Create a table on SQL server with columns the program found. Set datatype varchar(max) for each column.
Run through all the files again and insert data to the table. There is 2 ways you can go:
a) Insert data row by row. Pretty slow, but simple.
b) If you use C# you can implement your own DataReader and use SqlBulkCopy class to bulk insert data into the table
Here is some link which may help you:
http://www.sqlteam.com/article/use-sqlbulkcopy-to-quickly-load-data-from-your-client-to-sql-server
http://www.michaelbowersox.com/2011/12/22/using-a-custom-idatareader-to-stream-data-into-a-database/
http://daniel.wertheim.se/2010/11/10/c-custom-datareader-for-sqlbulkcopy/