How can I store the result of a SQL query in a CSV file using Squirrel? - sql

Version 3.0.3. It's a fairly large result-set, around 3 million rows.

Martin pretty much has this right.
The TL/DR version is that you need the "SQLScripts" plugin (which is one of the "standard' plugins), and then you can select these menu options: Session > Scripts > Store Result of SQL in File
I'm looking at version 3.4. I don't know when this feature was introduced but you may need to upgrade if you don't have and cannot install the SQLScripts plugin.
Instructions for installing a new plugin can be found at: http://squirrel-sql.sourceforge.net/user-manual/quick_start.html#plugins
But if you're performing a fresh install of Squirrel you can simply select the "SQLScripts" plugin during the installation.
Here's the long version:
Run the query
Connect to the database. Click on the SQL tab. Enter your query. Hit the run button (or Ctrl-Enter).
You should see the first 100 rows or so in the results area in the bottom half of the pane (depending upon how you've configured the Limit Rows option).
Export the full results
Open the Session menu. Select the Scripts item (nearly at the bottom of this long menu). Select Store Result of SQL in File.
This opens a dialog box where you can configure your export. Make sure you check Export the complete result set to get everything.
I haven't tried this with a 3 million row result set, but I have noticed that Squirrel seems to stream the data to disk (rather than reading it all into memory before writing), so I don't see any reason why it wouldn't work with an arbitrarily large file.
Note that you can export directly to a file by using Ctrl-T to invoke the tools popup and selecting sql2file.

I have found a way to do this, there is a nice support for this in Squirrel. Run the SQL select (the 100 row limit will be ignored by the exporter, don't worry). Then, in the main menu, choose Session, Scripts, Store Result of SQL in File. This functionality may not be present by default, it may be present in some standard plugin (but not installed by default). I don't know which plugin though.

I also wanted to export results of SQL query to CSV file using SquirrelSQL. However according to changes file it seems that this functionality is not supported even in SquirrelSql 3.3.0.
So far I was able to export only data shown in 'result table' of SQL query by right click on the table > Export to CSV. The table size is by default 100 rows and so is the CSV export. You may change the table size in Session Properties > SQL > Limit rows. E.g. change the size to 10000 and your export will also contain 10000 rows. The question is, how will SquirrelSql deal with really big result sets (millions of rows)...

Run from your GUI:
COPY (SELECT * FROM some_table) TO '/some/path/some_table.csv' WITH CSV HEADER

Using Squirrel 3.5.0
The "Store SQL result as file" is great if you only have a simple Select query. A more complex one with parameters wont work.
Even trying to export a result of 600,000+ rows to a CSV file can also fail.

Related

SQL Server - Copying data between tables where the Servers cannot be connected

We want some of our customers to be able to export some data into a file and then we have a job that imports that into a blank copy of a database at our location. Note: a DBA would not be involved. This would be a function within our application.
We can ignore table schema differences - they will match. We have different tables to deal with.
So on the customer side the function would ran somethiug like:
insert into myspecialstoragetable select * from source_table
insert into myspecialstoragetable select * from source_table_2
insert into myspecialstoragetable select * from source_table_3
I then run a select * from myspecialstoragetable and get a .sql file they can then ship to me which we can then use some job/sql script to import into our copy of the db.
I'm thinking we can use XML somehow, but I'm a little lost.
Thanks
Have you looked at the bulk copy utility bcp? You can wrap it with your own program to make it easier for less sophisticated users.
Since it is a function within your application, in what language is the application front-end written ? If it is .NET, you can use Data Transformation Services in SQL Server to do a sample export. In the last step, you could save the steps into a VB/.NET module. If necessary, modify this file to change table names etc. Integrate this DTS module into your application. While doing the sample export, export it to a suitable format such as .CSV, .Excel etc, whichever format from which you will be able to import into a blank database.
Every time the user wants do an export, he will have to click on a button that would invoke the DTS module integrated into your application, that will dump the data to the desired format. He can mail such file to you.
If your application is not written in .NET, in whichever language it is written, it will have options to read data from SQL Server and dump them to a .CSV or text file with delimiters. If it is a primitive language, you may have to do it by concatenating the fields of every record, by looping through the records and writing to a file.
XML would be too far-fetched for this, though it's not impossible. At your end, you should have the ability to parse the XML file and import it into your location. Also, XML is not really suited if the no. of records are too large.
You probably think of a .sql file, as in MySql. In SQL Server, .sql files, that are generated by the 'Generate Scripts' function of SQL Server's interface, are used for table structures/DDL rather than the generation of the insert statements for each of the record's hard values.

Exporting Data - IBM Data Studio DB2

I am trying to export a large query from a DB2 database to a text file on my desktop using IBM Data Studio and I can't seem to get anything to work. When I run the query and right click on the results tab->Export->All Results it only gives me the first 500 records. This table is going to be in the range of 15 MM records so I can't just change the display settings to allow it to display that many records. I cannot use the unload utility because it won't let me save the file to my desktop. Any ideas?
Probably, the limit in the SQL Results is activated.
You should click in the upper left triangle in the "SQL results" view, then "Preferences...".
However, this is not a good way to export data, because you should fetch them into Data Studio and then export them via the GUI. It is better to use the "export" command.
IN the IBM Data Studio:
right click the table/view and ---->Data-----> Extract and Browse destination to drop the file.

SSIS - Any other solution apart from Script Task

Team,
My objective is to data load from Excel to Sql Tables using SSIS. However the excels are quite dynamic i.e. their column count could vary OR the order of existing columns may change. But the destination table will be the same...
So I was contemplating on few options like:
1) Using SQL Command in "Excel Source" - But unfortunately I have to keep "first row as header" setting as false(To resolve the issue of Excel Connection Mngr sensing the datatype as numeric based on first few records). So the querying based on header doesnt work here.
2) The other oprtion in my mind is Script Task and write C# code to read excel based on the columns I know. So in this case the order and insertion/deletion of new columns won't matter.
Suggest me whether Script Task is the only option available for me? Any other simple way to achieve the same in SSIS? Also if possible suggest me a reference for the same.
Thanks,
Justin Samuel.
If you need to automate the process, then I'd definitely go with a script component / OleDbDataAdapter combo (you can't use a streamreader because Excel is a proprietary format). If not, go with the import wizard.
If you try to use a connection manager based solution, it's going to fail when the file layout changes. With the script component / OleDbDataAdapter combo, you can add logic in to interpret the fields and standardize the record layout before loading. You can also create an error buffer and gracefully push error values to it with Try / Catch.
Here's some links on how to use the script component as a source in the data flow task:
http://microsoft-ssis.blogspot.com/2011/02/script-component-as-source-2.html
http://beyondrelational.com/modules/2/blogs/106/posts/11126/ssis-script-component-split-single-row-to-multiple-rows.aspx
This could be done easily using "Import and Export Data" tool available with SQL Server.
Step 1: Specify your Excel as source and your SQL Server DB as destination.
Step 2: Provide necessary mappings.
Step: 3 In the final screen, you can specify to "Save as SSIS Package" and to File System. A relevant dtsx SSIS package would be created for you.
After the SQL Server Import and Export Wizard has created the package and copied the data, you can use the SSIS Designer to open and change the saved package by adding tasks, transformations, and event-driven logic.
(Since it works based on Header, order should not matter. And if a particular column is missing, it should automatically take NULL for that)
Reference: http://msdn.microsoft.com/en-us/library/ms140052.aspx

importing a text file using pgAdmin

I have just downloaded pgAdmin 1.14.3 in an effort to import, query, and manage large textfiles. These textfiles are either quote comma quote delimited or tab delimited (they come as quote comma quote and I edited many for use with another software). While version 1.16 allows an import function, it has not been released yet and I am wondering how to import data into a newly created table using pgAdmin.
The text files range from 12MB to 2GB, so I'm looking for a comprehensive solution that would not involve importing row by row. I tried this with phppgadmin, but ran into file size limitations embedded in the php.ini file (separate post) and am trying this as a possible workaround. I'm a little new to SQL, so not really sure of all the commands possible at my fingertips. Any helps is appreciated - thanks!
You can issue a COPY statement, like this:
COPY table_name (column_name)
FROM 'd:\test.sql';
Query returned successfully: 6 rows affected, 31 ms execution time.
See the documentation here:
http://www.postgresql.org/docs/9.1/static/sql-copy.html
Note that I did not test this in PgAdmin for large files, but using psql I have never seen a case where the file had been too big for COPY.

Export to excel from SQL Server 2000 using Query Analyzer code

What's the easiest way to export data to excel from SQL Server 2000.
I want to do this from commands I can type into query analyzer.
I want the column names to appear in row 1.
In Query Analyzer, go to the Tools -> Options menu. On the Results tab, choose to send your output to a CSV file and select the "Print column headers" option. The CSV will open in Excel and you can then save it as a .XLS/.XLSX
Manual copy and paste is the only way to do exactly what you're asking. Query Analyzer can include the column names when you copy the results, but I think you may have to enable that somewhere in the options first (it's been a while since I used it).
Other alternatives are:
Write your own script or program to convert a result set into a .CSV or .XLS file
Use a DTS package to export to Excel
Use bcp.exe (but it doesn't include column names, so you have to kludge it)
Use a linked server to a blank Excel sheet and INSERT the data
Generally speaking, you cannot export data from MSSQL to a flat file using pure TSQL, because TSQL cannot manipulate anything outside the database (using a linked server is sort of cheating). So you usually need to use some sort of client application anyway, whether it's bcp.exe, dtswiz.exe or your own program.
And as a final comment, MSSQL 2000 is no longer supported (unless your company has an extended maintenance agreement) so you may want to look into upgrading at some point.