any way to import csv file to firebird from sql cmd? - sql

I need to import few columns from csv file into firebird temp table
but not using any tool only from sql editor aka openrowset in SQL server.
is it supported?
thanks

firebird doesn't have a load statement, but several tools have an import module. Easest is to edit the file into a set of insert statements

You can have look at Data Transformer (disclaimer - I'm its developer). It can convert between CSV (an other formats) to SQL. The generated SQL contains "insert" statements for each line and a "create table" statement.
It works offline - the data never leaves your computer.
You can get it from the Mac App Store or the Microsoft Store.

Related

How to transfer SQL query results to a new csv file with headers?

I want to transfer SQL query results to a new csv file. This is because I have placed my SQL query inside a loop which will generate export query results to csv file each time. I'm using MS SQL Server 2012. I don't want to take GUI option.
Sql Server is not really designed to import and export files. You can use bulk copy program but I dont think it works in tsql code (looping). You can use openrowset but you need to set a special flag that opens up your surface area of attack which some do not want to do.
The answer is SSIS (or a tool like Talend). It comes with Sql and is designed by MS as the go to tool for import and export from Sql. If you were to right click on the data base, choose tasks and then export the wizard eventually creates and executes an SSIS package.
I recommend you reconsider a GUI option.
ps - Another answer was to use save results as. I have heard of problems using this method including problems with delimiters or text qualified fields.
There are multiple ways to attain this. Either you can export the resultset using BCP or using IMPORT/ EXPORT or using CTRL+SHIFT+S (this will change the resultset to SAVE AS. Hope this may help.

Insert data using powershell

We have a static sql file which consists of insert statements;basically test data.
is it possible to execute this script using powershell on azure sql db where alwaysencrypted is enabled. We use keyvault to store the certs.
Unfortunately, PowerShell (Invoke-SqlCmd) does not support insert statements against encrypted columns at this point. The only SQL tool from Microsoft that supports such statements at this point is SSMS - please see: https://learn.microsoft.com/en-us/sql/relational-databases/security/encryption/configure-always-encrypted-using-sql-server-management-studio#param .
An alternative could be to put your test data into a CSV file and use the Import Export Wizard to import the data into the database. You could save the import job as an SSIS package, which you could execute from the command line. Here is a blog article on using the I/E Wizard for importing (and encrypting) data from a database (importing from a file would be similar). https://blogs.msdn.microsoft.com/sqlsecurity/2015/10/31/ssms-encryption-wizard-enabling-always-encrypted-in-a-few-easy-steps/
Jakub

How can I spool a csv formatted file using an SQL Command?

I have an SQL query that generates a result regarding the daily data from the database. I want a csv formatted file to be generated everyday with this query and saved in a folder. Is there any way I can do this?
NOTE: I am using SQL Server Management Studio 2008 with regards to the DB.
This is a question about bulk export which well documented in MSDN - Importing and Exporting Bulk Data

import csv to sql

I have to import an csv file to SQL database table which already created (empty and has the same number of named columns). It would be great if you could suggest any tutorials or give some tips.
I assume you are using Microsoft SQL Server. Do you need to do this in a program or manually? There is a tutorial on using the bcp command for that, or alternatively a SQL command. If you need to parse the CSV file for your own code, see this previous SO question.

BCP utility to create a format file, to import Excel data to SQL Server 2008 for BULK insertion

Am trying to import Excel 2003 data into SQL table for SQL Server 2008.
Tried to add a linked server but have met with little success.
Now am trying to check if there's a way to use the BCP utility to do a BULK insert or BULK operation with OPENROWSET, using a format file to get the Excel mapping.
First of all, how can I create a format file for a table, that has differently named columns than the Excel spreadsheet colums?
Next, how to use this format file to import data from say a file at: C:\Folder1\Excel1.xsl
into table Table1 ?
Thank you.
There's some examples here that demonstrate what the data file should look like (csv) and what the format file should look like. Unless you need to do this lots I'd just hand-craft the format file, save the excel data to csv, then try using bcp or OPENROWSET.
The format file specifies the column names for the destination. The data file doesn't have column headings so you don't need to worry about the excel (source) cols being different.
If you need to do more mapping etc, then create an SSIS package. You can use the data import wizard to get you started, then save as SSIS package, then edit to your heart's content.
If it's a one-off I'd use the SQL data import size, from right-click on database in mgmt studio. If you just have a few rows to import from excel I typically open a query to Edit Top 200 rows, edit the query to match the columns I have in excel, then copy and paste the rows from excel into SQL mgmt studio. Doesn't handle errors very well, but quick.