SSMS import/export wizard with todays date in filename - sql

I am exporting data using the Task, Export Data menu in SSMS. I want to save the export as an SSIS package. My only issue is I need today's date in the filename. I know in SSIS you can do this in an expression.
But when typing filename in the box how can I write this out?
PtSurveyList_'getdate()'?
what would be the correct syntax for the system to know I want the getdate() function not the actual word?

The import/export wizard is a streamlined editor that creates an SSIS package. It's no different than using BIDS/SSDT to create a package (unless you're using SQL Server Express Edition wherein you are not licensed to save the resultant SSIS package, only execute it).
To answer your question, you cannot accomplish what you are asking for by directly using the import/export wizard.
Option 1
Use the import export wizard, save your SSIS package and then edit it with BIDS/SSDT. Unless you're on 2005/2008, you should be able to download the correct version of SSDT from the Microsoft website and edit your package locally. In the screenshot provided, that is not where you would apply the logic, you'd need to apply an Expression to the Flat File Connection Manager's ConnectionString property.
An even better approach is to not build out the date logic within the expression. This becomes brittle in situations where the server was down for a day due to patching, file was late being delivered, etc and now you have two files to process and what you built only looks for "today's" file. Now what - change the file name; change server time, edit your package? Instead, use a Foreach (file) Enumerator to pick up the file for processing. You can use a wildcard in the File Specification for it to restrict to only the PTSurveyList files. And then, obviously, use a File System Task to archive/move the processed file out of the source folder so you don't double process.
Option 2
Use the import/export wizard as is. It always looks for a file called PTSurveyList.csv. If you need today's date attached to the data you import, add a column to the target table with a default constraint of GETDATE(). That will ensure you have the processed/today's date in the table.
You then need to use OS level tooling/scripting/SQL Agent to handle identifying the current days file and any file manipulation from there. I'd go PowerShell but you can accomplish this with DOS batch scripting although it'll be uglier.
Pseudologic
Find most recent PTSurveyList_yyyymmdd.csv
Copy with overwrite to PTSurveyList.csv
Run SSIS package
Move to Archive/PTSurveyList_yyyymmdd.csv
Delete PTSurveyList.csv
A "trick" to solving the dynamic date via xp_cmdshell is to dynamically build the execution string. Approximately
DECLARE #Command nvarchar(4000);
SELECT #Command = N'EXEC master.dbo.xp_cmdshell ''rename "C:\CentOS_Share\PTSurveyList_.csv" "PTSurveyList_' + CONVERT(char(10), CURRENT_TIMESTAMP, 121) + '.csv"''';
EXEC(#command);
Of course, then you can run into the double quote issue with xp_cmdshell

Related

Modify my existing SSIS package to perform this specific operation

I have an SSIS package created using SSDT and running as a job on SQL Server 2014.
This SSIS package retrieves an Excel file (.xlsx) from a specific folder and exports its content into a specific table on my SQL Server database. The package runs fine.
My Data Flow is in the following sequence:
Import Excel file from folder
Apply a Conditional Split to split data with today's date
Export the data into the SQL Server table in the database
Here is my problem:
I will now have 4 additional Excel files into that folder and they will need to be exported into that same SQL Server table.
So what is the best way forward to achieve this (assuming all of them are possible solutions):
Rewrite 4 additional SSIS packages from scratch?
Use “Save As” existing package with a new name (4 times) and modify the file name to be retrieved?
Modify my existing SSIS package to accommodate for the additional 4 Excel files?
Any help would appreciated.
Assuming the 4 excel files are the same structure and going to the same table, you'll want to use the ForEach loop for each file in the folder.
SentryOne has a good example of looping through each file in a folder and archiving. I imagine it can be adjusted for your use case.

SQL Server - Copying data between tables where the Servers cannot be connected

We want some of our customers to be able to export some data into a file and then we have a job that imports that into a blank copy of a database at our location. Note: a DBA would not be involved. This would be a function within our application.
We can ignore table schema differences - they will match. We have different tables to deal with.
So on the customer side the function would ran somethiug like:
insert into myspecialstoragetable select * from source_table
insert into myspecialstoragetable select * from source_table_2
insert into myspecialstoragetable select * from source_table_3
I then run a select * from myspecialstoragetable and get a .sql file they can then ship to me which we can then use some job/sql script to import into our copy of the db.
I'm thinking we can use XML somehow, but I'm a little lost.
Thanks
Have you looked at the bulk copy utility bcp? You can wrap it with your own program to make it easier for less sophisticated users.
Since it is a function within your application, in what language is the application front-end written ? If it is .NET, you can use Data Transformation Services in SQL Server to do a sample export. In the last step, you could save the steps into a VB/.NET module. If necessary, modify this file to change table names etc. Integrate this DTS module into your application. While doing the sample export, export it to a suitable format such as .CSV, .Excel etc, whichever format from which you will be able to import into a blank database.
Every time the user wants do an export, he will have to click on a button that would invoke the DTS module integrated into your application, that will dump the data to the desired format. He can mail such file to you.
If your application is not written in .NET, in whichever language it is written, it will have options to read data from SQL Server and dump them to a .CSV or text file with delimiters. If it is a primitive language, you may have to do it by concatenating the fields of every record, by looping through the records and writing to a file.
XML would be too far-fetched for this, though it's not impossible. At your end, you should have the ability to parse the XML file and import it into your location. Also, XML is not really suited if the no. of records are too large.
You probably think of a .sql file, as in MySql. In SQL Server, .sql files, that are generated by the 'Generate Scripts' function of SQL Server's interface, are used for table structures/DDL rather than the generation of the insert statements for each of the record's hard values.

Execute service builder generated sql file on postgresql

I would like to execute sql files generated by the service builder, but the problem is that the sql files contains types like: LONG,VARCHAR... etc
Some of these types don't exist on Postgresql (for example LONG is Bigint).
I don't know if there is a simple way to convert sql file's structures to be able to run them on Postgresql?
execute ant build-db on the plugin and you will find sql folder with vary vendor specific scripts.
Daniele is right, using build-db task is obviously correct and is the right way to do it.
But... I remember a similar situation some time ago, I had only liferay-pseudo-sql file and need to create proper DDL. I managed to do this in the following way:
You need to have Liferay running on your desktop (or in the machine where is the source sql file), as this operation requires portal spring context fully wired.
Go to Configuration -> Server Administration -> Script
Change language to groovy
Run the following script:
import com.liferay.portal.kernel.dao.db.DB
import com.liferay.portal.kernel.dao.db.DBFactoryUtil
DB db = DBFactoryUtil.getDB(DB.TYPE_POSTGRESQL)
db.buildSQLFile("/path/to/folder/with/your/sql", "filename")
Where first parameter is obviously the path and the second is filename without .sql extension. The file on disk should have proper extension: must be called filename.sql.
This will produce tables folder next to your filename.sql which will contain single tables-postgresql.sql with your Postgres DDL.
As far as I remember, Service Builder uses the same method to generate database-specific code.

SSIS - Any other solution apart from Script Task

Team,
My objective is to data load from Excel to Sql Tables using SSIS. However the excels are quite dynamic i.e. their column count could vary OR the order of existing columns may change. But the destination table will be the same...
So I was contemplating on few options like:
1) Using SQL Command in "Excel Source" - But unfortunately I have to keep "first row as header" setting as false(To resolve the issue of Excel Connection Mngr sensing the datatype as numeric based on first few records). So the querying based on header doesnt work here.
2) The other oprtion in my mind is Script Task and write C# code to read excel based on the columns I know. So in this case the order and insertion/deletion of new columns won't matter.
Suggest me whether Script Task is the only option available for me? Any other simple way to achieve the same in SSIS? Also if possible suggest me a reference for the same.
Thanks,
Justin Samuel.
If you need to automate the process, then I'd definitely go with a script component / OleDbDataAdapter combo (you can't use a streamreader because Excel is a proprietary format). If not, go with the import wizard.
If you try to use a connection manager based solution, it's going to fail when the file layout changes. With the script component / OleDbDataAdapter combo, you can add logic in to interpret the fields and standardize the record layout before loading. You can also create an error buffer and gracefully push error values to it with Try / Catch.
Here's some links on how to use the script component as a source in the data flow task:
http://microsoft-ssis.blogspot.com/2011/02/script-component-as-source-2.html
http://beyondrelational.com/modules/2/blogs/106/posts/11126/ssis-script-component-split-single-row-to-multiple-rows.aspx
This could be done easily using "Import and Export Data" tool available with SQL Server.
Step 1: Specify your Excel as source and your SQL Server DB as destination.
Step 2: Provide necessary mappings.
Step: 3 In the final screen, you can specify to "Save as SSIS Package" and to File System. A relevant dtsx SSIS package would be created for you.
After the SQL Server Import and Export Wizard has created the package and copied the data, you can use the SSIS Designer to open and change the saved package by adding tasks, transformations, and event-driven logic.
(Since it works based on Header, order should not matter. And if a particular column is missing, it should automatically take NULL for that)
Reference: http://msdn.microsoft.com/en-us/library/ms140052.aspx

Export to excel from SQL Server 2000 using Query Analyzer code

What's the easiest way to export data to excel from SQL Server 2000.
I want to do this from commands I can type into query analyzer.
I want the column names to appear in row 1.
In Query Analyzer, go to the Tools -> Options menu. On the Results tab, choose to send your output to a CSV file and select the "Print column headers" option. The CSV will open in Excel and you can then save it as a .XLS/.XLSX
Manual copy and paste is the only way to do exactly what you're asking. Query Analyzer can include the column names when you copy the results, but I think you may have to enable that somewhere in the options first (it's been a while since I used it).
Other alternatives are:
Write your own script or program to convert a result set into a .CSV or .XLS file
Use a DTS package to export to Excel
Use bcp.exe (but it doesn't include column names, so you have to kludge it)
Use a linked server to a blank Excel sheet and INSERT the data
Generally speaking, you cannot export data from MSSQL to a flat file using pure TSQL, because TSQL cannot manipulate anything outside the database (using a linked server is sort of cheating). So you usually need to use some sort of client application anyway, whether it's bcp.exe, dtswiz.exe or your own program.
And as a final comment, MSSQL 2000 is no longer supported (unless your company has an extended maintenance agreement) so you may want to look into upgrading at some point.