How to set up SSIS to extract data from Postgres Database - sql

I have a database PostGres database in the AWS Cloud. I would like to use SSIS to extract tables and move them over to a local SQL Server.
Has anyone attempted to do this? Is it possible?
Ultimately I would like to move over tables from the PostGres to a SQL server, without having to purchase a tool.

As per the documentation, you would need to follow these steps to connect SSIS to a Postgres database:
get the PostgreSQL ODBC driver, either with Stack Builder or using ODBC
connect to PostgreSQL with the PostgreSQL ODBC driver (psqlODBC), using the proper connection string, typically Driver={PostgreSQL ODBC Driver(UNICODE)};Server=<server>;Port=<port>;Database=<database>;UID=<user id>;PWD=<password>

You can use the Postgres OLE DB Provider to connect to Postgres using OLE DB Source. The following link contains a step by step guide to import data from Postgres into SQL Server:
Export data from Postgres to SQL Server using SSIS

Related

Azure SQL Data Migration Assistant (DMA) Error - Three or Four Part Names

I'm using the MS Data Migration Assistant tool to move a SQL Server 2016 DB to Azure. I'm getting the following error on 80+ stored procedures:
Queries or references using three- or four-part names not supported in Azure SQL Database. Three-part name format, [database_name].[schema_name].[object_name], is supported only when the database_name is the current database or the database_name is tempdb and the object_name starts with #.
All of these stored procedures are using the current database and referencing the current database name. For example, this instruction is causing the error:
DELETE FROM [STDR].[dbo].[report] WHERE [report_id] = #xid
and when I run the command:
SELECT DB_NAME();
I get:
STDR
Could this be an error in the DMA tool? It's preventing me from executing the migration. I'd rather not have to modify all of these procedures. Thanks.
1.Queries or references using three- or four-part names not supported in Azure SQL Database.
It's not the error in the DMA tool. Cross database queries using three or four part names is not supported in Azure SQL Server.
You can read more in the official documentation:Resolving Transact-SQL differences during migration to SQL Database。
2.Three-part name format, [database_name].[schema_name].[object_name], is supported only when the database_name is the current database or the database_name is tempdb and the object_name starts with #.
About this question, I have an idea and I think you can try it. You can specify target Azure Database instance which has the same database name and the same schema objects with your on-premises SQL Server. Otherwise, when your SQL Server 2016 DB is migrated to Azure, the current database is not [STDR] and cause the error.
Reference: Migrate on-premises SQL Server or SQL Server on Azure VMs to Azure SQL Database using the Data Migration Assistant.
Hope this helps.
It's just the four-part name or three-part name that is not compatible with Azure SQL Database. You can script all your programing objects and then change the three part name format to two-part name format (dbo.[NameOfTheObjet]) on the script using Find and Replace on a text editor like Notepad++, then run that script on your Azure SQL Database to migrate your programming objects.
After that you can use DMA only to migrate the schema and data of your tables.

Configuration file in Netezza

Is there a configuration file in Netezza like tnsnames.ora in Oracle which contains database names and their connect string names?
If so, what is the default location of the file?
I'm using Informatica PowerCenter to load to target Netezza table. I want to know the Database details of the connect string Informatica uses to connect with Netezza DB. In Oracle, I could have got the informatica from tns file.
Netezza doesn't have an equivalent to Oracle TNSNames.
ODBC Connection String Example:
Driver={NetezzaSQL};servername=myServerAddress;port=myPortNumber;
database=myDataBase;username=myUsername;password=myPassword;
ODBC ConnectionStrings.com
ODBC Configuration IBM
JDBC Configuration IBM
You can check the dsn entry (connect string name in Informatica connection) in the odbc.ini file in the LD_LIBRARY_PATH which is defined at the time of Netezza ODBC driver installation
In PowerCenter, a developer can check the connection details only if a dedicated connector is used. For ODBC, the only information available in Workflow Manager is the name of ODBC. The details can be checked in ODBC definition on the server.
A small addition to #Marciejg:
We have only a few odbc connections compared to powercenter connections. Each odbc points to the ‘system’ database and in the powercenter connection pointing to a specific database on that server, we run a ‘set current_catalog PROD_EDW’ in the pre sql. That way things are mostly visible and manageable in powercenter, and the odbc only points to the server.
And slightly off topic: the pre sql has additional ‘set CLIENT_*_NAME’ statements that enters the powercenter workflow/session etc dynamically based on powercenter build in variables (they are named $PMWorkflowname and similar)
That way we can trace back to the powercenter code immediately from a planfile, the pg.log or most interestingly, the HISTDB
Follow these links if you want to play with it:
- https://www.ibm.com/support/knowledgecenter/SSULQD_7.2.1/com.ibm.nz.dbu.doc/r_dbuser_set.html
and
http://dwhlaureate.blogspot.dk/2012/09/built-in-variables-in-informatica.html

How to export database from mysqlworkbench to sql server

I designed a schema in Mysql Workbench, I want to get a script from Mysql Workbench and use it in sql server
I could transfer the database from mysql workbench to sql server using database converter which was very helpful.
I used https://www.spectralcore.com/fullconvert/ to do the conversion

Generate an "INSERT INTO" script from SQL Server to Postgres

I need to generate a script with the data from one db in sql server to postgres. It seems the "Generate Script" in SQL Management doesn't do ANSI format like INSERT INTO that I need.
How can I do it?
I suggest.
Fire up an instance of Postgres on an accessible device.
Set up a linked server from Microsoft SQL Server to access the postgresql server using the postgres 64 bit odbc driver.
Start writing TSQL to transfer the data.
https://www.google.com.au/search?q=postgresql+odbc+sql+server+2012

Optimal way to Load Data from SQL Server to DB2

We have 40+ Tables present in SQL SERVER DB and we need to copy the data to an IBM DB2 database. What methods do you recommend to accomplish this?
My ANALYSIS:
BCP and Data Import - The team is trying to avoid any BCP files
Write Stored procedure and use LINKED Server in SQL and insert the data in DB2
SSIS Packages to move data.
Please let us know if you have any better way to approach this issue.
Have you considered Information Integration, that is known in DB2 as federation? you can do a select in SQL Server directly from DB2, and with this feature you can define a cursor and then just use the LOAD command.