Postgres, plpgsql: Is there a way to connect to other DB from inside of a stored procedure? - sql

I have two DB's one is feed by filtered data from another, now i'm using perl script witch executes query on foreign DB, stores a result in a csv file, and loads it to local DB using \COPY sytnatx
Is there a way to write plpgsql function witch will connect to foreign DB and load filtered data in local DB ( I know it can be done in ie. plperl, but i search more "native" way )

And there is the DBI-LINK that supports much more databases :)

Currently, PostgreSQL has dblink, but it only supports connecting to other PostgreSQL instances - not any other database, sadly.

I would recommend PL/Proxy, which is significantly easier to use - just write the desired stored procedure on the target database (with some minor caveats, like not using enumerated types), and declare the same function on the source, PL/Proxy will handle the communications. It is the basis for Skype's distributed database architecture and is production-ready.

Related

Can I create unrestricted schema with sql database like the one sed in django?

In mongodb or firebase I can create {title:"sothigs, content:"somthing"} then I can create other object with only title for example {title:"second title"} or I even can create object with new values that i never defined before.
Is it possible to do with django sql databases?
If you are using PostgreSQL as your DB, then you can use JSONField in Django. Otherwise, check out more of the vendor-specific fields available in Django. This will lock you in to whichever database you end up using, however, PostgreSQL is a very good multi-purpose relational DB so I highly recommend using it.

Migrate data from SQL Server to PostgreSQL

I have a stored procedure function as well as table in the SQL Server enterprise 2014. I also have data in the table. Now I need same table and data in PostgreSql(pgAdmin4).
Can anyone suggest to me the idea to migrate data to POSTGRESQL or any idea on creating the SQL script so that I can use psql to run the script?
Depending on how much data you have, you could script out the table and data. Then you could tweak the script as needed for PostgreSQL:
Right click on the SQL database > Tasks > Generate Scripts
On the "Choose Objects" screen, select your specific table then select "Next>"
On the "Set Scripting Options" screen, select "Advanced"
Find the option called "Types of data to script", then select "Schema and data" and select "OK"
Set the filename and continue through the dialog until the file is generated
Tweak the sql script for any specific PostgreSQL syntax
If there is a larger amount of data, you might look into some type of data transfer tool like SSIS.
Exporting the table structure and data as Josh Jay describes will likely require some fixes where the syntax doesn't match, but it should be doable if not tedious. Luckily there are existing conversion tools available to help.
You could also try using a foreign data wrapper to map the tables in SQL Server to a running instance of PostgreSQL. Then it's just a matter of copying tables. Depends on your needs and where each database server is located relative to one another.
The stored procedures will be far more difficult to handle unfortunately. While Oracle's pl/sql language is substantially similar to PostgreSQL's pl/pgsql, MS SQL Server/Sybase's TransactSQL dialect on the other hand is different enough to require rewrites. If the TransactSQL functions also access .Net objects, the migration task may end up far more difficult as you reimplement dependencies or find logical equivalents.

How to insert R dataframe into existing table in SQL Server

After trying a few different packages and methods found online, I am yet to find a solution that works for inserting a dataframe from R into an existing table in SQL Server.
I've had great success doing this with MySQL, but SQL Server seems to be more difficult.
I have managed to write a new table using the DBI package, but I can't find a way to insert into using this method. Looking at the documentation, there doesn't seem to be a way of inserting.
As there are more than 1000 rows of data, using sqlQuery from the RODBC package also seems unfeasable.
Can anybody suggest a working method for inserting large amounts of data from a dataframe into an existing SQL table?
I've had similar needs using R and PostGreSQL using the r-postgres-specific drivers. I imagine similar issues may exist with SQLServer. The best solution I found was to write to a temporary table in the database using either dbWriteTable or one of the underlying functions to write from a stream to load very large tables (for Postgres, postgresqlCopyInDataframe, for example). The latter usually requires more work in terms of defining and aligning SQL data types and R class types to ensure writing, wheres dbWriteTable tends to be a bit easier. Once written to a temporary table, to then issue an SQL statement to insert into your table as you would within the database environment. Below is an example using high-level DBI library database calls:
dbExecute(conn,"start transaction;")
dbExecute(conn,"drop table if exists myTempTable")
dbWriteTable(conn,"myTempTable",df)
dbExecute(conn,"insert into myRealTable(a,b,c) select a,b,c from myTempTable")
dbExecute(conn,"drop table if exists myTempTable")
dbExecute(conn,"commit;")

Oracle has OWA. What is the PostgreSQL equivalent?

I'd like to write stored procedures in pgSQL that dynamically generate web-ready data. I need a pure SQL to HTML or SQL to XML gateway. Oracle has OWA. In Oracle you can setup a RAC frontend to a SAN and connect a large set of OWA hosts to your RAC so you
layer your web requests and spread your queries.
What is the PostgreSQL or MySQL equivalent? I'm not looking at getting the data out of the DB, then processing it via Python or Ruby. Is something like
this even possible in PostgreSQL?
From experience, stored SQL procedures are better at moving/calculating large datasets than piping the SQL query/cursor/proc result set to a middleware Python/Ruby/Perl/PHP
that then process the data and send it to the web browser.
PostgreSQL has a host of functions that allow you to convert from tables/schemas/queries to XML ( http://www.postgresql.org/docs/8.4/static/functions-xml.html ) as well as a built-in XML datatype ( http://www.postgresql.org/docs/8.4/static/datatype-xml.html ).
Unfortunately, I can't give much more information than that as I'm not an XML guy but hopefully these references will make sense to you. ;)

Move Data from Oracle to SQL Server

I would like to copy parts of an Oracle DB to a SQL Server DB. I need to move the data because the Oracle box is being decommissioned. I only need the data for reference purposes so don't need indexes or stored procedures or contstaints, etc. All I need is the data.
I have a link to the Oracle DB in SQL Server. I have tested the following query, which seemed to work just fine:
select
*
into
NewTableName
from
linkedserver.OracleTable
I was wondering if there are any potential issues with using this approach?
Using SSIS (sql integration services) may be a good alternative especially if your table names are the same on both servers. Use the import wizard via and it should create the destination tables for you and let you edit any mappings.
The only issue I see with that is you will need to execute that of course for each and every table you need. Glad you are decommissioning the oracle server :-). Otherwise if you are not concerned with indexes or any of the existing sprocs I don't see any issue in what you are doing.
The "select " approach could be very slow if tables are large. Consider writing pro*C in that case or use Fastreader http://www.wisdomforce.com/products-FastReader.html
A faster and easier approach might be to use the Data Transformation Services, depending on the number of objects you're trying to copy over.