Transfer Data from Oracle database 11G to MongoDB - sql

I want to have an automatic timed transfer from Oracle database to MongoDB. In a typical RDBMBS scenario, i would have established connection between two databases by creating a dblink and transferred the data by using PL/SQL procedures.
But i don't know what to do in MongoDB case; thus, how and what should i be implementing so that i can have an automatic transfer from Oracle database to MongoDB.

I would look at using Oracle Goldengate. It has a MONGODB Handler.
https://docs.oracle.com/goldengate/bd123110/gg-bd/GADBD/using-mongodb-handler.htm#GADBD-GUID-084CCCD6-8D13-43C0-A6C4-4D2AC8B8FA86
https://oracledb101.wordpress.com/2016/07/29/using-goldengate-to-replicate-to-mongodb/

What type of data do you want to transfer from the Oracle database to MongoDB? If you just want to export/import a small number of tables on a set schedule, you could use something like UTL_FILE on the Oracle side to create a .csv export of the table(s) and use DBMS_SCHEDULER to schedule the export to happen automatically based on your desired time frame.
You could also use an application like SQL Developer to export tables as .csv files by browsing to the table the schema list, then Right Click -> Export and choosing the .csv format. You may also find it a little easier to use UTL_FILE and DBMS_SCHEDULER through SQL Developer instead of relying on SQL*Plus.
Once you have your .csv file(s), you can use mongoimport to import the data, though I'm not sure if MongoDB supports scheduled jobs like Oracle (I work primarily with the latter.) If you are using Linux, you could use cron to schedule a script that will import the .csv file on a scheduled interval.

Related

How to create a local copy of Oracle data to avoid query over a slow link

I have a need to frequently run a large-ish query against a remote Oracle DB, which with my link speed, takes 10+ minutes. Is there a technique that I can use to create a local copy of the data in order to improve performance?
A few notes:
I would just need a local copy of a predetermined set of tables
Being able to schedule an update to run nightly would be a huge bonus
The data generally doesn't need to be refreshed throughout the day, though being able to do a delta update would be nice
I do have remote machines that can access the data much quicker, but I'm not able to install Excel on them to perform the actual work that needs to be done (using SQL Developer is not a problem). But it would be possible to set up an auto download of the data on those machines and then create a task to copy the files to my local machine
I've considered a few ideas so far, such as configuring SQL Developer to automatically pull the data that I need and dump it to Excel (or some other format that I can use to pull the data in from another Excel file), but I thought there might be a better way.
One way is to use the expdp and impdp tools to dump (export) only a subset of the tables :
https://oracle-base.com/articles/10g/oracle-data-pump-10g
But this solution could be quite hard to implement since you must have the tools on your local server and an access to the remote server to launch the export.
I think the simplest solution it to use CTAS (Create Table As Select). This will make a copy of the data from the distant server to you local server. For example if you use a database link called DistantServer, issue on you local server :
DROP TABLE MyTable;
CREATE TABLE MyTable AS SELECT * FROM MyTable#DistantServer;
You can search for Oracle CTAS for more informations.
Then if the CTAS script is correct you can schedule it every night by creating a Oracle JOB on you local server. See DBMS_JOB for older release of Oracle RDBMS or better DBMS_SCHEDULER package.

Best way to set up a new database on a new server which periodically refreshes tables from a live SQL Server?

I need to create a database solely for analytical purposes. The idea here is for it to start off as a 1:1 replica of a current SQL Server database but we will then add in additional tables. The idea here is to be able to have read-write access to a db without dropping anything in production inadvertently.
We would ideally like to set a daily refresh schedule to update all tables in the new tb to match the tables in the live environment.
In terms of the DBMS for the new database, I am very easy - MySQL, SQL Server, PostgreSQL would be great -- I am not hugely familiar with the Google Storage/BigQuery stack but if this is an easy option, I'm open to it.
You could use a standard HA/DR solution with a readable secondary (Availability Groups/mirroring /log shipping).
then have a second database on the new server for your additional tables.
Cloud Storage and BigQuery are not RDBMS services themselves, but could be used in this case to store the backups/exports/dumps from the replica, and then have the analytical work performed on those backups.
Here is an example workflow:
Perform a backup and restore in a different database
Add the new tables in the new database
Export the database as a CSV file on your local machine
Here you could either directly load the CSV file in BigQuery, or upload that file in a Cloud Storage bucket previously created
Query the data
I suggest to take a look at the multiple methods for loading data in BigQuery, as well as the methods for querying against external data sources which may help to determine which database replication/export method might be best for your use case.

Oracle 11g Script database with data

Is there a way in Oracle 11g to dump database to sql script, that when run will perform database, users, tables and data creation?
In Microsoft SQL Server there's SSMS Toolpack that is capable of such thing. (Script all data from SQL Server database) I'm interested whether the same is possible in Oracle 11g.
To extract metadata and data you should look at data pump, specifically the export and import tools. This will be the simplest, fastest and most supported way to move everything.
You will need to already have created the database, but I'm not sure if you're confusing that with the Oracle schema. Which you will also have to create in advance by creating the user(s) that will own all the objects. You can extract a script to create the user/schema, e.g. from Toad or SQL Developer, or using the DBMS_METADATA package.
Most client applications also have options to export pretty much everything as scripts. In SQL Developer, for example, go to the Tools menu and there's an 'Export database' option where you can choose what you want to include, which will be pretty much everything in your case. You'll still need to pre-create the new database to run those scripts against.

how to access a database when the access is restricted to a particular place

There is a student database in Some College.Some Organization wants to access it from their headquarters.
But access is restricted within college only.
Is it possible for you to extract data?
How and what SQL queries and functions for the above?
in network programming in can do by connecting via tcp r udp and extracting information but is t possible if the databasae is large?
how can we do using sql function
One thing you can do is to dump the data and reimport it into your own database. Depends on how big the data is you require. At work I have similar problems and I have to do the same somteimes.
If your admin dumps the data for you, then it is easier. You can also export it with sql commands, but how depends on which database you are using. When you dump it to CSV format, you can import it into a SQLIte datbase easily (or others like MySQL etc.), if you don't have a local DB version of your own database.
An alternative is to export the data yourself into a CSV. How to do this depends on the DB that you use and you didn't mention it. Under Oracle you can use the set and spool command to achieve this.

How to import data from eXist database to PostgreSQL database?

Is there any extension/tool/script available to import data from eXist database to PostgreSQL database automatically?
From the tag description it's pretty clear that you're going to need to use an ETL tool or some custom code. Which is easier depends on the nature of the data and how you want to migrate it.
I'd start by looking at Talend Studio and Pentaho Kettle. See if either of them can meet your needs.
If you can turn the eXist data into structured CSV exports then you can probably just hand-define tables for it in PostgreSQL then COPY the data into it or use pgloader.
If not, then I'd suggest picking up the language you're most familiar with (Python, Java, whatever) and using the eXist data connector for that language along with the PostgreSQL data connector for the language. Write a script that fetches data from eXist and feeds it to PostgreSQL. If using Python I'd use the Psycopg2 database connector, as it's fast and supports COPY for bulk data loading.