I have a local (read and write) sqlite database and a remote (read-only) oracle database. I use ODBC to access both DBs (I use an application to access the DBs by ODBC and query as such: EXECUTE-QUERY SQLITE "SELECT ..." or EXECUTE-QUERY ORACLE "SELECT ...") . I tried searching the net for a way to be able to perform one query joining tables from the 2 databases, but all I find is how to create a database link from oracle to other DBs but that doesn't help me because I have no write priviledges for the Orcale DB so creations of database links, databases, tables, views are not allowed in ORACLE, all I can do is query there. Is there an efficient way to do this with the restraints that I have?
How big are the tables in oracle? Given the limits of the access you have and the technology you are working with ( sqlite and oracle are worlds apart ), your best bet would probably be to export the tables from oracle into sqlite, then do your queries all within that.
Finally I installed Oracle express edition and created a database link to the other (read-only) oracle database. That worked great.
Related
I'm playing around with a table in an MS Access database. The table has a primary key of CLIENT_NUMBER. My corporation maintains an Oracle database that has a table which contains clients contact information (address, phone numbers, emails, etc). It also has the CLIENT_NUMBER field. I got to thinking that maybe I can join the 2 tables from the different databases and run some queries. I dug around on the net and I couldn't really find any reference, so I think this is a long shot and a silly question, but is that possible? Maybe through a DB link or something? For reference, I use SQL Developer 3.2.xx for sql developing.
I would copy the table in oracle to Access using what's called a sqlpassthrough query in Access. linked tables to oracle in my experience, perform very poorly, and if you are also thinking about joining to a local table in Access, probably much worse.
Passthrough queries are very quick since Access simply just sends the query for execution to the target server/database based on the connection you identify for the passthrough query, hence the name "pass-through".
The driver in the connect string may not work for you, and it may need more info depending on how things are setup in your environment, so you will have to work that out.
'creates the passthrough query to oracle
With CurrentDb.CreateQueryDef("qOracleConn")
.Connect = "ODBC;Driver={Microsoft ODBC for Oracle};Server=oracleservername;Uid=oracledbusername;Pwd=oracledbpassword;"
.sql = "SELECT * FROM tableinoracle"
End With
'creates the local table in access
CurrentDb.Execute "SELECT * INTO OracleClients FROM qOracleConn"
Ok so I have a little problem...
In my project we have a Oracle SQL Server. In the database I have access to some of an other users tables:
Tables:
|-bla
|-bla
Users:
|-otherUser (let's just call him that)
|-Tables:
|-aTable
In Oracle, to access the aTable table I use SELECT * FROM otherUser.aTable
Now, we also have a MS SQL CE database to which I sync the data from the OracleDB using the MS Sync f/w. And in the CE db - after sync - I get a table otherUser.aTable. This sounds good, so even though the CE doesn't have the User concept it just adds the same table.
BUT the problem is that when calling the same SQL query on CE as on Oracle I get a The table name is not valid error. Instead if I want to get the content of the table, the two ways that I have found to work is surrounding the otherUser.aTable with either [] or "".
However neither of them seem to work with Oracle. The [] seem to be an illegal name, and the "" seem to search for a table called just that (not an other user).
So why don't I just use the one way on Oracle and the other on CE? well I also use NHibernate as a ORM and it kind of needs the same table name for both the databases...
Is there a third way to encapsulate the table name that works with users in Oracle and just works in CE? or do you have any other ways to fix this issue?
I have no experience with MS SQL, but it seems like a problem that might be solved with synonyms on Oracle side.
Try to create synonym "otherUser.aTable" for otherUser.aTable in Oracle.
I have two different databases (on same server) and i want to join tables across databases. I am using Hibernate, is it possible to create a query in hibernate which can join two tables in those databases?
Hibernate will create an SQL query for your HQL or Criteria query and this SQL will be sent through jdbc to the database. This means that hibernate does not support for what you are trying to do.
However, you can achieve the same result in some cases. Some databases give you the option to create an alias for a table that resides in a different database. So you will be able to write an SQL query that joins the two tables and execute it on the database.
We are doing that with DB2.
If you can do that, it depends on your database.
I guess, that it would impossible if you have two different databases (example DB2 and MySQL) but if both databases are of the same vendor, then maybe it's achievable.
You should try to find more info in you database server's documentation.
I have a PostgreSQL database that stores real-time data from sensors in a specific table (every 30sec).
What I want to do, is to get periodically the data from the remote PostgreSQL database (for instance every 30sec) and store them in SQL Server 2005 to manipulate them locally. I don't care about having the two databases with duplicate tables. Actually this is what I want to achieve!
So far, I have as Linked Server the PostgreSQL to SQL Server and I can query and retrieve the sensor data. However, I prefer to store them in my SQL Server for performance reasons.
Solution so far:
Make select openquery statements with the linked PostgreSQL and insert the results to my table in SQL Server. Repeat this periodically and store fresh data only (e.g. with a larger timestamp).
I assume that my proposed solution is not ideal. I want to know what are the best practices to achieve this synchronization between the two databases.
Thank you in advance!
If you don't want to write your own code(implementations) to do that you can use SymmetricDS to synch the table from postgreSQL to MSSQL .
Is there a SQL command that will list all the tables in a database and which is provider independent (works on MSSQLServer, Oracle, MySQL)?
The closest option is to query the INFORMATION_SCHEMA for tables.
SELECT *
FROM INFORMATION_SCHEMA.Tables
WHERE table_schema = 'mydatabase';
The INFORMATION_SCHEMA is part of standard SQL, but not all vendors support it. As far as I know, the only RDBMS vendors that support it are:
MySQL
PostgreSQL
Microsoft SQL Server 2000/2005/2008
Some brands of database, e.g. Oracle, IBM DB2, Firebird, Derby, etc. have similar "catalog" views that give you an interface where you can query metadata on the system. But the names of the views, the columns they contain, and their relationships don't match the ANSI SQL standard for INFORMATION_SCHEMA. In other words, similar information is available, but the query you would use to get that information is different.
(footnote: the catalog views in IBM DB2 UDB for System i are different from the catalog views in IBM DB2 UDB for Windows/*NIX -- so much for the Universal in UDB!)
Some other brands (e.g. SQLite) don't offer any queriable interface for metadata at all.
No. They all love doing it their own little way.
No, the SQL standard does not constrain where the table names are listed (if at all), so you'll have to perform different statements (typically SELECT statements on specially named tables) depending on the SQL engine you're dealing with.
If you are OK with using a non-SQL approach and you have an ODBC driver for the database and it implements the SQLTables entry-point, you possibly might get the information you want!
pjjH
details on the API at:
http://msdn.microsoft.com/en-us/library/ms711831.aspx