Oracle - copying stored procedures to remote database - sql

An assignment I have as part of my pl/sql studies requires me to create a remote database connection and copy down all my tables to it from local, and then also copy my other objects that reference data, so my views and triggers etc.
The idea is that at the remote end, the views etc should reference the local tables provided the local database is online, and if it is not, then they should reference the tables stored on the remote database.
So I've created a connection, and a script that creates the tables at the remote end.
I've also made a pl/sql block to create all the views and triggers at the remote end, whereby a simple select query is run against the local database to check if it is online, if it is online then a series of execute immediate statements creates the views etc with reference to table_name#local, and if it isn't online the block skips to the exception section, where a similar series of execute immediate statements creates the same views but referencing the remote tables.
OK so this is where I become unsure.
I have a package that contains a few procedures and a function, and I'm not sure what's the best way to create that at the remote end so that it behaves in a similar way in terms of where it picks up its reference tables from.
Is it simply a case of enclosing the whole package-creating block within an 'execute immediate', in the same way as I did for the views, or should I create two different packages and call them something like pack1 and pack1_remote?
Or is there as I suspect a more efficient method of achieving the goal?
cheers!

This is absolutely not how any reasonable person in the real world would design a system. Suggesting something like what I suggest here in the real world will, in the best case, get you laughed out of the room.
The least insane approach I could envision would be to have two different schemas. Schema 1 would own the tables. Schema 2 would own the code. At install time, create synonyms for every object that schema 2 needs to reference. If the remote database is available when the code is installed, create synonyms that refer to objects in the remote database. Otherwise, create synonyms that refer to objects in the local database. That lets you create a single set of objects without using dynamic SQL by creating an extra layer of indirection between your code and your tables.

Related

Creating a view that can be called from any database but runs against local tables?

We create databases for each of our clients so that their data is segregated. For tables that are shared (e.g. product names), we use synonyms to a common database. For stored procedures that are common, we create them in the master database and mark them as system procedures so that they'll run in the customer database context. But for views, we're stuck. A view set as a synonym will not run in the local database and we can't find anything comparable to the system procedure for a view.
So, how do we create a common view that when run, will run in the context of the local customer database?
Here is what I would do. Create a script (or optionally a proc) that uses dynamic SQL to create the view in every database in a list that you either supply at run-time or keep in a table.
Then when you want to make a change to the view for every customer, just change the script and run it.

AS400 SQL query similar to CLRLIB (clear library) in native AS400

I'm working on a AS400 database and I need to manipulate library/collection with sql.
I need to recreate something similar to the CLRLIB command but I don't find a good way to do this.
Is there a way to delete all the table from a library with a sql query ?
Maybe I can drop the collection and create a new one with the same name. But I don't know if this is a good way to clear the library.
RESOLVE :
Thanks to Buck Calabro for his solution.
I use the following query to call the CLRLIB in SQL :
CALL QSYS.QCMDEXC('CLRLIB LIB_NAME ASPDEV(ASP_NAME)', 0000000032.00000)
Where LIB_NAME is the name of the library I want to clear, ASP_NAME is the name of the ASP where the library is and 0000000032.00000 is the command lenght.
(note that the term COLLECTION has been deprecated, SCHEMA is the current term)
Since a library can contain both SQL and non-SQL objects, there's no SQL way to delete every possible object type.
Dropping the schema and recreating it might work. But note that if the library is in a job's library list, it will have a lock on it and you will not be able to drop it. Also, unless the library was originally created via CREATE SCHEMA (or CREATE COLLECTION) you're going to end up with differences.
CRTLIB creates an empty library, CREATE SCHEMA creates a library plus objects needed for automatic journaling and a dozen or so SQL system views.
Read Charles' answer - there may be objects in your schema that you want to keep (data areas, programs, display and printer files, etc.) If the problem is to delete all of the tables so you can re-build all of the tables, then look at the various system catalog tables: SYSTABLES, SYSVIEWS, SYSINDEXES, etc. The system catalog 'knows' about all of the SQL tables, indexes, views, stored procedures, triggers and so on. You could read the catalog and issue the appropriate SQL DROP statements.

connecting to remote oracle database in SQL

I need to do some data migration between two oracle databases that in different servers. I've thought of some ways to do it like writing a jdbc program but i think the best way is to do it in SQL itself. I can also copy the entire table over to the database I am migrating to but these tables are big and doesnt seem like a "elegant" solution.
Is it possible to open a connection to one DB in SQL developer then connect to the other one using SQL and writing update/insert functions on tables as if they were both in the same connection?
I have read some examples on creating linked tables but none seem to be oracle specific or tell me how to open the external connection by supplying it the server hostname/port/SID/user credentials.
thanks for the help!
If you create a Database Link, you can just select a from different database by querying TABLENAME#dblink.
You can create such a link using the CREATE DATABASE LINK statement.
It depends if its a one time thing or a normal process and if you need to do ETL (Extract, Transform and Load) or not, but ill help you out based on what you explained.
From what i can gather from your explanation, what you attempt to accomplish is to copy a couple of tables from one db to another, if they can reach one another then its really simple, you could just create a DBLINK (http://www.dba-oracle.com/t_how_create_database_link.htm) and then do a SELECT AS INSERT from either side using the DBLINK for one of the tables and the local table as the receiver or sender. Its pretty straight forward.
But if its a one time thing i would just move the table with expdp and impdp since that will be a lot faster and a lot less strain on the DB.
If its something you need to maintain and keep updated, why not just add the DBLINK and use that on both sides, this will be dependent on network performance though.
If this is a bit out of you depth or you cant create dblinks due to restrictions, SQL Developer has had a database copy option for a while and you can go as far a copying individual tables, but its very heavy on the system where its being run (http://deepak-sharma.net/2014/01/12/copy-database-objects-between-two-databases-in-oracle-using-sql-developer/).

Directing ADO queries away from the database

I've got a large number of old Delphi apps accessing a remote SQL Server database using ADO. I would like to get direct those queries to a middleware layer instead of said database. The Delphi clients must run unchanged; I am not the owner of most of them.
Is it possible to do this? If so, how should I go about it?
Don't worry about parsing the T-SQL (both raw T-SQL and stored proc calls, incidentally).
Create a new SQL database, and use a combination of views, T-SQL and managed code to fake up enough database objects for the application to work.
Technique 1: Use tables, but populate them asyncrhonously from the new data source.
Technique 2: Fake the tables and procedures
E.g. you can have a stored procedure which calls managed code to your middleware, to replace the existing stored procedure.
Where the application reads directly from a table, you can use a view, which references a managed table-valued function.
-
You should have no trouble wherever stored procedures are used. If the application sends dynamic SQL you have more of an uphill struggle however.

Clone entire database with a SP

I'm trying to find out if this is possible, but so far I haven't found out any good solutions. What I would like to achieve is write a stored procedure that can clone a database but without the stored data. That means all tables, views, constraints, keys and indexes should be included but without any data. Can it be done?
Sure - your stored proc would have to read the system catalog views to find out what objects are in the database, determine their potential dependencies, and then create a single or a collection of SQL scripts which re-create the database, and execute those.
It's possible - not very nice and easy to do. Especially the dependencies between objects might cause more headaches than first meets the eye....
You could also:
use something like SQL Server Management Studio (if you're on SQL Server - you didn't specify) and create the scripts manually, and just re-execute them on a separate server
use a "diff" tool like Redgate SQL Compare to compare two servers and have the second one brought up to date
I've successfully used the Microsoft SQL Server Database Publishing Wizard for this purpose. It's pretty straightforward, no coding needed. Here's a sample call:
sqlpubwiz script -d DatabaseName -S ServerName -schemaonly C:\Projects2\Junk\ DatabaseName.sql
I believe the default is to create both data and schema, but you can use the schemaonly parameter.
Download it here
In SQL Server you can roll through the system tables (sys.tables, sys.columns, etc.) and construct things one at a time. It's going to be very manual and error prone at the beginning, but it should become systematic pretty quickly.
Another way to do it is to write something in .Net using SMO. Check out this link:
http://www.sqlteam.com/article/scripting-database-objects-using-smo-updated