Is there any way to namespace PL/SQL packages? - sql

I have several different packages, one for each logical part of my application. Some packages are getting huge but I would like to keep all the procedures/functions grouped in some way rather than breaking them into separate packages. Is there any way to nest, or namespace, my packages?
So if I have MYSCHEMA.PKG_PEOPLE and it has 10 procedures and 10 functions, is there no way that I can for instance move the CRUD procedures to MYSCHEMA.PKG_PEOPLE.CRUD. I want to keep all these items inside of PKG_PEOPLE but I want to further sub-divide them.

Beyond Schema and Package, there is no multi-level namespace handling for PL/SQL packages in Oracle.
Within a package body you can define nested procedures but I would guess this isn't what you need.
I think the closest you'll get is to enforce a naming rule on your packages. For example:
MYSCHEMA.PKG_PEOPLE
MYSCHEMA.PKG_PEOPLE_CRUD

Related

SSIS package dependent objects on SQL Server

I need to create a localized environment for a SSIS package that would only have the objects and entities needed by tasks inside it.
It's a large & complex SSIS package and it connects to eight databases on the same sql server.
Is there a quick way to list all the tables, stored procedures, functions etc that will be utilized(directly or indirectly) when I execute this package.
Good thought by Nick, but even that will only get you objects that are used directly. There is no quick easy way to get all the objects that will be used indirectly.

Moving Oracle procs out of packages [duplicate]

I searched google but did not find any satisfying answer as to why I should use packages.
I know that a package is a bundle of procedures, functions and different variables. As I understand it sort of corresponds to object in OOP. But of course there's nothing like instantiating different instances of a package so that each instance would have different property values and behave differently.
Then what is the advantage of using packages when I can just create a standalone procedure and call it independently?
Packages provide the following advantages:
Cohesion: all the procedures and functions relating to a specfic sub-system are in one program unit. This is just good design practice but it's also easier to manage, e.g. in source control.
Constants, sub-types and other useful things: there's more to PL/SQL than stored procedures. Anything we can define in a package spec can be shared with other programs, for instance user-defined exceptions.
Overloading: the ability to define a procedure or function with the same name but different signatures.
Security: defining private procedures in the package body which can only be used by the package because they aren't exposed in the specification.
Sharing common code: another benefit of private procedures.
We only need to grant EXECUTE on a package rather than on several procedures.
As described in Oracle docs, packages are good because of:
modularity
easier application design
information hiding
added functionality
better performance
Details on each reason are explained in docs.

Oracle database dependencies in PL/SQL

I need to find dependencies between functions/procedures(defined inside package bodies) and tables which they use.
I've tried all_dependencies but it works only on the package-level, not the inner function/procedure-level.
Is there any possibilty to find this dependencies using e.g. all_source?
Thanks in advance for your help.
It is not possible to find the dependencies between procedures (in a package) and tables.
There are several tools to examine dependencies. As you've already discovered, *_DEPENDENCIES only tracks object dependencies on a per-package level. There is a neat tool PL/Scope that tracks dependencies between parts of a package. But it does it does not track all table references.
Theoretically you could use *_SOURCE. In practice, this is impossible unless your code uses a limited set of features. For any moderately complicated code, forget about using string functions or regular expressions to parse code. Unfortunately there does not seem to be any PL/SQL parser that is both programmable and capable of accurately parsing complex code.
Saying "it's not possible" isn't a great answer. But in this case it might save you a lot of time. This is one of those tasks where it's very easy to hit a dead end and waste a lot of effort.

Common function / stored procedures and incompatible updates

We have a number of modules within a larger suite that all use a common set of stored procedures and functions, due largely in part to the fact that they all use a common set of data and tables. This approach ensures that all modules receive the same answers for when making the same calls - a very good thing (especially in the financial industry)
However, the downside of this approach is that when we update one module in the suite that requires a change to one of the shared stored procedures or functions it requires that we update almost the entire suite. Which is a bad thing due to time and cost.
What kind of strategies can be employed to mitigate this suite upgrade issue every time we update a single stored procedure, while minimizing the management complexity.
This is similar somewhat to the version issue that Microsoft had with DLL(s) / API(s) where you would see a signature change on an API that would necessitate a xxx2 version, which is not ideal cause then you have basically two versions which both need to be maintained and upgraded with the potential that they get out of synch (i.e. two different answers for the same question).
Any strategies or best practices in this regard would be greatly appreciated.
Thanks in advance.
Whatty
Two strategies are views and stored procedures.
Use views to access the data in the tables. This way, you can change the underlying tables (for one module) and not have to change other modules immediately. Eventually, you can get around to changing that module as well. For instance, you might split one table into two different tables for one module. Other modules will never notice, because they access a view, and you just modify the view to return the original data.
Along these lines, you never want to use select *, because the columns, their names, or their ordering might change.
The second strategy is to wrap all insert, update, and delete into stored procedures. This has a second advantage that you can do checking, logging, and notifications in the procedure. You can try to mimic this with triggers and constraints, but I find the stored procedure approach much more maintainable.

Find tables used from a VB.net application to remove unused tables

We are presently developing an application, let's call it APP1, which uses a SQL Database which have about 800 stored procedures, 600 tables, etc. APP1 was originally created in order to replace another application, APP0, from which we do not have source code but only SQL tables, Stored Procedures, views, etc. Previous programers of APP1 used some DB objects from this same database and added some other objects specific to APP1 because it becomes bigger than APP0. And we do not need APP0 anymore as APP1 does all what we want, and more.
So, now, we are thinking about a way to find out which objects are used by APP1 in order to remove objects which are ONLY used by APP0.
What is the best approach to discover all objects used by APP1 without having to open every single class and form?
Once we will have a complete list of these objects, it will be easy to use a program we bought which detects all dependencies for all SQL Objects specified directly from SQL and remove objects which do not return from any dependencies. Any ideas of how I could get this list without having to go through all our program that have many, many, many classes and forms?
Thanks,
Note : I know, in a perfect world, all calls to PSs and tables should be in a DAL but in the case of the application we're presently working on ... this is not our case! Yippy! (sarcastic yippy) ;)
Note 2 : This application is not using any ORM. So all queries are directly using SqlCommand. So any call to any DB objects are in string format.
You mentioned you have all the Tables, Sprocs & etc from APP0. Presumably there is a BAK of them or you can grab the original SQL objects by installing APP0 on a fresh PC.
Then use SQL Compare from RedGate to compare the Database that APP1 uses to the original APP0 Database, then you can see which objects you've added and can strip out all the redundant APP0 db objects.
You could run a trace on the database whilst the application is in use. This is likely to create a rather large amount of data, but from that you can reduce it to the procedures and or SQL statements executed by your application.
Can you guarantee that you will, or can, use all the functionality? You might want to also run something like NCover to check how much of the application code you've exercised whilst using it.
I don't have an easy answer, but here's how I'd attack it. I admit up front this would take a fair amount of time, so I'll readily yield to someone who has a better answer.
It's a two-step problem.
Step 1: Find all the dependencies within SQL. That is, find all the tables that are used to make views, and find all the tables and views that are used in stored procedures and functions. MS SQL server has a function to do this for you. With other DBs you could write some queries against information_schema (or whatever their proprietary equivalent is).
Step 2: Get a list of all the SQL statements and stored procedures executed from within your code. This should be relatively straightforward if you do not build SQL statements on the fly. Just search for all your SQLCommand objects and find what you set the query to. You could write a little program to scan your source and dump this out.
Then parse through this dump and make a list of referenced sprocs, tables, and views. Sort alphabetically and eliminate duplicates. Then add any tables or views referenced from sprocs and any tables referenced from views. Sort and eliminate duplicates again. Then you have your list.
If you do geneate SQL on the fly, I think the complexity level multiplies greatly. Then you have to work your way through code that generates SQL and pick out the table names. If there are places where table names are passed from function to function, this could get very hard. (I could imagine real nightmare scenarios, like you ask the user to type in a table name or you build a table name from pieces. Like, "dim tablename = if(dept="B17","accounting", "manufacturing") & "_" & year".)