I'll start off by saying I'm in no way a PL/SQL or T-SQL expert and I only really know the basics but I have been asked to undergo a project to migrate about 1,700 PL/SQL procedure packages to T-SQL. Even if I knew PL/SQL the best way is using an automated tool to at least cover the majority of the translation. I've been using Microsoft's SQL Server Migration Assistant. After reading this article. I Am under the impression it is possible to convert single pieces of PL/SQL.
When I'm migrating schema's its giving me 3 errors every time. The same 3 errors. And basically it doesn't do the migrating at all. It just seems to comment the whole thing out?
I just want to know what is going wrong and why it is not migrating. From the second error my impression is that it is not converting because it is looking for ''stage_sendup_nb' but that doesn't exist on the database because I just pasted the SQL in.
"Unparsed SQL" means that it didn't recognize the statement you tried to convert. Most likely CREATE PROCEDURE is not supported for SQL statements. When you have a procedure to convert it's better to find it in Procedures and convert from there. "Snippet" conversion is very limited, it's designed to convert a statement or two, usually some specific query you want to try on the migrated DB (e.g. SELECT or UPDATE).
Any conversion of SQL (including procedures and SQL statements) relies on proper metadata available in Oracle. So you have to connect to Oracle database with all referenced tables, procedures and so on to convert even one procedure. In that way SSMA knows details about the referenced objects.
Related
I've been looking over various questions on loading data into Firebird, and apparently it's possible to run multiple INSERT statements in a batch... if you're running a script in ISQL and all the values are written inline.
If you're trying to do this from code, of course, you have two problems here: 1) ISQL syntax doesn't work, and 2) any developer with 20 minutes of SQL experience knows that writing data values inline into your query is an unholy abomination that will summon nasal demons, cause your hair to fall out, and oh by the way leave your system open to SQL Injection vulnerabilities.
But I haven't found any solution whatsoever about running bulk inserts from application code. I haven't even found anyone discussing it. Apparently there's a mechanism for quick-loading data from "external tables" if you write it out to a file in the right format, but there's precious little information available on how that works, and what is available claims that it has problems with concepts as simple as blobs and even nulls!
So I'm just about at my wits' end here. Does any mechanism at all exist to allow 3rd-party application code to bulk-load any and all data supported by Firebird into a FB database?
Prepared parameterized statement in loop.
IBatch class in Firebird 4 OO API.
IReplicator class in Firebird 4 OO API which is tricky but the fastest possible option.
In any case parsing of source data format and transforming values into types supported by Firebird is up to the application programmer. There is no silver bullet that "load anything".
I have to make my stored procedures written in SQL to be generic, so that they can be used in different versions of SQL (also MySQL if possible). I think it can be done if the scripts are written according to ANSI standards. But I have to convert large procedures. There is no tool for direct conversion. Is there any set of rules which can be followed to perform this conversion?
I have found a tool to validate scripts # http://developer.mimer.com
But this will be very time consuming as I have large SP's and I think by using some rule book, this task can be done in a shorter time.
There is no generic stored procedure language.
If you need something to work across database platforms, you would be better to implement the SP functionality in code, using ANSI standard SQL for the database access.
we have a big portal that build using SharePoint 2007 , asp.net 3.5 , SQL Server 2005 .. many developers work in it since 01/2008 and we are now doing huge analysis for current SQL Databases [not share-point DB ] to optimize and enhance it.
The main db have about 330 table and 1720 stored procedure (SP) created from 01/2008 till now
Many table names / Columns is very long and we want to short it
we found SP names is written in 25 format :( , some of them are very complex and also we want to rename
many SP parameters need to be renamed
one of the biggest table is Registered user table, that will be spitted in more than one table for some optimization, many columns name will be changed
I searched for the way that i can rename table names ,columns and i found SQL refactor tool but i still trying it ..
my questions :
Is SQl Refactor is the best tool for renaming ? or is there any other one ?
if i want to make it manually, is there any references or best practice for that ?
How can i do such changes in fast and stable way .. i search for recommendations and case studies if exist ?
This is why people have written coding standards (with defined naming conventions) and have code reviews!! Make sure you implement those procedures right now, to prevent his from getting any worse in the future.
Also for around $300, SQL Refactorâ„¢ is an excellent tool. If you were to use search and replace, you'd have countless errors and spend hours and hours editing code. I wouldn't even consider using anything other than SQL Refactor, and would never even try using a manual search and replace method on something as large as you describe.
You can use Visual Studio 2005 Database Edition, 2008 Database Edition or 2010 ultimate to load up your DB schema. This provides refactor capabilities, as well as database "builds" that check references in stored procedures, views and functions to ensure all tables and columns referenced actually exist.
We have a massive amount of stored procedures to convert from sql-server 2000 to oracle 10g. Does anyone know of any tool that would achieve this?
We used Oracle SQL developer for this but it seems like it is creating extra cursors for each IF/ELSE condition that was on sql server side.
Has anyone successfully used a tool that would do this?
Sorry, no answer, and you have my sympathies. I've been through this before and it was all manual. We ended up making distinct migration & test plan tasks for it.
Oracle will use cursors in places that look odd for people used to SQL server. I am not aware of any (simple) way around this.
There seem to be a number of companies out there now offering services or tools to help: This Google search shows a bunch.
Don't forget to plan for functional equivalence testing. Datatype differences may cause issues, and your application development tool(s) may interact differently with Oracle than they do with SQL server. I did this conversion a number of years ago for a PowerBuilder application, and a lot more of that needed updating than we expected.
We are developing a Visual Studio.NET 2008 application with Oracle 9i as backend. Is it true that the usage of Stored procedures would be faster than Inline SQL ? This is a debate with my co-programmers.
We are using ODP.NET (Oracle Data Provider for .NET) from Oracle.
Thanks.
While I am tempted to answer "no" or "I don't think so", the only real way to know the answer to this question is to go and measure it yourself. Use a profiler like JetBrains dotTrace, and TOAD's own profiler, to find out.
All other answers are speculative.
It should be. When you send inline SQL to database, the engine must parse it and execute. Stored procedures are being parsed (and compiled) at creation time. So at least you are gaining parsing time.
I would expect stored procedures to be faster in almost all cases.
But it mainly depends on your code.
If for example you get a resultset from your database and then use this data to perform other queries, you'll end up with a lot of overhead in your network traffic. This is especially true if you forget to use a single connection for the requests.
It could then be more efficient to use a single stored procedure to return the agregated data (if possible of course).
It will be faster if you build your stored procs (PL/SQL packages) in such a way that less data will be transferred between the database and the client.
Jon Limjap said: "All other answers are speculative."
There's much truth to this statement. There are so many factors: how's the DB server set up? How's the network speed/reliability? How's the SQL? How's the PL/SQL? I could write really slow SQL and/or PL/SQL if I wanted to (and have, inadvertently, on past projects!). So if you can, try both out.