SQL Server (T-SQL) BAIv2 Banking File Importing Nightmare - sql

I would like to know if anyone has a good solution for importing BAIv2 banking files into SQL Server. First of all the files have "continuation records" which have to be considered along with the parent records. Also, T-SQL doesn't have a pleasant way of parsing comma-separated strings. Finally, one hierarchy in the file has a varying number of elements so that makes direct pasting into a table difficult because the columns would not line up.
This is the file from hell. If anyone has any insights into how to import and parse BAIv2 banking files I would be most appreciative.
Thank you,

You're best off handling this with a dedicated application server and a real (general-purpose) programming language. T-SQL is ill-suited for this task.
When that's not an option, you can use C# for a SQL CLR stored procedure to parse the files. I did something similar for banking flat-files when I didn't have the option of an application server.

Related

Is it possible at all to bulk-load data from an application into Firebird?

I've been looking over various questions on loading data into Firebird, and apparently it's possible to run multiple INSERT statements in a batch... if you're running a script in ISQL and all the values are written inline.
If you're trying to do this from code, of course, you have two problems here: 1) ISQL syntax doesn't work, and 2) any developer with 20 minutes of SQL experience knows that writing data values inline into your query is an unholy abomination that will summon nasal demons, cause your hair to fall out, and oh by the way leave your system open to SQL Injection vulnerabilities.
But I haven't found any solution whatsoever about running bulk inserts from application code. I haven't even found anyone discussing it. Apparently there's a mechanism for quick-loading data from "external tables" if you write it out to a file in the right format, but there's precious little information available on how that works, and what is available claims that it has problems with concepts as simple as blobs and even nulls!
So I'm just about at my wits' end here. Does any mechanism at all exist to allow 3rd-party application code to bulk-load any and all data supported by Firebird into a FB database?
Prepared parameterized statement in loop.
IBatch class in Firebird 4 OO API.
IReplicator class in Firebird 4 OO API which is tricky but the fastest possible option.
In any case parsing of source data format and transforming values into types supported by Firebird is up to the application programmer. There is no silver bullet that "load anything".

How can i fire the contents of a CSV file through a stored procedure (to get it into a database in the right format)?

Ok, so the background to the story. I am largely self taught the bits of SQL i do know, and it tends to be just enough to make things work that need to work - albeit with a fair bit of research for the most basic jobs!
I am using a piece of software which grabs a string of data, and then passes it straight to an SQL stored procedure to move the data around, perform a few tasks on the string to make it the format i need it to be, and then grabs lumps of this data and places it in various SQL tables as outlined by the SP. I get maybe half a million lines of data each day, and this process works perfectly well and quickly. However, should data be lost, or not manage to make it through to the SQL database correctly, i do still have a log of the 500,000 lines of raw data in CSV file format.
I cant seem to find a way to simply bulk import this data into the various tables in the various formats it needs to be in. Assuming there is no way to re-pass this data through the 3rd party software (i have tried and failed), what is the best (read easiest for a relative lamen) way to send each line of this CSV file through my existing stored procedure, which can then process and import the data as normal? i have looked at the bcp utility, but that didnt seem to be viable (or i am not well enough informed to make it do what i need). I have also done the usual trawling of the web (and these forums) to see if anything jumped out at me as the obvious way forward, but come up a bit dry.
Apologies if i am asking something a bit 101, but i would certainly be grateful if anyone could help me out with this - if i missed out any salient bits of information, let me know! :)
Cheers.
The SQL Server Import/Export Wizard is a point and click solution that can be used to import CSV files into SQL Server.
The wizard builds an SSIS package behind the scenes, which can be saved and scheduled to run as needed. The wizard doesn't give you much in the way of data transformation, but the data could be loaded into a staging table and then processed by your existing stored procedure.

Where to put Stored Procedures & Swapping DB Tech?

IS THIS ABSOLUTELY BONKERS.....
I have just started working on a new project and I'm shocked at what I have just seen. This project is a C# web app that sits on top of an Oracle database. Now all the stored procedures are not actually stored procedures.... They are just SQL scripts stored in text files in a directory on the server. When the application starts it looks in the directory and goes through each file and reading out the text and saving it in a dictionary. It also runs a Regex over the text removing special sequences like [PARAM] and replaces them with the correct symbol e.g. ':' in Oracles case or '#' for SQL SERVER. Then when the code wants to execute one of these statements it calls a method which finds the correct one in the dictionary and runs it.
Now this appears to have been done in case they ever wanted to swap underlying db technologies. They say they would just swap the sql files out of the directory for files in the appropriate syntax and it would work.
Now I would normally expect the stored procedures to be actually stored procedures and live on the db. A separate project (layer) that talks to the db. Then if the db technology changes just add another data layer project and swap the dlls out....
I see massive problems with the way its been done currently:
No execution plan on the db server being created.
Massive overhead reading hundreds of text files, building up a string for each, running regex over it.
No checking of SQL syntax.
Big memory foot print having all these stored procedures in memory
What do you lot think?
Is this really bad or am I just moaning because I have never seen anything like this before?
What else is wrong with this approach?
Any comments will be much appreciated as I'm trying to get across to colleagues that this is crazy....
Why don't they use the scripts at build time to create stored procedure scripts in the native syntax (PSQL, T-SQL), which can then be deployed to the database? I can't see that would too much more work, and you get all the benefits of compiled stored procedure code etc.
I have personal experience of run-time compilation of stored procedures (SQL Server) being a big performance overhead, and on a production system this was a real problem.
I can sort of see the reasoning behind this design:
Stored procedure code is too database specific, so we won't use
stored procedures, we will use SQL statements instead.
Even SQL statements can have database specific syntax in them, so
we'll have some hokey method for converting them on the fly at
run-time.
Even if you don't use stored procedures, I still think the conversion should be done at build time (e.g., to generate C# code), not run-time.
What do you lot think?
I think it's a classic case of over-engineering: who changes their DB provider? Is it really bringing anything to the table?
Is this really bad or am I just moaning because I have never seen
anything like this before?
A bit of both in my opinion: it's quite bad but if they've been running like this for some time it's got to be working.
What else is wrong with this approach?
As it is, re-calculating the execution plan every time is probably the biggest issue:
so much performance loss! I'm saying probably because performance is not always a requirement (over-optimization isn't any better ;)
What is really wrong is that they reinvented the wheel: LINQ does that natively.
And that means as LINQ gets improved over time, the company will steadily fall behind since they won't benefit from the enhancements.
If you have any leverage power, try to talk to them about LINQ.

Code legibility/maintenance: Where to put SQL statements?

In the language file itself?
In a language file with all SQL statements?
In different .sql files?
Another way?
Share your code style. :)
Even if you don't use a framework, MVC helps keep you sane by separating your data access, logic and presentation into separate language files.
I guess you will get as many answers as answers. Anyway let's ask why one could
be preferrable over the other:
In the language file itself (I don't know what yoyu mean with the language file) but let me assume you meant the programming langauge itself. Well this approach was taken by Microsoft with Linq . It was taken e.g in Gemstone where the "query" langauge is Smalltalk (but not SQL)
If you put it in some .sql file then there must be a way to adress the code. I think this is what is done with stored procedures. Examples for that can e.g be found in the Postgres Database software.
If you put it in one of many files is probably open. E.g it could be that you have one query one file. Is that better or worse than having a hash table with diverse SQL statements identified by some key.
I see the following approaches every day in Access software
1) embedded in VBA as "just strings"
2) put into the queries section of access
3) I even read about putting this SQL statements in an extra SQL Statement Table.
Regards
Friedrich
It all depends - e.g:
in small DB maintenance scripts with a simple sequential control flow it's nice to see the statements where they are used
programs with loops/callbacks should prepare the statements early; then a list of all stements near a init/prepare function makes sense
a special case: I use a set of tool scripts written in different languages that do 'the same thing'; they all get their statements from .txt files containig SQL statements tagged by name
'big' applications (should) use stored procedures - then the problem vanishes
In Cobol, I put the SQL in the language file, but in separate procedures. That way, I separated the business logic from the data base logic.
In Python, I put the SQL in its own .py module. That way, I separated the business logic from the data base logic.
In Java, I put the SQL in a separate package. That way, I separated the business logic from the data base logic.
I've not used other languages, but I'd probably separate the business logic from the data base logic.

Which tool to use to export SQL schema from ODBC database?

I have a database in a format which can be accessed via ODBC. I'm looking for a command-line tool to generate SQL file with DROP/CREATE statements from it, preferably with all the information including table/field comments and table relations. (Possibly for a tool to parse the file and import the schema too, but I guess this would be relatively easier to find). Need this to automate workflow, to be able to design the database visually but store it in SVN in code form.
Which tool should I use?
If this helps, the database in question is MS Access, but I guess there's a higher chance of finding a generic ODBC tool...
Okay, I wrote the tool to export access schema/parse SQL files myself, it's available here:
https://bitbucket.org/himselfv/jet-tool
Feel free to use if anyone needs it.
Adding this because I wanted to search an ODBC schema, and came across this post. This tool lets you dump a csv format of the schema itself:
http://sagedataobjects.blogspot.co.uk/2008/05/exploring-sage-data-schema.html
And then you can grep away..
This script may work for you with some modifications. Access (the application) is required though.