Plain SQL output from NHibernate - sql

I need:
Plain SQL that I can run without modification with sqlcmd.exe to insert testdata into testdatabase.
I have:
Service calls and entities to generate the insert operations with NHibernate.
Not working solution:
Log output to text-file. NHibernate generates parameterized sql but logs them in a format not runnable by sqlcmd.exe.
Is there any way to force NHibernate to generate sql without parameters?
Or is there any better solutions to the problem?

depending on your schema, it may be easier to just generate INSERTs from the actual database, try using a utility like:
Procedure to script your data (to generate INSERT statements from the existing data)

You could record a transaction log, the SQL Server Profiler is providing something like this.
In our application, we wrote factories in C# which generate the entities. We don't have SQL scripts to create testdata. An executer (.exe) picks up assemblies, creates the entities and stores them to the db. This way we don't have to maintain scripts. The factories are compile-time-safe.

Related

Write non-SQL dataset to SQL table in DataIku

I dont seem to find a way to write the output from a previous step in the flow into a SQL table, using the SQL recipes. When I read the documentation, it seems both types of SQL action can only take as an input a SQL dataset? This cant be write, as you would imagine you would want to create datasets in the flow and then commit them to a database?
https://doc.dataiku.com/dss/latest/code_recipes/sql.html
In the docs above, it describes In\Out parameters as needing to be SQL.
Indeed, it doesn't seem possible with a SQL recipe which executes fully in the database.
That being said you can probably use a sync recipe to put your non-SQL dataset in your SQL db so that you can execute a SQL recipe.

Stored Procedure vs direct SQL command in SSIS data flow source

I'm providing maintenance support for some SSIS packages. The packages have some data flow sources with complex embedded SQL scripts that need to be modified from time to time. I'm thinking about moving those SQL scripts into stored procedures and call them from SSIS, so that they are easier to modify, test, and deploy. I'm just wondering if there is any negative impact for the new approach. Can anyone give me a hint?
Yes there are issues with using stored procs as data sources (not in using them in Execute SQL tasks though in the control flow)
You might want to read this:
http://www.jasonstrate.com/2011/01/31-days-of-ssis-no-more-procedures-2031/
Basically the problem is that SSIS cannot always figure out the result set and thus the columns from a stored proc. I personally have run into this if you write a stored proc that uses a temp table.
I don't know that I would go as far as the author of the article and not use procs at all, but be careful that you are not trying to do too much with them and if you have to do something complicated, do it in an execute sql task before the dataflow.
I can honestly see nothing but improvements. Stored procedures will offer better security, the possibility for better performance due to cached execution plans, and easier maintenance, like you pointed out.
Refactor away!
You will not face issues using only simple stored procedures as data source. If procedure is using temp tables and CTE - there is no guarantee you will not face issues. Even when you can preview results in design time - you may get errors in a run time.
My experience has been that trying to get a sproc to function as a data source is just not worth the headache. Maybe some simple sprocs are fine, and in some cases TVFs will work well instead, but if you need to do some complex operations there's no alternative to a sproc.
The best workaround I've found is to create an output table for each sproc you need to use in SSIS.
Modify the sproc to truncate the new output table at start, and to write its output to this instead of (or in addition to) ending with a SELECT statement.
Call the sproc with an Exec SQL task before your data flow.
Have your data flow read from the output table - a much simpler task.
If you want to save space, truncate the output table again with another Exec SQL. I prefer to leave it, as it lets me examine the data later and lets me rerun the output data flow if it fails without calling the sproc again.
This is certainly less elegant than reading directly from a sproc's output, but it works. FWIW, this pattern follows the philosophy (obligate in Oracle) that a sproc should not try to be a parameterized view.
Of course, all this assumes that you have privs to adjust the sproc in question. If necessary, you could write a new wrapper sproc which truncates the output table, then calls the old sproc and redirects its output to the new table.

(SQL Server 2005) Good way to manage a large number of insert statements?

I have a stored procedure that takes a table name and writes out a series of INSERT statements, one for each row in the table. It's being used to provide sample, "real world" data for our test environment.
It works well but some of these sample rowsets are 10, 20k records. The stored proc writes them out using the PRINT statement and it's hard to copy that many lines and paste them into the management studio to run them. Is there a SQL redirect feature I might be able to use, perhaps to write this output to a table and a way to loop through and run each statement that way? Just a thought.
I'd like to do this all from within the management studio and not have to write a C# program to create a dataset and loop over it, etc. I'm essentially looking for suggestions on a good approach. Thanks very much.
Use EXEC:
PRINT #INSERT_statement
EXEC #INSERT_statement
...to run the query.
But I'd recommend looking at bulk insertion to make the data load faster:
BULK INSERT
Where is your stored procedure getting this data?
You may want to look into importing it as a table and then running your stored procedure against that inserted table. SQL Server Management studio has many options for importing data.
If your stored proc is generating the data - then that's a whole other issue.

How to get the sql query from a NHibernate save

I am trying to create a type of recorded data transaction that I can replay on a different database.
For example I am capturing an order into a system, when I save that I want to be able to "export" a sql script that I can run on another database to create the same order.
I am using NHibernate and I am trying to catch the sql query string for the save operation to save to a file, but with no success.
Checkout this question: Get executed SQL from nHibernate
I'm not sure if there is a better alternative like an event listener, if not, the IInterceptor approach seems to be the best.

best way in producing a master script for SQL

i want to extract specific database tables & stored procedures into one master script. Do you know any software that can help me do this faster? I've tried using the SQL Database publishing tool, but it's not that efficient since its gathering tables that I didn't select.
In SQL Server 2005, right click on the database, then select Tasks, and then select Generate Scripts.
Generating SQL Scripts in SQL Server 2005
As mentioned in that link, I'm fairly sure you have to generate the DROP and CREATE statements separately.
Try DBSourceTools. http://dbsourcetools.codeplex.com
Its open source, and specifically designed to script databases - tables, views, procs to disk.
It also allows you to select which tables, views, db-objects to script.
I use Redgate SQL compare for this (by comparing to an empty DB), as well as for doing upgrades between all my DB versions (I save a copy of the DB for each released version, and then just do a compare between current and previous to get a change script for that version).
I have found the "Generate Scripts" does a bad job in some cases with dependencies - eg, it will try to create a stored procedure that uses a table before the table is created, causing the script to fail. I'll accept I'm possibly using it wrong, but SQL Compare "just works". The scripts it generates are also enclosed in a transaction -- so if something fails, the whole change is rolled back. You don't end up with a half-populated or half-upgraded database.
Downside is that this is a commercial tool, but IMHO worth the money.