Stored Proc with multiple parameters - sql

I need a SP which can take some 24 input parameters to insert a record into a table. One way of sending multiple parameters is by using XML datatype. Any other best practice for sending multiple input parameter in SQL SP?
Any advise is appreciated!

If you're inserting only a fixed number of records, than you can define 24 parameters in your SP. This way you can get some compile-time checking, also you can define not-null, null or a default value for each parameter for greater flexibility.
I woudn't use XML datatype unless I have variable number of arguments or I must simulate parameter arrays (like in insert multiple orderlines at the same time).
If you're using SQL Server 2008 or higher there is support for Table-Valued Parameters. You can check this link for using table-valued params with .NET SqlCient, too

Vasile Bujac's answer is excellent and I agree with everything. But it may be worth adding that Sommarskog, a luminary MVP, has some very good articles on mimicing an array in SQL Server that may be very applicable to your situation. You can find them here: http://www.sommarskog.se/arrays-in-sql.html

Related

Why should "ordinary string formatting" be used for table/field names in psycopg queries?

From the documentation on properly passing SQL parameters to psycopg2:
"Only variable values should be bound via this method: it shouldn’t be used to set table or field names. For these elements, ordinary string formatting should be used before running execute()."
http://initd.org/psycopg/docs/usage.html#passing-parameters-to-sql-queries
Why is this? Why would setting a field name not have the same SQL injection problems as setting values?
Because you don't usually receive names of your tables or fields from users, you specify them in your code yourself. If you generate your request based on values got from user input, than yes, you should do some kind of escaping to avoid SQL injections.

SQL injection if brackets and semicolons are filtered

I have a statement like this:
SELECT * FROM TABLE WHERE COLUMN = 123456
123456 is provided by the user so it is vulnerable to SQLi but if I strip all semicolons and brackets, is it possible for the hacker to run any other statements (like DROP,UPDATE,INSERT etc) except SELECT?
I am already using prepared statements but I am curious that if the input is stripped of the line-terminator and brackets, can the hacker modify the DB in any way?
Use sql parameters. Attempting to "sanitize" input is an extremely bad idea. Try googling some complex sql injection snippets, you won't believe how creative black hat hackers are.
In general it's very difficult to be 100% certain that you are safe from this type of attack by trying to strip out specific characters - there are just too many ways to get around your code (by using character encodings etc.)
A better option is to pass parameters to a stored procedure, like this:
CREATE PROCEDURE usp_MyStoredProcedure
#MyParam int
AS
BEGIN
SELECT * FROM TABLE WHERE COLUMN = #MyParam
END
GO
That way SQL will treat the value passed in as a parameter, and nothing else, no matter what it contains. And in this case it would only accept a value of type int anyway.
If you don't want, or can't, use a stored procedure, then I'd suggest changing your code so that the input parameter can only contain a pre-defined list of characters - in this case numeric characters. That way you can be certain that the value is safe to use.

Error Inserting Entry With Text Column That Contains New Line And Quotes

I have an Informix 11.70 database.I am unable to sucessfully execute this insert statement on a table.
INSERT INTO some_table(
col1,
col2,
text_col,
col3)
VALUES(
5,
50,
CAST('"id","title1","title2"
"row1","some data","some other data"
"row2","some data","some other"' AS TEXT),
3);
The error I receive is:
[Error Code: -9634, SQL State: IX000] No cast from char to text.
I found that I should add this statement in order to allow using new lines in text literals, so I added this above the same query I have already written:
EXECUTE PROCEDURE IFX_ALLOW_NEWLINE('t');
Still, I receive the same error.
I have also read the IBM documentation that says: to alternatively allow new lines, I could set the ALLOW_NEWLINE parameter in the ONCONFIG file. I suppose the last one requires administrative access to the server to alter that config file, which I do not have, and I prefer not to take advantage of this setting.
Informix's TEXT (and BYTE) columns pre-date any standard, and are in many ways very peculiar types. TEXT in Informix is very different from TEXT found in other DBMS. One of the long-standing (over 20 years) problems with them is that there isn't a string literal notation that can be used to insert data into them. The 'No cast from char to text' is saying there is no explicit conversion from string literal to TEXT, either.
You have a variety of options:
Use LVARCHAR in the table (good if your values won't be longer than a few KiB, because the total row length is approximately 32 KiB). Maximum size of an LVARCHAR column is just under 32 KiB.
Use a programming language which can handle Informix 'locator' structures — in ESQL/C, the type used to hold a TEXT is loc_t.
Consider using CLOB instead. However, this has the same limitation (no string to CLOB conversion), but you'd be able to use the FILETOCLOB() function to get the information from a file on the client to the database (and LOTOFILE transfers information from the DB to a file on the client).
If you can use LVARCHAR, that is by far the simplest alternative.
I forgot to mention an important detail in the question - I use Java and the Hibernate ORM to access my Informix database, thus some of the suggested approaches (the loc_t handling in particular) in Jonathan Leffler's answer are unfortunately not applicable. Also, I need to store large data of dynamic length and I fear the LVARCHAR column would not be sufficient to hold it.
The way I got it working was to follow Michał Niklas's suggestion from his comment, and use PreparedStatement. This could potentially be explained by Informix handing the TEXT data type in its own manner.

SQL Server 2005 seperate stored procedure CSV value into multiple columns?

I'm a SQL Server 2005 newb and I have to do something that should be easy but I'm having difficulties.
For the time being my stored procedure is storing csv in one column in my db. I need to break apart the csv into multiple columns. The number of values in a the CSV parameter is static, there is always gonna be 8 values and they are always going to be in the same order. Here is an example of the data:
req,3NxWZ7RYQVsC4chw3BMeIlywYqjxdF5IUX8GMUqgJlJTztcXQS,192.168.208.78,getUserInfo,AssociateService,03303925,null,M042872,
What is the best function to use to separate this parameter in a stored proc so I can then insert it into separate columns? I was looking at substring but that seems to be positional and not regex aware??
Thanks,
Chris
SQL Server doesn't have the native functionality; you have to build a function (generally referred to as "split"). This thread provides a number of TSQL options, looking for the same answer you're after--what performs best. Where they lack is in testing large amounts of comma delimited data, but then your requirement is only for eight values per column anyway...
SQL Server 2005+ supports SQLCLR, where functionality can be handed off to .NET code. The .NET code has to be deployed to the SQL Server instance as an assembly, and TSQL functions/procedures need to be created to expose the functionality within the assembly.
SUBSTRING is positional, but you can use the CHARINDEX function to find the position of each delimiter, and grab the SUBSTRINGs between each comma.
It's a clunky solution, to be sure, but the impact is minimized given a small static number of fields that always appear in order...
SQL Server isn't very strong in string handling - especially if you cannot use SQL-CLR to help.
You might be better off doing this beforehand, in your client application, using something like FileHelpers, a C# library which very easily imports a CSV and breaks it apart into an object for each row.
Once you have that, you can easily loop through the array of objects you have and insert those individual object properties into a relational table.

SQL Server Stored Proc Argument Type Conversion

Suppose I have a bunch of varchar(6000) fields in a table and want to change those to text fields. What are the ramifications of the stored procedures whose arguments are of type varchar(6000). Does each stored procedure also need those argument data types changed?
Text fields are deprecated in SQL Server 2005 and above. You should use varchar(MAX), if possible. If you expect to have more than 6000 characters passed in the arguments to your stored procedures, you will need to change them as well.
Text fields are rough to work with in SQL Server. You can't actually declare local variables of type text (except as parameters to a stored procedure) and most of the string manipulation functions no longer work on text fields.
Also if you have triggers the text fields will not appear on the INSERTED or DELETED tables.
Basically if the field is just holding data from a program and you aren't manipulating it then no big deal. But if you have stored procedures to manipulate the string then your task will be way more difficult.
As tvanfosson mentioned if you have SQL Server 2005 use VARCHAR(MAX) then you get the length of a text field with the ability to manipulate it like it is a VARCHAR.
The other answers are right, but they don't answer your question. Varchar(max) is the way to go. If you made the feilds varchar(max)/text, but kept the stored proc arguments the same, any field that came in through the stored proc would be truncated to 6000 characters. Since you say that it will never exceed that, you will be fine, until, of course, that isn't the case. It doesn't throw an error. It just truncate.
I'm not sure of the exact behavior of varchar(max) verses text, but I'm pretty sure that once you start putting a lot of them in one table, you can get some crazy performance hits. Why so many big fields in one table?
The reason for text field usage is that all of the varchar(6000) fields in one row exceed the max row length. Text fields just store a pointer in the row thus not exceeding the SQL Server max row length of 8000 something. ATM the database cannot be normalized. The data is not manipulated by the stored procedures it's just inserted, updated and deleted.
Does VARCHAR(MAX) behave like a text field and only store a pointer to the data in the row?