I'm a SQL Server 2005 newb and I have to do something that should be easy but I'm having difficulties.
For the time being my stored procedure is storing csv in one column in my db. I need to break apart the csv into multiple columns. The number of values in a the CSV parameter is static, there is always gonna be 8 values and they are always going to be in the same order. Here is an example of the data:
req,3NxWZ7RYQVsC4chw3BMeIlywYqjxdF5IUX8GMUqgJlJTztcXQS,192.168.208.78,getUserInfo,AssociateService,03303925,null,M042872,
What is the best function to use to separate this parameter in a stored proc so I can then insert it into separate columns? I was looking at substring but that seems to be positional and not regex aware??
Thanks,
Chris
SQL Server doesn't have the native functionality; you have to build a function (generally referred to as "split"). This thread provides a number of TSQL options, looking for the same answer you're after--what performs best. Where they lack is in testing large amounts of comma delimited data, but then your requirement is only for eight values per column anyway...
SQL Server 2005+ supports SQLCLR, where functionality can be handed off to .NET code. The .NET code has to be deployed to the SQL Server instance as an assembly, and TSQL functions/procedures need to be created to expose the functionality within the assembly.
SUBSTRING is positional, but you can use the CHARINDEX function to find the position of each delimiter, and grab the SUBSTRINGs between each comma.
It's a clunky solution, to be sure, but the impact is minimized given a small static number of fields that always appear in order...
SQL Server isn't very strong in string handling - especially if you cannot use SQL-CLR to help.
You might be better off doing this beforehand, in your client application, using something like FileHelpers, a C# library which very easily imports a CSV and breaks it apart into an object for each row.
Once you have that, you can easily loop through the array of objects you have and insert those individual object properties into a relational table.
Related
I'm trying to extract data (using SPUFI) from a DB2 table to a file, with one of the output fields converting a decimal field to the same format as a COBOL comp field.
So e.g. today's date (20141007) would be ..ëõ
The SQL HEX function converts 20141007 to 013353CF, and doing a SELECT of x'013353CF' gives me the desired result, but obviously that's a constant, I'm trying to find an equivalent function.
Basically an inverse of the HEX function.
I've come across a couple of suggestions using user defined functions. Problem is, we've only recently upgraded to DB2 10 and new function mode isn't enabled yet, which means I don't have access to any control functions in a UDF.
I suspect I'm out of luck, but wondering if anyone has any suggestions.
I appreciate this is completely the wrong tool for the job, and would be easier to just write a COBOL program to do it, but various constraints are preventing that. I'm limited to just SQL functions and possibly JCL).
I thought I had a solution using a recursive UDF to get around the lack of control functions, but that's not allowed either.
Due to a bug in one of our applications, a certain character was duplicated 2^n times in many CLOB fields, where n is anywhere between 1 and 24. For the sake of simplicity, lets say the character would be X. It is safe to assume that any adjacent occurrence of two or more of these characters identifies broken data.
We've thought of running over every CLOB field in the database, and replacing the value where necessary. We've quickly found out that you can easily replace the value by using REGEXP_REPLACE, e.g. like this (might contain syntax errors, typed this by heart):
SELECT REGEXP_REPLACE( clob_value, 'XX*', 'X' )
FROM someTable
WHERE clob_value LIKE 'XX%';
However, even when changing the WHERE part to WHERE primary_key = 1234, for a data set which contains around four million characters in two locations within its CLOB field, this query takes more than fifteen minutes to execute (we aborted the attempt after that time, not sure how long it would actually take).
As a comparison, reading the same value into a C# application, fixing it there using a similar regular expression approach, and writing it back into the database only takes 3 seconds.
We could write such a C# application and execute that, but due to security restrictions it would just be a lot easier to send a database script to our customer which they could execute theirselves.
Is there any way to do a replacement like this much faster on an Oracle 10g (10.2.0.3) database?
Note: There are two configurations, one running the database on a Windows 2003 Server with the Clients being Windows XP, and another one running both the database and the client on a standalone Windows XP notebook. Both configurations are affected
How does your client access the Oracle server? If it is via a Unix environment(which most likely is the case) then maybe you can write a shell script to extract the value from database, fix it using sed, and write back to database. Replacing in unix should be real quick.
Maybe you facing problem with LOB segment space fragmentation. In fact each of your lobs will be shorted that before. Try to create a new table and copy modified clobs into this new table.
As we didn't find any way to make it faster on the database, we delivered the C# tool within an executable patch.
I have an Informix 11.70 database.I am unable to sucessfully execute this insert statement on a table.
INSERT INTO some_table(
col1,
col2,
text_col,
col3)
VALUES(
5,
50,
CAST('"id","title1","title2"
"row1","some data","some other data"
"row2","some data","some other"' AS TEXT),
3);
The error I receive is:
[Error Code: -9634, SQL State: IX000] No cast from char to text.
I found that I should add this statement in order to allow using new lines in text literals, so I added this above the same query I have already written:
EXECUTE PROCEDURE IFX_ALLOW_NEWLINE('t');
Still, I receive the same error.
I have also read the IBM documentation that says: to alternatively allow new lines, I could set the ALLOW_NEWLINE parameter in the ONCONFIG file. I suppose the last one requires administrative access to the server to alter that config file, which I do not have, and I prefer not to take advantage of this setting.
Informix's TEXT (and BYTE) columns pre-date any standard, and are in many ways very peculiar types. TEXT in Informix is very different from TEXT found in other DBMS. One of the long-standing (over 20 years) problems with them is that there isn't a string literal notation that can be used to insert data into them. The 'No cast from char to text' is saying there is no explicit conversion from string literal to TEXT, either.
You have a variety of options:
Use LVARCHAR in the table (good if your values won't be longer than a few KiB, because the total row length is approximately 32 KiB). Maximum size of an LVARCHAR column is just under 32 KiB.
Use a programming language which can handle Informix 'locator' structures — in ESQL/C, the type used to hold a TEXT is loc_t.
Consider using CLOB instead. However, this has the same limitation (no string to CLOB conversion), but you'd be able to use the FILETOCLOB() function to get the information from a file on the client to the database (and LOTOFILE transfers information from the DB to a file on the client).
If you can use LVARCHAR, that is by far the simplest alternative.
I forgot to mention an important detail in the question - I use Java and the Hibernate ORM to access my Informix database, thus some of the suggested approaches (the loc_t handling in particular) in Jonathan Leffler's answer are unfortunately not applicable. Also, I need to store large data of dynamic length and I fear the LVARCHAR column would not be sufficient to hold it.
The way I got it working was to follow Michał Niklas's suggestion from his comment, and use PreparedStatement. This could potentially be explained by Informix handing the TEXT data type in its own manner.
I need a SP which can take some 24 input parameters to insert a record into a table. One way of sending multiple parameters is by using XML datatype. Any other best practice for sending multiple input parameter in SQL SP?
Any advise is appreciated!
If you're inserting only a fixed number of records, than you can define 24 parameters in your SP. This way you can get some compile-time checking, also you can define not-null, null or a default value for each parameter for greater flexibility.
I woudn't use XML datatype unless I have variable number of arguments or I must simulate parameter arrays (like in insert multiple orderlines at the same time).
If you're using SQL Server 2008 or higher there is support for Table-Valued Parameters. You can check this link for using table-valued params with .NET SqlCient, too
Vasile Bujac's answer is excellent and I agree with everything. But it may be worth adding that Sommarskog, a luminary MVP, has some very good articles on mimicing an array in SQL Server that may be very applicable to your situation. You can find them here: http://www.sommarskog.se/arrays-in-sql.html
I'm selecting columns from one table and would like to select all values of a column from a related table when the two tables have a matching value, separate them by commas, and display them in a single column with my results from table one.
I'm fairly new to this and apologize ahead of time if I'm not wording it correctly.
It sounds like what you're trying to do is to take multiple rows and aggregate them into a single row by concatenating string values from one or more columns. Yes?
If that's the case, I can tell you that it's a more difficult problem than it seems if you want to do it using portable SQL - especially if you don't know ahead of time how many items you may get.
The Oracle-specific solution often used in such cases is to implement a custom aggregate function - STRAGG(). Here's a link to an article that describes exactly how to do so and has examples of it's usage.
If you're on Oracle 9i or later and are willing to live with using undocumented functions (that could change in the future), you can also look at the WM_CONCAT() function - which does much the same thing.
You want a row aggregation or concatenation function, choices are:
If you are using Oracle 11gR2, there is a built-in function to aggregate strings with a delimiter called LISTAGG(column, delimiter).
If you are using any earlier release of Oracle database, you can use WM_CONCAT(column) function, however you have no choice of delimiter and will have to use something like TRANSLATE(string, string_to_replace, replacement_string) function to change the delimiter afterwards if your data does not contain commas.
As mentioned by LBushkin, you can create a custom function in your schema to perform row aggregation for you. Here is PL/SQL code example for one: http://www.oracle-base.com/articles/misc/StringAggregationTechniques.php#user_defined_aggregate_function