I'm in a bit of a situation with outputting data from a table where the only column I'm selecting is a VARBINARY(MAX) type.
In management studio, when I execute the query, I get back what I expect in the format:
0x1FABCDEFG......etc
Now, when the same query is executed in powershell, via a simple setup of a SqlCommand, SqlDataAdapter and DataSet, the file is eventually output which uses the following command:
$dataSet.Tables[0] | Select-Object * | Export-Csv $outputFileName -Force -NoTypeInformation;
..it returns a bunch of rows that simply contain:
System.Byte[]
A byte array isn't what I want, just the same output that's shown when you execute it in SQL Management Studio...
Is there some kind of cast/convert magic I can do in the query (SQL 2005 btw), or do I need to go through the DataSet object and mess around with the data there (which is not ideal for the situation but can be done).
Any help or a pat on the back letting me know that I should just take my whole idea out back and shoot it would be greatly appreciated. :)
I have found this and think it may help. Microsoft blog Article which explains how to do hex to string and vise versa. If you add it to the SQL command it might do the trick
Related
Since I am new to Powershell I am looking for some assistance for my problem.
What I need is to count the number of files in a network folder and use that number for a Bulk-Insert loop. Currently, the number of loop cyles is stored in a "status-table" (in SQL). I want to be able to update that status table with the actual number of files.
The powershell cmdlet to count the number of files looks like this:
$FileCount = #(Get-ChildItem -Path '\\YS001UVE\Download\MIS\MIS FactData 20??m?1.csv' -Name -File | sort Name).count
But then I need to update the existing "status-table" with the outcome of the Powershell script .
What is the correct SQL coding for this?
Thanks in advance and regards
Peter
There are many ways to do that. This Post contains an example to do this task. For some database connections you may need to install something on your computer (for MySQL you need the MySQL Connector).
I've done this myself and know, how confusing it can be at first. Under this link, you can find my script (sorry for the German comments and text). The database connection starts at about line 270 in the function Import-ToDatabase. Be aware that I am not updating one single line in my script, but a whole new table.
Let me know if you need some more help
Here are a few more links:
https://www.joseespitia.com/2016/06/03/connecting-and-writing-to-a-sql-db-with-powershell/
https://www.sqlservercentral.com/scripts/insert-data-into-a-sql-server-table-using-powershell-using-invoke-sqlc
https://www.sqlservercentral.com/blogs/performing-an-insert-from-a-powershell-script
Got about a 400 MB .txt file here that is delimited by '|'. Using a Windows Form with C#, I'm inserting each row of the .txt file into a table in my SQL server database.
What I'm doing is simply this (shortened by "..." for brevity):
while ((line = file.ReadLine()) != null)
{
string[] split = line.Split(new Char[] { '|' });
SqlCommand cmd = new SqlCommand("INSERT INTO NEW_AnnualData VALUES (#YR1984, #YR1985, ..., #YR2012)", myconn);
cmd.Parameters.AddWithValue("#YR1984", split[0]);
cmd.Parameters.AddWithValue("#YR1985", split[1]);
...
cmd.Parameters.AddWithValue("#YR2012", split[28]);
cmd.ExecuteNonQuery();
}
Now, this is working, but it is taking awhile. This is my first time to do anything with a huge amount of data, so I need to make sure that A) I'm doing this in an efficient manner, and that B) my expectations aren't too high.
Using a SELECT COUNT() while the loop is going, I can watch the number go up and up over time. So I used a clock and some basic math to figure out the speed that things are working. In 60 seconds, there were 73881 inserts. That's 1231 inserts per second. The question is, is this an average speed, or am I getting poor performance? If the latter, what can I do to improve the performance?
I did read something about SSIS being efficient for this purpose exactly. However, I need this action to come from clicking a button in a Windows Form, not going through SISS.
Oooh - that approach is going to give you appalling performance. Try using BULK INSERT, as follows:
BULK INSERT MyTable
FROM 'e:\orders\lineitem.tbl'
WITH
(
FIELDTERMINATOR ='|',
ROWTERMINATOR ='\n'
)
This is the best solution in terms of performance. There is a drawback, in that the file must be present on the database server. There are two workarounds for this that I've used in the past, if you don't access to the server's file system from where you're running the process. One is to install an instance of SQL Express on the workstation, add the main server as a linked server to the workstation instance, and then run "BULK INSERT MyServer.MyDatabase.dbo.MyTable...". The other option is to reformat the CSV file as XML, which can be processed very quickly, and then passing the XML to query and processing it using OPENXML. Both BULK INSERT and OPENXML are well documented on MSDN, and you'd do well to read through the examples.
Have a look at SqlBulkCopy on MSDN, or the nice blog post here. For me that goes up to tens of thousands of inserts per second.
I'd have to agree with Andomar. I really quite like SqlBulkCopy. It is really fast (you need to play around with BatchSizes to make sure you find one that suits your situation.)
For a really in depth article discussing the various options, check out Microsoft's "Data Loading Performance Guide";
http://msdn.microsoft.com/en-us/library/dd425070(v=sql.100).aspx
Also, take a look at the C# example with SqlBulkCopy of CSV Reader. It isn't free, but if you can write a fast and accurate parser in less time, then go for it. At least, it'll give you some ideas.
I have fonud SSIS to be much faster than this type of method but there are a bunch of variables that can affect performence.
If you want to experiment with SSIS, use the Import and Export wizard in Management Studio to generate a SSIS package that will import a pipe delimited file. You can save out the package and run it from a .NET application
See this article: http://blogs.msdn.com/b/michen/archive/2007/03/22/running-ssis-package-programmatically.aspx for info on how to run an SSIS package programatically. It includes options on how to run from the client, from the server, or wherever.
Also, take a look at this article for additional ways you can improve bulk insert performance in general. http://msdn.microsoft.com/en-us/library/ms190421.aspx
I'm part of a team writing an ERP using , Seam, and Jboss, and on one of my pages, I keep getting an SQL error: 8152 whenever I try to input something. SQL error:8152, for those of you who don't know, is when you try to input a value over the maximum limit of the column.
I've double checked my entity and the database, and their maximum value limits are the same (50 nvarchars). In addition, I'm pretty sure that we're not using audit tables. I then put System.out.println(""); all over the place, and found that the error was happening in between these two println(s):
System.out.println("Flushing");
entityManager.flush();
System.out.println("Flushing complete");
Which is part of a method that process all changes to the table. But I'm pretty new to programming and not sure what's going on.
Any help would be appreciated, thanks in advance, Jeff.
P.s. Code on request, but I didn't post it because there is a lot of it all over the place.
I would verify the SQL that is being executed when the flush() is performed. That way you can see the length of your data and verify that it is too big as shown by the DB error.
If you are using Hibernate, you can output SQL to the console. You don't say what your DB is, but if it's SQL Server you can use the profiler to see what SQL is being executed.
Is there a way to run a query and then have SQL Server management studio or sqlcmd or something simply display the datatype and size of each column as it was received.
Seems like this information must be present for the transmission of the data to occur between the server and the client. It would be very helpful to me if it could be displayed.
A little background:
The reason I ask is because I must interface with countless legacy stored procedures with anywhere from 50 to 5000+ lines of code each. I do not want to have to try and follow the cryptic logic flow in and out of temp tables, into other procedures, into string concatenated eval statement and so on. I wish to maintain no knowledge of the implementation, simply what to expect when they work. Unfortunately following the logic flow seems to be the only way to figure out what exactly is being returned without trying to infer what the actual types of the data string representations om management studio studio or from the native type in .net for example.
To clarify: I am not asking about how to tell the types of a table or something static like that. I'm pretty sure something like sp_help will not help me. I am asking how to tell what the sql server types (ie varchar(25), int...) are of what I have been given. Additionally, changing the implementation of the sprocs is not possible so please consider that in your solutions. I am really hoping there is a command I have missed somewhere. Much appreciation to all.
Update
I guess what I am really asking is how to get the schema of the result set when the result set originates from a query using a temp table. I understand this to be impossible but don't find much sense with that conclusion because the data is being transmitted after all. Here is an example of a stored procedure that would cause a problem.
CREATE PROCEDURE [dbo].[IReturnATempTable]
AS
Create table #TempTable
(
MyMysteryColumn char(50)
)
INSERT #TempTable (
MyMysteryColumn
) VALUES (
'Do you know me?' )
select TOP 50 * FROM #TempTable
What will you do about stored procedures which return different result sets based on their parameters?
In any case, you can configure a SqlDataAdapter.SelectCommand, along with the necessary parameters, then call the FillSchema method. Assuming that the schema can be determined, you'll get a DataTable configured with correct column names and types, and some constraints.
A bit of a long shot, try messing around with SET FMTONLY ON (or off). According to BOL, this "Returns only metadata to the client. Can be used to test the format of the response without actually running the query." I suspect that this will inlcude what you're looking for, as BCP uses this. (I stumbled across this setting when debugging some very oddball BCP problems.)
Could you append another select to your procedure?
If so you might be able to do it by using the sql_variant_property function.
Declare #Param Int
Set #Param = 30
Select sql_variant_property(#Param, 'BaseType')
Select sql_variant_property(#Param, 'Precision')
Select sql_variant_property(#Param, 'Scale')
I posted that on this question.
I am asking how to tell what the sql
server types (ie varchar(25), int...)
are of what I have been given
You could then print out the type, precision (i.e. 25 if its VarChar(25)), and the scale of the parameter.
Hope that helps... :)
If you are not limited to T-SQL, and obviously you don't mind running the SPs (because SET FMTONLY ON isn't fully reliable), you definitely CAN call the SPs from, say C#, using a SqlDataReader. Then inspect the SqlDataReader to get the columns and the data types. You might also have multiple result sets, you you can also go to the next result set easily from this environment.
This code should fix you up. It returns a schema only dataset with no records. You can use this Dataset to query the columns' DataType and any other metadata. Later, if you wish, you can populate the DataSet with records by creating a SqlDataAdapter and calling it's Fill method (IDataAdapter.Fill).
private static DataSet FillSchema(SqlConnection conn)
{
DataSet ds = new DataSet();
using (SqlCommand formatCommand = new SqlCommand("SET FMTONLY ON;", conn))
{
formatCommand.ExecuteNonQuery();
SqlDataAdapter formatAdapter = new SqlDataAdapter(formatCommand);
formatAdapter.FillSchema(ds, SchemaType.Source);
formatCommand.CommandText = "SET FMTONLY OFF;";
formatCommand.ExecuteNonQuery();
formatAdapter.Dispose();
}
return ds;
}
I know this is an old question, I found it through a link from SqlDataAdapter.FillSchema with stored procedure that has temporary table. Unfortunately, neither question had an accepted answer, and none of the proposed answers were able to resolve my issue.
For the sake of brevity, if you are using SQL Server 2012 or later, using the following built-in functions will work in most situations:
sys.dm_exec_describe_first_result_set
sys.dm_exec_describe_first_result_set_for_object
However, there are some cases in which these functions will not provide any useful output. In my case, the problem was more similar to the question linked above and therefore, I believe the solution is more appropriately answered under that question. My answer can be found here.
I have a column in SQL Server 2005 that stores a simple chunk of XML. At a later point processing is performed and I need to merge some processing info into the XML.
While I can do this at an intermediate point I would much prefer to keep this method centraliazed within the stored procedure that is responsible for updating other fields post-processing.
Here's an example of the XML I'm starting with and the type of outcome I'd like to achieve. Can anyone provide me some rough SQL to achieve it?
Update: Finally got it! I'll post the full solution when I get the chance, it was enough of a hack that someone else will hopefully find it useful
All finished! In the end I had a couple of additional requirements that required me to rework Marc's suggested solution and ditch the .modify() function entirely; however his answer let me get past my initial hurdles and got me to where I could step back and spot the easier approach. Here's my final solution!
How about this:
update yourTable
set (your XML column).modify('insert <processingData id="guid" someAttrib="x" /> as last into /someData[1]')
where .......
That should do it.
For more details on how to deal with XML in SQL Server 2005 and up, I keep going back to this article at 15 seconds which shows really nicely how to insert, modify, and delete XML fragments inside your SQL server fields, using XML DML statements.
Marc