SSIS Oracle connection string as project parameter is not being replaced at runtime - sql

We move data from Oracle 11 to SQL 2014 using SSIS project deployment model. We use Attunity 3.0 connector.
Connection string to oracle data source is a project parameter and is also stored in a table in SQL.
We use custom stored procedure that
Gets this connection string stored in the sql table
sets project parameters (via [SSISDB].[catalog].[set_execution_parameter_value] )
executes packages (via [SSISDB].[catalog].[start_execution] )
We use [SSISDB].[internal].[execution_parameter_values] to check that parameter values are being replaced during run time with the connection string we stored in the backend.
What's interesting is that, even though Oracle connection string is being replaced during runtime, the package still tries to use the connection string it has been complied with(Project Params). We do not have the same issue when connecting to a SQL Source in a similar fashion.
Do you have any suggestions? Is it a known issue?

Found the solution. Turns out that the oracle connection string that we stored in the table did not prefix the server name with "SERVER = ". The connection string would straight away start with For Eg - 'x1abc01.something.com:1234/x1abc01;ORACLEHOME=;ORACLEHOME64=;WINAUTH=0;'. Changed the connection string to 'SERVER = x1abc01.something.com:1234/x1abc01;ORACLEHOME=;ORACLEHOME64=;WINAUTH=0;' and it started working now. We tested it by deploying the ssis solution with one connection string and changing it with a different connection string from the database and the overwritten value persists.
However, its still bizarre where the disconnect happens when the run time connection string has an invalid value and its not reported out as an error and ssis quietly switches to design time value in Project Param.

Related

Use of database name in connection string

What is the use of mentioning the database name in connection string while opening a connection from dot net application to SQL server? Because even though we mention a database name in connection string we have to explicitly write the fully qualified name (DBName.schemaName.ProcName) while calling a stored procedure if the default DB is different for that particular user.
Connecting to database from a .NET application is different from accessing a table of different database.
use of mentioning the database name in connection string
so for instance you can use connection string below to connect to myDB at MyServer
Data Source=MyServer;Initial Catalog=myDB;Integrated Security=True
if you will not specify at least these information how your .NET application can connect to a stored procedure (MyProcInMyDB) located in myDB.
Now for part you asked
though we mention a database name while calling a stored procedure if
the default DB is different for that particular user
this is not a normal case to access stored procedure of another database using same connection string if it is a very special case (not likely) then you will do it for calling one or two stored procedures. But if it is required quite often within your application then you should create a separate connection string. Using same connection string and calling like
command.CommandText = "myDB2.dbo.getList"
can result is difficult maintenance and flexibility

"String or binary data would be truncated." error after publishing on IIS7

Could you help me with the following error, which causes in my asp.net webforms application after publishing it on IIS 7 web-server.
Error:
There is a following error message "String or binary data would be truncated. The statement has been terminated."
Additional details:
I have a table in MS SQL database to keep files uploaded by users.
The field is set as varbinary(max)
I run my webapp from Visual Studio Development server using the same DB from my workstation and there is no any error.
But after publishing on IIS7 web-server I have this error.
(Source codes, a file to upload and DB are the same)
I tried to set the exact length of SqlParameter, but result is the same.
Dim fileDataParam As New SqlParameter("FileData", SqlDbType.VarBinary,
fileData.Length)
fileDataParam.Value = fileData
params.Add(fileDataParam)
Please, could you give me a piece of advice what reasons can cause this error?
Could you recommend me what settings of IIS or MSSQL I should set or check?
Update:
I run SQL Server profile in both cases.
SQL queries are the same, in the first case (on IIS) error occurs, in the second case (on my PC) - no error.
Solved
I have solved the problem. It was a truncation of a string but in another field [ServerName] (it passed short value on my PC and long value on server side.)
I would suggest you accept #ps2goat comment as the answer.
The data you are inserting into the column is too large for the column size. Set the column to varbinary(MAX)

How to prevent 'query timeout expired'? (SQLNCLI11 error '80040e31')

I have a connection to a MS SQL Server 2012 database in classic ASP (VBScript). This is my connection string:
Provider=SQL Server Native Client 11.0;Server=localhost;
Database=databank;Uid=myuser;Pwd=mypassword;
When I execute this SQL command:
UPDATE [info] SET [stamp]='2014-03-18 01:00:02',
[data]='12533 characters goes here',
[saved]='2014-03-18 01:00:00',
[confirmed]=0,[ip]=0,[mode]=3,[rebuild]=0,
[updated]=1,[findable]=0
WHERE [ID]=193246;
I get the following error:
Microsoft SQL Server Native Client 11.0
error '80040e31'
Query timeout expired
/functions.asp, line 476
The SQL query is pretty long, the data field is updated with 12533 characters. The ID column is indexed so finding the post with ID 193246 should be fast.
When I execute the exact same SQL expression (copied and pasted) on SQL Server Management Studio it completes successfully in no time. No problem what so ever. So there isn't a problem with the SQL itself. I've even tried using a ADODB.Recordset object and update via that (no self-written SQL) but I still get the same timeout error.
If I go to Tools > Options > Query Execution in the Management Studio I see that execution time-out is set to 0 (infinite). Under Tools > Options > Designers I see that transaction time-out is set to 30 seconds, which should be plenty enough since the script and database is on the same computer ("localhost" is in the connection string).
What is going on here? Why can I execute the SQL in the Management Studio but not in my ASP code?
Edit: Tried setting the 30 sec timeout in the Designers tab to 600 sec just to make sure, but I still get the same error (happens after 30 sec of page loading btw).
Here is the code that I use to execute the SQL on the ASP page:
Set Conn = Server.CreateObject("ADODB.Connection")
Conn.Open "Provider=SQL Server Native Client 11.0;
Server=localhost;Database=databank;Uid=myuser;Pwd=mypassword;"
Conn.Execute "UPDATE [info] SET [stamp]='2014-03-18 01:00:02',
[data]='12533 characters goes here',[saved]='2014-03-18 01:00:00',
[confirmed]=0,[ip]=0,[mode]=3,[rebuild]=0,[updated]=1,[findable]=0
WHERE [ID]=193246;"
Edit 2: Using Conn.CommandTimeout = 0 to give infinite execution time for the query does nothing, it just makes the query execute forever. Waited 25 min and it was still executing.
I then tried to separate the SQL into two SQL statements, the long data update in one and the other updates in the other. It still wouldn't update the long data field, just got timeout.
I tried this with two additional connection strings:
Driver={SQL Server};Server=localhost;Database=databank;Uid=myuser;Pwd=mypassword;
Driver={SQL Server Native Client 11.0};Server=localhost;Database=databank;Uid=myuser;Pwd=mypassword;
Didn't work. I even tried changing the data to 12533 A's just to see if the actual data was causing the problem. Nope, same problem.
Then I found out something interesting: I tried to execute the short SQL first, before the long update of the data field. It ALSO got query timeout exception...
But why? It has so little stuff to update in it (the whole SQL statement is less than 200 characters). Will investigate further.
Edit 3: I thought it might have been something to do with the login but I didn't find anything that looked wrong. I even tried changing the connection string to use the sa-account but even that didn't work, still getting "Query timeout expired".
This is driving me mad. There is no solution, no workaround and worst of all no ideas!
Edit 4: Went to Tools > Options > Designers in the Management Studio and ticked off the "Prevent saving changes that require table re-creation". It did nothing.
Tried changing the "data" column data type from "nvarchar(MAX)" to the inferior "ntext" type (I'm getting desperate). It didn't work.
Tried executing the smallest change on the post I could think of:
UPDATE [info] SET [confirmed]=0 WHERE [ID]=193246;
That would set a bit column to false. Didn't work. I tried executing the exact same query in the Management Studio and it worked flawlessly.
Throw me some ideas if you have got them because I'm running out for real now.
Edit 5: Have now also tried the following connection string:
Provider=SQLOLEDB.1;Password=mypassword;Persist Security Info=True;User ID=myuser;Initial Catalog=databank;Data Source=localhost
Didn't work. Only tried to set confirmed to false but still got a time out.
Edit 6: Have now attempted to update a different post in the same table:
UPDATE [info] SET [confirmed]=0 WHERE [ID]=1;
It also gave the timeout error. So now we know it isn't post specific.
I am able to update posts in other tables in the same "databank" database via ASP. I can also update tables in other databases on localhost.
Could there be something broken with the [info] table? I used the MS Access wizard to auto move data from Access to MS SQL Server 2012, it created columns of data type "ntext" and I manually went and changed that to "nvarchar(MAX)" since ntext is deprecated. Could something have broken down? It did require me to re-create the table when I changed the data type.
I have to get some sleep but I will be sure to check back tomorrow if anybody has responded to me. Please do, even if you only have something encouraging to say.
Edit 7: Quick edit before bed. Tried to define the provider as "SQLNCLI11" in the connection string as well (using the DLL name instead of the actual provider name). It makes no difference. Connection is created just as fine but the timeout still happens.
Also I'm not using MS SQL Server 2012 Express (as far as I know, "Express" wasn't mentioned anywhere during installation). It's the full thing.
If it helps, here's the "Help" > "About..." info that is given by the Management Studio:
Microsoft SQL Server Management Studio: 11.0.2100.60
Microsoft Analysis Services Client Tools: 11.0.2100.60
Microsoft Data Access Components (MDAC): 6.3.9600.16384
Microsoft MSXML: 3.0 5.0 6.0
Microsoft Internet Explorer: 9.11.9600.16521
Microsoft .NET Framework: 4.0.30319.34011
Operating System: 6.3.9600
Edit 8 (also known as the "programmers never sleep" edit):
After trying some things I eventually tried to close the database connection and reopening it right before executing the SQL statements. It worked all of a sudden. What the...?
I have had my code inside a subroutine and it turns out that outside of it the post that I was trying to update was already opened! So the reason for the timeout was that the post or the whole table was locked by the very same connection that tried to update it. So the connection (or CPU thread) was waiting for a lock that would never unlock.
Hate it when it turns out to be so simple after trying so hard.
The post had been opened outside the subroutine by this simple code:
Set RecSet = Conn.Execute("SELECT etc")
I just added the following before calling the subroutine.
RecSet.Close
Set RecSet = Nothing
The reason why this never crossed my mind is simply because this was allowed in MS Access but now I have changed to MS SQL Server and it wasn't so kind (or sloppy, rather). The created RecSet by Conn.Execute() had never created a locked post in the database before but now all of a sudden it did. Not too strange since the connection string and the actual database had changed.
I hope this post saves someone else some headache if you are migrating from MS Access to MS SQL Server. Though I can't imagine there are that many Access users left in the world nowadays.
Turns out that the post (or rather the whole table) was locked by the very same connection that I tried to update the post with.
I had a opened record set of the post that was created by:
Set RecSet = Conn.Execute()
This type of recordset is supposed to be read-only and when I was using MS Access as database it did not lock anything. But apparently this type of record set did lock something on MS SQL Server 2012 because when I added these lines of code before executing the UPDATE SQL statement...
RecSet.Close
Set RecSet = Nothing
...everything worked just fine.
So bottom line is to be careful with opened record sets - even if they are read-only they could lock your table from updates.

SQL server - execute scalar function without specifing db name

I have a user defined SQL function that I am able to call from management studio using syntax dbo.Function(arg)
Now, when I have to call this function from C# if I don't specify **dbname**.dbo.Function(arg) I get an error that SQL server does not find this user defined function. How can I solve this without specifing dbname ? I already connect to the server using a connection string that specifies the "initial catalog = dbname"
It seems that I cannot reproduce mentioned behavior at this point :-) (either using SQL server 2005 or 2008) I have to put this question on hold
Your connection string needs to specify the database to use initially. It might look something like this:
var cn = new SqlConnection(
"SERVER=SomeServer;DATABASE=SomeDb;Integrated Security=SSPI;"
);
Without that, you're probably being dumped into the master database, which is why you need to fully qualify the function name.

SSIS and MySQL - Table Name Delimiter Issue

I am trying to insert rows into a MySQL database from an Access database using SQL Server 2008 SSIS.
TITLE: Microsoft SQL Server Management Studio
------------------------------
ERROR [42000] [MySQL][ODBC 5.1 Driver][mysqld-5.0.51a-community-nt]You have
an error in your SQL syntax; check the manual that corresponds to your MySQL
server version for the right syntax to use near '"orders"' at line 1
The problem is with the delimiters. I am using the 5.1 ODBC driver, and I can connect to MySql and select a table from the ADO.Net destination data source.
The MySql tables all show up delimited with double-quotes in the SSIS package editor:
"shipto addresses"
Removing the double quotes from the "Use a table or view" text box on the ADO.NET Destination Editor or replacing them with something else does not work if there is a space in the table name.
When SSIS puts the Insert query together, it retains the double quotes and adds single quotes.
The error above is shown when I click on "Preview" in the editor, and a similar error is thrown when I run the package (albeit then from the actual insert statement).
I don't seem to have control over this behavior. Any suggestions? Other package types where I can hand-code the SQL don't have this problem.
Sorry InnerJoin, I had to take the accepted answer away from you. I found a workaround here:
The solution is to reuse the connection for all tasks, and to turn ANSI quotes on for the connection before you do any inserts, with an Execute Sql task that runs the following:
set sql_mode='STRICT_TRANS_TABLES,NO_AUTO_CREATE_USER,
NO_ENGINE_SUBSTITUTION,ANSI_QUOTES'
Try using square brackets around the table names. That may help.
EDIT: If you can, I would create views (with no spaces) based on the Access tables, and use those to export. Even if it means building another Access database with linked tables, I think this is your best bet.
I've always struggled with using SSIS with MYSQL directly. Even after installing the ODBC drivers, they just don't play well in data flows. I've always ended up creating linked ODBC connections between SQL Server and MYSQL. I then rely on linked server queries to bring over data. Instead of using a SSIS data flow task, I use an Execute SQL command, usually in the form of a stored procedure that executes an OPENQUERY.
One solution you could do is load the data into a SQL Server database and use it as a staging environment before you load it into the MYSQL database. I regularly move data between SQL Server 2008 and MYSQL and in the past I use to regularly move data between Access and SQL Server.
Another possible solution is to transform the incoming Access data before it loads into the MYSQL database. That may give you a chance to clean up the column names and the actual data that's going through to MYSQL.
Let me know if either of these work for you.
You can locate the configuration setting file my.ini at <<Drive>>:\ProgramData\MySQL\MySQL Server 5.6\my.ini and add "ANSI_QUOTES" to sql-mode.
e.g: sql-mode="STRICT_TRANS_TABLES,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION,ANSI_QUOTES". This should solve the issue while previewing in the SSIS editor.