I'm developing an app in vb6 and I'm trying to do queries on SQL AZURE thanks to the SQL Server Native Client 11.0 but each time I execute a query (even a simple select * from Users;) the Native Client returned this error :
[Microsoft][SQL Server Native Client 11.0][SQL Server] Executing SQL directly; no cursor.
This problem occurs since I have created a new DB with a different level on azure portal. The database before was a web DB and the new one is a Standard DB. Is there a manip to do for the three new versions of DB in Azure(Basic,Standard,Premium) ?
Thank you very much for your help
The error in question is 16954 (your error reporting should first and foremost show the error number, state and severity, not only the error message). It occurs when an application attempts to use server side cursors in a context not supported. See Client-Side Cursors Versus Server-Side Cursors. This is likely coming from your cursor library choice in RDO, see Choosing an RDO Cursor Library. Switch to rdUseIfNeeded or rdUseNone.
I believe the underlying error is getting hidden by the sql client.
If you are getting an error for simple select * from table queries then I would check you have permissions for the credentials your app is using.
Related
I have taken a code from another environment onto my local system. The code is written in VB.NET and the backend technology is oracle 11g. While debugging the code in the local environment , I am unable to connect to the oracle database. The inner exception states 'The provider is not compatible with the version of Oracle client'. What should be the suggested action items to debug the issue and correct it?
I'm a web developer that has been tasked with creating some sort of mechanism for moving data from an IBM AS400 to a SQL server. Unfortunately, linked servers are out of the question in this case as the SQL Server is just Standard Edition (db2 providers not available in this version) and the AS400 server is on a separate server. I've researched adding some sort of trigger on the AS400 table that calls a web service that would insert data into the SQL server, but that doesn't seem like the best method. Does anyone have any suggestions on the process to get the data from the AS400 to the SQL Server when it is committed to the AS400?
This solution assumes you are familiar with SQL Server Integration Services (SSIS):
Connection to AS400
Create a new ADO.Net connection Manager
Set the Provider to .Net Provider --> ODBC Data Provider
Create a DSN (Control Panel -->Administrative Tools-->Data Sources ODBC -->System DSN)
In the connection manager for Data source specification select the DSN created. Provide the login information.
Test the connection.
Data flow source:
Use the DataReader source
In Advanced Editor select the Ado.Net connection manager just created.
In Component Properties tab --> Custom properties, in SQLCommand specify the required query string (select * from DatabaseName.TableName)
Check the column mappings for accuracy
Go to Input and Output properties -->Data reader output -->External columns (Select the columns which were of type varchar in the table, they will now be of the datatype UnicodeString (DT_WSTR). This is because by default DataReader reads strings as unicode strings. This implies that in the destination table in SQL these columns must be of type unicode i.e NVARCHAR instead of VARCHAR)
Answer sourced from www.sqlservercentral.com/Forums
I synchronize my web applications with an IBM i. But I have my own database design and wrote a sync program on the Windows side.
Having the same database design I wonder why I would need a copy on SQL server. I would access the IBM directly. Install the drivers as #Kamran Farzami suggested and use them. That way there would be no lag between writes on the mainframe and your queries.
If a lag is acceptable for you and you can't access the IBM i directly, I see three main options:
Pull the data from your Windows system with the OLE DB driver. Using the .NET driver you can use the relative record number (RRN) to remember where you stopped synchronizing.
Read the journal files and make them available by creating a webservice on the IBM i.
Read the journal files in a scheduled job and push the changes from the journal to a webservice which updates the SQL server.
Option 1 only works if the files you sync are not reorganized. The RGZPFM command changes the record numbers. If that's okay, you can get the RRN in your SELECT statement: select *, RRN(MYTABLE) as RRNMYTABLE from MYTABLE
The web service server is included in OS400 since V5R4. So you should be able to use option 2.
I've done something similar where the SQL server was in a remote (Honduras) location where the internet connection was unreliable. It was a short VB program, using the OLE DB driver, running on the server that connected to the AS400 when it was available (or "slept" when the connection was down). When available the program would update/synchronize a uniquely keyed mirror file. Another program uploaded individual transaction records to a separate table (file).
We'd also periodically update SQL Server master tables (i.e. item master) from the AS400. That also utilized a VB program (could be any language using the driver) initiated on the server. It isn't exactly elegant, but more practical than an AS/400 trigger to a web service, I believe.
We're using SQL Server Management Studio 10.0 with Business Objects 4.1 for reporting. We're facing an issue where none of our reports were running, and every time it gave the error:
Database Error: [Microsoft][SQL Server Native Client 10.0][SQL Server]Statement(s) could not be prepared.. (IES 10901)
We drilled down the issue further and noticed that it was because some of the table in the SQL query were not valid. On running a select statement on the concerned tables, we get the error:
Invalid object name 'dbo.tablename'
I am not able to see these tables in the concerned schema and not able to find any related logs as well to see if someone dropped them by mistake. I don't know much about administration, can someone provide useful links or any thoughts on how to find those tables please?
We are using "Enterprise Library Data Access Application Block" to access SQL Server database. In DataAccess layer, we are calling application block's API. Internally it must be resolving the command and parameters into SQL statement.
How can I know what SQL query goes to database?
Thanks
AJ
One way:
Run profiler on SQL Server, start a trace and add even SQL:Batchstarting. In the trace data look at the TextData column
see image below of what it looks like
Is there a tool for windows that we can use to inspect any SQL commands that go through a particular ODBC data source?
You can make ODBC log out everything it's doing:
http://support.microsoft.com/kb/274551
http://msdn.microsoft.com/en-us/library/ms711020%28VS.85%29.aspx
You can also do it programmatically:
... One can do this by calling SQLSetConnectAttr and set the SQL_ATTR_TRACE attribute in the connection to SQL_OPT_TRACE_ON. So, by doing this you would be enabling/disabling it for the connection duration.
http://decipherinfosys.wordpress.com/2009/01/17/odbc-tracing/
If you're using SQL Server, look at the SQL Server Profiler. Profiler allows you to monitor/trace all communications between your application and the SQL Server, including which procedures are called, parameter values, etc, without having to modify your application.
If you're using a different server, you should be able to find a sql proxy that will do the same thing.