Mongodb is throwing "connection timed out" error - pymongo

We are using driver PyMongo to connect python with mongodb it is working fine but, throws "Connection Timedout Exception" for long running scripts.

Mongodb connectTimeoutMS defaults to 30 seconds.
It defaults to 20 seconds in pymongo.
You may want to increase the timeout, or have a look at optimizing the heavy scripts.
MongoDB PyMongo Doc

Related

EF6 Entity framework sometimes slow query on SQL Server

We have started a new project in .NET Core. We are just starting out and we are hitting a Web API endpoint to get some reference data of 8 records.
We noticed in our Angular screen that perodically (about every 10 requests) to a screen that the EF query takes about 6 to 15 seconds to run, rather than the 30ms it normally takes.
On debugging we know that we are getting right up to the .ToListAsync() and then in SQL profiler we can see it initiates the query and it take a delayed time.
So first impressions says its a SQL issue but if we run the SQL query manually in SQL itself it never delays.
Any ideas?
This might have to do with the connection pooling setup of efcore, It should not request a new connection on each request to db, enable connection pooling by adding this in your dependency management:
builder.Services.AddPooledDbContextFactory<DbContext>(
o => o.UseSqlServer(builder.Configuration.GetConnectionString("AppContext")));
reference :- https://learn.microsoft.com/en-us/ef/core/performance/advanced-performance-topics?tabs=with-di%2Cwith-constant

Connection::SQLGetInfoW: [Simba][ODBC] (11180) SQLGetInfo property not found: 1750

This is a setup where Microsoft's Power BI is the frontend for the data presentation to end-users. Behind it there's an on-premises PBI gateway which connects to BigQuery via Magnitude Simba ODBC driver for BigQuery. Since two days ago, after always working flawlessly, the PBI data refresh started failing due to timeout.
BigQuery ODBC driver's debug shows these two errors below in hundreds of rows per refresh:
SimbaODBCDriverforGoogleBigQuery_connection_9.log:Aug 29 15:21:54.154 ERROR 544 Connection::SQLGetInfoW: [Simba][ODBC] (11180) SQLGetInfo property not found: 180
SimbaODBCDriverforGoogleBigQuery_connection_9.log:Aug 29 15:22:49.427 ERROR 8176 Connection::SQLGetInfoW: [Simba][ODBC] (11180) SQLGetInfo property not found: 1750
And only occurence per refresh of this:
SimbaODBCDriverforGoogleBigQuery_connection_6.log:Aug 29 16:56:15.102 ERROR 6704 BigQueryAPIClient::GetResponseCheckErrors: HTTP error: Error encountered during execution. Retrying may solve the problem.
After some intensive research web search, it kinda looks like this might be related to 'wrong' coding, either wrong data types or strings that are too big, but nothing conclusive.
Other, smaller, refreshes to the same place work without issues.
Do we have any knowledgebase or reference for such cryptic error messages? Any advice on how to troubleshoot this?
Already tried:
Searching Google;
Updating Magnitude Simba ODBC driver for BigQuery to the latest
version;
Updating PBI Gateway to the latest version;
Rebooting the gateway server.
This issue occurs when ODBC drivers try to pull the data in streams which is via port 444. You either need to enable port 444 for optimal performance or disable streams, so that the data is pulled using pagination(Not recommended for huge data).

SQL Azure: Frequent error

We are having trouble with Sql Azure. We deploy a ancient .Net 4.0/C# based website on Azure. It is all good, except that every here and there we start getting Timeout or anything related to database start giving error. Couple of time it stop insert, DataSet.Fill function fails with Timeout or "Column not found".
As I understand it happens when SQL Azure is probably is busy to respond to our request and break connection in between, probably restarting or switching the node. However it looks bad on end client, when such error appears. Our Database is less than 1 GB in size and we got 10-15 at a time. So I don't think we have heavy load [as Azure is ready for 150 GB Database ours is less than 1% of it].
Anyone suggest what we should do to avoid such error, can we detect such upcoming error?
You will need to use Transient Fauly handling, The Transient Fault Handling Application Block is good place to start. When refactoring with legacy code for Sql Azure, I found it easier to use the SqlConnection and SqlCommand Extension methods:
Example of SqlConnection extions:
var retryPolicy = RetryPolicyFactory.GetDefaultSqlConnectionRetryPolicy()
using(var connection = new SqlConnection(connectingString))
{
connection.OpenWithRetry(retryPolicy);
///Do usual stuff with connection
}

Query timeout expired - ASP Classic (VB). Timeout happens in client browser, not on server

I'm having some troubles with timeouts on my website. It has a rather large newsletter receiverlist (approx. 100k recievers). We deliver a tool for sending out newsletters, and based on the query for the newsletter (segmentation fx), we make show a total number for recievers of the newsletter.
If I run the query through SQL Server Management Studio I get a result in roughly 2 seconds. But if the querye is run through a client browser I get the same timeout evert time: "[Microsoft][ODBC SQL Server Driver]Query timeout expired".
I have tried adjusting the Server.ScriptTimeout parameter but with no luck. It seems as there's a problem with the data connection, but that's where I get stuck.
I'm hoping some of you brilliant people know the answer to this one :-)
Thanks!
Have you tried setting the Connection Timeout property of the connection string?
-- EDIT --
PS: I know I've given you a VB.NET page, but the connection strings will be pretty similar.

setting timeout on a SSIS 2005 script task

I am using a script task in ssis 2005 which takes some time and getting a problems timing out with "The operation has timed out". Where abouts is the setting to change the timeout period? I cant seem to find it in the properties in task and I do not wish to make a global change to timeout.
Probably the problems with timeout is in the databases. You should define a new timeout in the connection managers you have in your SSIS package