My DataTable sometimes fails giving me Ajax Error but its working - datatables

I have this weird problem that I cannot solve. My DataTable when working in a production server it's failing(not always if I refresh the page the DataTable sometimes works) and giving me this error: DataTables warning: table id=test-table - Ajax error. The thing it's that the server it's taking 25000 rows from a database. When I draw the data from the same database but work locally the same DataTable doesn't fail(it takes 1-2 seconds but it works always and it doesn't throw the error). I can't understand why it fails like this. I wouldn't care if the production server took 3-4 seconds but didn't throw an error.
My DataTable options are:
$('#test-table').DataTable({
processing: true,
serverSide: true,
ajax: 'showroute',
deferRender: true,
paging: true,

Related

BigQuery: An internal error occurred and the request could not be completed. Error: 7367027

Trying to run a simple delete statement and the command simply times out and throws the error in the title.
If I change the where to operate on a subset of the data, it works and then I was able to run the where true version successfully. I can't wrap my head around why I'm getting this error.
I've tried running this from the python library as well as from the console, with both returning the same error. I've also tried deleting the table entirely and then recreating it, which did nothing.
delete from `<table>`
where true

Big JSON record to BigQuery is not showing up

I wanted to try to upload big JSON record object to BigQuery.
I am talking of JSON records of 1.5 MB each, with a complex nested schema up to 7th degree.
For simplicity, I started to load file with a single record on one line.
At first I try to have BigQuery to autodetect my schema, but that resulted in table that is not responsive and I cannot perform query on, albeit it says it had at least a record.
Then, assuming that my schema could be too hard to reverse for the loader, I tried to write the schema myself and I then I tried to load my my file with single record.
At first I got a simple error with just "invalid".
bq load --source_format=NEWLINE_DELIMITED_JSON invq_data.test_table
my_single_json_record_file
Upload complete.
Waiting on bqjob_r5a4ce64904bbba9d_0000015e14aba735_1 ... (3s) Current
status: DONE
BigQuery error in load operation: Error processing job 'invq-
test:bqjob_r5a4ce64904bbba9d_0000015e14aba735_1': JSON table
encountered too many errors, giving up. Rows:
1; errors: 1.
Which after checking for the job error was just giving me the following:
"status": {
"errorResult": {
"location": "file-00000000",
"message": "JSON table encountered too many errors, giving up. Rows: 1; errors: 1.",
"reason": "invalid"
},
"errors": [
{
"location": "file-00000000",
"message": "JSON table encountered too many errors, giving up. Rows: 1; errors: 1.",
"reason": "invalid"
}
],
"state": "DONE"
},
The after a couple of more attempts creating new tables, it actually started to succeed on command line, without reporting errors:
bq load --max_bad_records=1 --source_format=NEWLINE_DELIMITED_JSON invq_data.test_table_4 my_single_json_record_file
Upload complete.
Waiting on bqjob_r368f1dff98600a4b_0000015e14b43dd5_1 ... (16s) Current status: DONE
with no error on the status checker...
"statistics": {
"creationTime": "1503585955356",
"endTime": "1503585973623",
"load": {
"badRecords": "0",
"inputFileBytes": "1494390",
"inputFiles": "1",
"outputBytes": "0",
"outputRows": "0"
},
"startTime": "1503585955723"
},
"status": {
"state": "DONE"
},
But no actual records are added to my tables.
I tried to perform the same from WebUI but the result is the same. Green on the completed job, but no actual record added.
Is there something else that I can do for checking where the data is sinking to? Maybe some more log?
I can imagine that maybe I am on the the edge of the 2 MB JSON row size limit but, if so, should this be reported as error?
Thanks in advance for the help!!
EDIT:
It turned out the complexity of my schema was a bit the devil in here.
My json files were valid, but my complex schema had several errors.
It turned out that I had to simplify it such schema anyway, because I got a new batch of data where single json instances where more 30MB and I had to restructure this data in a more relational way, whilst making smaller rows to insert in the database.
Funny enough when the schema was scattered across multiple entities (ergo, simplified) the actually error/inconsistencies of the schema started to actually show up in error returned and it was easier to fix them. (Mostly it was new nested undocumented data which I was not aware anyway... but still my bad).
The lesson here, is when a table schema is too long (I didn't experiment how much precisely is too long) BigQuery just hide itself behind reporting too many errors to show.
But that is a point where you should consider simplify the schema(/structure) of your data.

How to handle a longer wait time for complex queries? Angularjs

I am using angularjs, breeze, mssql2012.
I have several queries that takes about 40 seconds to 1 minute to complete running. It is a search function that goes through several tables with records above 900K with several joins.
It shows up with a few errors:
500 (Internal Server Error)
[Error] Error retrieving dataThe wait
operation timed out Error: The wait operation timed out
I am not sure whether the error is with breeze or angular, but I'd like to make the wait time longer than a minute. The query does work on the server.
I've tried using the $timeout from angular, but it doesn't seem to work.
getSearch().then(function () {
common.$timeout(function () {
toggleSearchSpinner();
}, 1250);
});
Not quite sure how to use the timeout function.
I do have $timeout defined in the common module:
commonModule.factory('common',
['$q', '$rootScope', '$timeout', 'commonConfig', 'logger', common]);
function common($q, $rootScope, $timeout, commonConfig, logger) {
var throttles = {};
var service = {
// common angular dependencies
$broadcast: $broadcast,
$q: $q,
$timeout: $timeout,
};
return service;
}
I have multiple places in the application where queries might just take a long time and unfortunately it is unavoidable. It would be great if there was a one time setting that prolongs the timeout wait time.... Is there?

time out error using entity framework?

We have the following code:
var db = new CoreEntityDB();
var abc = new abcDB();
var connection = new DataStore(db.ConnectionStrings.First(p => p.Name == "Abc").Value, DataStore.Server.SqlServer);
var projects = new List<abc_Employees>();
projects.AddRange(abc.Database.SqlQuery<abc_Employees>("usp_ABC_EmployeeSys"));
The project is failing on the following line:
projects.AddRange(abc.Database.SqlQuery<abc_Employees>("usp_ABC_EmployeeSys"));
And the error says: "Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding"
Everything was working fine a few days ago, and now, nothing. Nothing's changed either as far as code, or sql stored proc.
Anyone else experienced this before?
Did you try to run SP independently to see if that's the bottle neck?
Is it the command that is timing out?
You can increase the command timeout using:
((IObjectContextAdapter)abc).ObjectContext.CommandTimeout = 180;
You should take a look at your stored procedure. The default timeout is 30 seconds so it looks like it is taking longer for the stored procedure to return results. Increasing the timeout is just treating the symptoms.

Selenium RC is ignoring my timeout

I have a simple Selenium test that runs against a remote Selenium Server instance.
I'm trying to test for page performance, and some pages can exceed the max execution time, and I'm trying to catch that.
No matter what I put in setTimeout(), it always waits for the full page to load or the server times out.
public static $browsers = array(
array(
'name' => 'Firefox on Ubuntu',
'browser' => '*firefox',
'host' => 'dev-ubuntudesktop',
'port' => 4444,
'timeout' => '1000',
),
)
public function testSlowPage() {
$this->setTimeout(1000);
$this->open('myslowaddress');
$this->assertTextNotPresent('Internal Server Error');
}
Even though I'm not using openAndWait, the above example doesn't reach the assert line until after the page is loaded or the web server terminates the request.
What I'd really like is a test that confirms "Page loads in under 1 second", without waiting 30 seconds (or whatever the PHP timeout happens to be set to).
Open method implicitly invokes wait, whether you want it to or not. This wait defaults to 30sec. And setTimeOut is used when your page does not load with in 30 sec. Hence if your page does not load in 30 second then you could use setTimeOut else you tests would fail, so it would be -
selenium.Open(appURL);
selenium.setTimeOt(timeOut);
Now coming to your test objective. You could do assertion to check how long page takes to load, I have checked with 1 sec here -
(It's in java but you should be able to find PHP equivalent)
int testStartTime = Calendar.getInstance().get(13);
selenium.open(appURL);
int testEndTime = Calendar.getInstance().get(13);
Assert.assertTrue((testEndTime-testStartTime)>1, "Fail");
Notice that I have not used setTimeOut method here, hence if page does not load with in 30sec then test would fail any way and you would know that page does not load with in 30 sec. In this situation if you yet want to check page load time then can use exception handling along with time calculation to find page load time.