I'm trying to debug a slow Django listview page. The page currently displays all tests (133), as well as the latest result for each test. There are approximately 60,000 results, each result having a FK relationship to a single test.
I've optimized the SQL (I think) by selecting the latest result for each test, and passing it in as a prefetch related to my tests query. Django Debug Toolbar shows that the SQL is taking ~350ms, but the page load is ~3.5s.
If I restrict the list view to a single test though, the SQL is ~7.5ms, and page load is ~100ms.
I'm not fetching anything from S3 etc, and not rendering any images or the like.
So my question is, what could be causing the slowness? It seems like it is the SQL, since as the result set grows so does page load, but maybe rendering of each item or something similar? Any guidance on what to look into next would be appreciated.
You can force execute your Django query set:
test_list = list(your_queryset)
and return in your view simple text
HttpResponse("return this string"),
then you can check time without rendering. Also Django Debug Toolbar and slow your app, maybe here your case: https://github.com/jazzband/django-debug-toolbar/issues/910
Related
Which way is the best in case of performance if I want to display data with pagination? Should I download all data from the DB and then locally switch those parts depending on the current page or get the data from the DB part by part?
Firstly I was opting for the second option but I started wondering, found this article and I'm lost now.
In my SQL queries, I'm using OFFSET and LIMIT attribute and since I'm also obtaining the last page of the pagination so the better option would be the first one as far as I understand? Important to notice is my database is quite small.
And the best one option would be to still use OFFSET but without reading last page or I'm wrong (in case of larger databases and improvement of performance)?
So at the end, I implemented it just like the article says. Removed the "move to last page" button so it won't be forced to count all rows and anyway I got some of sorting features like ASC/DESC by particular columns so still if user want to go to the last page he can casually click this filter option and he will get the last elements but using as I said before ASC/DESC queries which I hope are faster than OFFSET.
At the moment i use delphi xe3 and an online database,
I use
db.active:=False;
DB.sql.text:='slect*from database';
db.active:=True;
to show and filter records, but when i get to about 1000 records in the database, it becomes very slow, is there a way to make the program usable while the db is being loaded,
kind of like facebook loads while you type, and shows records when they're found.
thank you
Th problem is in * in your select, and ofc in amount of data. You must draw data as it goes from server. I don't know what components do you use so any information about it will be helpfull... In DevExpress's grid component there is property called "GridMode" for this.
I want to make a table of "people", where each of their attributes is inline editable. There are about 500 people, and the list will grow over time. The people#index page contains this list.
Since there are a lot of form elements, the server takes a long time to render the page. The server I'm using to host is also fairly slow, so I thought about making the Javascript doing the rendering of these tables.
Currently, I created a new action to return JSON data of all the people's attributes, and I use Rails to generate just a template for the row. I then use jQuery to clone that template, and insert each attribute from the JSON data into a new copy, and add it to the rows.
This makes people#index load a lot faster, but now I run into the problem of the Javascript freezing up the page for a second while things load.
I looked into web workers for creating a thread to do this work. I plan to generate the extra rows on a separate thread, then add it to my table (I'm using dataTables).
I feel like I might be missing something. That this is a sloppy solution.
My question is, is there a better approach to this?
Possible options:
1 Pagination
I this in this case pagination is the best solution.
Pagination can be done on rails side(https://github.com/mislav/will_paginate and https://github.com/amatsuda/kaminari gems) or on js side.
2 You may also use something like http://trirand.com/blog/jqgrid/jqgrid.html
This JS component offers ability to inline edit data too and you may search for rails integration gems like https://github.com/allen13/rails-asset-jqgrid
I need a simple tool to visualize the status of a series of processes (ETL processes, but that shouldn't matter). This process monitor need to be customizable with color coding for different status codes. The plan is to place the monitor on a big screen in the office making any faults instantly visible to everyone.
Today I can check the status of these processes by running an sql statement against the underlying tables in our oracle database. The output of these queries are the abovementioned status codes for each process. I'm imagining using these sql statements, run periodically (say, every minute or so), as an input to this monitor.
I've considered writing a simple web interface for doing this, but I'm thinking something like this should exist out there already. Anyone have any suggestions?
If just displaying on one workstation another option is SQL Developer Custom Reports. You would still have to fire up SQL Developer and start the report, but the custom reports have a setting so they can be refreshed at a specified interval (5-120 seconds). Depending on the 'richness' of the output you want you can either:
Create a simple Table report (style = Table)
Paste in one of the queries you already use as a starting point.
Create a PL/SQL Block that outputs HTML via DBMS_OUTPUT.PUT_LINE statements (Style = plsql-dbms_output)
Get creative as you like with formatting, colors, etc using HTML tags in the output. I have used this to create bar graphs to show progress of v$Long_Operations. A full description and screen shots are available here Creating a User Defined HTML Report
in SQL Developer.
If you just want to get some output moving you can forego SQL Developer, schedule a process to use your PL/SQL block to write HTML output to a file, and use a browser to display your generated output on your big screen. Alternately make the file available via a web server so others in your office can bring it up. Periodically regnerate the file and make sure to add a refresh meta tag to the page so browsers will periodically reload.
Oracle Application Express is probably the best tool for this.
I would say roll your own dashboard. Depends on your skillset, but I'd do a basic web app in Java (spring or some mvc framework, I'm not a web developer but I know enough to create a basic functional dashboard). Since you already know the SQL needed, it shouldn't be difficult to put together and you can modify as needed in future. Just keep it simple I would say (don't need a middleware or single sign-on or fancy views/charts).
OK, first let me state that I have never used this control and this is also my first attempt at using a web service.
My dilemma is as follows. I need to query a database to get back a certain column and use that for my autocomplete. Obviously I don't want the query to run every time a user types another word in the textbox, so my best guess is to run the query once then use that dataset, array, list or whatever to then filter for the autocomplete extender...
I am kinda lost any suggestions??
Why not keep track of the query executed by the user in a session variable, then use that to filter any further results?
The trick to preventing the database from overloading I think is really to just limit how frequently the auto updater is allowed to update, something like once per 2 seconds seems reasonable to me.
What I would do is this: Store the current list returned by the query for word A server side and tie that to a session variable. This should be basically the entire list I would think. Then, for each new word typed, so long as the original word A exists, you can filter the session info and spit the filtered results out without having to query again. So basically, only query again when word A changes.
I'm using "session" in a PHP sense, you may be using a different language with different terminology, but the concept should be the same.
This question depends upon how transactional your data store is. Obviously if you are looking for US states (a data collection that would not change realistically through the life of the application) then I would either cache a System.Collection.Generic List<> type or if you wanted a DataTable.
You could easily set up a cache of the data you wish to query to be dependent upon an XML file or database so that your extender always queries the data object casted from the cache and the cache object is only updated when the datasource changes.
RAM is cheap and SQL is harder to scale than IIS so cache everything in memory:
your entire data source if is not
too large to load it in reasonable
time,
precalculated data,
autocomplete webservice responses.
Depending on your autocomplete desired behavior and performance you may want to precalculate data and create redundant structures optimized for reading. Make use of structs like SortedList (when you need sth like 'select top x ... where z like #query+'%'), Hashtable,...
While caching everything is certainly a good idea, your question about which data structure to use is an issue that wasn't fully answered here.
The best data structure for an autocomplete extender is a Trie.
You can find a good .NET article and code here.