I am trying to display a large matrix that is constructed of 0/1, each cell is a div which could be black or white, when trying to display matrix of size 1000x1000 the browser crashes...
I am using v-for nested with v-for to display it...
how can i improve the performance?
This is not a Vue-related problem, but rather a DOM related problem. You are putting over a million DOM elements on a page. A tab where I ran a test that generated a stable 1000x1000 matrix with empty divs and classes showed that it consumed 2.3GB of memory. It took quite some time to even render, and scrolling is very slow, which suggests that it is the browser that is having trouble here. Vue does not do anything in my test after having rendered the page.
What you can do depends on the use case. If you just want to display data in a graphical way, consider using a canvas. You can freely draw on a canvas, and since you do not have to juggle around a million dom elements, the performance should be much better.
Other techniques include lazy loading, where you use scroll position to only load/show the rows that are in the viewport. This will reduce the number of dom elements, which should increase performance.
Alternatively you can limit the amount of data that is served to the user, by providing a filter or something similar. If a filter matches too many items, you can cut it off at a certain number of items you know will render fine in a browser and show a message that some of the results are hidden for performance reasons.
Related
I'm working with a List in Office Fabric, specifically a DetailsList. Within my list, I have a number of images that are pretty expensive to render, as well as a pretty big list of rows. Unfortunately, this means when I scroll down, there's a huge lag as the page is re-rendering new images (also frustratingly because it destroys the previous images, if I scroll back up it's similarly laggy).
Is there a way to force a render of the entire list so that it doesn't have to re-render when you scroll up or down? I don't mind having a long initial loading time as long as the actual scrolling portion doesn't have a high latency/isn't slow or jerky.
You can disable virtualization by returning false in the DetailsList's onShouldVirtualize callback.
The team is actively working on improving List / DetailsList virtualization in the coming months.
Relevant documentation pages describing the above prop:
https://developer.microsoft.com/en-us/fabric#/components/detailslist
https://github.com/OfficeDev/office-ui-fabric-react/blob/738e270892f99957aecf567e4b107f8e4cf86176/packages/office-ui-fabric-react/src/components/DetailsList/DetailsList.types.ts#L253
So I've been giving Cytoscape a try recently. My project's goal is basically a collaborative graph that people will be able to add/remove nodes to/from, making it grow in the process. The graph will include many compound nodes.
Most of the examples I've seen use container div that takes 100% of the screen space. This is fine for "controlled" graphs but won't work in my case because its size is intended to be dynamic.
Here's a JSFiddle using the circle layout within a fixed 3000px/3000px container:
https://jsfiddle.net/Jeto143/zj8ed82a/5/
Is there any way to have the container size be dynamic as opposed to stating it explicitly? Or do I have to compute the new optimal container size each time somehow, and then call cy.resize()?
edit: actually, using 100%/100% into cy.fit() might just work no matter how large the network is gonna be, so please ignore this question is this is the case.
Is there a recommended layout for displaying large/unknown amounts of data in a non-hierarchical way that would "smartly" place nodes (including compound ones) in the most efficient way possible, all the while avoiding any overlap? (I guess that's a lot to ask...)
Why doesn't cy.fit() seem to be working in my example? I'm using it both at graph initialization and when CTRL+clicking nodes (to show closed neighborhood), but it doesn't seem to like the 3000x3000px container (seems better with 100%x100%).
edit: also ignore this question if you ignored 1., as again it seems fine with 100%/100%.
Any help/suggestions would be greatly appreciated.
Thank you very much in advance.
TLDR: It's (1).
Cytoscape has a pannable viewport, like a map. You can define the dimensions of the viewport (div) in CSS. What's displayed in the viewport is a function of the positions of the nodes, the zoom level, and the pan level -- just like what is visible in a map's viewport is a function of zoom, pan, and positions of points of interest.
So either you have to
(a) rethink your UI in terms of zoom and pan and use those in-built facilities in Cytoscape, or
(b) disable zoom and pan in Cytoscape (probably stay at (0, 0) at zoom 1) and let the user scroll the page as you add content to the graph and resize its container div to accommodate the new content.
I am trying to use PhantomJS image capture to capture the image of the browser.
Each time I run the image capture function, the dimensions of the image is slightly different. Example, once I get 1400x5185, if I open the same url after few hrs, I get 1399x5185 or 1400x5186.
I have tried croping from left top corner, but then pixels are slightly skewed.
Note:The content of the page is always constant
How do I always ensure that I get the same dimension of image without copping the pixels?
Something probably changes on the page, otherwise there is no reason for PhantomJS to render different images.
You should check the differences of the images in detail. Ads are probably the culprit when they are not uniformly formatted. If you identified the changing DOM elements, you can use casper.evaluate() to access the DOM and remove/hide those elements before capturing the screenshot.
You could also change the viewport size to 1920x1080 for example using casper.viewport(). If the page is vertically scrolling, then only one of the y-direction might change. If you want to be sure, then change the viewport size to 1400x5187.
I am making an app with large set of data, to improve performance I used virtualizing stackpanel inside listbox. Now I have better performance but loading items as they roll in becomes slow when I scroll fast. I am interested:
what is default number of rendered items? (without those in viewport)
can I and how to set explicitly number of items to be rendered at the same time?
Currently three items can fit the viewport, I guess that it renders 5-7 items at the time. How can I modify it?
I've noticed that when dynamically creating a large canvas (6400x6400) that quite alot of the time nothing will be drawn on it, and when setting the canvas to a small size it works 100% of the time, however as I don't know any better, I have no other choice than to try and get the large canvas working correctly.
thisObj.oMapCanvas = jQuery( document.createElement('canvas') ).attr('width', 6400).attr('height', 6400).css('border','1px solid green').prependTo( thisObj.oMapLayer ).get(0);
// getContext and then drawing stuff here...
The purpose of the canvas is to simply draw a line between two nodes (images), which are within a div container that can be dragged around (viewport I think people call them).
What I "think" may be happening is that on a canvas resize it emptys the canvas, and that is interfering with the context drawing, as like I said previously it works all the time when the canvas is alot smaller.
Has anyone experienced this before and/or know any possible solutions?
That is an enormous sized canvas. 6400 x 6400 x 4 bytes per pixel is 156 MB, and your implementation may need to allocate two or more buffers of that size, for double buffering, or need to allocate video memory of that size as well. It's going to take a while to allocate and clear all that memory, and you may not be guaranteed to succeed at such an allocation. Is there a reason you need such an enormous canvas? You could instead try sizing your canvas to be only as large as necessary to draw the line between those two divs, or you could try using SVG instead of a canvas.
Another possibility would be to try dividing your canvas up into large tiles, and only rendering those tiles that are actually visible on the screen. Google Maps does this with images, to only load images for the portion of the map that is currently visible (plus some extra one each side of the screen to make sure that when you scroll you won't need to wait for it to render), maintaining an illusion that there is an enormous canvas while really only rendering something a bit bigger than the window.
Most browsers that implement HTML5 are still in early beta - so it's quite likely they are still working the bugs out.
However, the resolution of the canvas you are trying to create is very high .. much higher than what most people's monitors can even display. Is there are reason you need it quite so large? Why not restrict the draggable area to something more in line with typical display resolutions?
I had the same problem! I was trtying to use a big canvas to connect some divs. Eventually I gave up and drew a line using javascript (I drew my line using little images as pixels- I did it with divs first, but in IE the divs came out too big).