Size on disk signifcantly larger than actual size - size

I want to install Unreal Engine 5 on my external SSD. The actual size of Unreal 5 is about 20G. But the Size on disk is 160G. Any clue of reducing the size on disk?
disk size properties

Related

Size limit of imported meshes in MeshLab

MeshLab crashes when importing a OBJ file larger than 10 MB.
Is there a size limit or a workaround?

FileSize Limit on Google Colab

I am working on APTOS Blindness detection challenge datasets from Kaggle. Post uploading the files; when I try to unzip the train images folder ; I get an error of file size limit saying the limited space available on RAM and Disk. Could any one please suggest an alternative to work with large size of image data.
If you get that error while unzipping the archive, it is a disk space problem. Colab gives you about 80 gb by default, try switching runtime to GPU acceleration, aside from better performance during certain tasks as using tensorflow, you will get about 350 gb of available space.
From Colab go to Runtime -> Change runtime type, and in the hardware acceleration menu select GPU.
If you need more disk space, Colab now offers a Pro version of the service with double disk space available in the free version.

How to evaluate the file bytes size and the disk's real usage?

First, I throw the truth: When you write 1KB bytes to disk, the real disk usage is not
precisely 1KB, maybe more of that.
I'm downloading file from some third-party file system to
local disk. On each file downloading, I can get the file size from HTTP response, for
example, 7kb, and I added it the the disk's usage (a counter variable), when the usage reach 90% of the disk's
capacity, the program will switch to other available disk automatically. But when I do
the real job, the disk's real usage increase faster then the real received bytes, before
I download 2TB bytes (at 1TB, for example), the disk's usage has reached to 2TB (use
df -h), how to precisely evaluate the disk usage when downloading files? is there a
ratio between file byte size and the real disk usage?
Once I plan to use df -h to get the realtime disk usage, but on each file download creat
a OS process is not realistic.

recommended limit for memory management in Cocos2d?

is there a recommended limit for the images in Cocos2d, whether there are too big and take too much memory? Are there some rules, in dimensions or in Kb, to avoid slowing the game down? (for the background image, or the graphics of my characters (even if i use a batch node)?)
Thanks for your answer
First of all, memory usage has very, very, very little to do with performance. You can fill up the entire memory with textures, the game won't care. It's when you render them where there will be a difference. And then it only matters how much of the screen area you're filling with textures, how heavily they're overlayed, batched, rotated, scaled, shaded and alpha-blended. Those are the main factors in texture rendering performance. Memory usage plays a very insignificant role.
You may be interested in the cocos2d sprite-batch performance test I did and the general cocos2d performance analysis. Both come with test projects.
As for the maximum texture sizes have a look at the table from my Learn Cocos2D book:
Note that iPhone and iPhone 3G devices have a 24 MB texture memory limit. 3rd generation (iPhone 3GS) and newer devices don't have that limit anymore. Also keep in mind that while a device may have 256 MB of memory installed, significantly less memory will be available for use by apps.
For example, on the iPad (1st gen) it is recommended not to use more than 100 MB of memory, with a maximum available memory peaking at around 125 MB and memory warning starting to show as early as around 80-90 MB memory usage.
With iOS 5.1 Apple also increased the maximum texture size of the iPad 2. The safest and most commonly usable texture size is 2048x2048 for Retina textures, and 1024x1024 for standard resolution textures.
Not in the table are iPod touch devices because they're practically identical to the iPhone models of the same generation, but not as easily identifiable. For example the iPod touch 3rd generation includes devices with 8, 16 and 32GB of flash memory, but the 8GB model is actually 2nd generation hardware.
The dimensional size of images and textures depends on the device you are supporting. Older devices supported small layers, I think 2048x2048 in size. I don't think such limitation exists on current devices.
For large images, you definitely want to use batch nodes as they have been tested to demonstrate the largest performance gain when dealing with large images. Though it is a good idea to use them for as much as possible in general.
As for how much you can load, it really depends on the device. The new iPad has 1 GB of memory and is designed to have much more available memory for large images. A first-gen iPad has 1/4 this amount of memory, and in my experience I start to see an app crash when it gets around 100 MB of memory used (as confirmed using Instruments).
The trick is to use only as much memory as you need for the current app's operation, then release it when you move to a new scene or new set of images/sprites/textures. You could for example have very large tiled textures where only the tiles nearest the viewport are loaded into memory. You could technically have an infinite sized view that stretches forever if you remove from memory those parts of the view that are not visible onscreen.
And of course when dealing with lots of resources, make sure your app delegate responds appropriately to its memory warnings.
As per my knowledge.. A batch node of size 1024x1024 takes around 4 MB of space which is only texture memory.. And an application has maximum limit of 24 MB. So game slows down as you reach this 24 MB space and crashes after that. To avoid slowness I used maximum of 4 Batch Nodes at one time i.e.16 MB. Rest 8 MB was left for variables and other data. Before using more batch node I used to clean memory and remove unused batch nodes.. I don't know about memory limit in 4s but in case of iPhone 4 this was what I learnt.
Using this logic in mind I used to run my game smoothly.

providing more heap in Keil

I am working on MCB2300 (with LPC2378 processor)and using keil uVision4. In my program I am creating dynamic memory using malloc() function. As all dynamic contents will be stored in heap, I need to ensure that required heap size allocated. The default value for heap in my startup file (LPC2300.s) is 0x00000800. In my application I am reading an image (bmp format) and storing the pixel values into a matrix and the matrix is created dynamically with respect to size of input image. The maximum heap value I can set in my start up file is 0x000072FF. For this value of heap, I was able to read an image of 44 x 33 successfully. Beyond this size memory is not allocated. I need to read an image with dimensions of atleast 100 x 100. My available RAM is 32K
These are my output values after I compile my code
Program Size: Code=30664 RO-data=1220 RW-data=132 ZI-data=37628
How to provide additional heap?
Is it possible to store heap memory on SD/MMC card or external memory bank which has been provided for LPC2378. Please help me to solve this problem
If your board has external RAM chip, you can use it for heap.
But if there are no external RAM, there are no way to increase heap size above internal RAM size.
You can write some variant of virtual memory driver to use SD/MMC card as memory device. But since your device has no MMU (memory management unit), your driver will be extreme complex and extreme slow. So it is not an option.
Also, having 28K of heap, you can hold 99x99 RGB24 BMP image there. 99*99*3 = 29403.