I wonder how many templates I can create for product and category pages? I did not find it on Planform limits page.
There's a limit of 2500 total files in the theme bundle, but aside from that, there isn't a limit specific to custom product and category templates. A few other boundaries that might be helpful:
Max bundle size of 50MB
5MB max for any single file within the bundle
1MB max for either the /templates or the parsed/templates directories within the bundle
Max unzipped file size is 100MB
Related
I can't upload files over 2 MB.
I have seen various procedures in various forums, which unfortunately cannot be implemented in TYPO3 10.4.14.
Does anyone know how I can increase the file limit of 2MB?
Many thanks in advance
It's not due to TYPO3 or a setting of TYPO3.
This 2MB limit is the default value for MAX_FILE_SIZE in PHP. So you (or your hoster) have to increase this limit to contemporary values.
https://www.php.net/manual/en/features.file-upload.common-pitfalls.php
In former times, there was a setting $GLOBALS['TYPO3_CONF_VARS']['BE']['maxFileSize'] with a default value of 10MB. This setting has been removed in TYPO3 v7.6.0 for keeping TYPO3 in line with PHP settings. (https://forge.typo3.org/issues/71110)
I am trying to deploy my first instant app, but keep getting this error on the google play console that keeps the 'Deploy' button disabled:
Your Instant App APKs contains an APK that is greater than the maximum size of 4364793 bytes
while the android studio APK profiler tools shows:
raw size: 3.5 MB
Download size: 3.5 MB
APK size: 3.8 MB
Further more, I noticed that the maximum size kept decreasing as I reduced the APK's size(it was initially 4.2 MB)
I would like to get help if someone has a clue on what is going wrong.
I was a little thrown off by the message "APK contains an APK", thinking that you might have some exotic app. But I don't see that string in the publishing code, so I think you're running into a more typical error.
The Instant App format is a ZIP file containing APKs. Each of these APKs must be under 4MB.
Additionally, if your app is split up into feature APKs and a base APK, then any combination of feature + base must also be under 4MB.
If you use configuration APKs, then those also contribute to the limit, but I don't think you are.
Off the top of my head, I'm not sure what APK Profiler is telling you for instant apps, whether it's the size of the whole ZIP, or of a single APK. I could be wrong, but I don't think it knows about the calculation of our 4MB limit. One easy way to check the sizes without using any fancy tools is to just unzip the ZIP file and look at the sizes of the APKs on disk. Or use APK Analyzer.
One other thing: if you publish to dev track, there are no size restrictions there. Not that that solves your size problem, but it can be a useful way to play around with Instant Apps before tackling the size problem.
Source: Instant Apps FAQs: App size.
4MB is the designated size limit of Instant Apps in Android as stated in the documentation:
You must be mindful of the 4MB size limit for instant app downloads.
Also, each feature needs to have a single activity as its entry point.
What I can suggest is that check the Reduce APK Size docs. Its suggests removing unused resources, minimize resource use m libraries and so much more.
I want to know the best way how to handle/manage our products images when we import products from csv in Prestashop 1.6. I mean, does Prestashop provide place/space to upload many images? or we must upload in external website (what website)?
May be this question is general enough, but when I googling I dont get the clear answer. Your answers I appreciate.
Newer PrestaShop versions support new storage architecture for pictures. This new system of image placement allows to work with images much faster, keeping them in order. Images are stored at /img/p folder, in created subfolders that correspond to image ID
Basically, you will avoid having 100,000 pictures in the same “/img/p” folder. Instead, the pictures will be placed into subfolders within “/img/p” directory (e.g.: “/img/p/1/2/ for image with ID 12 or /img/p/7/6/5/4/7 for image with ID 76547).
We are in the process of building a system which allows users to upload multiple images and videos to our servers.
The team I'm working with have decided to save all the assets belonging to a user in a folder named using the user's unique identifier. This folder in turn will be a sub-folder of our main assets folder on the file server.
The file structure they have proposed is as follows:
[asset_root]/userid1/assets1
[asset_root]/userid1/assets2
[asset_root]/userid2/assets1
[asset_root]/userid2/assets2
etc.
We are expecting to have thousands or possibly a million+ users in the life time of this system.
I always thought that it wasn't a good idea to have many sub-folders in a single location and suggested a year/month/day approach as follows:
[asset_root]/2010/11/04/userid1/assets1
[asset_root]/2010/11/04/userid1/assets2
[asset_root]/2010/11/04/userid2/assets1
[asset_root]/2010/11/04/userid2/assets2
etc.
Does anyone know which of the above approaches would be better suited for this many assets? Is there a better method to organize images/videos on a server?
The system in question will be an Windows IIS 7.5 with a SAN.
Many thanks in advance.
In general you are correct, in that many file systems impose a limit on the number of files and folders which may be in one folder. If you hit that limit with the number of users you have, your in trouble.
In general, I would simply use a uuid for each image, with some dimension of partitioning. e.g. A hash of ABCDEFGH would end up as [asset_root]/ABC/DEFGH. Using a hash gives you a greater degree of assurance about the number of files which will end up in each folder and prevents you from having to worry about, for example, not knowing which month an image you need was stored in.
I'm presuming your file system is NTFS? IF so, you've got a limit of 4,294,967,295 files on the disk - the limit of files in a folder is the same. If you have on the order of millions of users you should be fine, though you might want to consider having only one folder per user instead of several as your example indicates.
How many files can a windows server 2008 r2 directory safely hold?
I'm thinking in terms of a website that has image galleries. Say there is one directory that holds all the thumbnails and a different directory that holds the full size images. How many pairs of images can be safely stored?
Or, if there isn't a good cut-and-dry answer, should I just try it with 30,000 images?
If your server is using NTFS for its volume file system, you aren't limited to any number of files per directory per se, but more in that you are limited to some number of files/directories per volume.
For NTFS, size limitations are:
NTFS Size Limits
Files per volume
4,294,967,295 (2^32 minus 1 file)
Of course, that says nothing about performance, and there are other considerations that can come into play. With 30000, you shouldn't worry. When you get into the millions, you might have to start restructuring.
edit to address scaling/performance
Technically speaking, the NTFS file system uses a global MFT that keeps track of all the files (directories are files and are mostly used for logical representation to the end user) so every time you modify the volume, that change is reflected in the MFT.
When you start having a single directory with large numbers of files, one of the recommended procedures is to disable automatic 8.3 name generation. From the technet article I linked above:
Every time you create a file with a long file name, NTFS creates a second file entry that has a similar 8.3 short file name. A file with an 8.3 short file name has a file name containing 1 to 8 characters and a file name extension containing 1 to 3 characters. The file name and file name extension are separated by a period.
So if you are constantly modifying a single directory with a large amount of files, the system has to generate a short name for it - this can lead to performance degradation if you are constantly modifying the contents of a single directory. Since you are storing images, it could be very likely a lot of the files have similar file names at the beginning, like imageblahblahblah.
For file look-up performance, even for large directories NTFS should be reasonably fast because of the underlying B-Tree implementation.
Also check out this thread: NTFS performance and large volumes of files and directories