Your Instant App APKs contains an APK that is greater than the maximum size of 4364793 bytes - android-instant-apps

I am trying to deploy my first instant app, but keep getting this error on the google play console that keeps the 'Deploy' button disabled:
Your Instant App APKs contains an APK that is greater than the maximum size of 4364793 bytes
while the android studio APK profiler tools shows:
raw size: 3.5 MB
Download size: 3.5 MB
APK size: 3.8 MB
Further more, I noticed that the maximum size kept decreasing as I reduced the APK's size(it was initially 4.2 MB)
I would like to get help if someone has a clue on what is going wrong.

I was a little thrown off by the message "APK contains an APK", thinking that you might have some exotic app. But I don't see that string in the publishing code, so I think you're running into a more typical error.
The Instant App format is a ZIP file containing APKs. Each of these APKs must be under 4MB.
Additionally, if your app is split up into feature APKs and a base APK, then any combination of feature + base must also be under 4MB.
If you use configuration APKs, then those also contribute to the limit, but I don't think you are.
Off the top of my head, I'm not sure what APK Profiler is telling you for instant apps, whether it's the size of the whole ZIP, or of a single APK. I could be wrong, but I don't think it knows about the calculation of our 4MB limit. One easy way to check the sizes without using any fancy tools is to just unzip the ZIP file and look at the sizes of the APKs on disk. Or use APK Analyzer.
One other thing: if you publish to dev track, there are no size restrictions there. Not that that solves your size problem, but it can be a useful way to play around with Instant Apps before tackling the size problem.
Source: Instant Apps FAQs: App size.

4MB is the designated size limit of Instant Apps in Android as stated in the documentation:
You must be mindful of the 4MB size limit for instant app downloads.
Also, each feature needs to have a single activity as its entry point.
What I can suggest is that check the Reduce APK Size docs. Its suggests removing unused resources, minimize resource use m libraries and so much more.

Related

Realm objective c - really huge db file size - 64GB

We have recently planned to switch from SQLite to Realm in macOS and iOS app due to db file corruption issue with SQLite so we first started with macOS app. All coding changes were smooth and app started working fine.
Background about app and DB usage - app really uses DB very heavily and performs too many read and writes to DB in each minute and saves big xml's to it. In each minute it writes / updates around 10-12 records (at max) with xml and reads 25-30records too. After each read it deletes data along with xml from database and my expectation is once data is deleted it should free up space and reduce file size but looks like it is growing continuously.
To test the new DB changes we kept app running app 3-4 days and DB file size went to 64.42GB and app started being slow. Please refer the attached screen shot.
To further debug, I started app with new DB file and size was 4KB but within 5min it goes to 295KB and never reduced in size even records were continuously added and deleted.
To further clarify, app uses NSThreads to perform various operations and those threads writes and reads data to DB but with proper begin\commit transactions. I also read at 'Large File Size' at https://realm.io/docs/java/latest/#faq and tried to find compactRealm but can't find it in objective c.
Can anybody please advise.
Update - I Give up on Realm
After 15days of efforts, Finally I have stopped usage of Realm and starting to fix/workaround db file corruption issue with SQLite. Realm Huge DB file issue was fixed by making changes to code for threads but then I started getting Too many open files error after running the app for 7-8 hours.
Debugged for whole week and made all possible changes and at some point looks like all was good as xcode were not showing any open files. But again I started getting Too many open files crash and then debugged with instrument and found there were so many open files to realm database, lock, commit and cv files.
I am sure there are no leaks in app and xcode also does not shows those open files in Disk usage as well. I decided to invoke lsof command in code before and after Realm calls and most of the it doesn't increase open file count but sometime n between it increases. In My app it started from 120 files to 550 in around 6 hours. Xcode looks all fine via Disk usage but instrument shows open files.
No good support from Realm team, sent email to them, just got one response. Made many changes to code following their suggestions and doesn't work at all so gave up on it. I think it's good for small apps only.

How chunk file upload works

I am working on file upload and really wandering how actually chunk file upload works.
While i understand client sends data in small chunks to server instead of complete file at once. But i have few questions on this:-
For browser to divide and send whole file into chunks, Will it read complete file to its memory? If yes, then again there will me chances of memory leak and browser crash for big files(say > 10GB)
How cloud application like google drive droopbox handles such big files upload?
If multiple files are selected to upload and all have size grater than 5-10 GB, Does browser keep all files into memory then send it chunk by chunk?
Not sure if you're still looking for answer, I been in your position recently, and here's what I've come up with, hope it helps: Deal chunk uploaded files in php
During uploading, If you can print out the request from the backend, you shall see three parameters: _chunkNumber, _totalSize and _chunkSize, with these parameters it's easy to decide whether this chunk is the last piece, if it is, assemble all of the pieces as a whole shouldn't be hard.
As for javascript side, ng-file-upload has a setting named "resumeChunkSize" where you can enable chunk mode and setup the chunk size.

Reduce image size of multiple images via ssh

I have an e-commerce website I made for a client.
As with any e-commerce site, there are a lot of pictures.
About a hundred of these pictures were uploaded by me, provided by my client.
The other 400 were uploaded by client.
The problem is that the first set of images that my client provided me with were about 100kb each, which is not such a big deal. The second set of images, the ones my client uploaded, were about 5-9 MBs in size. Obviously I didn't see this until it was too late.
So my question is this: How can I reduce the image size of all those load-heavy images to something more around 100-200kb through ssh/commandline/php.
I'm also talking about re-scaling the images to something smaller (currently they are about 3700px x 5600px).
Please note: I don't need a solution to re-scale the images when they are being uploaded.
I need a solution to re-scale the images that are already on the server.
Assuming your server is a Unix, you can use imagemagick/convert tool:
http://doc.ubuntu-fr.org/imagemagick
You can also use PHP+GD, see:
http://fr.php.net/manual/en/book.image.php

Download large amount of images with objective-C

I'm currently developing an order entry application for my company. This means I need to download approximately 1900 product images to the iPad, and that's just the normal images. I also need to download an equal amount of thumbnails. The reason for downloading the images to the iPad instead of just displaying them from a given URL is that our reps wander into large stores which often don't have stable internet connections.
What would be the best course of action? The images are stored on our servers, but you need to be authenticated using Basic Auth before you can access those. I have thought of just downloading them one-by-one, which is tedious, or group them together on the server as a zip-file but that would be a large file.
A one-by-one is a valid options for the download. I have done projects with similar specs, so what I advise:
Use some 3rd party library to help you with the download of the images. MKNetworkKit for example. If you feel confortable enough, NSURLConnection is more than enough.
Store the images in the application sandbox.
Instead of downloading the thumbs, just create them on the go when you need them (Lazy pattern). Unless your image's thumbs are somewhat different than the original (some special effect).

iPad - how should I distribute offline web content for use by a UIWebView in application?

I'm building an application that needs to download web content for offline viewing on an iPad. At present I'm loading some web content from the web for test purposes and displaying this with a UIWebView. Implementing that was simple enough. Now I need to make some modifications to support offline content. Eventually that offline content would be downloaded in user selectable bundles.
As I see it I have a number of options but I may have missed some:
Pack content in a ZIP (or other archive) file and unpack the content when it is downloaded to the iPad.
Put the content in a SQLite database. This seems to require some 3rd party libs like FMDB.
Use Core Data. From what I understand this supports a number of storage formats including SQLite.
Use the filesystem and download each required file individually. OK, not really a bundle but maybe this is the best option?
Considerations/Questions:
What are the storage limitations and performance limitations for each of these methods? And is there an overall storage limit per iPad app?
If I'm going to have the user navigate through the downloaded content, what option is easier to code up?
It would seem like spinning up a local web server would be one of the most efficient ways to handle the runtime aspects of displaying the content. Are there any open source examples of this which load from a bundle like options 1-3?
The other side of this is the content creation and it seems like zipping up the content (option 1) is the simplest from this angle. The other options would appear to require creation of tools to support the content creator.
If you have the control over the content, I'd recommend a mix of both the first and the third option. If the content is created by you (like levels, etc) then simply store it on the server, download a zip and store it locally. Use CoreData to store an Index about the things you've downloaded, like the path of the folder it's stored in and it's name/origin/etc, but not the raw data. Databases are not thought to hold massive amounts of raw content, rather to hold structured data. And even if they can -- I'd not do so.
For your considerations:
Disk space is the only limit I know on the iPad. However, databases tend to get slower if they grow too large. If you barely scan though the data, use the file system directly -- may prove faster and cheaper.
The index in CoreData could store all relevant data. You will have very easy and very quick access. Opening a content will load it from the file system, which is quick, cheap and doesn't strain the index.
Why would you do so? Redirect your WebView to a file:// URL will have the same effect, won't it?
Should be answered by now.
If you don't have control then use the same as above but download each file separately, as suggested in option four. after unzipping both cases are basically the same.
Please get back if you have questions.
You could create a xml file for each bundle, containing the path to each file in the bundle, place it in a folder common to each bundle. When downloading, download and parse the xml first and download each ressource one by one. This will spare you the overhead of zipping and unzipping the content. Create a folder for each bundle locally and recreate the folder structure of the bundle there. This way the content will work online and offline without changes.
With a little effort, you could even keep track of file versions by including version numbers in the xml file for each ressource, so if your content has been partially updated only the files with changed version numbers have to be downloaded again.