Actual iOS usage in app

#9
by Serjio42 - opened

He everyone.
I tried to use this model with Flutter App using MediaPipe inference, it works on Android but crashes on iOS because of memory issues (Cannot allocate memory [type.googleapis.com/mediapipe.StatusList]).
I used iPhone 16 Pro Max (8GB RAM).
As I understand, the problem is that model is too big to fit into memory constrained iOS requirements for the apps.
I used gemma-3n-E2B-it-int4.task file with 3.14 Gb size. And as I see gemma-3n-E2B-it-int4.litertlm is even bigger.

Could anyone suggest me how to use this model in iOS?

Hello, did you find solution?
2) How do you put the model, deos it inter the app though the app package or you install it later? are we allowed to make a user download the model from us?
3) optional: can you share your flutter code?

Hi @HeyGoodEnough I use the model from here: https://huggingface.co/google/gemma-3n-E2B-it-litert-preview/tree/main (with .task format) but thinking about to switch to .litertlm but not sure whether it working properly. Worth to try.
The iOS solution, as I undersrtand, is to use Memory entitlements in iOS, it needs $100/year subscription. I am still use it in Android only.

Sign up or log in to comment