Running LLM model locally on device

Hello forum,

I’m wondering if someone here have tried (successfully) run some tiny llm model locally? I’m still in the research phase however I found this website which claims they manage to build such app with a 4GB LLM model on device. Any thoughts on the topic?

1 Like

Yes, Using flutter_gemma plugin, you can run gemma 2B models locally on mobile.

1 Like