|
https://reddit.com/link/1r32vf8/video/1uq52gevc4jg1/player Every time you use ChatGPT, Gemini, or Copilot, your conversations are sent to servers you don’t control. Your questions about health, finances, relationships, work problems — all of it sitting in someone’s database, training their next model. I wanted AI without the surveillance tax. So I built LocalLLM – an Android & iOS app that downloads an AI model once, then runs 100% on your phone. After that first download, you can turn on airplane mode and chat forever. What it actually does:
What it doesn’t do:
The only time it touches the internet is to download models from Hugging Face. After that, it’s yours. Airplane mode works perfectly. Works on most phones with 6GB+ RAM. Flagships run it really well. You can start with as small as 80MB for a model 🙂 It’s fully open source (MIT): https://github.com/alichherawalla/offline-mobile-llm-manager APK available in the repo if you want to skip building from source. For iOS as of now you’ll need to actually run it locally and sideload it. If there is enough interest I’ll publish to the app store. Image gen takes about 6 seconds on iOS, and with NPU ~12 seconds on Android including the time to enhance the prompt. Happy to answer any questions about what’s happening under the hood. submitted by /u/alichherawalla |