Skip to content

Offline LLM-Based Educational App #824

@Yannlagaf

Description

@Yannlagaf

Hello,

I am exploring the development of an offline educational mobile app for students in areas where data internet is not really accessible.

The app would allow students (Grade 6 to University) to download the courses of a single year.

Each pack would include a small LLM model (or adapter) that runs fully offline on mid-range Android smartphones.

Once downloaded, the app should work 100% offline (no cloud access required), with good performance and minimal latency.

i want the LLM to be able to answer questions based on the course material and help students solve exercises, with minimal to low hallucinations.

My question:

Is this technically feasible on typical mid-range smartphones used in countries where the average phone is (3-8 GB RAM, ~128-256 GB storage) ?

Which model architecture strategy (quantization, LoRA adapters, small fine-tuned model, etc.) would you recommend for this use case?

Thanks.

Metadata

Metadata

Assignees

Labels

questionFurther information is requested

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions