| .idea | ||
| app | ||
| gradle/wrapper | ||
| .gitignore | ||
| build.gradle.kts | ||
| gradle.properties | ||
| gradlew | ||
| gradlew.bat | ||
| llm_inference.png | ||
| README.md | ||
| settings.gradle.kts | ||
MediaPipe LLM Inference Android Demo
Overview
This is a sample app that demonstrates how to use the LLM Inference API to run common text-to-text generation tasks like information retrieval, email drafting, and document summarization.
This application must be run on a physical Android device to take advantage of the device GPU.
How to Build the Demo App
1. Download the Code
To download the demo code, clone the git repository using the following command:
git clone https://github.com/google-ai-edge/mediapipe-samples
After downloading the demo code, you can import the project into Android Studio and run the app with the following instructions.
2. Prerequisites
-
The Android Studio IDE. This demo has been tested on Android Studio Hedgehog.
-
A physical Android device with a minimum OS version of SDK 24 (Android 7.0 - Nougat) with developer mode enabled.
3. Build and Run
To import and build the demo app:
-
Download Android Studio and install.
-
From the Android Studio, select File > New > Import Project.
-
Navigate to the demo app
androiddirectory and select that directory, for example:.../mediapipe-samples/examples/llm_inference/android -
If Android Studio requests a Gradle Sync, choose OK.
-
Build the project by selecting Build > Make Project.
When the build completes, the Android Studio displays a
BUILD SUCCESSFULmessage in the Build Output status panel.
To run the demo app:
-
Ensure that your Android device is connected to your computer and developer mode is enabled.
-
From Android Studio, run the app by selecting Run > Run 'app'.
How to Use the Demo App
1. Select Model
The user first selects a model (e.g. DEEPSEEK_CPU for the DeepSeek model) from the model selection screen.
2. Download Model
If the model has not been downloaded previously, the app will download it from LiteRT on Hugging Face.
If authentication and license acknowledgment are required to access the model, the user will be prompted to sign in with their Hugging Face account and acknowledge the license if necessary.
3. Chat with Model
Once the model is downloaded, the user can interact with it by entering prompts and receiving responses.
Reference
For more details, see the LLM Inference guide for Android.
下载资源
去找 app/src/main/java/com/google/mediapipe/examples/llminference/Model.kt 里面的 url,下载完了放在 asset 里面,文件名和 path 一致
