From 72dbd3250b3a1bb3c87a42dd202e530d94270337 Mon Sep 17 00:00:00 2001 From: Ziang Wu <97337387+ZiangWu-77@users.noreply.github.com> Date: Thu, 28 Mar 2024 21:42:10 +0800 Subject: [PATCH] Update MobileVLM-README.md remove gguf links --- examples/llava/MobileVLM-README.md | 11 ----------- 1 file changed, 11 deletions(-) diff --git a/examples/llava/MobileVLM-README.md b/examples/llava/MobileVLM-README.md index 063b943ff..96b048525 100644 --- a/examples/llava/MobileVLM-README.md +++ b/examples/llava/MobileVLM-README.md @@ -20,17 +20,6 @@ After building, run: `./llava-cli` to see the usage. For example: -p "A chat between a curious user and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the user's questions. USER: \nWho is the author of this book? Answer the question using a single word or phrase. ASSISTANT:" ``` -## GGUF model -If you just want to use it, fetch the gguf format weight from here: -for MobileVLM-1.7B -``` -git clone https://huggingface.co/guinmoon/MobileVLM-1.7B-GGUF -``` -for MobileVLM_V2-1.7B -``` -git clone https://huggingface.co/ZiangWu/MobileVLM_V2-1.7B-GGUF -``` - ## Model conversion - Clone `mobileVLM-1.7B` and `clip-vit-large-patch14-336` locally: