Commit graph

3703 commits

Author SHA1 Message Date
李为
ca7e8ef19e fix clip_n_patch() allocation size error for 81-series omni-vlm models 2024-12-03 15:00:23 +08:00
liwiii
97267e60bd
bug fix in common-nexa.cpp
gguf_free(ctx_gguf) is called twice in L155, but this typo is not appeared in apollo repos, so this is just a tiny but fatal typo.
2024-12-03 11:36:59 +08:00
T
0b15d2d745
fix conficts (#32) 2024-12-02 15:41:12 +08:00
Zack Li
661b3f718c
Merge pull request #31 from NexaAI/teliu/dev
Ugrade to llama.cpp 74d73dc
2024-12-01 22:40:13 -08:00
Te993
a2c53052bd merge from master 2024-12-02 14:38:20 +08:00
Te993
809db95990 ugrade to llama.cpp 74d73dc 2024-12-02 14:24:50 +08:00
Yicheng Qian
3479f516ea udpate prompt template in wrapper 2024-11-22 14:01:42 -08:00
Zack Li
43f41a4c00
Merge pull request #28 from NexaAI/zack/vlm
Zack/vlm
2024-11-22 01:50:10 -08:00
zack Zhiyuan Li
fe8c7b45fd revert CMakeList 2024-11-22 09:08:44 +00:00
zack Zhiyuan Li
460212ac2a change template for inference 2024-11-22 09:06:15 +00:00
zack Zhiyuan Li
bbf1aaa7ed Merge remote-tracking branch 'origin' into zack/vlm 2024-11-22 09:04:28 +00:00
Zack Li
25190fefa2
Merge pull request #25 from NexaAI/weili/master-release
fix all mem leaks of qwen2audio example
2024-11-14 17:49:34 -08:00
李为
e4ca946c48 free omni_ctx heap malloc space in omni_free() api
Currently mem leaks in qwen2audio are almost fixed.
2024-11-15 08:31:01 +08:00
李为
8e2e630405 fix mem leakage based on leaks tool (still WIP) 2024-11-14 22:04:01 +08:00
李为
aad0167bc3 audio embedding free() (but still memory leakage detected) 2024-11-14 14:50:49 +08:00
Zack Li
b9845b4f63
Merge pull request #24 from NexaAI/weili/master-release
[memory leakage] fixed a leakage by projector free
2024-11-13 17:16:39 -08:00
李为
fc25544867 [memory leakage] fixed a leakage by projector free 2024-11-14 08:32:55 +08:00
Zack Li
98297afbd5
Merge pull request #22 from NexaAI/david/vulkan2
fix vulkan build bug for external build
2024-11-11 23:46:51 -08:00
Yicheng Qian
4e80184c32 fix vulkan build bug for external build 2024-11-11 23:35:11 -08:00
Zack Li
82dbdbdb40
Merge pull request #20 from NexaAI/weili/master-release
[omni-vlm] fixed the segmentation fault issue in nano-vlm-instruct(WIP)
2024-11-11 22:36:57 -08:00
李为
55953d35a4 [omni-vlm] fixed the segmentation fault issue in nano-vlm-instruct(WIP,
current solution is still not perfect)
2024-11-12 14:17:42 +08:00
Zack Li
362bdf3292
Merge pull request #18 from NexaAI/weili/master-release
[omni-vlm example] reset model in every inerence step to avoid nosense output.
2024-11-11 12:24:12 -08:00
李为
7cf07df5e2 reset model in every inerence step to avoid nosense output. 2024-11-11 19:41:26 +08:00
Zack Li
6f0e8c3ee6
Create CODEOWNERS 2024-11-09 18:56:57 -08:00
Zack Li
21bc833273
Merge pull request #17 from NexaAI/weili/master-release
fix OCR template error.
2024-11-09 10:03:46 -08:00
李为
d04e354f2f fix OCR template error. 2024-11-09 20:35:55 +08:00
Perry Cheng
667a6d9838
Merge pull request #16 from NexaAI/perry/android-dev
changed download models and nlen
2024-11-08 15:23:54 -08:00
zhycheng614
ecfe0b487f changed download models and nlen 2024-11-08 23:22:26 +00:00
Zack Li
d5df53658f
Merge pull request #14 from NexaAI/teliu/android/dev
Add submodule llava for android sample
2024-11-08 13:25:00 -08:00
Zack Li
8c417282d5
Merge pull request #15 from NexaAI/weili/master-release
support all omni-vlm models in one omni-vlm/ folder.
2024-11-08 13:23:46 -08:00
李为
eb6d54679e update README.md 2024-11-08 22:05:57 +08:00
李为
3d9c63a3ff remove omni-vlm-v2/ 2024-11-08 21:00:42 +08:00
李为
16c22471e8 remove redundant omni-vlm-v2/ folder, all omni-vlm examples will be added to omni-vlm/ folder. 2024-11-08 20:59:23 +08:00
liute110
b17684efb3 add include llava.h 2024-11-08 16:07:50 +08:00
liute110
400fc2a4b0 add one more model 2024-11-08 16:06:37 +08:00
liute110
86c2233a38 add submodule llava for android 2024-11-08 16:02:45 +08:00
Zack Li
df5841b6b8
Merge pull request #13 from NexaAI/weili/master-release
add omni-vlm-v2 implementations( C++ & python)
2024-11-07 00:48:21 -08:00
李为
3dfac7817f add returned string type (const char*) for nexa-omni-audio 2024-11-07 16:13:53 +08:00
Zack Li
20b9f02cee
Merge pull request #12 from NexaAI/weili/master-release
add returned string type (const char*) for nexa-omni-audio
2024-11-06 19:28:46 -08:00
李为
5edadffd88 add returned string type (const char*) for nexa-omni-audio 2024-11-07 11:19:50 +08:00
Zack Li
6a4cf0b983
Merge pull request #11 from NexaAI/weili/master-release
add returned string (const char*) for qwen2 audio
2024-11-05 23:27:47 -08:00
李为
b24a409e22 add returned string (const char*) for qwen2 audio 2024-11-06 15:24:26 +08:00
Zack Li
5574bda471
Merge pull request #10 from NexaAI/weili/master-release
add returned string (pure c const char* type) for omni-vlm inference api
2024-11-05 19:41:03 -08:00
李为
22da7bc379 add returned string (pure c const char* type) for omni-vlm inference api 2024-11-06 11:20:36 +08:00
Zack Zhiyuan Li
38c6fa3b8f enable lib to be exported in nexa SDK 2024-11-05 20:56:33 +00:00
Zack Li
983b4625ef
Merge pull request #8 from NexaAI/weili/master-release
add omni-vlm examples (C++ & python)
2024-11-04 22:39:36 -08:00
Zack Li
91b3cafbb5
Merge pull request #6 from NexaAI/master-release-audio-lm
Remove C++20 coding and suport Microsoft Visual Studio Compilation
2024-11-04 21:59:26 -08:00
Zack Zhiyuan Li
05853eb861 remove C++20 syntax 2024-11-04 23:03:49 +00:00
Zack Zhiyuan Li
d42e0371f8 remove C++20 style 2024-11-04 22:50:33 +00:00
Zack Zhiyuan Li
1419681089 disable <cxxabi.h> for MSC_VER 2024-11-04 05:45:52 +00:00