Colabcpp improvements (#512)

* Aria2

* Aria2 Typo fix

* Streamlined Wget

* Streamlining Fix

* Back to .so downloading

* Crash colab if no GPU is present

* Created using Colaboratory

* Restore proper link

Colab overwrite the link, manually changing it back so people don't land on my branch.

* Restore file juggle

* Fixing the colab link... again
This commit is contained in:
henk717 2023-11-05 03:19:09 +01:00 committed by GitHub
parent 5e5be717c3
commit 02595f9d21
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23

View file

@ -3,8 +3,8 @@
{ {
"cell_type": "markdown", "cell_type": "markdown",
"metadata": { "metadata": {
"colab_type": "text", "id": "view-in-github",
"id": "view-in-github" "colab_type": "text"
}, },
"source": [ "source": [
"<a href=\"https://colab.research.google.com/github/LostRuins/koboldcpp/blob/concedo/colab.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>" "<a href=\"https://colab.research.google.com/github/LostRuins/koboldcpp/blob/concedo/colab.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
@ -12,22 +12,28 @@
}, },
{ {
"cell_type": "markdown", "cell_type": "markdown",
"metadata": {}, "metadata": {
"id": "2FCn5tmpn3UV"
},
"source": [ "source": [
"## Welcome to the Official KoboldCpp Colab Notebook\r\n", "## Welcome to the Official KoboldCpp Colab Notebook\n",
"It's really easy to get started. Just press the two **Play** buttons below, and then connect to the **Cloudflare URL** shown at the end. \r\n", "It's really easy to get started. Just press the two **Play** buttons below, and then connect to the **Cloudflare URL** shown at the end.\n",
"You can select a model from the dropdown, or enter a **custom URL** to a GGUF model (Example: `https://huggingface.co/KoboldAI/LLaMA2-13B-Tiefighter-GGUF/resolve/main/LLaMA2-13B-Tiefighter.Q4_K_M.gguf`)" "You can select a model from the dropdown, or enter a **custom URL** to a GGUF model (Example: `https://huggingface.co/KoboldAI/LLaMA2-13B-Tiefighter-GGUF/resolve/main/LLaMA2-13B-Tiefighter.Q4_K_M.gguf`)\n",
"\n",
"**Keep this page open and occationally check for captcha's so that your AI is not shut down**"
] ]
}, },
{ {
"cell_type": "code", "cell_type": "code",
"execution_count": null, "execution_count": null,
"metadata": {}, "metadata": {
"id": "QNaj3u0jn3UW"
},
"outputs": [], "outputs": [],
"source": [ "source": [
"#@title <-- Tap this if you play on Mobile { display-mode: \"form\" }\r\n", "#@title <-- Tap this if you play on Mobile { display-mode: \"form\" }\n",
"%%html\r\n", "%%html\n",
"<b>Press play on the music player to keep the tab alive, then start KoboldCpp below</b><br/>\r\n", "<b>Press play on the music player to keep the tab alive, then start KoboldCpp below</b><br/>\n",
"<audio src=\"https://raw.githubusercontent.com/KoboldAI/KoboldAI-Client/main/colab/silence.m4a\" controls>" "<audio src=\"https://raw.githubusercontent.com/KoboldAI/KoboldAI-Client/main/colab/silence.m4a\" controls>"
] ]
}, },
@ -40,35 +46,40 @@
}, },
"outputs": [], "outputs": [],
"source": [ "source": [
"#@title <b>v-- Enter your model below and then click this to start Koboldcpp</b>\r\n", "#@title <b>v-- Enter your model below and then click this to start Koboldcpp</b>\n",
"\r\n", "\n",
"Model = \"https://huggingface.co/KoboldAI/LLaMA2-13B-Tiefighter-GGUF/resolve/main/LLaMA2-13B-Tiefighter.Q4_K_M.gguf\" #@param [\"https://huggingface.co/KoboldAI/LLaMA2-13B-Tiefighter-GGUF/resolve/main/LLaMA2-13B-Tiefighter.Q4_K_M.gguf\",\"https://huggingface.co/TheBloke/MythoMax-L2-13B-GGUF/resolve/main/mythomax-l2-13b.Q4_K_M.gguf\",\"https://huggingface.co/TheBloke/ReMM-SLERP-L2-13B-GGUF/resolve/main/remm-slerp-l2-13b.Q4_K_M.gguf\",\"https://huggingface.co/TheBloke/Xwin-LM-13B-v0.2-GGUF/resolve/main/xwin-lm-13b-v0.2.Q4_K_M.gguf\",\"https://huggingface.co/TheBloke/Stheno-L2-13B-GGUF/resolve/main/stheno-l2-13b.Q4_K_M.gguf\",\"https://huggingface.co/TheBloke/Mistral-7B-Instruct-v0.1-GGUF/resolve/main/mistral-7b-instruct-v0.1.Q4_K_S.gguf\"]{allow-input: true}\r\n", "Model = \"https://huggingface.co/KoboldAI/LLaMA2-13B-Tiefighter-GGUF/resolve/main/LLaMA2-13B-Tiefighter.Q4_K_M.gguf\" #@param [\"https://huggingface.co/KoboldAI/LLaMA2-13B-Tiefighter-GGUF/resolve/main/LLaMA2-13B-Tiefighter.Q4_K_M.gguf\",\"https://huggingface.co/TheBloke/MythoMax-L2-13B-GGUF/resolve/main/mythomax-l2-13b.Q4_K_M.gguf\",\"https://huggingface.co/TheBloke/ReMM-SLERP-L2-13B-GGUF/resolve/main/remm-slerp-l2-13b.Q4_K_M.gguf\",\"https://huggingface.co/TheBloke/Xwin-LM-13B-v0.2-GGUF/resolve/main/xwin-lm-13b-v0.2.Q4_K_M.gguf\",\"https://huggingface.co/TheBloke/Stheno-L2-13B-GGUF/resolve/main/stheno-l2-13b.Q4_K_M.gguf\",\"https://huggingface.co/TheBloke/Mistral-7B-Instruct-v0.1-GGUF/resolve/main/mistral-7b-instruct-v0.1.Q4_K_S.gguf\"]{allow-input: true}\n",
"Layers = 43 #@param [43]{allow-input: true}\r\n", "Layers = 43 #@param [43]{allow-input: true}\n",
"ContextSize = 4096 #@param [4096] {allow-input: true}\r\n", "ContextSize = 4096 #@param [4096] {allow-input: true}\n",
"\r\n", "\n",
"%cd /content\r\n", "import os\n",
"!git clone https://github.com/LostRuins/koboldcpp\r\n", "if not os.path.isfile(\"/opt/bin/nvidia-smi\"):\n",
"%cd /content/koboldcpp\r\n", " raise RuntimeError(\"⚠Colab did not give you a GPU due to usage limits, this can take a few hours before they let you back in. Check out https://lite.koboldai.net for a free alternative (that does not provide an API link but can load KoboldAI saves and chat cards) or subscribe to Colab Pro for immediate access.⚠️\")\n",
"kvers = !(cat koboldcpp.py | grep 'KcppVersion = ' | cut -d '\"' -f2)\r\n", "\n",
"kvers = kvers[0]\r\n", "%cd /content\n",
"!echo Finding prebuilt binary for {kvers}\r\n", "!git clone https://github.com/LostRuins/koboldcpp\n",
"!wget -O dlfile.tmp -c https://kcppcolab.concedo.workers.dev/?{kvers} && mv dlfile.tmp koboldcpp_cublas.so\r\n", "%cd /content/koboldcpp\n",
"!test -f koboldcpp_cublas.so && echo Prebuilt Binary Exists || echo Prebuilt Binary Does Not Exist\r\n", "kvers = !(cat koboldcpp.py | grep 'KcppVersion = ' | cut -d '\"' -f2)\n",
"!test -f koboldcpp_cublas.so && echo Build Skipped || make koboldcpp_cublas LLAMA_CUBLAS=1\r\n", "kvers = kvers[0]\n",
"!cp koboldcpp_cublas.so koboldcpp_cublas.dat\r\n", "!echo Finding prebuilt binary for {kvers}\n",
"!wget $Model -O model.ggml\r\n", "!wget -O dlfile.tmp -c https://kcppcolab.concedo.workers.dev/?{kvers} && mv dlfile.tmp koboldcpp_cublas.so\n",
"!python koboldcpp.py model.ggml --usecublas 0 mmq --multiuser --gpulayers $Layers --contextsize $ContextSize --hordeconfig concedo 1 1 --remotetunnel\r\n" "!test -f koboldcpp_cublas.so && echo Prebuilt Binary Exists || echo Prebuilt Binary Does Not Exist\n",
"!test -f koboldcpp_cublas.so && echo Build Skipped || make koboldcpp_cublas LLAMA_CUBLAS=1\n",
"!cp koboldcpp_cublas.so koboldcpp_cublas.dat\n",
"!apt install aria2 -y\n",
"!aria2c -x 10 -o model.ggml --allow-overwrite=true --file-allocation=none $Model\n",
"!python koboldcpp.py model.ggml --usecublas 0 mmq --multiuser --gpulayers $Layers --contextsize $ContextSize --hordeconfig concedo 1 1 --remotetunnel\n"
] ]
} }
], ],
"metadata": { "metadata": {
"accelerator": "GPU", "accelerator": "GPU",
"colab": { "colab": {
"authorship_tag": "",
"gpuType": "T4", "gpuType": "T4",
"include_colab_link": true,
"private_outputs": true, "private_outputs": true,
"provenance": [] "provenance": [],
"cell_execution_strategy": "setup",
"include_colab_link": true
}, },
"kernelspec": { "kernelspec": {
"display_name": "Python 3", "display_name": "Python 3",