180 lines
16 KiB
Plaintext
180 lines
16 KiB
Plaintext
{
|
|
"nbformat": 4,
|
|
"nbformat_minor": 0,
|
|
"metadata": {
|
|
"colab": {
|
|
"name": "ColabKobold GPU",
|
|
"private_outputs": true,
|
|
"provenance": [],
|
|
"collapsed_sections": [],
|
|
"include_colab_link": true
|
|
},
|
|
"kernelspec": {
|
|
"display_name": "Python 3",
|
|
"name": "python3"
|
|
},
|
|
"language_info": {
|
|
"name": "python"
|
|
},
|
|
"accelerator": "GPU"
|
|
},
|
|
"cells": [
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {
|
|
"id": "view-in-github",
|
|
"colab_type": "text"
|
|
},
|
|
"source": [
|
|
"<a href=\"https://colab.research.google.com/github/KoboldAI/KoboldAI-Client/blob/main/colab/GPU.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {
|
|
"id": "kX9y5koxa58q"
|
|
},
|
|
"source": [
|
|
"# Welcome to KoboldAI on Google Colab, GPU Edition!\n",
|
|
"KoboldAI is a powerful and easy way to use a variety of AI based text generation experiences. You can use it to write stories, blog posts, play a text adventure game, use it like a chatbot and more! In some cases it might even help you with an assignment or programming task (But always make sure the information the AI mentions is correct, it loves to make stuff up).\n",
|
|
"\n",
|
|
"For more information about KoboldAI check our our Github readme : https://github.com/KoboldAI/KoboldAI-Client/blob/main/readme.md\n",
|
|
"\n",
|
|
"For the larger AI models (That are typically more coherent) check out our **[TPU edition](https://colab.research.google.com/github/KoboldAI/KoboldAI-Client/blob/main/colab/TPU.ipynb)**!"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"metadata": {
|
|
"id": "ewkXkyiFP2Hq"
|
|
},
|
|
"source": [
|
|
"#@title <-- Tap this if you play on Mobile { display-mode: \"form\" }\n",
|
|
"%%html\n",
|
|
"<b>Press play on the music player to keep the tab alive, then start KoboldAI below (Uses only 13MB of data)</b><br/>\n",
|
|
"<audio src=\"https://henk.tech/colabkobold/silence.m4a\" controls>"
|
|
],
|
|
"execution_count": null,
|
|
"outputs": []
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"metadata": {
|
|
"id": "lVftocpwCoYw",
|
|
"cellView": "form"
|
|
},
|
|
"source": [
|
|
"#@title <b><-- Select your model below and then click this to start KoboldAI</b>\n",
|
|
"#@markdown You can find a description of the models below along with instructions on how to start KoboldAI.\n",
|
|
"\n",
|
|
"Model = \"Nerys 2.7B\" #@param [\"Nerys 2.7B\", \"AID 2.7B\", \"Erebus 2.7B\", \"Janeway 2.7B\", \"Picard 2.7B\", \"Horni LN 2.7B\", \"Horni 2.7B\", \"Shinen 2.7B\", \"Neo 2.7B\"] {allow-input: true}\n",
|
|
"Version = \"Official\" #@param [\"Official\", \"United\"] {allow-input: true}\n",
|
|
"Provider = \"Localtunnel\" #@param [\"Localtunnel\", \"Cloudflare\"]\n",
|
|
"\n",
|
|
"!nvidia-smi\n",
|
|
"from google.colab import drive\n",
|
|
"drive.mount('/content/drive/')\n",
|
|
"\n",
|
|
"if Model == \"Nerys 2.7B\":\n",
|
|
" Model = \"KoboldAI/fairseq-dense-2.7B-Nerys\"\n",
|
|
" path = \"\"\n",
|
|
" download = \"\"\n",
|
|
"elif Model == \"Erebus 2.7B\":\n",
|
|
" Model = \"KoboldAI/OPT-2.7B-Erebus\"\n",
|
|
" path = \"\"\n",
|
|
" download = \"\"\n",
|
|
"elif Model == \"Janeway 2.7B\":\n",
|
|
" Model = \"KoboldAI/GPT-Neo-2.7B-Janeway\"\n",
|
|
" path = \"\"\n",
|
|
" download = \"\"\n",
|
|
"elif Model == \"Picard 2.7B\":\n",
|
|
" Model = \"KoboldAI/GPT-Neo-2.7B-Picard\"\n",
|
|
" path = \"\"\n",
|
|
" download = \"\"\n",
|
|
"elif Model == \"AID 2.7B\":\n",
|
|
" Model = \"KoboldAI/GPT-Neo-2.7B-AID\"\n",
|
|
" path = \"\"\n",
|
|
" download = \"\"\n",
|
|
"elif Model == \"Horni LN 2.7B\":\n",
|
|
" Model = \"KoboldAI/GPT-Neo-2.7B-Horni-LN\"\n",
|
|
" path = \"\"\n",
|
|
" download = \"\"\n",
|
|
"elif Model == \"Horni 2.7B\":\n",
|
|
" Model = \"KoboldAI/GPT-Neo-2.7B-Horni\"\n",
|
|
" path = \"\"\n",
|
|
" download = \"\"\n",
|
|
"elif Model == \"Shinen 2.7B\":\n",
|
|
" Model = \"KoboldAI/GPT-Neo-2.7B-Shinen\"\n",
|
|
" path = \"\"\n",
|
|
" download = \"\"\n",
|
|
"elif Model == \"Neo 2.7B\":\n",
|
|
" Model = \"EleutherAI/gpt-neo-2.7B\"\n",
|
|
" path = \"\"\n",
|
|
" download = \"\"\n",
|
|
"\n",
|
|
"if Provider == \"Localtunnel\":\n",
|
|
" tunnel = \"--localtunnel yes\"\n",
|
|
"else:\n",
|
|
" tunnel = \"\"\n",
|
|
"\n",
|
|
"!wget https://koboldai.org/ckds -O - | bash /dev/stdin -m $Model -g $Version $tunnel"
|
|
],
|
|
"execution_count": null,
|
|
"outputs": []
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"source": [
|
|
"# GPU Edition Model Descriptions\n",
|
|
"| Model | Size | Style | Description |\n",
|
|
"| --- | --- | --- | --- |\n",
|
|
"| [Nerys 2.7B](https://huggingface.co/KoboldAI/fairseq-dense-2.7B-Nerys) by Mr Seeker | 2.7B | Novel/Adventure | Nerys is a hybrid model based on Pike (A newer Janeway), on top of the Pike dataset you also get some Light Novels, Adventure mode support and a little bit of Shinen thrown in the mix. The end result is a very diverse model that is heavily biased towards SFW novel writing, but one that can go beyond its novel training and make for an excellent adventure model to. Adventure mode is best played from a second person perspective, but can be played in first or third person as well. Novel writing can be done best from the first or third person. |\n",
|
|
"| [Janeway 2.7B](https://huggingface.co/KoboldAI/GPT-Neo-2.7B-Janeway) by Mr Seeker | 2.7B | Novel | Janeway is a model created from Picard's dataset combined with a brand new collection of ebooks. This model is trained on 20% more content than Picard and has been trained on literature from various genres. Although the model is mainly focussed on SFW, romantic scenes might involve a degree of nudity. |\n",
|
|
"| [Picard 2.7B](https://huggingface.co/KoboldAI/GPT-Neo-2.7B-Picard) by Mr Seeker | 2.7B | Novel | Picard is a model trained for SFW Novels based on Neo 2.7B. It is focused on Novel style writing without the NSFW bias. While the name suggests a sci-fi model this model is designed for Novels of a variety of genre's. It is meant to be used in KoboldAI's regular mode. |\n",
|
|
"| [AID 2.7B](https://huggingface.co/KoboldAI/GPT-Neo-2.7B-AID) by melastacho | 2.7B | Adventure | Also know as Adventure 2.7B this is a clone of the AI Dungeon Classic model and is best known for the epic wackey adventures that AI Dungeon Classic players love. |\n",
|
|
"| [Horni LN 2.7B](https://huggingface.co/KoboldAI/GPT-Neo-2.7B-Horni-LN) by finetune | 2.7B | Novel | This model is based on Horni 2.7B and retains its NSFW knowledge, but was then further biased towards SFW novel stories. If you seek a balance between a SFW Novel model and a NSFW model this model should be a good choice. |\n",
|
|
"| [Horni 2.7B](https://huggingface.co/KoboldAI/GPT-Neo-2.7B-Horni) by finetune | 2.7B | NSFW | This model is tuned on Literotica to produce a Novel style model biased towards NSFW content. Can still be used for SFW stories but will have a bias towards NSFW content. It is meant to be used in KoboldAI's regular mode. |\n",
|
|
"| [Shinen 2.7B](https://huggingface.co/KoboldAI/GPT-Neo-2.7B-Shinen) by Mr Seeker | 2.7B | NSFW | Shinen is an alternative to the Horni model designed to be more explicit. If Horni is to tame for you Shinen might produce better results. While it is a Novel model it is unsuitable for SFW stories due to its heavy NSFW bias. Shinen will not hold back. It is meant to be used in KoboldAI's regular mode. |\n",
|
|
"| [Neo 2.7B](https://huggingface.co/EleutherAI/gpt-neo-2.7B) by EleutherAI | 2.7B | Generic | This is the base model for all the other 2.7B models, it is best used when you have a use case that we have no other models available for, such as writing blog articles or programming. It can also be a good basis for the experience of some of the softprompts if your softprompt is not about a subject the other models cover. |\n",
|
|
"\n",
|
|
"# [TPU Edition Model Descriptions](https://colab.research.google.com/github/KoboldAI/KoboldAI-Client/blob/main/colab/TPU.ipynb)\n",
|
|
"\n",
|
|
"| Model | Size | Style | Description |\n",
|
|
"| --- | --- | --- | --- |\n",
|
|
"| [Nerys](https://huggingface.co/KoboldAI/fairseq-dense-13B-Nerys) by Mr Seeker | 13B | Novel/Adventure | Nerys is a hybrid model based on Pike (A newer Janeway), on top of the Pike dataset you also get some Light Novels, Adventure mode support and a little bit of Shinen thrown in the mix. The end result is a very diverse model that is heavily biased towards SFW novel writing, but one that can go beyond its novel training and make for an excellent adventure model to. Adventure mode is best played from a second person perspective, but can be played in first or third person as well. Novel writing can be done best from the first or third person. |\n",
|
|
"| [Janeway](https://huggingface.co/KoboldAI/fairseq-dense-13B-Janeway) by Mr Seeker | 13B | Novel | Janeway is a model created from Picard's dataset combined with a brand new collection of ebooks. This model is trained on 20% more content than Picard and has been trained on literature from various genres. Although the model is mainly focussed on SFW, romantic scenes might involve a degree of nudity. |\n",
|
|
"| [Shinen](https://huggingface.co/KoboldAI/fairseq-dense-13B-Shinen) by Mr Seeker | 13B | NSFW | Shinen is an NSFW model designed to be more explicit. Trained on a variety of stories from the website Sexstories it contains many different kinks. |\n",
|
|
"| [Skein](https://huggingface.co/KoboldAI/GPT-J-6B-Skein) by VE\\_FORBRYDERNE | 6B | Adventure | Skein is best used with Adventure mode enabled, it consists of a 4 times larger adventure dataset than the Adventure model making it excellent for text adventure gaming. On top of that it also consists of light novel training further expanding its knowledge and writing capabilities. It can be used with the You filter bias if you wish to write Novels with it, but dedicated Novel models can perform better for this task. |\n",
|
|
"| [Adventure](https://huggingface.co/KoboldAI/GPT-J-6B-Adventure) by VE\\_FORBRYDERNE | 6B | Adventure | Adventure is a 6B model designed to mimick the behavior of AI Dungeon. It is exclusively for Adventure Mode and can take you on the epic and wackey adventures that AI Dungeon players love. It also features the many tropes of AI Dungeon as it has been trained on very similar data. It must be used in second person (You). |\n",
|
|
"| [Lit](https://huggingface.co/hakurei/lit-6B) by Haru | 6B | NSFW | Lit is a great NSFW model trained by Haru on both a large set of Literotica stories and high quality novels along with tagging support. Creating a high quality model for your NSFW stories. This model is exclusively a novel model and is best used in third person. |\n",
|
|
"| Neo(X) by EleutherAI | 20B | Generic | NeoX is the largest EleutherAI model currently available, being a generic model it is not particularly trained towards anything and can do a variety of writing, Q&A and coding tasks. 20B's performance is closely compared to the 13B models and it is worth trying both especially if you have a task that does not involve english writing. Its behavior will be similar to the GPT-J-6B model since they are trained on the same dataset but with more sensitivity towards repetition penalty and with more knowledge. |\n",
|
|
"| [Fairseq Dense](https://huggingface.co/KoboldAI/fairseq-dense-13B) | 13B | Generic | Trained by Facebook Researchers this model stems from the MOE research project within Fairseq. This particular version has been converted by us for use in KoboldAI. It is known to be on par with the larger 20B model from EleutherAI and considered as better for pop culture and language tasks. Because the model has never seen a new line (enter) it may perform worse on formatting and paragraphing. |\n",
|
|
"| [GPT-J-6B](https://huggingface.co/EleutherAI/gpt-j-6B) by EleutherAI | 6B | Generic | This model serves as the basis for most other 6B models (Some being based on Fairseq Dense instead). Being trained on the Pile and not biased towards anything in particular it is suitable for a variety of tasks such as writing, Q&A and coding tasks. You will likely get better result with larger generic models or finetuned models. |\n",
|
|
"\n",
|
|
"\n",
|
|
"| Style | Description |\n",
|
|
"| --------- | ------------------------------------------------------------ |\n",
|
|
"| Novel | For regular story writing, not compatible with Adventure mode or other specialty modes. |\n",
|
|
"| NSFW | Indicates that the model is strongly biased towards NSFW content and is not suitable for children, work environments or livestreaming. Most NSFW models are also Novel models in nature. |\n",
|
|
"| Adventure | These models are excellent for people willing to play KoboldAI like a Text Adventure game and are meant to be used with Adventure mode enabled. Even if you wish to use it as a Novel style model you should always have Adventure mode on and set it to story. These models typically have a strong bias towards the use of the word You and without Adventure mode enabled break the story flow and write actions on your behalf. |\n",
|
|
"| Generic | Generic models are not trained towards anything specific, typically used as a basis for other tasks and models. They can do everything the other models can do, but require much more handholding to work properly. Generic models are an ideal basis for tasks that we have no specific model for, or for experiencing a softprompt in its raw form. |\n",
|
|
"\n",
|
|
"---\n",
|
|
"# How to start KoboldAI in 7 simple steps\n",
|
|
"Using KoboldAI on Google Colab is easy! Simply follow these steps to get started:\n",
|
|
"1. Mobile phone? Tap the play button below next to \"<--- Tap this if you play on mobile\" to reveal an audio player, play the silent audio to keep the tab alive so Google will not shut you down when your using KoboldAI. If no audio player is revealed your phone browser does not support Google Colab in the mobile view, go to your browser menu and enable Desktop mode before you continue.\n",
|
|
"2. Select the model that most describes what you would like to do, by default we have the most recommended model for people willing to try out KoboldAI selected. If you are an advanced user you can also type any GPT model name from Huggingface.co to load this up (Unlisted Models may or may not work depending on Colab's hardware limitations).\n",
|
|
"3. Click the play button next to \"<--- Click this to start KoboldAI\".\n",
|
|
"4. Allow Google Drive access, this typically happens trough a popup but sometimes Google Drive access may be requested trough the older method by asking you to click on a link and copy a code. This is normal behavior for Colab and only you will get access to your files, nothing is shared with us.\n",
|
|
"5. Now the automatic installation and Download process starts, for most models in the GPU edition expect this to take 7 minutes on average depending on the current Colab download speeds. These downloads happen trough Google's internet connection, you will not be billed by your internet provider and it will not count towards any download limits.\n",
|
|
"6. After waiting a Trycloudflare link appears, click the link to enjoy KoboldAI. If you get a 1033 error Cloudflare is not done loading, in that case keep refreshing until it goes away. (If it keeps happening after 2 minutes Cloudflare has an issue, in that case you can use Runtime -> Restart and Run All to get a new link).\n",
|
|
"7. As you play KoboldAI, keep this Colab tab open in the background and check occationally for Captcha's so they do not shut your instance down. If you do get shut down you can always download a copy of your gamesave in the Save menu inside KoboldAI. Stories are never lost as long as you keep KoboldAI open in your browser.\n",
|
|
"\n",
|
|
"Get a error message saying you do not have access to a GPU/TPU instance? Do not continue and try again later, KoboldAI will not run correctly without them."
|
|
],
|
|
"metadata": {
|
|
"id": "Lrm840I33hkC"
|
|
}
|
|
}
|
|
]
|
|
} |