{ "cells": [ { "cell_type": "markdown", "metadata": { "id": "view-in-github", "colab_type": "text" }, "source": [ "\"Open" ] }, { "cell_type": "markdown", "source": [ "# Welcome to KoboldAI on Google Colab, TPU Edition!\n", "KoboldAI used to have a very powerful TPU engine for the TPU colab allowing you to run models above 6B, we have since moved on to more viable GPU based solutions that work across all vendors rather than splitting our time maintaing a colab exclusive backend.\n", "\n", "If you were brought here by a (video) tutorial keep in mind the tutorial you are following is very out of date.\n", "\n", "We recommend that you switch to Koboldcpp, our most modern solution that runs fantastic on Google Colab's GPU's allowing a similar level of performance that you were using before on the TPU at a fraction of the loading times.\n", "\n", "# [Click here to go to the KoboldCpp Colab](https://koboldai.org/colabcpp)\n", "\n", "The model you wish to use not available in GGUF format? You can use our [GPU colab](https://koboldai.org/colab) and select the United version to load models up to 13B.\n", "\n", "Both versions are capable of using our API and will work as you expect from a KoboldAI product. If you are following a tutorial the rest of the instructions may apply to these newer versions of our products." ], "metadata": { "id": "zrLGxVCEaqZx" } } ], "metadata": { "colab": { "name": "ColabKobold TPU", "provenance": [], "private_outputs": true, "include_colab_link": true }, "kernelspec": { "display_name": "Python 3", "name": "python3" }, "language_info": { "name": "python" }, "accelerator": "TPU" }, "nbformat": 4, "nbformat_minor": 0 }