Still VERY far from ideal for multiplayer, only one person can realistically edit it at a time. Whoever submits counts. Will need more major interface changes so things can be submitted one by one. But hey, it works and its good enough for a group of friends to play the game :D
Expected all tensors to be on the same device, but found at least two devices, cpu and cuda:0! (when checking arugment for argument index in method wrapper_index_select)
First step towards native Colab support, built in Cloudflare tunnels easily allows players to play KoboldAI on another device. This mode also removes buttons that would get you stuck if you have no local PC access.
Big overhaul of the installer, partially based on the #53 commit from LexSong.
The following is new :
- Conda has been replaced with MicroMamba, allows the dependencies to automatically download the best version of Python and prevents all the issues with people failing to download conda.
- The installer now has more options so you can choose not to delete the existing files, and it has new optional virtual K: drive support to bypass all the pathing issues people are having (Sorry Windows 7 users, its still not compatible even now).
- Docker support for Linux systems has been added including ROCm support.
- Environment files are now used to more easily keep everything on track, and to allow conda users to manually create environments across all operating systems (ROCm is an outlier in this because i have to use AMD's Pytorch docker for now it was to much hassle getting their Conda to use the environment file to add it to this commit).
- Play.bat has been changed to allow the virtual drive support, everything should still be compatible with old installations as I kept all the paths intact.
This PR does three things when loading a story from within the browser:
1. Prevents an error if a story file is not valid JSON.
2. Catches an error is a file is JSON, but lacks an actions property.
3. Replaces getcwd() and instead uses the path of the script file itself in case someone does not start the app from the current working directory.
No more manually forcing CUDA to be 11, instead we use conda-forge.
This will pull down a lot more recent versions of pretty much everything, and fixes errors in the GPT-J models that cropped up on the older versions of the dependencies.
I changed the menu order around because Finetuneanon's version is better for most users and needed for 6B.
The github branch it downloads for finetune is updated, and it can now fix the download path length errors if ran as admin.
This changes the installation script to use Miniconda3 inside the KoboldAI directory, this is MUCH more user friendly for the users.
Any existing python environment will be bypassed, and other dependencies like CUDA automatically installed with compatible versions.
With this approach we can better ensure that end users have the correct environment and won't run into other issues because of their existing installations, it also prevents the need for them to install anything else on their system as anything required is automatically downloaded.