Commit Graph

9 Commits

Author SHA1 Message Date
henk717 3b976c9af7 Updated defaults
Transformers official by default, no more Git versions
2021-11-27 03:14:47 +01:00
henk717 409be6645a Finetune version of rocm
Seperate file so people can easily go back to the legacy implementation based on finetune (Recommended until Huggingface's compatibility is improved) . You can install and use both.
2021-11-20 03:14:18 +01:00
henk717 485034b6bb ROCm Conda
Allows anyone to easily create a ROCm compatible conda environment. Currently set to the newer transformers, you can edit the github link if you want the finetune one.
2021-11-17 22:15:01 +01:00
henk717 0f38dbc0ed Using VE's fork for now
Switching the official huggingface to VE's fork for the time being until some of these changes land upstream.
2021-10-19 11:33:49 +02:00
henk717 7d35f825c6 Huggingface GPT-J Support
Finetune's fork has unofficial support which we supported, but this is not compatible with models designed for the official version. In this update we let models decide which transformers backend to use, and fall back to Neo if they don't choose any. We also add the 6B to the menu and for the time being switch to the github version of transformers to be ahead of the waiting time. (Hopefully we can switch back to the conda version before merging upstream).
2021-09-25 16:26:17 +02:00
henk717 fcc210898f Revert to Python 3.8
tensorflow-base doesn't like the older cudatoolkit anymore. The one for python 3.8 still does, so lets just stick to that for now then.
2021-09-16 01:18:04 +02:00
henk717 03501a4c8c Dependency Fixes
Looks like cudatoolkit is now shipping 11.3, but pytorch has no version for this resulting in an installation of the CPU only version. This is going to lead to people unable to get their GPU running, so for now we force the recommended 11.1 version. I also don't see any harm in allowing Python 3.9 so thats now the default as well to prevent future issues.
2021-09-16 01:07:43 +02:00
henk717 136dd71171 Added --remote Mode
First step towards native Colab support, built in Cloudflare tunnels easily allows players to play KoboldAI on another device. This mode also removes buttons that would get you stuck if you have no local PC access.
2021-08-20 00:37:59 +02:00
henk717 1327bd30a5 The Mamba Installer Update
Big overhaul of the installer, partially based on the #53  commit from LexSong.

The following is new :
- Conda has been replaced with MicroMamba, allows the dependencies to automatically download the best version of Python and prevents all the issues with people failing to download conda.
- The installer now has more options so you can choose not to delete the existing files, and it has new optional virtual K: drive support to bypass all the pathing issues people are having (Sorry Windows 7 users, its still not compatible even now).
- Docker support for Linux systems has been added including ROCm support.
- Environment files are now used to more easily keep everything on track, and to allow conda users to manually create environments across all operating systems (ROCm is an outlier in this because i have to use AMD's Pytorch docker for now it was to much hassle getting their Conda to use the environment file to add it to this commit).
- Play.bat has been changed to allow the virtual drive support, everything should still be compatible with old installations as I kept all the paths intact.
2021-06-28 22:35:15 +02:00