Compare commits

...

640 Commits
1.18.2 ... main

Author SHA1 Message Date
henk717 f49d763e2a Promote Colabcpp 2024-01-02 14:08:53 +01:00
henk717 fd24d95981 Crash without a GPU 2023-11-05 01:42:13 +01:00
henk717 61a0042c66 Echidna 2023-10-28 03:05:02 +02:00
henk717 8b7ab2f93b
Match colab description for Tiefighter 2023-10-27 15:58:49 +02:00
henk717 0ea758b789 Better Tiefighter description 2023-10-27 15:57:08 +02:00
henk717 2db1812ee4
Merge pull request #409 from RecoveredApparatus/main
Updated Model list and description in Read.md and GPU.ipynb markdown
2023-10-27 15:52:37 +02:00
anhad 3287328fe4 Update the model list in both Read.md and Colab markdown 2023-10-25 14:53:00 +05:30
anhad a92951f47e Updated Readme.md 2023-10-24 10:08:34 +05:30
henk717 7d39b353c0 Tiefighter on Colab 2023-10-19 20:07:01 +02:00
henk717 58b4c48fdb Disable GPTQ for now to enable higher context 2023-10-16 21:23:27 +02:00
henk717 bf61e5ef02 Emerhyst 2023-10-09 05:06:26 +02:00
Henk 386fd1f034 Werkzeug Fix 2023-10-06 13:51:52 +02:00
henk717 d86f61151b Working revision support 2023-08-23 22:07:37 +02:00
henk717 ebab774aab Add Holomax 2023-08-14 18:19:03 +02:00
henk717 ee93fe6e4a Add model cleaner 2023-08-11 22:39:49 +02:00
henk717 9cb93d6b4c Add some 13B's for easier beta testing 2023-08-10 23:56:44 +02:00
henk717 d6b1ff513d More cleanup - TPU 2023-05-11 15:24:25 +02:00
henk717 c11a269493 Model cleanup - GPU 2023-05-11 02:55:28 +02:00
henk717 148f900324 Cleaned up model list - TPU 2023-05-11 02:52:45 +02:00
henk717 b66110ea54 Created using Colaboratory 2023-05-08 18:54:41 +02:00
henk717 d2b399d7bc
Merge pull request #311 from SmolBleat/main
Add Nerybus Models
2023-05-08 16:59:24 +02:00
henk717 f2b643a639
Merge pull request #239 from waffshappen/patch-2
Allow Project File Access with Podman+Selinux
2023-05-08 16:58:51 +02:00
Henk 1499763472 Flask fix 2023-04-29 02:44:41 +02:00
SmolBleat 692fe2e5ee Add Nerybus Models 2023-04-24 21:01:29 +02:00
henk717 c3bf89a94f Missed a spot 2023-04-24 19:20:35 +02:00
henk717 1ae1d499e8 Remove banned model 2023-04-24 19:15:17 +02:00
henk717 b808f039ab
Pin TPU driver 2023-04-23 20:21:28 +02:00
henk717 d88f109073
TPU Fix Fix 2023-04-23 18:49:25 +02:00
henk717 b4cb09590f
Update requirements_mtj.txt 2023-04-23 18:23:38 +02:00
henk717 5f0e2001a7 Remove broken TPU disclaimer 2023-04-23 17:50:03 +02:00
henk717 dddde7dbc3
Merge pull request #306 from Zurnaz/tpu_fix
Fix: TPU driver error
2023-04-23 12:32:14 +02:00
Bogdan Drema 92a0bf9524 Fix: TPU driver error
to_dlpack/from_dlpack was causing issues with tensor with new jax version
2023-04-23 00:49:42 +01:00
henk717 e4c15fe1f6
Update install_requirements.sh 2023-04-21 03:00:52 +02:00
henk717 b432d55d99
Merge pull request #291 from Relys/patch-1
Update install_requirements.sh
2023-04-20 14:03:35 +02:00
henk717 ee6e7e9b72 Colab description changes 2023-04-17 22:59:55 +02:00
Syler Clayton 860b697a70
Update install_requirements.sh
Made parameter case insensitive.
2023-04-15 09:51:45 -07:00
henk717 29c2d4b7a6 Removing Pygmalion from the TPU colab to get it unbanned 2023-04-04 19:51:18 +02:00
henk717 fd12214091 Clean the description of the GPU colab 2023-04-04 19:40:22 +02:00
henk717 bb51127bbf We no longer support Pygmalion on Colab due to Google's Pygmalion ban 2023-04-04 19:37:15 +02:00
henk717 72b4669563
Fix the chex dependency 2023-03-30 23:41:35 +02:00
henk717 ab779efe0e
Merge pull request #276 from YellowRoseCx/stable-branch
Update README and remove unavailable model from gpu.ipynb
2023-03-30 00:50:15 +02:00
YellowRoseCx 3c48a77a52
Update README.md
changed Colab GPU models listed to their higher quality counter parts
2023-03-29 17:44:44 -05:00
YellowRoseCx f826930c02
Update GPU.ipynb
removed litv2-6B-rev3
2023-03-29 17:41:01 -05:00
henk717 66264d38c4 Add Mixes 2023-03-28 00:23:10 +02:00
henk717 94eb8ff825 TPU Message 2023-03-19 14:52:14 +01:00
Henk 219b824b9b SocketIO Requirements Pin 2023-03-17 01:28:59 +01:00
henk717 ffa5c0bc13 Empty Revision Fix 2023-03-08 20:52:03 +01:00
henk717 487739911a Restore Pygmalion 6B Dev 2023-03-08 18:44:03 +01:00
Henk 2ed6cdb411 Huggingface Hub Pin 2023-03-08 18:03:36 +01:00
henk717 142cb354f9 Nerybus 13B - TPU colab 2023-03-01 22:33:11 +01:00
Henk 93bf023bd7 Use our own horde URL 2023-03-01 17:54:39 +01:00
henk717 750cc3d2dc
Merge pull request #245 from db0/kaimergemain2
Makes prod version of KAI work with merged hordes in stablehorde.net
2023-03-01 17:52:58 +01:00
Henk 0e06fc371f Modeldir Fix 2023-02-27 17:46:33 +01:00
Divided by Zer0 6426e3ca24 changes 2023-02-23 18:34:46 +01:00
Divided by Zer0 2de9672b95 attempt1 2023-02-23 18:27:11 +01:00
henk717 c27faf56e6 Updated Silence Audio - GPU 2023-02-20 18:43:06 +01:00
henk717 5962a6cb4f Updated Audio Link - TPU 2023-02-20 18:41:06 +01:00
Henk 1378fe8beb Silence file for colab 2023-02-20 18:28:58 +01:00
waffshappen a0d4497c95
Also update CUDA container 2023-02-16 10:37:58 +00:00
waffshappen d026bd79cb
Allow Project File Access with Podman+Selinux
With selinux enabled distros containers accessing KoboldAIs main directory as content, as planned here, will likely generally be denied (atleast with podman).
Option 1 would be to mark it with the right label - like :z - but that has other Implications for the content directory.

The other fix, if uglier, is to run the container without labels being enforced and thus allow the file access as the same user and with no further sideeffects to the project file labelling.
2023-02-15 23:32:41 +00:00
Henk cc01ad730a Don't install safetensors for MTJ 2023-02-11 11:20:21 +01:00
Henk b58daa1ba1 Pin Flask-cloudflared 2023-02-10 19:11:13 +01:00
henk717 661bd5c99e Hide Pygmalion 6B Dev, currently only supported on the GPU 2023-01-31 19:24:19 +01:00
Henk 257a535be5 Revision Fixes Fixes 2023-01-31 05:17:34 +01:00
Henk 739cccd8ed Revision Fixes 2023-01-31 04:48:46 +01:00
henk717 e9cf9fa6d0 Pygmalion Dev support 2023-01-27 05:20:09 +01:00
henk717 031c06347f Streamlining Revision Support 2023-01-24 13:51:08 +01:00
henk717 a185cbd015 Fix Defaults 2023-01-24 13:31:45 +01:00
henk717 a046db4ded Gemaakt met Colaboratory 2023-01-24 13:16:49 +01:00
henk717 47a27fa906 Cloudflare as default again - GPU 2023-01-23 18:15:37 +01:00
henk717 24f50d6fb7
Download Manager Support docker-rocm 2023-01-18 02:04:45 +01:00
henk717 22acde1ab7
Download Manager Support docker-cuda 2023-01-18 02:04:14 +01:00
Henk e9859cf17d DNSPython workaround
DNSPython had an update eventlet is not ready for. We now manually cap DNSPython to ensure the installations still happen correctly.
2023-01-16 16:32:17 +01:00
Henk 307fc97b9d ROCm Dependency Bump/Fix 2023-01-13 22:49:32 +01:00
henk717 4a88e41d14 Pygmalion 6B 2023-01-10 17:22:03 +01:00
henk717 1628b789d1 Add Pygmalion 2023-01-09 23:36:43 +01:00
Henk 857476ef6b ROCm torch version pin 2023-01-08 17:10:59 +01:00
Henk 7fc5c46c1d Add Safetensors
Having the dependency adds basic support for safetensor models.
2023-01-06 16:40:56 +01:00
henk717 1dbc987048 6B models now Colab Free has beefier GPU's 2022-12-21 16:52:41 +01:00
henk717 a04f99891f
Merge pull request #194 from Gouvernathor/patch-1
Update usage instructions for git-clone use
2022-12-21 15:36:15 +01:00
henk717 75fecb86cc
Merge pull request #196 from henk717/united
Improved model support & Shortcut Fixes
2022-12-18 20:18:13 +01:00
Gouvernathor a4f49c097a
Add git clone command and Linux case 2022-12-17 12:13:51 +01:00
Gouvernathor 55cf5f2f67
Update usage instructions for git-clone use 2022-12-17 04:10:56 +01:00
henk717 23b2d3a99e
Merge pull request #236 from one-some/united
Move shortcuts to Alt
2022-12-17 00:37:18 +01:00
somebody 9efbe381cf Move shortcuts to Alt from Ctrl 2022-12-16 16:47:22 -06:00
henk717 0a926e41e4
Merge pull request #235 from VE-FORBRYDERNE/patch
Fix materialize function for galactica models
2022-12-12 20:15:54 +01:00
vfbd 33ba3e7e27 Fix materialize function for galactica models 2022-12-12 14:11:08 -05:00
Henk eeb1774d42 Cleaner implementation of zipfolder 2022-12-10 19:23:08 +01:00
Henk 9a8e8a0005 New pytorch zipfile support 2022-12-10 19:11:07 +01:00
henk717 dd7363548c
Merge pull request #191 from henk717/united
Probability Viewer Fix
2022-12-09 21:56:41 +01:00
henk717 686845cd21
Merge pull request #234 from one-some/united
Move probability visualization to after logitwarpers
2022-12-09 21:22:33 +01:00
somebody e6656d68a1 Move probability visualization to after logitwarpers 2022-12-09 13:47:38 -06:00
henk717 55ef53f39b
Typo fix 2022-12-08 15:17:10 +01:00
henk717 0b3e22ee13
Merge pull request #185 from henk717/united
Pin transformers version
2022-12-02 02:03:23 +01:00
Henk d0cb463c53 Pin transformers version
To avoid breaking changes lets force the exact transformers version we code against. This will be automatically picked up by all the automatic updaters.
2022-12-02 01:48:12 +01:00
henk717 e8245478d6
Merge pull request #184 from henk717/united
Cap transformers version
2022-12-02 01:27:18 +01:00
henk717 f72ceeadd0
Cap transformers version
Since MTJ is low level, we force a fixed transformers version to have more controlled updates when needed
2022-12-02 01:10:59 +01:00
henk717 04d9172fcd
Merge pull request #180 from VE-FORBRYDERNE/patch
Only enable TPU transpose optimization if loading from HF model
2022-11-21 20:02:14 +01:00
vfbd 9a3f0eaab2 Only enable TPU transpose optimization if loading from HF model 2022-11-21 13:47:18 -05:00
henk717 f2077b8e58
Merge pull request #179 from henk717/united
1.19.2
2022-11-20 16:26:03 +01:00
Henk 2603f1fd5d Version bump 2022-11-20 16:22:33 +01:00
Henk 3084552c05 Sampler Order Fix for Models 2022-11-14 17:15:39 +01:00
Henk 13dff68de8 Sampler Order Loading Fix 2022-11-14 16:59:53 +01:00
Henk a66e1443fd New Models 2022-11-12 16:54:40 +01:00
Henk 440c5c333e Clear flask_session on launch
Can help with version switching bugs
2022-11-12 15:43:06 +01:00
Henk f1e4664d56 Dependency improvements
Adding psutil from conda to avoid the need for a compiler, finetuneanon should no longer be used. If people really want to use it they are on their own.
2022-11-11 21:13:51 +01:00
Henk eb52ebd082 Merge branch 'main' into united 2022-11-03 00:22:30 +01:00
henk717 09b5ffc09d
Merge pull request #175 from VE-FORBRYDERNE/gptj-patch
Fix GPT-J model loading in TPU Colab when `vocab_size` is not divisible by 8
2022-11-03 00:13:50 +01:00
vfbd b20d80ca2a Add vocab padding to embedding bias in gptj.json 2022-11-02 19:02:09 -04:00
henk717 2e3a80b8ea
Merge branch 'KoboldAI:main' into united 2022-10-26 23:11:26 +02:00
henk717 7b5a766b4a
Merge pull request #172 from VE-FORBRYDERNE/accelerate-patch
Fix "is on the meta device" error when loading model with disk cache
2022-10-26 22:42:05 +02:00
vfbd 3233e78c56 Fix "is on the meta device" error when loading model with disk cache 2022-10-26 16:00:45 -04:00
Henk 442a9760b8 Hide V2 Saves 2022-10-23 19:03:18 +02:00
henk717 2300fb46ff
Merge branch 'KoboldAI:main' into united 2022-10-23 18:29:28 +02:00
Henk 8ee795055c Force compatible HF Hub 2022-10-23 18:28:50 +02:00
Henk ea8b50d31e Conda fix for update script 2022-10-23 16:00:18 +02:00
Henk 0da404d4f8 Conda conflict fix 2022-10-23 14:10:44 +02:00
Henk 4699ded3ce Tuner Dependencies 2022-10-22 19:00:06 +02:00
henk717 351fb3c80b
Merge pull request #232 from VE-FORBRYDERNE/mkultra
Universal mkultra-based soft prompt tuner
2022-10-22 14:13:42 +02:00
henk717 10a779d8c1
Merge pull request #231 from ebolam/united
Add parameter to Colab to use google drive
2022-10-22 14:13:32 +02:00
vfbd f7b799be56 Apply tokenizer fixes to prompt_tuner.py 2022-10-21 17:06:17 -04:00
ebolam d588dc0096 Check if dir exists before creating 2022-10-19 11:19:04 -04:00
ebolam 73865ba066 Add parameter to Colab for not using google drive (data would be ephemeral) 2022-10-19 11:05:17 -04:00
henk717 f8be854e09
Merge branch 'KoboldAI:main' into united 2022-10-17 21:06:10 +02:00
henk717 2795ced3a4
Merge pull request #168 from VE-FORBRYDERNE/api-patch
Fix regex for the prompt parameter of the POST /story/end endpoint
2022-10-17 20:38:34 +02:00
vfbd 9ff50d81fd Fix regex for the prompt parameter of the POST /story/end endpoint 2022-10-17 14:36:23 -04:00
henk717 c6ed656a76
Merge pull request #230 from pi6am/fix/lua_kobold_modeltype
Fix/lua kobold modeltype
2022-10-14 19:50:19 +02:00
Llama e5d0cc7b49 Fix exception thrown by kobold.modeltype in Lua
Fixes this exception:
  File "aiserver.py", line 3389, in lua_get_modeltype
    hidden_size = get_hidden_size_from_model(model)
NameError: name 'get_hidden_size_from_model' is not defined

The kobold.modeltype method eventually attempts to call
get_hidden_size_from_model in Python, but this method
was previously defined only within a local scope and so
is not visible from within lua_get_modeltype.  Since
get_hidden_size_from_model only accesses its model argument,
there is no reason not to make it a module-level method.

Also change the severity of several more Lua error logs to error.
2022-10-14 09:20:33 -07:00
Llama 6eb3abbdb8
Merge pull request #2 from henk717/united
Merging henk717/united
2022-10-13 20:33:34 -07:00
henk717 fff7837a4a
Merge pull request #229 from pi6am/feature/anote-kwarg
Feature/anote kwarg
2022-10-13 23:04:46 +02:00
henk717 be5ffe763c
Merge pull request #228 from VE-FORBRYDERNE/transpose
Slightly decrease TPU loading times
2022-10-13 15:35:28 +02:00
Llama 8357c3e485 Merge branch 'united' into feature/anote-kwarg 2022-10-12 23:37:45 -07:00
Llama 05bcd3af11
Merge pull request #1 from henk717/united
Version bump
2022-10-12 23:32:25 -07:00
Llama 4a01f345de Add include_anote kwarg to lua_compute_context.
Add an optional keyword argument to lua_compute_context to control
whether the author's note should be included in the context.  The
default value is true, so if the include_anote kwarg is not specified
then the author's note will be included, which was the default
behavior prior to this change.

Also update the Lua API documentation to describe this kwarg.
2022-10-12 23:18:19 -07:00
vfbd bdc73ef393 Decrease TPU loading times by eliminating a transpose operation 2022-10-12 14:31:18 -04:00
henk717 59e3a40496
Merge pull request #165 from henk717/united
1.19.1
2022-10-12 15:35:09 +02:00
Henk 64715b18d6 Version bump 2022-10-12 14:54:11 +02:00
Henk d5143eeb80 LUA Error as Error 2022-10-12 01:23:00 +02:00
henk717 739cf0aae7
Merge pull request #227 from VE-FORBRYDERNE/pickle
Custom unpickler to avoid pickle's arbitrary code execution vulnerability
2022-10-07 02:12:53 +02:00
vfbd 323f593a96 Custom unpickler to avoid pickle's arbitrary code execution vulnerability 2022-10-06 20:08:08 -04:00
henk717 b85d74f22c
Merge branch 'KoboldAI:main' into united 2022-10-05 19:51:29 +02:00
henk717 9f18811ff9
Merge pull request #226 from VE-FORBRYDERNE/api-settings
Allow changing and reading sampler seed and sampler order from API
2022-10-04 20:30:25 +02:00
henk717 6af0e842f2
Switch to official
Switch to the official branch on KoboldAI now that it is compatible
2022-10-04 17:42:18 +02:00
henk717 cf3aebbd8f
Merge pull request #161 from henk717/united
Release 1.19
2022-10-04 15:57:47 +02:00
vfbd bdfa6d86b7 Seed has to be a 64-bit unsigned int or PyTorch will throw an error
tpu_mtj_backend's seed can be an integer of arbitrary size but we will
limit it to a 64-bit unsigned integer anyways for consistency.
2022-10-02 17:50:32 -04:00
vfbd dd1c25241d Allow sampler seed and full determinism to be read/written in /config 2022-10-02 17:43:54 -04:00
vfbd 1a59a4acea Allow changing sampler seed and sampler order from API 2022-10-02 16:25:51 -04:00
henk717 7bd3125f5a
Merge branch 'KoboldAI:main' into united 2022-10-01 16:59:46 +02:00
henk717 2f45b93119 GPU updates 2022-10-01 16:58:16 +02:00
henk717 e1606afc0d GPU Descriptions 2022-10-01 15:43:45 +02:00
henk717 8313df8817 Localtunnel Default 2022-10-01 15:42:54 +02:00
henk717 9abad8bee9
Merge pull request #225 from scythe000/united
Update aiserver.py - typo fix
2022-09-30 19:30:46 +02:00
scythe000 a482ec16d8
Update aiserver.py - typo fix
Changed 'beakmodel' to 'breakmodel' in the example comment.
2022-09-30 10:29:32 -07:00
henk717 276c6f8e9e
Merge pull request #224 from db0/get_cluster_models_fix
fix endpoint for get_cluster_models
2022-09-30 00:30:18 +02:00
Divided by Zer0 90022d05c8 fix endpoint for get_cluster_models 2022-09-30 00:26:55 +02:00
vfbd 6758d5b538 Merge branch 'united' into mkultra 2022-09-28 14:30:34 -04:00
henk717 3a094a049b
Merge pull request #223 from ebolam/united
Fix for GPT models downloading even when present in model folder
2022-09-28 19:15:33 +02:00
ebolam e7973e13ac Fix for GPT models downloading even when present in model folder 2022-09-28 12:47:50 -04:00
henk717 0f7ecb3257
Merge pull request #222 from ebolam/united
Fix for lazy loading on models after a non-lazy load model
2022-09-28 01:54:21 +02:00
ebolam f0690373b3 Merge branch 'united' of https://github.com/ebolam/KoboldAI into united 2022-09-27 19:52:44 -04:00
ebolam 72fc68c6e4 Fix for lazy loading on models after a non-lazy load model 2022-09-27 19:52:35 -04:00
henk717 c935d8646a
Merge pull request #221 from ebolam/united
Fix for loading models that don't support breakmodel (GPU/CPU support…
2022-09-28 01:32:08 +02:00
ebolam 4aa842eada Merge commit 'refs/pull/180/head' of https://github.com/ebolam/KoboldAI into united 2022-09-27 19:29:05 -04:00
ebolam be719a7e5e Fix for loading models that don't support breakmodel (GPU/CPU support in UI) 2022-09-27 19:02:37 -04:00
Henk 52e120c706 Disable breakmodel if we error on the check 2022-09-28 01:00:06 +02:00
henk717 d2ff32be32
Merge pull request #220 from ebolam/united
Fix for loading models on CPU only that don't support breakmodel
2022-09-28 00:46:37 +02:00
Henk 057ddb4fb2 Better --cpu handling 2022-09-28 00:45:17 +02:00
ebolam 168ae8083c Remove debug print 2022-09-27 18:30:20 -04:00
ebolam 0311cc215e Fix for loading models on CPU only that don't support breakmodel 2022-09-27 18:29:32 -04:00
henk717 3906cc1bd0
Merge pull request #219 from ebolam/united
Fix for GPT2 breakmodel in the UI
2022-09-28 00:17:27 +02:00
ebolam edd50fc809 Fix for GPT2 breakmodel in the UI 2022-09-27 17:58:51 -04:00
Henk f1d63f61f3 Syntax fix 2022-09-27 22:43:09 +02:00
henk717 d772837ad0
Merge pull request #217 from ebolam/united
Fix for older model loading
2022-09-27 22:41:13 +02:00
ebolam 908dc8ea60 Fix for older model loading 2022-09-27 15:59:56 -04:00
Henk 62921c4896 getmodelname for configname 2022-09-27 21:11:31 +02:00
Henk 6c32bc18d7 GPT2Tokenizer for TPU 2022-09-27 18:33:31 +02:00
Henk 60d09899ea Don't use Fast tokenizers when we don't have to 2022-09-27 18:26:13 +02:00
Henk 11455697ef Tokenizer Fixes (Slow first to keep coherency) 2022-09-27 17:57:18 +02:00
Henk 07896867b2 Revert Tokenizer Change 2022-09-27 15:36:08 +02:00
henk717 7f5ba8a678
Merge pull request #216 from VE-FORBRYDERNE/merge
Merge main into united
2022-09-26 22:11:44 +02:00
vfbd 79ae0f17ec Merge branch 'main' into merge 2022-09-26 16:10:10 -04:00
Henk ce692c7ebf Pinned info toggle fix 2022-09-26 16:36:50 +02:00
henk717 c88f88f54d
Merge pull request #215 from ebolam/united
Update for horde selection to pull models automatically
2022-09-25 19:50:44 +02:00
ebolam 1f6861d55c Update for horde selection to pull models automatically (or on typing with a 1 second delay 2022-09-25 13:49:02 -04:00
Henk 50266ab49a Torch fixes for colab 2022-09-25 19:47:08 +02:00
Henk b0a32d3646 Requirements Cleanup 2022-09-25 19:40:59 +02:00
Henk 9f15077337 Requirements Updates 2022-09-25 19:33:18 +02:00
Henk 6009586ac8 Requirements Updates 2022-09-25 19:30:09 +02:00
Henk 1a55088a21 Merge branch 'united' of https://github.com/henk717/koboldai into united 2022-09-25 17:26:59 +02:00
Henk d7ed577bf7 Don't stream in chatmode 2022-09-25 17:26:52 +02:00
henk717 0da5031955
Merge branch 'KoboldAI:main' into united 2022-09-25 17:08:07 +02:00
Henk 465c1fd64d 1.19 version bump and polish 2022-09-25 17:00:33 +02:00
Henk ba85ae4527 WI Improvements 2022-09-25 11:13:57 +02:00
Henk d4b7705095 Merge branch 'main' into united 2022-09-24 22:14:32 +02:00
Henk c66657ef1b Flaskwebgui removal 2 2022-09-23 14:46:02 +02:00
Henk 557f7fc0fc Remove flaskwebgui (Conflicts with our threading) 2022-09-23 14:45:47 +02:00
Henk 2ec7e1c5da Continue if webbrowser fails to open 2022-09-23 14:24:57 +02:00
Henk fef946c173 Possible colab aria2 status fix 2022-09-21 19:48:17 +02:00
Henk 06f4d9addf No Aria2 spam 2022-09-21 19:36:57 +02:00
henk717 a5f2ab42d6
Merge pull request #214 from ebolam/united
Fix for Aria2 Status in UI
2022-09-21 19:20:45 +02:00
ebolam bae5184b3b Merge branch 'united' of https://github.com/ebolam/KoboldAI into united 2022-09-21 13:10:23 -04:00
ebolam 8915ee7eb3 Fix for aria2 download status to UI 2022-09-21 13:09:27 -04:00
ebolam 0e270e0b25 Fix for aria2 download status to UI 2022-09-21 13:07:49 -04:00
Henk f3d6beb578 Remove accidentally uploaded file 2022-09-21 18:59:39 +02:00
Henk 68747f2a17 More cleanup 2022-09-21 18:58:44 +02:00
Henk e0c5564244 Cleanup test files 2022-09-21 18:57:54 +02:00
Henk cca3ce3493 Aria2 Fixes 2022-09-21 18:57:09 +02:00
Henk f62c740f7e Revert "Aria2 Fixes"
This reverts commit 8d1c734df8.
2022-09-21 18:47:13 +02:00
Henk 8d1c734df8 Aria2 Fixes 2022-09-21 18:21:48 +02:00
Henk e68f284006 show_budget fix 2022-09-21 17:53:16 +02:00
Henk d0664207e8 Sync and save show budget 2022-09-21 17:39:56 +02:00
Henk a6298ee6df Slider fixes 2022-09-21 17:29:44 +02:00
Henk d1d23b5383 New Models 2022-09-20 18:12:54 +02:00
henk717 6ffdc4e356
Merge pull request #213 from db0/horde_settings
fix previously saved settings overwriting new API key
2022-09-17 17:07:27 +02:00
henk717 f904cd4839
Merge pull request #211 from ebolam/united
Fix for default URL in Horde Mode
2022-09-17 00:27:13 +02:00
Divided by Zer0 4362ca4b34 fix previously saved settings overwriting new API key 2022-09-16 16:23:04 +02:00
henk717 981acaef71
Merge pull request #212 from db0/loguru_deps
loguru dependency in all environments
2022-09-16 00:57:10 +02:00
ebolam 7146a063ce Fix for default URL in Horde Mode 2022-09-15 18:55:36 -04:00
Divided by Zer0 198e2920d2 loguru dependency in all environments 2022-09-16 00:53:53 +02:00
henk717 8eb4cd36ad
Merge pull request #210 from db0/model_loading_msg
fix model loading format bleeding into gui
2022-09-16 00:11:54 +02:00
Divided by Zer0 9582722c2e fix model loading format bleeding into gui 2022-09-16 00:08:59 +02:00
henk717 77763da6e2
Merge pull request #209 from VE-FORBRYDERNE/dependency-fix
Fix compatibility issues with transformers and optax/chex
2022-09-15 23:48:13 +02:00
vfbd e8e0ad85be Merge branch 'united' into dependency-fix 2022-09-15 17:41:34 -04:00
vfbd c288c39de7 Remove type hints from http_get 2022-09-15 17:39:32 -04:00
vfbd 943614b5e6 Merge branch 'main' into dependency-fix 2022-09-15 17:33:48 -04:00
henk717 8c934b4488
Merge pull request #208 from db0/loguru
Onboarded Loguru as logging infrastructure
2022-09-15 23:24:11 +02:00
Divided by Zer0 36b80a5542
Merge branch 'united' into loguru 2022-09-15 12:48:41 +02:00
Henk 7d4c690471 Merge branch 'main' into united 2022-09-15 00:43:10 +02:00
Divided by Zer0 e58df9568c
Merge branch 'united' into loguru 2022-09-14 01:46:37 +02:00
Henk be7f7cab7e Merge branch 'main' into united 2022-09-13 15:12:34 +02:00
henk717 8e07f5fd50
Merge branch 'KoboldAI:main' into united 2022-09-13 13:55:07 +02:00
Divided by Zer0 a75351668f switched model retrieval and sendtocluster to loguru 2022-09-12 17:22:27 +02:00
Divided by Zer0 9280102cb3 Moved Flask Startup into __main__
This allows command line arguments to be parsed first
2022-09-12 17:00:30 +02:00
Divided by Zer0 6bc702854f added verbosity controls 2022-09-12 16:47:10 +02:00
Divided by Zer0 c05e0864c4 added verbosity controls 2022-09-12 16:30:19 +02:00
Divided by Zer0 86256ca4e3 Made messages the highest priority 2022-09-12 12:03:38 +02:00
Divided by Zer0 c858452740 Added logger to fileops 2022-09-12 12:00:30 +02:00
Divided by Zer0 68aaef9090 escapig gen/prompt logs so that they stay in one line 2022-09-12 11:49:59 +02:00
Divided by Zer0 3ed39f9863 loguru to deps 2022-09-12 02:06:37 +02:00
Divided by Zer0 239a141d7e added loguru dependency 2022-09-12 02:04:33 +02:00
Divided by Zer0 d30bbd28a1 logger for prompt and gen 2022-09-12 01:57:41 +02:00
Divided by Zer0 66ae5c35c0 replaced all warnings with logger 2022-09-12 01:18:25 +02:00
Divided by Zer0 5692e5ce16 finished all init messages 2022-09-12 01:13:12 +02:00
Divided by Zer0 656e3995f0 model tensors info 2022-09-12 01:09:21 +02:00
Divided by Zer0 48f6b5a939 more init messages 2022-09-12 01:00:03 +02:00
Divided by Zer0 4817a27552 multicolored init 2022-09-12 00:49:51 +02:00
Divided by Zer0 11eac68676 more init messages 2022-09-12 00:19:51 +02:00
Divided by Zer0 2a8a223473 added init logs 2022-09-11 23:55:48 +02:00
Divided by Zer0 e29f6c94d3 prompt_colors 2022-09-11 23:08:17 +02:00
Divided by Zer0 ce2d1ff654 logger file 2022-09-11 22:59:25 +02:00
Divided by Zer0 432af79fa5 reqs 2022-09-11 19:56:36 +02:00
Divided by Zer0 ee357fff3d reqs 2022-09-11 19:51:23 +02:00
Divided by Zer0 88d8f815f5 init 2022-09-11 19:46:51 +02:00
henk717 8fce4c192e
Merge pull request #207 from db0/hide_prompt_option
Allows to specify to the /generate API endpoing to go quiet
2022-09-11 19:27:38 +02:00
Divided by Zer0 f97d285f9f Allows to specify to the API to go quiet 2022-09-11 19:23:28 +02:00
henk717 9405adb885
Merge pull request #206 from db0/horde_error_handling
horde error handling
2022-09-10 19:12:13 +02:00
Divided by Zer0 888e33a63e switch to new API 2022-09-10 19:02:55 +02:00
Divided by Zer0 2eefb488d5 f-string 2022-09-10 15:17:13 +02:00
Divided by Zer0 d6fc61739f no need to dmp json 2022-09-10 14:52:24 +02:00
Divided by Zer0 684399cdd6 fix the fix 2022-09-10 14:51:31 +02:00
Divided by Zer0 13ca465980 horde error handling 2022-09-10 10:03:46 +02:00
henk717 4851c1dd46
Merge pull request #205 from VE-FORBRYDERNE/hidden-size
Fix hidden size calculation for GPT-NeoX models
2022-09-07 22:57:29 +02:00
vfbd 153f6b6c92 Fix hidden size calculation for GPT-NeoX models 2022-09-07 13:21:49 -04:00
henk717 8bbb9ff761
Merge pull request #204 from db0/api_key
Adjustment to work with authentication on the Horde
2022-09-07 14:01:49 +02:00
Divided by Zer0 16fae3c6df username > api_key 2022-09-07 01:36:59 +02:00
Henk cf3f38b90d Fix merge artifacts 2022-09-06 01:54:18 +02:00
henk717 8ed731daff
Merge pull request #201 from ebolam/united
Fix for Horde mode
2022-09-06 01:50:10 +02:00
ebolam a383ef81b1 Merge commit 'refs/pull/107/head' of https://github.com/ebolam/KoboldAI into united 2022-09-05 19:49:32 -04:00
Henk 296481f4aa More config hardening 2022-09-05 22:32:20 +02:00
henk717 78d720037f
Merge pull request #202 from db0/configfile_fix
fix settings name not being correct for loaded models
2022-09-05 22:30:43 +02:00
henk717 3236068c84
Merge pull request #203 from VE-FORBRYDERNE/patch
Fix POST /story/end API endpoint
2022-09-05 22:27:54 +02:00
vfbd f66ffa09a2 Fix POST /story/end API endpoint 2022-09-05 14:37:39 -04:00
Divided by Zer0 542f30cdc4 fix settings name not being correct for loaded models 2022-09-05 18:03:21 +02:00
henk717 c7a6309fa2
Merge pull request #200 from db0/online_model_fix
Fixes Horde not saving as expected
2022-09-03 23:40:23 +02:00
ebolam 397059cf2f Fix for Horde mode 2022-09-03 13:56:53 -04:00
Divided by Zer0 c5ee5d3ea2 Fixes Horde not saving as expected
Now Horde will save different settings per model, or for All

Refactored the code so that args.configname
is not used like a global var.

Added var.online_model because we need to keep track of it
2022-09-03 19:36:06 +02:00
henk717 f38034bd2c
Merge pull request #199 from db0/horde
Renamed KoboldAI Cluster to KoboldAI Horde
2022-09-02 00:26:12 +02:00
Divided by Zer0 9463474ce4 renamed cluster to horde 2022-09-02 00:23:29 +02:00
henk717 05a4695ad2
Merge pull request #198 from db0/soft_prompts_list
Adds /config/soft_prompts_list API endpoint
2022-09-01 16:35:08 +02:00
henk717 b0aa615ef5
Merge pull request #196 from db0/mult_gen_api_cluster
fix for multiple gens breaking API/CLUSTER
2022-09-01 16:34:52 +02:00
henk717 a809170bdc
Merge pull request #197 from ebolam/united
UI changes for cluster support
2022-09-01 16:34:32 +02:00
Divided by Zer0 c1bf91f86c Adds /config/soft_prompts_list API endpoint 2022-08-31 23:45:26 +02:00
Divided by Zer0 339225e400 fix for multiple gens breaking API/CLUSTER 2022-08-31 22:58:58 +02:00
ebolam 8626debeff Fix for cluster key saving 2022-08-31 15:46:08 -04:00
ebolam b07a649e3e Fix for API key not being saved 2022-08-31 13:17:30 -04:00
ebolam 417cfe20bf Fix for saving key in cluster mode 2022-08-31 11:50:27 -04:00
ebolam bf814ad407 Add model loading on url or key change for CLUSTER mode 2022-08-31 11:48:38 -04:00
ebolam 6258963e39 Fixed all option for Cluster model selection 2022-08-31 11:10:41 -04:00
ebolam 24ac6f3db8 First working CLUSTER ui. Might need change when multiple models selected. 2022-08-31 10:46:16 -04:00
ebolam 569f4cbce4 Merge branch 'united' of https://github.com/ebolam/KoboldAI into united 2022-08-31 09:34:24 -04:00
ebolam 1031b70731 Starts of adding cluster to UI 2022-08-31 09:34:14 -04:00
henk717 39944c4258
Merge pull request #195 from db0/kai_cluster3
Adds support for using the KAI Cluster approach directly from KAI
2022-08-30 22:45:56 +02:00
Divided by Zer0 496ef1472d updated 2022-08-30 21:35:17 +02:00
Divided by Zer0 42e04afc83 init 2022-08-30 21:29:55 +02:00
henk717 c5caa03e5b
Merge pull request #194 from ebolam/united
Fix for KoboldAI API as a model option
2022-08-30 21:11:49 +02:00
ebolam 181c93424c Fix for KoboldAI API as a model option 2022-08-30 15:10:11 -04:00
ebolam 8d3eb44d2e
Merge pull request #83 from henk717/united
Update
2022-08-30 15:03:34 -04:00
henk717 1641b7538b
Merge pull request #192 from VE-FORBRYDERNE/api-model-change
PUT /model endpoint for changing the model from the API
2022-08-30 20:41:06 +02:00
vfbd 8292f17ab0 Don't allow changing model during generation 2022-08-29 13:23:19 -04:00
vfbd 807ddf6f26 Add PUT /model endpoint 2022-08-28 15:53:15 -04:00
ebolam b5a6b44582 Revert "Bug fix for saves putting actions metadata as a dict instead of a list when not used yet"
This reverts commit 171effc29b.
2022-08-27 18:47:57 -04:00
ebolam 171effc29b Bug fix for saves putting actions metadata as a dict instead of a list when not used yet 2022-08-27 18:25:56 -04:00
henk717 7c01933743
Merge pull request #190 from VE-FORBRYDERNE/patch
Fix error that occurs when using dynamic TPU backend
2022-08-27 23:44:37 +02:00
vfbd cbacfbdfac Fix error that occurs when using dynamic TPU backend 2022-08-27 17:42:49 -04:00
vfbd cbab98cc23 Merge branch 'united' into mkultra 2022-08-24 15:06:02 -04:00
henk717 a282500da7
Merge branch 'KoboldAI:main' into united 2022-08-24 20:40:17 +02:00
henk717 6faa27ef87
Merge pull request #187 from VE-FORBRYDERNE/offline
Fix the model selection GUI when there is no internet connection
2022-08-24 20:13:02 +02:00
vfbd 51135e192b Merge branch 'united' into mkultra 2022-08-23 21:29:29 -04:00
henk717 3fdee98fcc
Merge pull request #189 from VE-FORBRYDERNE/rep-pen-order
Allow changing order of repetition penalty relative to other samplers
2022-08-24 00:55:33 +02:00
vfbd 938e1eddf3 Fix `jax.lax.cond` call 2022-08-23 18:13:46 -04:00
vfbd ff9058896e Add Repetition Penalty to Samplers menu 2022-08-23 15:42:23 -04:00
vfbd cbfe456409 Repetition penalty is now added to sampler list when loading from settings files 2022-08-23 15:30:07 -04:00
vfbd 62dd9b8c11 Merge branch 'patch' into rep-pen-order 2022-08-23 15:26:25 -04:00
vfbd aee4beb27a Fix the Show Field Budget toggle 2022-08-23 15:26:15 -04:00
vfbd 6ffaf43548 Repetition penalty is now sampler #6 in the sampler order 2022-08-23 15:10:21 -04:00
vfbd 9eecb61fea Remove unused import from warpers.py 2022-08-23 14:52:45 -04:00
vfbd 74922966bd Merge branch 'avril' into rep-pen-order 2022-08-23 14:47:29 -04:00
vfbd 624f916dc6 Fix some remaining problems in prompt_tuner.py 2022-08-22 22:57:30 -04:00
vfbd 07eb2b5c4f Disable urllib3 logger in prompt_tuner.py to disable aria2 warnings 2022-08-22 21:57:46 -04:00
vfbd a51e4f0651 aria2_hook now handles properly when vars is None 2022-08-22 21:52:40 -04:00
vfbd bae8d88651 Fix typo in get_hf_checkpoint_metadata 2022-08-22 21:50:06 -04:00
vfbd aede7ef192 Fix typo in training routine of prompt_tuner.py 2022-08-22 21:38:13 -04:00
vfbd 1e9f0e68a0 Merge branch 'united' into mkultra 2022-08-22 21:25:42 -04:00
vfbd b60d14e3bf Handle -1's in prompt_tuner.py breakmodel_gpulayers 2022-08-22 21:25:07 -04:00
vfbd 09750acfa0 prompt_tuner.py now shows layer configuration 2022-08-22 20:02:21 -04:00
vfbd b1c456ec18 prompt_tuner.py always has accelerate 2022-08-22 19:52:47 -04:00
vfbd 8da6893407 Replace MTJSP with MKUSP in prompt_tuner.py 2022-08-22 19:29:56 -04:00
vfbd 3d5c83fc23 prompt_tuner.py now uses lazy loader and accelerate 2022-08-22 19:29:20 -04:00
henk717 65a0197e64
Merge pull request #188 from one-some/budget-setting
Add show budget setting
2022-08-23 01:00:48 +02:00
somebody 95796faf41 Add show budget setting 2022-08-22 17:25:55 -05:00
somebody d7ebd2ae20 Dont broadcast token usage 2022-08-22 17:25:33 -05:00
vfbd 584056b6d5 Fix remaining problems in prompt_tuner.py 2022-08-22 17:30:49 -04:00
vfbd f79926b73d Fix some more typos in prompt_tuner.py 2022-08-22 16:51:09 -04:00
vfbd a49a633164 `self.ckpt_path` -> `self.data.ckpt_path` 2022-08-22 16:46:39 -04:00
vfbd 05cf9b1dde Upload BasicTrainer class 2022-08-22 16:43:02 -04:00
vfbd 728e19a7f0 Implement file saving in prompt_tuner.py 2022-08-22 16:29:39 -04:00
vfbd 55f45c4912 Fix the model selection GUI when there is no internet connection 2022-08-22 14:45:02 -04:00
vfbd 4e88b277d4 Merge branch 'united' into mkultra 2022-08-20 23:24:03 -04:00
henk717 0d4bffe8f8
Merge pull request #186 from ebolam/united
Fix for Aria2 download status not showing in colab console
2022-08-19 18:18:03 +02:00
ebolam 8ba68e05ec Aria2 Status Bar Download Fix 2022-08-19 12:13:46 -04:00
ebolam 10c46340f7 Merge branch 'united' of https://github.com/ebolam/KoboldAI into united 2022-08-19 12:12:07 -04:00
ebolam 513f59791a Debug 2022-08-19 12:11:53 -04:00
ebolam 812ac8f27d Debug 2022-08-19 12:10:29 -04:00
ebolam 67fe4dc979 Merge branch 'united' of https://github.com/ebolam/KoboldAI into united 2022-08-19 12:08:07 -04:00
ebolam 046f9d8ace Fix for Colab download status bar 2022-08-19 12:08:00 -04:00
ebolam 7eee21d674 Fix for Colab download status bar 2022-08-19 12:05:22 -04:00
ebolam cf422aa16e Merge branch 'united' of https://github.com/ebolam/KoboldAI into united 2022-08-19 10:52:05 -04:00
ebolam ec90b76064 Add print in console for model downloading when using Aria2 2022-08-19 10:51:56 -04:00
ebolam 081240fad1 Add print in console for model downloading when using Aria2 2022-08-19 10:19:10 -04:00
henk717 05ad6c100b
Merge pull request #185 from ebolam/united
Update for execution time timer
2022-08-19 15:29:06 +02:00
ebolam 10e3e64b0b Update for execution time timer 2022-08-18 19:10:18 -04:00
Henk b04a3a2fbb Dismiss reload warning when needed 2022-08-18 23:10:19 +02:00
henk717 a3862946aa
Merge pull request #184 from ebolam/united
Fix for vars.model getting set on AI selection in the UI rather than when actually loaded
2022-08-18 00:05:55 +02:00
ebolam 137695106d Fix for gooseai 2022-08-17 18:03:48 -04:00
ebolam a19300b3ca
Merge branch 'henk717:united' into united 2022-08-17 09:07:01 -04:00
henk717 85337ccf11
Merge branch 'KoboldAI:main' into united 2022-08-16 16:02:43 +02:00
ebolam 0032462837 Fix for vars.model getting set on AI selection in the UI rather than when actually loaded 2022-08-13 20:12:35 -04:00
Henk 6acccbf7a4 Save null seed to settings 2022-08-14 02:09:53 +02:00
henk717 c453643e2c
Merge pull request #183 from ebolam/united
Better Icon placement in AI load menu
2022-08-13 23:17:15 +02:00
ebolam 85d925aead Better Icon placement in AI load menu 2022-08-13 09:51:20 -04:00
vfbd 31ea1bafac Merge branch 'united' into mkultra 2022-08-12 17:47:27 -04:00
henk717 78a6f1cf08
Merge pull request #182 from VE-FORBRYDERNE/api
API
2022-08-12 20:46:14 +02:00
vfbd a7fb2c8414 Merge branch 'united' into api 2022-08-12 13:57:50 -04:00
Henk 09a709f0dc Merge branch 'token-usage-textsize' into united 2022-08-12 01:46:33 +02:00
somebody 6ac970b1c0 Update author's template effect token usage live 2022-08-11 18:38:29 -05:00
somebody c21c1e3dc0 Don't show token usage when max tokens is unknown 2022-08-11 18:22:06 -05:00
somebody a28faa0cb2 Fix author's note token usage 2022-08-11 18:21:49 -05:00
henk717 e0229302cd
Merge branch 'KoboldAI:main' into united 2022-08-12 01:06:47 +02:00
henk717 4ed913ad4f
Merge pull request #180 from ebolam/united
Fix for APIs and Custom Models not working in AI menu
2022-08-12 00:52:29 +02:00
vfbd e879d1c5f3 Hide the warning about `torch.distributed.reduce_op` being deprecated 2022-08-11 18:42:56 -04:00
somebody 555ca5fd05 Add token usage indicator 2022-08-11 17:31:12 -05:00
vfbd 8c7ed92fef --no_ui for disabling the main GUI and Socket.IO server 2022-08-11 18:21:35 -04:00
ebolam ca2c60d423 Fix for --nobreakmodel 2022-08-11 18:12:50 -04:00
ebolam bddcd7ab7f Deeper disable of --nobreakmodel attempt 2022-08-11 17:47:19 -04:00
vfbd 8fbca2db5a actionsubmit should not ignore vars.aibusy if vars.standalone is True 2022-08-11 15:27:07 -04:00
ebolam 45495d8792 Fix for --cpu on command line and MAYBE --nobreakmodel 2022-08-11 15:23:35 -04:00
vfbd 8b299525fd sendtoapi now automatically detects tokenizer 2022-08-11 14:57:13 -04:00
vfbd d328c2c1de Add GET /model endpoint 2022-08-11 14:55:38 -04:00
vfbd cd7ff2b141 Change behaviour of disable_input/output_formatting 2022-08-11 14:30:14 -04:00
vfbd bd703cd36a Warn about disable_input_formatting and disable_output_formatting 2022-08-11 14:12:07 -04:00
vfbd df111b944d Input formatting is now actually applied when generating with API 2022-08-11 14:11:17 -04:00
vfbd c0c9d62cd7 Latest API version is now automatically calculated 2022-08-11 14:00:43 -04:00
vfbd 8482df0d8d Remove HTTP error 422 from specifications where it is never thrown 2022-08-11 13:56:00 -04:00
vfbd 78cc5da87f `calcsubmitbudget` no longer adds `_koboldai_header` to API requests 2022-08-11 13:37:09 -04:00
vfbd 43e318bdc2 Safer method of determining request URL in sendtoapi 2022-08-11 13:29:47 -04:00
ebolam 64664dc61e Fix for the AI menu to respect the --cpu command line flag 2022-08-11 10:40:32 -04:00
ebolam 9016e29c66 Fix for APIs and Custom Models not working in AI menu 2022-08-11 10:33:47 -04:00
vfbd 1527db894e Fix specification of GET /story/nums/{num} 2022-08-10 21:09:29 -04:00
vfbd 6853625570 Allow KoboldAI to use its own API to generate text 2022-08-10 21:00:17 -04:00
vfbd 4eff7bf3ba /api now redirects to /api/latest 2022-08-10 18:22:46 -04:00
vfbd f2e2c40bc8 Merge branch 'united' into api 2022-08-10 18:11:07 -04:00
vfbd d2c06182f2 Remove annotation from api_version 2022-08-10 18:05:04 -04:00
henk717 784dea8298
Merge pull request #179 from ebolam/united
Fix for model loading not updating some attributes on the page.
2022-08-10 23:57:05 +02:00
vfbd 2af57adff3 API v1.1.0 2022-08-10 14:48:01 -04:00
vfbd becda8b842 Error 405 now sets Allow header 2022-08-09 22:32:24 -04:00
vfbd 5352c14c59 Fix typo in /config/soft_prompt documentation 2022-08-08 19:20:48 -04:00
vfbd c04e3c5666 Fix /docs/ redirects 2022-08-08 18:21:46 -04:00
vfbd 55c4acad8f Disable probability viewer and output streaming when using API 2022-08-08 18:16:08 -04:00
vfbd 82ae749396 Merge branch 'united' into api 2022-08-08 18:14:50 -04:00
vfbd aa01d1419d Add /story/end/delete and /story endpoints 2022-08-08 18:08:55 -04:00
vfbd 1f629ee254 Add more endpoints 2022-08-08 17:51:40 -04:00
vfbd a93087aecd Fix `api_format_docstring` 2022-08-08 14:21:50 -04:00
vfbd ddda981436 Improve /generate description 2022-08-08 14:19:43 -04:00
vfbd dc0fa9bff1 Add redirects to /api/v1/docs/ 2022-08-08 14:16:38 -04:00
vfbd ce064168e3 Additional validation for soft_prompt in API 2022-08-08 13:52:07 -04:00
vfbd de1e8f266a ValidationErrorSchema now has minItems 1 for its arrays 2022-08-08 13:22:18 -04:00
vfbd 596f619999 Unknown values in API input are now ignored instead of causing error 2022-08-08 13:17:53 -04:00
vfbd 3b56859c12 vars.disable_input_formatting and vars.disable_output_formatting fix 2022-08-08 13:04:46 -04:00
ebolam ad6bf95c42 Fix for model loading not updating some attributes on the page. 2022-08-08 10:16:10 -04:00
vfbd 3460e62271 Mark /static/swagger-ui as linguist-vendored 2022-08-08 02:40:39 -04:00
vfbd 34c9535667 Upload basic API with /generate POST endpoint 2022-08-08 02:27:48 -04:00
Henk 77e2a7972c Fix incorrect max tokens 2022-08-06 17:28:55 +02:00
Henk 76c7783ac8 Hide useless scrollbars on model list 2022-08-06 17:15:29 +02:00
Henk fe00581b83 Merge branch 'main' into united 2022-08-06 17:10:09 +02:00
Henk 610257b36e Output Streaming on by Default 2022-08-06 16:47:04 +02:00
Henk fccb464989 Polish 2 2022-08-06 16:43:45 +02:00
Henk 0a0bd75617 Polish 2022-08-06 16:42:15 +02:00
henk717 8bcf4187ac
Merge pull request #178 from one-some/token-prob
Add token probability visualizer
2022-08-05 14:27:46 +02:00
henk717 ed9391431b
Merge pull request #177 from ebolam/united
Fix for secondary model loads leaking settings into secondary model's…
2022-08-05 14:20:25 +02:00
ebolam b484b973d9 Fix for custom model box not showing up in model load menu if there aren't any models in the model folder 2022-08-04 19:26:55 -04:00
somebody f6d046fe1b Add token probability visualizer 2022-08-04 13:49:37 -05:00
henk717 5f1ffc0cd4
Merge branch 'KoboldAI:main' into united 2022-08-04 20:48:31 +02:00
vfbd 8823059713 Merge branch 'united' into mkultra 2022-08-03 15:58:13 -04:00
vfbd 00e8928ee6 Upload current progress 2022-08-03 15:57:23 -04:00
ebolam 71e119f0b7 Fix for secondary model loads leaking settings into secondary model's settings file. 2022-08-02 19:45:36 -04:00
ebolam 7ab39bac0f
Merge pull request #21 from henk717/united
Update to current
2022-08-02 19:38:16 -04:00
henk717 bd13a41eb7
Merge pull request #176 from one-some/shortcuts
Add editor shortcuts and actually fix streaming newline bug
2022-07-31 12:46:17 +02:00
somebody 59d55369cb Add shortcuts
Adds shortcuts to the UI. These shortcuts are:
Ctrl-Z: Undo
Ctrl-Y: Redo
Ctrl-E: Retry
2022-07-30 21:34:42 -05:00
somebody 5d135e091e Fix the streaming newline bug (for real this time I promise)
After actually investigating the cause of the bug instead of duct taping
a fix on, I have produced a better fix. The previous fix caused a bug
where there a newline was *removed* where it shouldn't have been when
undoing and redoing. The reason this bug was happening in the first
place was because some "newline regulation" code was falsely detecting
the output stream as a text chunk, and attempted to remove a newline
from there instead of the actual chunk.
2022-07-30 21:28:50 -05:00
henk717 56783a1257
Merge pull request #175 from one-some/token-streaming-newline-fix
Fix arbitrary newline insertion when token streaming is enabled
2022-07-31 01:22:15 +02:00
somebody 32ad83df8e Fix arbitrary newline insertion when token streaming is enabled
In adventure mode with token streaming enabled, retrying after an action
would cause a <br> element to be inserted into the action. This prevents
the insertion of the <br> if token streaming is enabled.
2022-07-30 15:05:54 -05:00
vfbd 9cf1b071b5 Merge branch 'united' into mkultra 2022-07-30 15:48:06 -04:00
henk717 71ea8b215a
Merge branch 'KoboldAI:main' into united 2022-07-30 21:46:25 +02:00
vfbd d1925452f6 Merge branch 'united' into mkultra 2022-07-30 13:34:33 -04:00
henk717 050e195420
Merge pull request #173 from one-some/token-streaming
Add token streaming option
2022-07-30 18:32:51 +02:00
henk717 a63f7cfa5a
Merge pull request #174 from ebolam/united
Fix for blank model info box when downloading model
2022-07-29 22:15:58 +02:00
ebolam f97c10b794 Fix for blank model info box when downloading model 2022-07-28 19:40:27 -04:00
vfbd e469a64a02 Merge branch 'main' into mkultra 2022-07-28 19:12:43 -04:00
somebody a4d81292f8 Add token streaming option 2022-07-27 22:13:08 -05:00
henk717 699c2353e7
Merge branch 'KoboldAI:main' into united 2022-07-27 18:06:53 +02:00
vfbd ee492647ff Merge branch 'united' into mkultra 2022-07-27 11:35:32 -04:00
henk717 fe64e480ee
Merge pull request #171 from ebolam/united
Add Download Model Status
2022-07-26 00:52:12 +02:00
Henk 317e85dbaa New spacing language 2022-07-26 00:49:54 +02:00
henk717 c43fefed74
Merge branch 'KoboldAI:main' into united 2022-07-26 00:45:53 +02:00
henk717 7721b72184
Merge branch 'KoboldAI:main' into united 2022-07-26 00:42:35 +02:00
ebolam 3b5ab92a02 Merge branch 'united' of https://github.com/ebolam/KoboldAI into united 2022-07-25 18:29:27 -04:00
ebolam 12acb50ee0 Fix for getting "model download status" when downloading config to figure out layer counts 2022-07-25 18:29:14 -04:00
henk717 46231519af
Merge pull request #172 from scott-ca/united
Added functionality to load all of the CLI arguments via a single JSON file
2022-07-26 00:14:46 +02:00
vfbd 168c14fd4c Better way to copy mkultra methods 2022-07-24 00:35:58 -04:00
scott-ca ce2efa0149
Update customsettings_template.json 2022-07-23 22:06:56 -06:00
scott-ca 9dc9966433 Added functionality to add any/all args via json 2022-07-23 22:02:03 -06:00
vfbd 289248ef40 Write AutoPromptTuningLM class 2022-07-23 14:37:28 -04:00
ebolam 0ab3612e49
Merge branch 'henk717:united' into united 2022-07-22 13:58:58 -04:00
ebolam 907cf74b13 Added status bar for downloading models 2022-07-22 13:58:20 -04:00
henk717 e860eb161d
Merge branch 'KoboldAI:main' into united 2022-07-22 15:33:25 +02:00
henk717 f1fd46fca6
Merge branch 'KoboldAI:main' into united 2022-07-22 15:19:19 +02:00
henk717 5f3783a294
Merge branch 'KoboldAI:main' into united 2022-07-21 20:02:45 +02:00
ebolam 2b53598307
Fixes for file editor (#170)
Various fixes for the file editor by Ebolam
2022-07-20 00:50:03 +02:00
ebolam a0475ba049 Moved emit action on fire browser to button rather than icon for easier clicking 2022-07-19 18:16:01 -04:00
ebolam f58064e72c Revert "Fix for aidg.club website being taken read-only"
This reverts commit 23a031d852.
2022-07-19 16:54:32 -04:00
ebolam c3fdee68a8 Revert "Revert "Fix for edit files""
This reverts commit 9c1fc5af8b.
2022-07-19 16:53:45 -04:00
ebolam 9c1fc5af8b Revert "Fix for edit files"
This reverts commit aedd7e966b.
2022-07-19 14:02:27 -04:00
ebolam 23a031d852 Fix for aidg.club website being taken read-only 2022-07-19 13:40:55 -04:00
ebolam 68d143b80c Merge branch 'united' of https://github.com/ebolam/KoboldAI into united 2022-07-15 12:30:18 -04:00
ebolam d91ed3141d Fix for non ascii files in edit mode 2022-07-15 12:30:02 -04:00
ebolam 9c136985a7
Merge branch 'henk717:united' into united 2022-07-15 12:29:00 -04:00
henk717 68110c5930
Merge branch 'KoboldAI:main' into united 2022-07-12 23:03:09 +02:00
henk717 f900a17f3c
Merge pull request #168 from VE-FORBRYDERNE/bloom
BLOOM support for TPU instances
2022-07-08 01:25:28 +02:00
vfbd d9e7ca5b48 Upload map file for BLOOM 2022-07-07 17:48:00 -04:00
henk717 9e140e3ba9
Merge branch 'KoboldAI:main' into united 2022-07-05 21:35:53 +02:00
ebolam aedd7e966b Fix for edit files 2022-07-04 19:08:30 -04:00
Henk 736a39b10b gitignore update 2022-07-04 20:12:11 +02:00
Henk b76e82644a flask-session for conda 2022-07-04 20:07:11 +02:00
henk717 e8c39992a1
Merge pull request #166 from ebolam/united
Add file browser to soft prompts and user scripts
2022-07-04 19:52:05 +02:00
ebolam 8013bc2a98 Added background blur for popup file editor 2022-07-03 16:21:48 -04:00
ebolam 328c0a38d7 Removed breadcrumbs on file browser before the jail directory 2022-07-03 16:02:55 -04:00
henk717 fd44f0ded3
Merge branch 'KoboldAI:main' into united 2022-07-03 15:12:12 +02:00
henk717 a99518d0a8
Merge branch 'KoboldAI:main' into united 2022-07-02 12:59:53 +02:00
henk717 74547b31d6
Merge pull request #167 from VE-FORBRYDERNE/accelerate
Fix base fairseq dense models when using accelerate with a GPU
2022-07-02 02:19:41 +02:00
vfbd aeed9bd8f7 Fix base fairseq dense models when using accelerate with a GPU 2022-07-01 20:16:39 -04:00
henk717 5d957e33ae
Merge branch 'KoboldAI:main' into united 2022-07-01 20:33:36 +02:00
ebolam 74d8e5f71b File Manager button visual change 2022-06-30 19:52:26 -04:00
ebolam 3f8a7ab4bb Allowing edit in userscripts 2022-06-30 19:41:11 -04:00
ebolam 813540fe9b Added folder browser for softprompts and userscripts 2022-06-30 19:13:05 -04:00
ebolam 97e0df45d7 File Dialog complete 2022-06-30 15:57:27 -04:00
ebolam 58418c4aa5 Basic file browser with edit and delete functionality
Can be shown by going to /popup_test in a second tab.
2022-06-30 09:44:04 -04:00
henk717 979ea074f2
Merge pull request #165 from VE-FORBRYDERNE/seed
Add support for setting the RNG seed and full determinism
2022-06-28 19:42:40 +02:00
vfbd 048bd0ff3b Add support for setting the RNG seed and full determinism 2022-06-28 13:21:05 -04:00
henk717 72d661111d
Merge branch 'KoboldAI:main' into united 2022-06-28 18:37:31 +02:00
Henk 496f6dcf3f Add docker usage info 2022-06-28 16:28:00 +02:00
Henk 33b8cec452 Standalone Docker
KoboldAI United now has a working official docker available at henk717/koboldai:united . In the spirit of our project, this commit open sources the files used to build the docker. docker-helper.sh is in the main folder, so that it is not overwritten during updates. Instead this file is copied by the Dockerfile and should be updated trough container updates.
2022-06-28 16:23:21 +02:00
Henk cba38bf3e4 Automatically update requirements 2022-06-28 15:17:32 +02:00
henk717 5f1c98af8e
Merge pull request #164 from ebolam/united
Fix for saved breakmodel settings on custom models
2022-06-27 16:18:09 +02:00
ebolam edd6dd7cd7 Fix for saved breakmodel settings on custom models
Fix for unit tests with new disk breakmodel
2022-06-27 10:12:54 -04:00
henk717 2207ac4b0a
Merge branch 'KoboldAI:main' into united 2022-06-26 20:36:13 +02:00
Henk 46678931b2 Better sentence spacing 2022-06-26 20:27:21 +02:00
henk717 8cea194809
Merge pull request #163 from VE-FORBRYDERNE/whitespace-cleanup
More story whitespace cleanup
2022-06-26 19:50:28 +02:00
vfbd ae41ad298c Make sure editor changes are applied before submitting 2022-06-26 13:40:58 -04:00
vfbd b99d1449c9 Remove trailing whitespace from submissions 2022-06-26 13:15:55 -04:00
vfbd 151407a001 Don't add sentence spacing if submission is empty
When you retry, it actually sends an empty submission, so if you have
add sentence spacing on, retrying could add an extra action with a
single space.
2022-06-26 13:02:22 -04:00
Henk fa97d28cb3 Nerys V2 for United 2022-06-25 14:06:51 +02:00
henk717 10e85db89d
Merge pull request #162 from VE-FORBRYDERNE/whitespace-cleanup
Story whitespace cleanup
2022-06-25 13:36:03 +02:00
vfbd 6acfa8c33c Merge branch 'whitespace' into whitespace-cleanup 2022-06-24 12:44:30 -04:00
vfbd 6e138db1c0 Clean up whitespace in the editor as well 2022-06-24 12:44:00 -04:00
Henk d3fce44095 Merge branch 'main' into united 2022-06-24 18:31:45 +02:00
vfbd 4b16600e49 Clean up whitespace at the end of actions when loading story
Specifically, we merge blank actions into the next action and we move
whitespace at the end of non-blank actions to the beginning of the next
action.
2022-06-24 12:03:35 -04:00
henk717 fca7c15fd3
Merge branch 'KoboldAI:main' into united 2022-06-24 18:00:37 +02:00
Henk 6a89ad5b94 Merge branch 'main' into united 2022-06-23 21:07:56 +02:00
henk717 8098f4ec8f
Merge branch 'KoboldAI:main' into united 2022-06-23 17:20:48 +02:00
henk717 3de22f2b27
Merge pull request #160 from VE-FORBRYDERNE/gc
Delete all torch tensors before loading model
2022-06-22 18:38:21 +02:00
vfbd 53034ee533 Delete all torch tensors before loading model 2022-06-22 12:07:36 -04:00
henk717 f127918114
Merge pull request #159 from VE-FORBRYDERNE/fairseq
Don't blacklist </s> token in "s" newline mode
2022-06-22 17:41:20 +02:00
vfbd 922394c68f Don't blacklist </s> token in "s" newline mode 2022-06-22 11:23:03 -04:00
Henk d4e18360f0 HF NeoX Support 2022-06-22 01:46:40 +02:00
henk717 f1d0a327f8
Merge branch 'KoboldAI:main' into united 2022-06-21 23:34:32 +02:00
henk717 b5b8e5a30b
Merge branch 'KoboldAI:main' into united 2022-06-21 23:19:57 +02:00
henk717 37eb47d0d3
Merge pull request #157 from VE-FORBRYDERNE/sp-fix
Bug fixes and new soft prompt implementation
2022-06-21 22:20:36 +02:00
Gnome Ann 8593bf339b Another typo fix 2022-06-21 15:36:25 -04:00
Gnome Ann 7e0ded6b47 Typo fix 2022-06-21 15:12:55 -04:00
Gnome Ann 91643be10a Change soft prompt implementation to a more universal one 2022-06-21 15:03:43 -04:00
Gnome Ann 0ea4fa9c87 Automatically calculate badwords and pad_token_id 2022-06-21 14:35:52 -04:00
Gnome Ann ea7d278ff4 Fix 20B TPU model 2022-06-21 13:16:45 -04:00
Gnome Ann 6b172306f6 move_model_to_devices no longer crashes if you don't have accelerate 2022-06-21 13:15:46 -04:00
henk717 f2c5bb5cb7
Merge pull request #156 from VE-FORBRYDERNE/accelerate
Accelerate disk cache support
2022-06-21 00:31:50 +02:00
Gnome Ann ff69e9fbfe Put layers_module_names, module_names and named_buffers in utils.py 2022-06-20 17:17:42 -04:00
Gnome Ann 1620ac4148 Lazy loader needs to cache named buffers of layers in the disk cache 2022-06-20 17:08:52 -04:00
Gnome Ann ab5ab79003 Set primary device to CPU if in CPU-only mode 2022-06-20 16:25:01 -04:00
Gnome Ann bd7d7b41a1 Don't enable accelerate if no layers are in disk cache or GPUs 2022-06-20 16:21:44 -04:00
Gnome Ann 90fd8b1845 Disk cache support in CPU-only mode 2022-06-20 16:06:09 -04:00
Gnome Ann af07d7a15f Disk cache support for computers with at least one GPU 2022-06-20 14:49:54 -04:00
Gnome Ann 47a58a36b8 Add disk cache slider 2022-06-19 22:53:30 -04:00
henk717 efed44ac8d
Merge pull request #155 from VE-FORBRYDERNE/accelerate
Initial support for Accelerate
2022-06-20 01:08:54 +02:00
Gnome Ann 4dd59e0a9d Correct the type hint for lazy_load_callback 2022-06-19 17:17:41 -04:00
Gnome Ann 21de36c4b0 Lazy loader now moves all non-layer weights to primary device 2022-06-19 16:44:23 -04:00
Gnome Ann 26c319519e Lazy loader now attempts to pin layers if accelerate is enabled 2022-06-19 16:35:23 -04:00
Gnome Ann 042cf3e560 Automatically support soft prompts for all transformers models 2022-06-19 13:11:58 -04:00
Gnome Ann cc56718a7e Fix lazy loader putting too many layers on CPU 2022-06-19 00:29:35 -04:00
Gnome Ann 1380eb0bb0 Disable lazy loader when using GPT-2 2022-06-18 23:54:11 -04:00
Gnome Ann f9732eb143 Always enable breakmodel if accelerate is available 2022-06-18 23:46:09 -04:00
Gnome Ann 8b4efc5d0a Use `accelerate.dispatch_model()` instead of breakmodel if possible 2022-06-18 23:41:36 -04:00
Gnome Ann f7ffdd7b6b Add more model querying utilities 2022-06-18 18:16:56 -04:00
Gnome Ann e143963161 Merge branch 'united' into accelerate 2022-06-18 13:47:38 -04:00
henk717 b209cf9868
NS mode as default
Experimental change that makes NS the default, more and more models seem to be requiring this as megatron based models are getting traction, neither does this seem to break the original models (with the exception of a user not being able to use </s> in generated outputs, the extremely rare case someone would be effected by this they can manually switch the mode by editing their settings file).

If this breaks nothing ns will remain the default, however the n mode should remain a choice for those who need it. In case it does get reversed I have also added the bloom model type to the ns list since its models require this.
2022-06-18 19:46:16 +02:00
henk717 23aae24f8e
Merge pull request #154 from VE-FORBRYDERNE/united-merge
Merge main into united
2022-06-18 19:42:26 +02:00
Gnome Ann 0eedc541c8 Merge branch 'main' into united-merge 2022-06-18 13:39:23 -04:00
henk717 22091bc7e2
Merge pull request #153 from ebolam/united
Fix for flaskwebgui
2022-06-17 14:19:22 +02:00
ebolam 2964175d8b Fix for flaskwebgui 2022-06-17 08:17:22 -04:00
Henk f112fc3493 Initial flaskwebgui support 2022-06-17 13:49:03 +02:00
Gnome Ann 8bdf17f598 Lazy loader can now use accelerate's `init_empty_weights()` 2022-06-16 18:56:16 -04:00
Gnome Ann 5253cdcb36 Lazy loader no longer requires map file except when loading to TPU 2022-06-16 18:45:11 -04:00
henk717 b0a01962ab
Merge branch 'KoboldAI:main' into united 2022-06-16 20:42:24 +02:00
henk717 50d2172aaf
Merge branch 'KoboldAI:main' into united 2022-06-16 19:55:39 +02:00
henk717 83b1fac7a4
Merge pull request #152 from VE-FORBRYDERNE/oom-passthrough
Don't use fallback loading if we run out of memory during model loading
2022-06-15 21:30:24 +02:00
Gnome Ann 96d3d397ab Don't use fallback loading if we run out of memory during loading 2022-06-15 14:35:32 -04:00
Henk fb2b6f1026 Model Path Hardening 2022-06-15 13:29:10 +02:00
Henk 24d34647e0 Block navigation on all remote modes 2022-06-15 12:32:19 +02:00
Henk f39e24d87f Localtunnel fix, small polish 2022-06-15 12:22:00 +02:00
Henk f49cf919bf Merge branch 'overhaul' into united 2022-06-15 02:09:30 +02:00
henk717 de07b1749f
Merge pull request #150 from ebolam/Web-UI
Delete model fixes and model info ui cleanup
2022-06-15 01:50:39 +02:00
ebolam 095cd2a19d Prevent on server side deletion of folders other than in models in the executing directory
Removed delete icon for model folders outside the models directory
2022-06-14 19:39:11 -04:00
ebolam f444ad851f Potential catch for if somehow a user sends a delete model with a .. in it. 2022-06-14 19:30:01 -04:00
ebolam 899f191b51 Fix for model information not being centered and having the wrong background 2022-06-14 19:26:02 -04:00
henk717 9add3b0761
Merge pull request #149 from ebolam/Web-UI
--remote jailed to model directory and delete of models from UI
2022-06-15 01:14:06 +02:00
ebolam 462206fa86 added --remote not allowing navigation outside of the model folder for custom models.
added a delete custom models option (will not delete models outside of the models directory, nor will it delete non-model directories)
2022-06-14 19:11:30 -04:00
Henk 01b3c9932a 1.18.1 version bump 2022-06-15 00:58:49 +02:00
henk717 f3eb7cba5c
Merge pull request #148 from VE-FORBRYDERNE/overhaul-merge
Merge united into overhaul
2022-06-15 00:56:14 +02:00
Gnome Ann 32a8f03f13 Merge branch 'united' into overhaul-merge 2022-06-14 18:53:04 -04:00
Gnome Ann 107966fef8 Merge branch 'united' into overhaul-merge 2022-06-14 18:47:38 -04:00
Gnome Ann a61e06f876 Merge commit '4c7d6f42d99d557130511f5d185249b34f9db5a1' into overhaul-merge 2022-06-14 18:43:25 -04:00
Gnome Ann 979640bd2f Merge commit '2d3db7b4ba388f566aaec88a0e76678fe4fade8d' into overhaul-merge 2022-06-14 18:42:14 -04:00
Gnome Ann 130d530e7c Merge commit 'a273a5ebc49935bfafdcf1aaf4b98c9bf4bc33b1' into overhaul-merge 2022-06-14 18:38:25 -04:00
Gnome Ann 18218a99bc Merge commit '8a38b258f497281af06fcb0c2559f382b419b938' into overhaul-merge 2022-06-14 18:36:37 -04:00
henk717 c4b2bcde4b
Merge pull request #147 from ebolam/Web-UI
Layer input box
2022-06-15 00:24:58 +02:00
ebolam 780548fba9 Added text input box for layer assignment 2022-06-14 11:53:47 -04:00
ebolam 11ed55f34a Added custom text box for loading models from specific path, or loading other models from hugging face. 2022-06-13 13:48:45 -04:00
ebolam 5110e956d2 Added execution time to the UI 2022-06-10 20:51:22 -04:00
ebolam cfd1147d5a Bug fix for loading model after loading a model duplicating the settings menu until the website is refreshed
Fixed escaping warnings
Added back/redo unit test
2022-06-10 14:47:52 -04:00
ebolam c432051fe3 Fix for HTML Unit Tests to include required css 2022-06-10 10:10:56 -04:00
henk717 dc45e808c7
Merge pull request #144 from ebolam/Web-UI
Web UI Enhancement, Basic Unit Tests
2022-06-10 16:07:50 +02:00
ebolam ed428f2e73 Merge branch 'Web-UI' of https://github.com/ebolam/KoboldAI into Web-UI 2022-06-10 09:12:18 -04:00
ebolam 4a920724d9 fix for folder paths on linux 2022-06-10 09:12:04 -04:00
ebolam 6200908582
Merge pull request #10 from henk717/overhaul
Overhaul
2022-06-10 08:40:15 -04:00
ebolam 13f17d3eca Changed unit tests so that they run with a simple pytest command 2022-06-10 08:39:15 -04:00
henk717 bd18cd6900
Merge pull request #143 from VE-FORBRYDERNE/overhaul-merge
Merge united into overhaul
2022-06-10 07:21:27 +02:00
Gnome Ann ce582f188f Merge branch 'united' into overhaul-merge 2022-06-09 23:48:28 -04:00
ebolam 1ea0df5295 Merge branch 'Web-UI' of https://github.com/ebolam/KoboldAI into Web-UI 2022-06-09 20:03:56 -04:00
ebolam 32b883892a Added favicon swapping mechanism on aibusy 2022-06-09 20:03:34 -04:00
ebolam f89d1f131f
Update readme.md 2022-06-09 13:34:13 -04:00
ebolam 663dee784d Unit Tests using pytest and Minor modifications to allow unit testing 2022-06-09 13:16:32 -04:00
henk717 22e8468b98
Merge pull request #142 from ebolam/Web-UI
Added GPU name to the UI when using break models.
2022-06-09 15:46:18 +02:00
ebolam 606c276f9d Potential fix for tokenizer using a fallback 2022-06-09 09:01:40 -04:00
ebolam db9a94ca2a Added GPU name to the UI when using break models.
Added total layers to the UI
Added favicon
2022-06-09 08:42:35 -04:00
henk717 ae2ee0dd57
Merge pull request #141 from ebolam/Web-UI
Functional --model/--path, fix for switching models
2022-06-09 01:48:38 +02:00
ebolam c565978fff Fix for multi-gpu not showing appropriately
Slight visual improvement for custom model load breadcrumbs
2022-06-08 19:39:04 -04:00
ebolam 4548dcf1b0 Fix for --model with custom paths 2022-06-08 18:53:56 -04:00
ebolam 001439be45
Merge pull request #9 from henk717/overhaul
Overhaul
2022-06-08 18:44:21 -04:00
ebolam 622a3fc8db Fix for model loading by moving monkey patching functions into a run-once function
Added folder navigation to custom model loading (Needs prittying)
2022-06-08 18:42:44 -04:00
Henk 1a46d97ad5 Send correct settings after load 2022-06-08 13:26:30 +02:00
Henk 461cd04932 Fix Essential Code + selectfolder fix
As part of the restructuring essential code was removed that handled the --path parameter correctly. This has now been restored. Selectfolder was also updated to use its NeoCustom counterpart instead of specifying a model so that the underlying code that corrects model names is being hit again.
2022-06-08 11:30:00 +02:00
henk717 6a324b0e75
Merge pull request #140 from ebolam/Web-UI
Fix for user selection of model folder before the web ui is loaded
2022-06-08 08:31:45 +02:00
ebolam 190869f0d3 Fix for selectfolder model to force old style folder select on startup. 2022-06-07 20:24:31 -04:00
ebolam c131eb04c7
Merge pull request #8 from henk717/overhaul
Overhaul
2022-06-07 20:17:32 -04:00
ebolam 930a98f4e0
Merge branch 'henk717:united' into Web-UI 2022-06-07 15:41:55 -04:00
Henk 88f5ed7b3c --model selectfolder 2022-06-07 21:32:58 +02:00
Henk 66ba165b4c --noaimenu as seperate parameter 2022-06-07 20:44:14 +02:00
henk717 2333c85f4e
Merge pull request #139 from ebolam/Web-UI
UI changes with AI Selection in Web
2022-06-07 20:33:46 +02:00
ebolam 6fd2496d94 Fix for green opening text showing OAI and/or OAI/GooseAI model name rather than the appropriate name. 2022-06-07 13:47:10 -04:00
ebolam 1df88e1696 TPU fix Attempt 2022-06-07 09:05:51 -04:00
ebolam bf4af94abb Hopefully a fix for InferKit 2022-06-07 08:22:10 -04:00
ebolam afb894f5a0 TPU Fix 2022-06-06 21:47:15 -04:00
ebolam 1b35b55d86 Fix TPU 2022-06-06 21:39:17 -04:00
ebolam ae1aed0916 TPU Fix 2022-06-06 21:37:35 -04:00
ebolam df76bc4b41 Fix for Colab 2022-06-06 21:29:14 -04:00
ebolam edbf36a632 Web UI functional for GooseAI (and presumably OpenAI).
Fix for Breakmodel layer info saving
2022-06-06 19:21:10 -04:00
ebolam d9480ec439 Fix for lazy loading 2022-06-06 14:27:47 -04:00
ebolam 60b70bdf8a Fix 2022-06-06 14:02:17 -04:00
ebolam dd07b10b73 Fix for model loading on web ui and removing AI menu when using remote/colab methods 2022-06-06 13:57:19 -04:00
ebolam c984f4412d Fix for web based model loading 2022-06-06 12:49:40 -04:00
ebolam 1e139594a9 Merge commit 'refs/pull/7/head' of https://github.com/ebolam/KoboldAI into HEAD 2022-06-06 09:49:46 -04:00
Gnome Ann 793d788706 Preserve whitespace in the editor 2022-05-26 14:34:40 -04:00
ebolam e65015aed4 Merge branch 'Web-UI' of https://github.com/ebolam/KoboldAI into HEAD 2022-03-14 16:43:42 -04:00
ebolam 36fef6bfbc
Delete base.yml 2022-03-14 16:43:15 -04:00
ebolam bc5f30610d Removed base.yml 2022-03-14 16:42:49 -04:00
ebolam 8ae0a4a3e7 Online Services Working now (without a way to test as I don't have accounts) 2022-03-12 14:21:11 -05:00
ebolam 772ae2eb80 Added model info to show model load progress in UI 2022-03-11 11:31:41 -05:00
ebolam 0943926f6a Fix for lazy loading 2022-03-07 19:52:44 -05:00
ebolam bfc07073e3 layer count fix 2022-03-07 19:33:24 -05:00
ebolam d8ab58892d saved layer value fix 2022-03-07 19:21:55 -05:00
ebolam da53d7edb3 Custom Path Load fix 2022-03-07 18:54:11 -05:00
ebolam d1a64e25da Custom Model Load Fix 2022-03-07 18:44:37 -05:00
ebolam 70f1c2da9c Added stub for model name feedback 2022-03-07 14:20:25 -05:00
ebolam d0553779ab Bug Fix 2022-03-07 12:33:35 -05:00
ebolam 6a08fe2f10 Added scroll bars to the model load menu 2022-03-07 12:04:41 -05:00
ebolam c50fe77a7d Load Fix 2022-03-07 11:57:33 -05:00
ebolam 49fc854e55 Added saving of breakmodel values so that it defaults to it on next load 2022-03-07 11:49:34 -05:00
ebolam 2cf6b6e650
Merge branch 'henk717:united' into united 2022-03-07 11:31:14 -05:00
ebolam 123cd45b0e Breakmodel working now with the web UI 2022-03-07 11:27:23 -05:00
ebolam 5e00f7daf0 Next evolution of web ui model selection. Custom Paths not working quite right. 2022-03-06 20:55:11 -05:00
ebolam 2ddf45141b Initial UI based model loading. Includes all parameters except breakmodel chunks, engine # for OAI, and url for ngrok url for google colab 2022-03-06 19:51:35 -05:00
Gnome Ann 2db1f2f7bb AvrilAI-style repetition penalty test 2022-01-25 15:05:21 -05:00
73 changed files with 23945 additions and 1734 deletions

1
.gitattributes vendored
View File

@ -1,2 +1,3 @@
*.min.lua linguist-vendored
*documentation.html linguist-vendored
/static/swagger-ui/* linguist-vendored

3
.gitignore vendored
View File

@ -15,6 +15,7 @@ bin
__pycache__
*.log
cache
accelerate-disk-cache
userscripts
!userscripts/examples
!userscripts/kaipreset_*.lua
@ -24,6 +25,8 @@ softprompts
models
!models/models go here.txt
Uninstall
flask_session
accelerate-disk-cache
.ipynb_checkpoints
# Ignore PyCharm project files.

View File

@ -48,33 +48,42 @@ If you would like to play KoboldAI online for free on a powerful computer you ca
Each edition features different models and requires different hardware to run, this means that if you are unable to obtain a TPU or a GPU you might still be able to use the other version. The models you can use are listed underneath the edition. To open a Colab click the big link featuring the editions name.
## [TPU Edition Model Descriptions](https://colab.research.google.com/github/KoboldAI/KoboldAI-Client/blob/main/colab/TPU.ipynb)
## [Models the TPU can run:](https://colab.research.google.com/github/KoboldAI/KoboldAI-Client/blob/main/colab/TPU.ipynb)
| Model | Size | Style | Description |
| --- | --- | --- | --- |
| [Nerys](https://huggingface.co/KoboldAI/fairseq-dense-13B-Nerys) by Mr Seeker | 13B | Novel/Adventure | Nerys is a hybrid model based on Pike (A newer Janeway), on top of the Pike dataset you also get some Light Novels, Adventure mode support and a little bit of Shinen thrown in the mix. The end result is a very diverse model that is heavily biased towards SFW novel writing, but one that can go beyond its novel training and make for an excellent adventure model to. Adventure mode is best played from a second person perspective, but can be played in first or third person as well. Novel writing can be done best from the first or third person. |
| [Janeway](https://huggingface.co/KoboldAI/fairseq-dense-13B-Janeway) by Mr Seeker | 13B | Novel | Janeway is a model created from Picard's dataset combined with a brand new collection of ebooks. This model is trained on 20% more content than Picard and has been trained on literature from various genres. Although the model is mainly focussed on SFW, romantic scenes might involve a degree of nudity. |
| [Shinen](https://huggingface.co/KoboldAI/fairseq-dense-13B-Shinen) by Mr Seeker | 13B | NSFW | Shinen is an NSFW model designed to be more explicit. Trained on a variety of stories from the website Sexstories it contains many different kinks. |
| [Skein](https://huggingface.co/KoboldAI/GPT-J-6B-Skein) by VE\_FORBRYDERNE | 6B | Adventure | Skein is best used with Adventure mode enabled, it consists of a 4 times larger adventure dataset than the Adventure model making it excellent for text adventure gaming. On top of that it also consists of light novel training further expanding its knowledge and writing capabilities. It can be used with the You filter bias if you wish to write Novels with it, but dedicated Novel models can perform better for this task. |
| [Adventure](https://huggingface.co/KoboldAI/GPT-J-6B-Adventure) by VE\_FORBRYDERNE | 6B | Adventure | Adventure is a 6B model designed to mimick the behavior of AI Dungeon. It is exclusively for Adventure Mode and can take you on the epic and wackey adventures that AI Dungeon players love. It also features the many tropes of AI Dungeon as it has been trained on very similar data. It must be used in second person (You). |
| [Lit](https://huggingface.co/hakurei/lit-6B) by Haru | 6B | NSFW | Lit is a great NSFW model trained by Haru on both a large set of Literotica stories and high quality novels along with tagging support. Creating a high quality model for your NSFW stories. This model is exclusively a novel model and is best used in third person. |
| Neo(X) by EleutherAI | 20B | Generic | NeoX is the largest EleutherAI model currently available, being a generic model it is not particularly trained towards anything and can do a variety of writing, Q&A and coding tasks. 20B's performance is closely compared to the 13B models and it is worth trying both especially if you have a task that does not involve english writing. Its behavior will be similar to the GPT-J-6B model since they are trained on the same dataset but with more sensitivity towards repetition penalty and with more knowledge. |
| [Fairseq Dense](https://huggingface.co/KoboldAI/fairseq-dense-13B) | 13B | Generic | Trained by Facebook Researchers this model stems from the MOE research project within Fairseq. This particular version has been converted by us for use in KoboldAI. It is known to be on par with the larger 20B model from EleutherAI and considered as better for pop culture and language tasks. Because the model has never seen a new line (enter) it may perform worse on formatting and paragraphing. |
| [GPT-J-6B](https://huggingface.co/EleutherAI/gpt-j-6B) by EleutherAI | 6B | Generic | This model serves as the basis for most other 6B models (Some being based on Fairseq Dense instead). Being trained on the Pile and not biased towards anything in particular it is suitable for a variety of tasks such as writing, Q&A and coding tasks. You will likely get better result with larger generic models or finetuned models. |
| Model | Style | Description |
| --- | --- | --- |
| [Nerys](https://huggingface.co/KoboldAI/fairseq-dense-13B-Nerys) by Mr Seeker | Novel/Adventure | Nerys is a hybrid model based on Pike (A newer Janeway), on top of the Pike dataset you also get some Light Novels, Adventure mode support and a little bit of Shinen thrown in the mix. The end result is a very diverse model that is heavily biased towards SFW novel writing, but one that can go beyond its novel training and make for an excellent adventure model to. Adventure mode is best played from a second person perspective, but can be played in first or third person as well. Novel writing can be done best from the first or third person. |
| [Erebus](https://huggingface.co/KoboldAI/OPT-13B-Erebus) by Mr Seeker | NSFW | Erebus is our community's flagship NSFW model, being a combination of multiple large datasets that include Literotica, Shinen and erotic novels from Nerys and featuring thourough tagging support it covers the vast majority of erotic writing styles. This model is capable of replacing both the Lit and Shinen models in terms of content and style and has been well received as (one of) the best NSFW models out there. If you wish to use this model for commercial or non research usage we recommend choosing the 20B version as that one is not subject to the restrictive OPT license. |
| [Janeway](https://huggingface.co/KoboldAI/fairseq-dense-13B-Janeway) by Mr Seeker | Novel | Janeway is a model created from Picard's dataset combined with a brand new collection of ebooks. This model is trained on 20% more content than Picard and has been trained on literature from various genres. Although the model is mainly focussed on SFW, romantic scenes might involve a degree of nudity. |
| [Shinen](https://huggingface.co/KoboldAI/fairseq-dense-13B-Shinen) by Mr Seeker | NSFW | Shinen is an NSFW model trained on a variety of stories from the website Sexstories it contains many different kinks. It has been merged into the larger (and better) Erebus model. |
| [Skein](https://huggingface.co/KoboldAI/GPT-J-6B-Skein) by VE\_FORBRYDERNE | Adventure | Skein is best used with Adventure mode enabled, it consists of a 4 times larger adventure dataset than the Adventure model making it excellent for text adventure gaming. On top of that it also consists of light novel training further expanding its knowledge and writing capabilities. It can be used with the You filter bias if you wish to write Novels with it, but dedicated Novel models can perform better for this task. |
| [Adventure](https://huggingface.co/KoboldAI/GPT-J-6B-Adventure) by VE\_FORBRYDERNE | Adventure | Adventure is a 6B model designed to mimick the behavior of AI Dungeon. It is exclusively for Adventure Mode and can take you on the epic and wackey adventures that AI Dungeon players love. It also features the many tropes of AI Dungeon as it has been trained on very similar data. It must be used in second person (You). |
| [Lit](https://huggingface.co/hakurei/lit-6B) ([V2](https://huggingface.co/hakurei/litv2-6B-rev3)) by Haru | NSFW | Lit is a great NSFW model trained by Haru on both a large set of Literotica stories and high quality novels along with tagging support. Creating a high quality model for your NSFW stories. This model is exclusively a novel model and is best used in third person. |
| [OPT](https://huggingface.co/facebook/opt-13b) by Metaseq | Generic | OPT is considered one of the best base models as far as content goes, its behavior has the strengths of both GPT-Neo and Fairseq Dense. Compared to Neo duplicate and unnecessary content has been left out, while additional literature was added in similar to the Fairseq Dense model. The Fairseq Dense model however lacks the broader data that OPT does have. The biggest downfall of OPT is its license, which prohibits any commercial usage, or usage beyond research purposes. |
| [Neo(X)](https://huggingface.co/EleutherAI/gpt-neox-20b) by EleutherAI | Generic | NeoX is the largest EleutherAI model currently available, being a generic model it is not particularly trained towards anything and can do a variety of writing, Q&A and coding tasks. 20B's performance is closely compared to the 13B models and it is worth trying both especially if you have a task that does not involve english writing. Its behavior will be similar to the GPT-J-6B model since they are trained on the same dataset but with more sensitivity towards repetition penalty and with more knowledge. |
| [Fairseq Dense](https://huggingface.co/KoboldAI/fairseq-dense-13B) | Generic | Trained by Facebook Researchers this model stems from the MOE research project within Fairseq. This particular version has been converted by us for use in KoboldAI. It is known to be on par with the larger 20B model from EleutherAI and considered as better for pop culture and language tasks. Because the model has never seen a new line (enter) it may perform worse on formatting and paragraphing. Compared to other models the dataset focuses primarily on literature and contains little else. |
| [GPT-J-6B](https://huggingface.co/EleutherAI/gpt-j-6B) by EleutherAI | Generic | This model serves as the basis for most other 6B models (Some being based on Fairseq Dense instead). Being trained on the Pile and not biased towards anything in particular it is suitable for a variety of tasks such as writing, Q&A and coding tasks. You will likely get better result with larger generic models or finetuned models. |
## [GPU Edition Model Descriptions](https://colab.research.google.com/github/KoboldAI/KoboldAI-Client/blob/main/colab/GPU.ipynb)
| Model | Size | Style | Description |
| --- | --- | --- | --- |
| [Nerys 2.7B](https://huggingface.co/KoboldAI/fairseq-dense-2.7B-Nerys) by Mr Seeker | 2.7B | Novel/Adventure | Nerys is a hybrid model based on Pike (A newer Janeway), on top of the Pike dataset you also get some Light Novels, Adventure mode support and a little bit of Shinen thrown in the mix. The end result is a very diverse model that is heavily biased towards SFW novel writing, but one that can go beyond its novel training and make for an excellent adventure model to. Adventure mode is best played from a second person perspective, but can be played in first or third person as well. Novel writing can be done best from the first or third person. |
| [Janeway 2.7B](https://huggingface.co/KoboldAI/GPT-Neo-2.7B-Janeway) by Mr Seeker | 2.7B | Novel | Janeway is a model created from Picard's dataset combined with a brand new collection of ebooks. This model is trained on 20% more content than Picard and has been trained on literature from various genres. Although the model is mainly focussed on SFW, romantic scenes might involve a degree of nudity. |
| [Picard 2.7B](https://huggingface.co/KoboldAI/GPT-Neo-2.7B-Picard) by Mr Seeker | 2.7B | Novel | Picard is a model trained for SFW Novels based on Neo 2.7B. It is focused on Novel style writing without the NSFW bias. While the name suggests a sci-fi model this model is designed for Novels of a variety of genre's. It is meant to be used in KoboldAI's regular mode. |
| [AID 2.7B](https://huggingface.co/KoboldAI/GPT-Neo-2.7B-AID) by melastacho | 2.7B | Adventure | Also know as Adventure 2.7B this is a clone of the AI Dungeon Classic model and is best known for the epic wackey adventures that AI Dungeon Classic players love. |
| [Horni LN 2.7B](https://huggingface.co/KoboldAI/GPT-Neo-2.7B-Horni-LN) by finetune | 2.7B | Novel | This model is based on Horni 2.7B and retains its NSFW knowledge, but was then further biased towards SFW novel stories. If you seek a balance between a SFW Novel model and a NSFW model this model should be a good choice. |
| [Horni 2.7B](https://huggingface.co/KoboldAI/GPT-Neo-2.7B-Horni) by finetune | 2.7B | NSFW | This model is tuned on Literotica to produce a Novel style model biased towards NSFW content. Can still be used for SFW stories but will have a bias towards NSFW content. It is meant to be used in KoboldAI's regular mode. |
| [Shinen 2.7B ](https://huggingface.co/KoboldAI/GPT-Neo-2.7B-Shinen) by Mr Seeker | 2.7B | NSFW | Shinen is an alternative to the Horni model designed to be more explicit. If Horni is to tame for you Shinen might produce better results. While it is a Novel model it is unsuitable for SFW stories due to its heavy NSFW bias. Shinen will not hold back. It is meant to be used in KoboldAI's regular mode. |
| [Neo 2.7B](https://huggingface.co/EleutherAI/gpt-neo-2.7B) by EleutherAI | 2.7B | Generic | This is the base model for all the other 2.7B models, it is best used when you have a use case that we have no other models available for, such as writing blog articles or programming. It can also be a good basis for the experience of some of the softprompts if your softprompt is not about a subject the other models cover. |
## [Models the Colab GPU can run:](https://colab.research.google.com/github/KoboldAI/KoboldAI-Client/blob/main/colab/GPU.ipynb)
| Model | Style | Description |
| --- | --- | --- |
| [Nerys](https://huggingface.co/KoboldAI/fairseq-dense-2.7B-Nerys) by Mr Seeker | Novel/Adventure | Nerys is a hybrid model based on Pike (A newer Janeway), on top of the Pike dataset you also get some Light Novels, Adventure mode support and a little bit of Shinen thrown in the mix. The end result is a very diverse model that is heavily biased towards SFW novel writing, but one that can go beyond its novel training and make for an excellent adventure model to. Adventure mode is best played from a second person perspective, but can be played in first or third person as well. Novel writing can be done best from the first or third person. |
| [Tiefighter 13B by KoboldAI](https://huggingface.co/KoboldAI/LLaMA2-13B-Tiefighter) | Hybrid | Tiefighter 13B is a very versitile fiction Hybrid, it can write, chat and play adventure games and can also answer regular instructions (Although we do not recommend this model for factual use due to its fictional nature). This is an excellent starting model, for the best results avoid using Second person writing in your chats unless you are wanting it to become a text adventure.|
| [Janeway](https://huggingface.co/KoboldAI/GPT-Neo-2.7B-Janeway) by Mr Seeker | Novel | Janeway is a model created from Picard's dataset combined with a brand new collection of ebooks. This model is trained on 20% more content than Picard and has been trained on literature from various genres. Although the model is mainly focussed on SFW, romantic scenes might involve a degree of nudity. |
| [Picard](https://huggingface.co/KoboldAI/GPT-Neo-2.7B-Picard) by Mr Seeker | Novel | Picard is a model trained for SFW Novels based on Neo 2.7B. It is focused on Novel style writing without the NSFW bias. While the name suggests a sci-fi model this model is designed for Novels of a variety of genre's. It is meant to be used in KoboldAI's regular mode. |
| [AID](https://huggingface.co/KoboldAI/GPT-Neo-2.7B-AID) by melastacho | Adventure | Also know as Adventure 2.7B this is a clone of the AI Dungeon Classic model and is best known for the epic wackey adventures that AI Dungeon Classic players love. |
| [OPT](https://huggingface.co/facebook/opt-2.7b) by Metaseq | Generic | OPT is considered one of the best base models as far as content goes, its behavior has the strengths of both GPT-Neo and Fairseq Dense. Compared to Neo duplicate and unnecessary content has been left out, while additional literature was added in similar to the Fairseq Dense model. The Fairseq Dense model however lacks the broader data that OPT does have. The biggest downfall of OPT is its license, which prohibits any commercial usage, or usage beyond research purposes. |
| [Fairseq Dense](https://huggingface.co/KoboldAI/fairseq-dense-2.7B) | Generic | Trained by Facebook Researchers this model stems from the MOE research project within Fairseq. This particular version has been converted by us for use in KoboldAI. It is known to be on par with the larger models from EleutherAI and considered as better for pop culture and language tasks. Because the model has never seen a new line (enter) it may perform worse on formatting and paragraphing. Compared to other models the dataset focuses primarily on literature and contains little else. |
| [MythoMax 13B](https://huggingface.co/TheBloke/MythoMax-L2-13B-GPTQ) by Gryphe | Roleplay | An improved, potentially even perfected variant of MythoMix, my MythoLogic-L2 and Huginn merge using a highly experimental tensor type merge technique¹. |
| [Holomax 13B by KoboldAI](https://huggingface.co/KoboldAI/LLaMA2-13B-Holomax) | Adventure | This is an expansion merge to the well-praised MythoMax model from Gryphe (60%) using MrSeeker's KoboldAI Holodeck model (40%). The goal of this model is to enhance story-writing capabilities while preserving the desirable traits of the MythoMax model as much as possible (It does limit chat reply length). |
| [Airoboros 13B](https://huggingface.co/jondurbin/airoboros-13b) by Jon Durbin | Generic | This is an instruction fine-tuned llama-2 model, using synthetic instructions generated by airoboros⁵. |
| [Emerhyst 13B](https://huggingface.co/Undi95/Emerhyst-13B) by Undi | Roleplay | An attempt using BlockMerge_Gradient to get better result. In addition, LimaRP v3 was used⁷. |
| [Chronos 13B](https://huggingface.co/elinas/chronos-13b) by Elinas | Generic | This model is primarily focused on chat, roleplay, and storywriting, but can accomplish other tasks such as simple reasoning and coding. Chronos generates very long outputs with coherent text, largely due to the human inputs it was trained on. |
| [Spring Dragon by Henk717](https://huggingface.co/Henk717/spring-dragon) | Adventure | This model is a recreation attempt of the AI Dungeon 2 Dragon model. To achieve this, the "text_adventures.txt" dataset was used, which was bundled with the original AI Dungeon 2 GitHub release prior to the online service. It is worth noting that the same dataset file was used to create the Dragon model, where Dragon is a GPT-3 175B Davinci model from 2020. |
| [Holodeck By KoboldAI](https://huggingface.co/KoboldAI/LLAMA2-13B-Holodeck-1) | Adventure |LLAMA2 13B-Holodeck is a finetune created using Meta's llama 2 model.The training data contains around 3000 ebooks in various genres. Most parts of the dataset have been prepended using the following text: [Genre: <genre1>, <genre2>|
| [Neo](https://huggingface.co/EleutherAI/gpt-neo-2.7B) by EleutherAI | Generic | This is the base model for all the other 2.7B models, it is best used when you have a use case that we have no other models available for, such as writing blog articles or programming. It can also be a good basis for the experience of some of the softprompts if your softprompt is not about a subject the other models cover. |
| [Various 2.7b models]() by various | Various smaller models are also possible to load in GPU colab. | |
### Styles
| Type | Description |
@ -100,7 +109,7 @@ KoboldAI has a large number of dependencies you will need to install on your com
### Downloading the latest version of KoboldAI
KoboldAI is a rolling release on our github, the code you see is also the game. You can the software by clicking on the green Code button at the top of the page and clicking Download ZIP.
KoboldAI is a rolling release on our github, the code you see is also the game. You can download the software by clicking on the green Code button at the top of the page and clicking Download ZIP, or use the `git clone` command instead. Then, on Windows you need to you run install_requirements.bat (using admin mode is recommanded to avoid errors), and once it's done, or if you're on Linux, either play.bat/sh or remote-play.bat/sh to run it.
The easiest way for Windows users is to use the [offline installer](https://sourceforge.net/projects/koboldai/files/latest/download) below.
@ -192,14 +201,21 @@ Lastly the all the features of our userscript API are documented inside the API
For our TPU versions keep in mind that scripts modifying AI behavior relies on a different way of processing that is slower than if you leave these userscripts disabled even if your script only sporadically uses this modifier. If you want to partially use a script at its full speed than you can enable "No Gen Modifiers" to ensure that the parts that would make the TPU slow are not active.
## API
KoboldAI has a REST API that can be accessed by adding /api to the URL that Kobold provides you (For example http://127.0.0.1:5000/api).
When accessing this link in a browser you will be taken to the interactive documentation.
## Contributors
This project contains work from the following contributors :
* The Gantian - Creator of KoboldAI, has created most features such as the interface, the different AI model / API integrations and in general the largest part of the project.
* VE FORBRYDERNE - Contributed many features such as the Editing overhaul, Adventure Mode, expansions to the world info section, breakmodel integration, scripting support, softpromtps and much more. As well as vastly improving the TPU compatibility and integrating external code into KoboldAI so we could use official versions of Transformers with virtually no downsides.
* VE FORBRYDERNE - Contributed many features such as the Editing overhaul, Adventure Mode, expansions to the world info section, breakmodel integration, scripting support, API, softpromtps and much more. As well as vastly improving the TPU compatibility and integrating external code into KoboldAI so we could use official versions of Transformers with virtually no downsides.
* Henk717 - Contributed the installation scripts, this readme, random story generator, the docker scripts, the foundation for the commandline interface and other smaller changes as well as integrating multiple parts of the code of different forks to unite it all. He also optimized the model loading so that downloaded models get converted to efficient offline models and that in future models are more likely to work out of the box. Not all code Github attributes to Henk717 is by Henk717 as some of it has been integrations of other people's work. We try to clarify this in the contributors list as much as we can.
* Ebolam - Automatic Saving
* Ebolam - Automatic Saving, back/redo, pinning, web loading of models
* one-some, Logits Viewer and Token Streaming
* db0, KoboldAI Horde
* Frogging101 - top\_k / tfs support (Part of this support was later redone by VE to integrate what was originally inside of finetuneanon's transformers)
* UWUplus (Ralf) - Contributed storage systems for community colabs, as well as cleaning up and integrating the website dependencies/code better. He is also the maintainer of flask-cloudflared which we use to generate the cloudflare links.
* Javalar - Initial Performance increases on the story\_refresh

File diff suppressed because it is too large Load Diff

View File

@ -4,7 +4,7 @@ https://github.com/arrmansa/Basic-UI-for-GPT-J-6B-with-low-vram/blob/main/GPT-J-
The ORIGINAL version of the patch is released under the Apache License 2.0
Copyright 2021 arrmansa
Copyright 2021 finetuneanon
Copyright 2018 The Hugging Face team
Copyright 2018, 2022 The Hugging Face team
Apache License
@ -216,11 +216,13 @@ from torch import nn
import torch.cuda.comm
import copy
import gc
import os
import sys
import itertools
import bisect
import random
from typing import Optional
import utils
from typing import Dict, List, Optional, Union
from transformers.modeling_outputs import BaseModelOutputWithPast, BaseModelOutputWithPastAndCrossAttentions
@ -230,7 +232,100 @@ logger = logging.get_logger(__name__)
breakmodel = True
gpu_blocks = []
primary_device = 0
disk_blocks = 0
primary_device = 0 if torch.cuda.device_count() > 0 else "cpu"
if utils.HAS_ACCELERATE:
from accelerate.hooks import attach_align_device_hook_on_blocks
from accelerate.utils import OffloadedWeightsLoader, check_device_map, extract_submodules_state_dict, offload_state_dict
from accelerate import dispatch_model
def dispatch_model_ex(
model: nn.Module,
device_map: Dict[str, Union[str, int, torch.device]],
main_device: Optional[torch.device] = None,
state_dict: Optional[Dict[str, torch.Tensor]] = None,
offload_dir: Union[str, os.PathLike] = None,
offload_buffers: bool = False,
**kwargs,
):
"""
This is a modified version of
https://github.com/huggingface/accelerate/blob/eeaba598f455fbd2c48661d7e816d3ff25ab050b/src/accelerate/big_modeling.py#L130
that still works when the main device is the CPU.
Dispatches a model according to a given device map. Layers of the model might be spread across GPUs, offloaded on
the CPU or even the disk.
Args:
model (`torch.nn.Module`):
The model to dispatch.
device_map (`Dict[str, Union[str, int, torch.device]]`):
A dictionary mapping module names in the models `state_dict` to the device they should go to. Note that
`"disk"` is accepted even if it's not a proper value for `torch.device`.
main_device (`str`, `int` or `torch.device`, *optional*):
The main execution device. Will default to the first device in the `device_map` different from `"cpu"` or
`"disk"`.
state_dict (`Dict[str, torch.Tensor]`, *optional*):
The state dict of the part of the model that will be kept on CPU.
offload_dir (`str` or `os.PathLike`):
The folder in which to offload the model weights (or where the model weights are already offloaded).
offload_buffers (`bool`, *optional*, defaults to `False`):
Whether or not to offload the buffers with the model parameters.
preload_module_classes (`List[str]`, *optional*):
A list of classes whose instances should load all their weights (even in the submodules) at the beginning
of the forward. This should only be used for classes that have submodules which are registered but not
called directly during the forward, for instance if a `dense` linear layer is registered, but at forward,
`dense.weight` and `dense.bias` are used in some operations instead of calling `dense` directly.
"""
if main_device != "cpu":
return dispatch_model(model, device_map, main_device, state_dict, offload_dir=offload_dir, offload_buffers=offload_buffers, **kwargs)
# Error early if the device map is incomplete.
check_device_map(model, device_map)
offload_devices = ["cpu", "disk"] if main_device != "cpu" else ["disk"]
if main_device is None:
main_device = [d for d in device_map.values() if d not in offload_devices][0]
cpu_modules = [name for name, device in device_map.items() if device == "cpu"] if main_device != "cpu" else []
if state_dict is None and len(cpu_modules) > 0:
state_dict = extract_submodules_state_dict(model.state_dict(), cpu_modules)
disk_modules = [name for name, device in device_map.items() if device == "disk"]
if offload_dir is None and len(disk_modules) > 0:
raise ValueError(
"We need an `offload_dir` to dispatch this model according to this `device_map`, the following submodules "
f"need to be offloaded: {', '.join(disk_modules)}."
)
if len(disk_modules) > 0 and (
not os.path.isdir(offload_dir) or not os.path.isfile(os.path.join(offload_dir, "index.json"))
):
disk_state_dict = extract_submodules_state_dict(model.state_dict(), disk_modules)
offload_state_dict(offload_dir, disk_state_dict)
execution_device = {
name: main_device if device in offload_devices else device for name, device in device_map.items()
}
offload = {name: device in offload_devices for name, device in device_map.items()}
save_folder = offload_dir if len(disk_modules) > 0 else None
if state_dict is not None or save_folder is not None:
weights_map = OffloadedWeightsLoader(state_dict=state_dict, save_folder=save_folder)
else:
weights_map = None
attach_align_device_hook_on_blocks(
model,
execution_device=execution_device,
offload=offload,
offload_buffers=offload_buffers,
weights_map=weights_map,
**kwargs,
)
model.hf_device_map = device_map
return model
# Copied from transformers.models.bart.modeling_bart._expand_mask

View File

@ -1,23 +1,4 @@
{
"nbformat": 4,
"nbformat_minor": 0,
"metadata": {
"colab": {
"name": "ColabKobold GPU",
"private_outputs": true,
"provenance": [],
"collapsed_sections": [],
"include_colab_link": true
},
"kernelspec": {
"display_name": "Python 3",
"name": "python3"
},
"language_info": {
"name": "python"
},
"accelerator": "GPU"
},
"cells": [
{
"cell_type": "markdown",
@ -35,52 +16,99 @@
"id": "kX9y5koxa58q"
},
"source": [
"## [You can get faster generations and higher context with our Koboldcpp Notebook](https://koboldai.org/colabcpp)\n",
"\n",
"# Welcome to KoboldAI on Google Colab, GPU Edition!\n",
"KoboldAI is a powerful and easy way to use a variety of AI based text generation experiences. You can use it to write stories, blog posts, play a text adventure game, use it like a chatbot and more! In some cases it might even help you with an assignment or programming task (But always make sure the information the AI mentions is correct, it loves to make stuff up).\n",
"\n",
"For more information about KoboldAI check our our Github readme : https://github.com/KoboldAI/KoboldAI-Client/blob/main/readme.md\n",
"\n",
"For the larger AI models (That are typically more coherent) check out our **[TPU edition](https://colab.research.google.com/github/KoboldAI/KoboldAI-Client/blob/main/colab/TPU.ipynb)**!"
"---\n",
"## How to load KoboldAI: Everything you need to know\n",
"1. On a phone? First put your browser in desktop mode because of a Google Colab bug. Otherwise nothing will happen when you click the play button. Then tap the play button next to \"<-- Tap This if you play on Mobile\", you will see an audio player. Keep the audio player playing so Colab does not get shut down in the background.\n",
"2. Select the desired model, you will find a description of all the available models further down the page.\n",
"3. Click the play button next to \"<-- Select your model below and then click this to start KoboldAI\".\n",
"4. Got a message saying no accelerator is available? Click cancel, and try again in a few minutes. If you do not manage to get a session when you frequently try again try at a different time of day, colab can be busy or your priority may have been lowered by frequent usage.\n",
"5. After everything is done loading you will get a link that you can use to open KoboldAI. In case of Localtunnel you will also be warned that some people are abusing Localtunnel for phishing, once you acknowledge this warning you will be taken to KoboldAI's interface. If you picked Cloudflare and get a 1033 error refresh the error page after waiting one minute.\n",
"\n",
"---\n",
"\n",
"Further down the page you can find descriptions of the models, and tips to get the most out of your Google Colab experience.\n",
"\n",
"Make sure to keep this page open while you are using KoboldAI, and check back regularly to see if you got a Captcha. Failure to complete the captcha's in time can result in termination of your session or a lower priority towards the TPUs.\n",
"\n",
"Firefox users need to disable the enhanced tracking protection or use a different browser in order to be able to use Google Colab without errors (This is not something we can do anything about, the cookie blocker breaks the Google Drive integration because it uses different domains)."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "ewkXkyiFP2Hq"
},
"outputs": [],
"source": [
"#@title <-- Tap this if you play on Mobile { display-mode: \"form\" }\n",
"%%html\n",
"<b>Press play on the music player to keep the tab alive, then start KoboldAI below (Uses only 13MB of data)</b><br/>\n",
"<audio src=\"https://henk.tech/colabkobold/silence.m4a\" controls>"
],
"execution_count": null,
"outputs": []
"<audio src=\"https://raw.githubusercontent.com/KoboldAI/KoboldAI-Client/main/colab/silence.m4a\" controls>"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "lVftocpwCoYw",
"cellView": "form"
"cellView": "form",
"id": "lVftocpwCoYw"
},
"outputs": [],
"source": [
"#@title <b><-- Select your model below and then click this to start KoboldAI</b>\n",
"#@markdown You can find a description of the models below along with instructions on how to start KoboldAI.\n",
"\n",
"Model = \"Nerys 2.7B\" #@param [\"Nerys 2.7B\", \"AID 2.7B\", \"Erebus 2.7B\", \"Janeway 2.7B\", \"Picard 2.7B\", \"Horni LN 2.7B\", \"Horni 2.7B\", \"Shinen 2.7B\", \"Neo 2.7B\"] {allow-input: true}\n",
"Model = \"Nerys V2 6B\" #@param [\"Tiefighter 13B (United)\", \"Echidna 13B (United)\", \"HoloMax 13B (United)\", \"Emerhyst 13B (United)\", \"MythoMax 13B (United)\", \"Huginn 13B (United)\", \"Chronos 13B (United)\", \"Airoboros M2.0 13B (United)\", \"Holodeck 13B (United)\", \"Spring Dragon 13B (United)\", \"Nerys V2 6B\", \"Skein 6B\", \"Janeway 6B\", \"Adventure 6B\", \"Nerys 2.7B\", \"AID 2.7B\", \"Janeway 2.7B\", \"Picard 2.7B\", \"OPT 2.7B\", \"Fairseq Dense 2.7B\", \"Neo 2.7B\"] {allow-input: true}\n",
"Revision = \"\" #@param [\"\"]{allow-input: true}\n",
"Version = \"Official\" #@param [\"Official\", \"United\"] {allow-input: true}\n",
"Provider = \"Localtunnel\" #@param [\"Localtunnel\", \"Cloudflare\"]\n",
"Provider = \"Cloudflare\" #@param [\"Localtunnel\", \"Cloudflare\"]\n",
"use_google_drive = True #@param {type:\"boolean\"}\n",
"\n",
"import os\n",
"if not os.path.isfile(\"/opt/bin/nvidia-smi\"):\n",
" raise RuntimeError(\"⚠Colab did not give you a GPU due to usage limits, this can take a few hours before they let you back in. Check out https://lite.koboldai.net for a free alternative (that does not provide an API link but can load KoboldAI saves and chat cards) or subscribe to Colab Pro for immediate access.⚠️\")\n",
"\n",
"!nvidia-smi\n",
"from google.colab import drive\n",
"drive.mount('/content/drive/')\n",
"if use_google_drive:\n",
" drive.mount('/content/drive/')\n",
"else:\n",
" import os\n",
" if not os.path.exists(\"/content/drive\"):\n",
" os.mkdir(\"/content/drive\")\n",
" if not os.path.exists(\"/content/drive/MyDrive/\"):\n",
" os.mkdir(\"/content/drive/MyDrive/\")\n",
"\n",
"if Model == \"Nerys 2.7B\":\n",
" Model = \"KoboldAI/fairseq-dense-2.7B-Nerys\"\n",
"if Model == \"Nerys V2 6B\":\n",
" Model = \"KoboldAI/OPT-6B-nerys-v2\"\n",
" path = \"\"\n",
" download = \"\"\n",
"elif Model == \"Erebus 2.7B\":\n",
" Model = \"KoboldAI/OPT-2.7B-Erebus\"\n",
"elif Model == \"Skein 6B\":\n",
" Model = \"KoboldAI/GPT-J-6B-Skein\"\n",
" path = \"\"\n",
" download = \"\"\n",
"elif Model == \"Janeway 6B\":\n",
" Model = \"KoboldAI/GPT-J-6B-Janeway\"\n",
" path = \"\"\n",
" download = \"\"\n",
"elif Model == \"Adventure 6B\":\n",
" Model = \"KoboldAI/GPT-J-6B-Adventure\"\n",
" path = \"\"\n",
" download = \"\"\n",
"elif Model == \"Shinen 6B\":\n",
" Model = \"KoboldAI/GPT-J-6B-Shinen\"\n",
" path = \"\"\n",
" download = \"\"\n",
"elif Model == \"Nerys 2.7B\":\n",
" Model = \"KoboldAI/fairseq-dense-2.7B-Nerys\"\n",
" path = \"\"\n",
" download = \"\"\n",
"elif Model == \"Janeway 2.7B\":\n",
@ -95,67 +123,107 @@
" Model = \"KoboldAI/GPT-Neo-2.7B-AID\"\n",
" path = \"\"\n",
" download = \"\"\n",
"elif Model == \"Horni LN 2.7B\":\n",
" Model = \"KoboldAI/GPT-Neo-2.7B-Horni-LN\"\n",
"elif Model == \"Fairseq Dense 2.7B\":\n",
" Model = \"KoboldAI/fairseq-dense-2.7B\"\n",
" path = \"\"\n",
" download = \"\"\n",
"elif Model == \"Horni 2.7B\":\n",
" Model = \"KoboldAI/GPT-Neo-2.7B-Horni\"\n",
" path = \"\"\n",
" download = \"\"\n",
"elif Model == \"Shinen 2.7B\":\n",
" Model = \"KoboldAI/GPT-Neo-2.7B-Shinen\"\n",
"elif Model == \"OPT 2.7B\":\n",
" Model = \"facebook/opt-2.7b\"\n",
" path = \"\"\n",
" download = \"\"\n",
"elif Model == \"Neo 2.7B\":\n",
" Model = \"EleutherAI/gpt-neo-2.7B\"\n",
" path = \"\"\n",
" download = \"\"\n",
"elif Model == \"Tiefighter 13B (United)\":\n",
" Model = \"KoboldAI/LLaMA2-13B-Tiefighter\"\n",
" path = \"\"\n",
" download = \"\"\n",
" Version = \"United\"\n",
"elif Model == \"Echidna 13B (United)\":\n",
" Model = \"NeverSleep/Echidna-13b-v0.3\"\n",
" path = \"\"\n",
" download = \"\"\n",
" Version = \"United\"\n",
"elif Model == \"Huginn 13B (United)\":\n",
" Model = \"The-Face-Of-Goonery/Huginn-13b-v1.2\"\n",
" path = \"\"\n",
" download = \"\"\n",
" Version = \"United\"\n",
"elif Model == \"Chronos 13B (United)\":\n",
" Model = \"elinas/chronos-13b-v2\"\n",
" path = \"\"\n",
" download = \"\"\n",
" Version = \"United\"\n",
"elif Model == \"Airoboros M2.0 13B (United)\":\n",
" Model = \"jondurbin/airoboros-l2-13b-gpt4-m2.0\"\n",
" path = \"\"\n",
" download = \"\"\n",
" Version = \"United\"\n",
"elif Model == \"Emerhyst 13B (United)\":\n",
" Model = \"Undi95/Emerhyst-13B\"\n",
" path = \"\"\n",
" download = \"\"\n",
" Version = \"United\"\n",
"elif Model == \"MythoMax 13B (United)\":\n",
" Model = \"Gryphe/MythoMax-L2-13b\"\n",
" Revision = \"\"\n",
" path = \"\"\n",
" download = \"\"\n",
" Version = \"United\"\n",
"elif Model == \"Spring Dragon 13B (United)\":\n",
" Model = \"Henk717/spring-dragon\"\n",
" path = \"\"\n",
" download = \"\"\n",
" Version = \"United\"\n",
"elif Model == \"Holodeck 13B (United)\":\n",
" Model = \"KoboldAI/LLAMA2-13B-Holodeck-1\"\n",
" path = \"\"\n",
" download = \"\"\n",
" Version = \"United\"\n",
"elif Model == \"HoloMax 13B (United)\":\n",
" Model = \"KoboldAI/LLaMA2-13B-Holomax\"\n",
" path = \"\"\n",
" download = \"\"\n",
" Version = \"United\"\n",
"\n",
"if Provider == \"Localtunnel\":\n",
" tunnel = \"--localtunnel yes\"\n",
"else:\n",
" tunnel = \"\"\n",
"\n",
"!wget https://koboldai.org/ckds -O - | bash /dev/stdin -m $Model -g $Version $tunnel"
],
"execution_count": null,
"outputs": []
"!wget https://koboldai.org/ckds -O - | bash /dev/stdin -m $Model -g $Version $Revision $tunnel"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "Lrm840I33hkC"
},
"source": [
"# GPU Edition Model Descriptions\n",
"| Model | Size | Style | Description |\n",
"| --- | --- | --- | --- |\n",
"| [Nerys 2.7B](https://huggingface.co/KoboldAI/fairseq-dense-2.7B-Nerys) by Mr Seeker | 2.7B | Novel/Adventure | Nerys is a hybrid model based on Pike (A newer Janeway), on top of the Pike dataset you also get some Light Novels, Adventure mode support and a little bit of Shinen thrown in the mix. The end result is a very diverse model that is heavily biased towards SFW novel writing, but one that can go beyond its novel training and make for an excellent adventure model to. Adventure mode is best played from a second person perspective, but can be played in first or third person as well. Novel writing can be done best from the first or third person. |\n",
"| [Janeway 2.7B](https://huggingface.co/KoboldAI/GPT-Neo-2.7B-Janeway) by Mr Seeker | 2.7B | Novel | Janeway is a model created from Picard's dataset combined with a brand new collection of ebooks. This model is trained on 20% more content than Picard and has been trained on literature from various genres. Although the model is mainly focussed on SFW, romantic scenes might involve a degree of nudity. |\n",
"| [Picard 2.7B](https://huggingface.co/KoboldAI/GPT-Neo-2.7B-Picard) by Mr Seeker | 2.7B | Novel | Picard is a model trained for SFW Novels based on Neo 2.7B. It is focused on Novel style writing without the NSFW bias. While the name suggests a sci-fi model this model is designed for Novels of a variety of genre's. It is meant to be used in KoboldAI's regular mode. |\n",
"| [AID 2.7B](https://huggingface.co/KoboldAI/GPT-Neo-2.7B-AID) by melastacho | 2.7B | Adventure | Also know as Adventure 2.7B this is a clone of the AI Dungeon Classic model and is best known for the epic wackey adventures that AI Dungeon Classic players love. |\n",
"| [Horni LN 2.7B](https://huggingface.co/KoboldAI/GPT-Neo-2.7B-Horni-LN) by finetune | 2.7B | Novel | This model is based on Horni 2.7B and retains its NSFW knowledge, but was then further biased towards SFW novel stories. If you seek a balance between a SFW Novel model and a NSFW model this model should be a good choice. |\n",
"| [Horni 2.7B](https://huggingface.co/KoboldAI/GPT-Neo-2.7B-Horni) by finetune | 2.7B | NSFW | This model is tuned on Literotica to produce a Novel style model biased towards NSFW content. Can still be used for SFW stories but will have a bias towards NSFW content. It is meant to be used in KoboldAI's regular mode. |\n",
"| [Shinen 2.7B](https://huggingface.co/KoboldAI/GPT-Neo-2.7B-Shinen) by Mr Seeker | 2.7B | NSFW | Shinen is an alternative to the Horni model designed to be more explicit. If Horni is to tame for you Shinen might produce better results. While it is a Novel model it is unsuitable for SFW stories due to its heavy NSFW bias. Shinen will not hold back. It is meant to be used in KoboldAI's regular mode. |\n",
"| [Neo 2.7B](https://huggingface.co/EleutherAI/gpt-neo-2.7B) by EleutherAI | 2.7B | Generic | This is the base model for all the other 2.7B models, it is best used when you have a use case that we have no other models available for, such as writing blog articles or programming. It can also be a good basis for the experience of some of the softprompts if your softprompt is not about a subject the other models cover. |\n",
"\n",
"# [TPU Edition Model Descriptions](https://colab.research.google.com/github/KoboldAI/KoboldAI-Client/blob/main/colab/TPU.ipynb)\n",
"\n",
"| Model | Size | Style | Description |\n",
"| --- | --- | --- | --- |\n",
"| [Nerys](https://huggingface.co/KoboldAI/fairseq-dense-13B-Nerys) by Mr Seeker | 13B | Novel/Adventure | Nerys is a hybrid model based on Pike (A newer Janeway), on top of the Pike dataset you also get some Light Novels, Adventure mode support and a little bit of Shinen thrown in the mix. The end result is a very diverse model that is heavily biased towards SFW novel writing, but one that can go beyond its novel training and make for an excellent adventure model to. Adventure mode is best played from a second person perspective, but can be played in first or third person as well. Novel writing can be done best from the first or third person. |\n",
"| [Janeway](https://huggingface.co/KoboldAI/fairseq-dense-13B-Janeway) by Mr Seeker | 13B | Novel | Janeway is a model created from Picard's dataset combined with a brand new collection of ebooks. This model is trained on 20% more content than Picard and has been trained on literature from various genres. Although the model is mainly focussed on SFW, romantic scenes might involve a degree of nudity. |\n",
"| [Shinen](https://huggingface.co/KoboldAI/fairseq-dense-13B-Shinen) by Mr Seeker | 13B | NSFW | Shinen is an NSFW model designed to be more explicit. Trained on a variety of stories from the website Sexstories it contains many different kinks. |\n",
"| [Skein](https://huggingface.co/KoboldAI/GPT-J-6B-Skein) by VE\\_FORBRYDERNE | 6B | Adventure | Skein is best used with Adventure mode enabled, it consists of a 4 times larger adventure dataset than the Adventure model making it excellent for text adventure gaming. On top of that it also consists of light novel training further expanding its knowledge and writing capabilities. It can be used with the You filter bias if you wish to write Novels with it, but dedicated Novel models can perform better for this task. |\n",
"| [Adventure](https://huggingface.co/KoboldAI/GPT-J-6B-Adventure) by VE\\_FORBRYDERNE | 6B | Adventure | Adventure is a 6B model designed to mimick the behavior of AI Dungeon. It is exclusively for Adventure Mode and can take you on the epic and wackey adventures that AI Dungeon players love. It also features the many tropes of AI Dungeon as it has been trained on very similar data. It must be used in second person (You). |\n",
"| [Lit](https://huggingface.co/hakurei/lit-6B) by Haru | 6B | NSFW | Lit is a great NSFW model trained by Haru on both a large set of Literotica stories and high quality novels along with tagging support. Creating a high quality model for your NSFW stories. This model is exclusively a novel model and is best used in third person. |\n",
"| Neo(X) by EleutherAI | 20B | Generic | NeoX is the largest EleutherAI model currently available, being a generic model it is not particularly trained towards anything and can do a variety of writing, Q&A and coding tasks. 20B's performance is closely compared to the 13B models and it is worth trying both especially if you have a task that does not involve english writing. Its behavior will be similar to the GPT-J-6B model since they are trained on the same dataset but with more sensitivity towards repetition penalty and with more knowledge. |\n",
"| [Fairseq Dense](https://huggingface.co/KoboldAI/fairseq-dense-13B) | 13B | Generic | Trained by Facebook Researchers this model stems from the MOE research project within Fairseq. This particular version has been converted by us for use in KoboldAI. It is known to be on par with the larger 20B model from EleutherAI and considered as better for pop culture and language tasks. Because the model has never seen a new line (enter) it may perform worse on formatting and paragraphing. |\n",
"| [GPT-J-6B](https://huggingface.co/EleutherAI/gpt-j-6B) by EleutherAI | 6B | Generic | This model serves as the basis for most other 6B models (Some being based on Fairseq Dense instead). Being trained on the Pile and not biased towards anything in particular it is suitable for a variety of tasks such as writing, Q&A and coding tasks. You will likely get better result with larger generic models or finetuned models. |\n",
"| Model | Style | Description |\n",
"| --- | --- | --- |\n",
"| [Nerys](https://huggingface.co/KoboldAI/fairseq-dense-2.7B-Nerys) by Mr Seeker | Novel/Adventure | Nerys is a hybrid model based on Pike (A newer Janeway), on top of the Pike dataset you also get some Light Novels, Adventure mode support and a little bit of Shinen thrown in the mix. The end result is a very diverse model that is heavily biased towards SFW novel writing, but one that can go beyond its novel training and make for an excellent adventure model to. Adventure mode is best played from a second person perspective, but can be played in first or third person as well. Novel writing can be done best from the first or third person. |\n",
"| [Tiefighter 13B by KoboldAI](https://huggingface.co/KoboldAI/LLaMA2-13B-Tiefighter) | Hybrid | Tiefighter 13B is a very versitile fiction Hybrid, it can write, chat and play adventure games and can also answer regular instructions (Although we do not recommend this model for factual use due to its fictional nature). This is an excellent starting model, for the best results avoid using Second person writing in your chats unless you are wanting it to become a text adventure.|\n",
"| [Janeway](https://huggingface.co/KoboldAI/GPT-Neo-2.7B-Janeway) by Mr Seeker | Novel | Janeway is a model created from Picard's dataset combined with a brand new collection of ebooks. This model is trained on 20% more content than Picard and has been trained on literature from various genres. Although the model is mainly focussed on SFW, romantic scenes might involve a degree of nudity. |\n",
"| [Picard](https://huggingface.co/KoboldAI/GPT-Neo-2.7B-Picard) by Mr Seeker | Novel | Picard is a model trained for SFW Novels based on Neo 2.7B. It is focused on Novel style writing without the NSFW bias. While the name suggests a sci-fi model this model is designed for Novels of a variety of genre's. It is meant to be used in KoboldAI's regular mode. |\n",
"| [AID](https://huggingface.co/KoboldAI/GPT-Neo-2.7B-AID) by melastacho | Adventure | Also know as Adventure 2.7B this is a clone of the AI Dungeon Classic model and is best known for the epic wackey adventures that AI Dungeon Classic players love. |\n",
"| [OPT](https://huggingface.co/facebook/opt-2.7b) by Metaseq | Generic | OPT is considered one of the best base models as far as content goes, its behavior has the strengths of both GPT-Neo and Fairseq Dense. Compared to Neo duplicate and unnecessary content has been left out, while additional literature was added in similar to the Fairseq Dense model. The Fairseq Dense model however lacks the broader data that OPT does have. The biggest downfall of OPT is its license, which prohibits any commercial usage, or usage beyond research purposes. |\n",
"| [Fairseq Dense](https://huggingface.co/KoboldAI/fairseq-dense-2.7B) | Generic | Trained by Facebook Researchers this model stems from the MOE research project within Fairseq. This particular version has been converted by us for use in KoboldAI. It is known to be on par with the larger models from EleutherAI and considered as better for pop culture and language tasks. Because the model has never seen a new line (enter) it may perform worse on formatting and paragraphing. Compared to other models the dataset focuses primarily on literature and contains little else. |\n",
"| [MythoMax 13B](https://huggingface.co/TheBloke/MythoMax-L2-13B-GPTQ) by Gryphe | Roleplay | An improved, potentially even perfected variant of MythoMix, my MythoLogic-L2 and Huginn merge using a highly experimental tensor type merge technique¹. |\n",
"| [Holomax 13B by KoboldAI](https://huggingface.co/KoboldAI/LLaMA2-13B-Holomax) | Adventure | This is an expansion merge to the well-praised MythoMax model from Gryphe (60%) using MrSeeker's KoboldAI Holodeck model (40%). The goal of this model is to enhance story-writing capabilities while preserving the desirable traits of the MythoMax model as much as possible (It does limit chat reply length). |\n",
"| [Airoboros 13B](https://huggingface.co/jondurbin/airoboros-13b) by Jon Durbin | Generic | This is an instruction fine-tuned llama-2 model, using synthetic instructions generated by airoboros⁵. |\n",
"| [Emerhyst 13B](https://huggingface.co/Undi95/Emerhyst-13B) by Undi | Roleplay | An attempt using BlockMerge_Gradient to get better result. In addition, LimaRP v3 was used⁷. |\n",
"| [Chronos 13B](https://huggingface.co/elinas/chronos-13b) by Elinas | Generic | This model is primarily focused on chat, roleplay, and storywriting, but can accomplish other tasks such as simple reasoning and coding. Chronos generates very long outputs with coherent text, largely due to the human inputs it was trained on. |\n",
"| [Spring Dragon by Henk717](https://huggingface.co/Henk717/spring-dragon) | Adventure | This model is a recreation attempt of the AI Dungeon 2 Dragon model. To achieve this, the \"text_adventures.txt\" dataset was used, which was bundled with the original AI Dungeon 2 GitHub release prior to the online service. It is worth noting that the same dataset file was used to create the Dragon model, where Dragon is a GPT-3 175B Davinci model from 2020. |\n",
"| [Holodeck By KoboldAI](https://huggingface.co/KoboldAI/LLAMA2-13B-Holodeck-1) | Adventure |LLAMA2 13B-Holodeck is a finetune created using Meta's llama 2 model.The training data contains around 3000 ebooks in various genres. Most parts of the dataset have been prepended using the following text: [Genre: <genre1>, <genre2>|\n",
"| [Neo](https://huggingface.co/EleutherAI/gpt-neo-2.7B) by EleutherAI | Generic | This is the base model for all the other 2.7B models, it is best used when you have a use case that we have no other models available for, such as writing blog articles or programming. It can also be a good basis for the experience of some of the softprompts if your softprompt is not about a subject the other models cover. |\n",
"\n",
"\n",
"| Style | Description |\n",
"| --------- | ------------------------------------------------------------ |\n",
"| Novel | For regular story writing, not compatible with Adventure mode or other specialty modes. |\n",
"| NSFW | Indicates that the model is strongly biased towards NSFW content and is not suitable for children, work environments or livestreaming. Most NSFW models are also Novel models in nature. |\n",
"| Adventure | These models are excellent for people willing to play KoboldAI like a Text Adventure game and are meant to be used with Adventure mode enabled. Even if you wish to use it as a Novel style model you should always have Adventure mode on and set it to story. These models typically have a strong bias towards the use of the word You and without Adventure mode enabled break the story flow and write actions on your behalf. |\n",
"| Generic | Generic models are not trained towards anything specific, typically used as a basis for other tasks and models. They can do everything the other models can do, but require much more handholding to work properly. Generic models are an ideal basis for tasks that we have no specific model for, or for experiencing a softprompt in its raw form. |\n",
"\n",
@ -171,10 +239,39 @@
"7. As you play KoboldAI, keep this Colab tab open in the background and check occationally for Captcha's so they do not shut your instance down. If you do get shut down you can always download a copy of your gamesave in the Save menu inside KoboldAI. Stories are never lost as long as you keep KoboldAI open in your browser.\n",
"\n",
"Get a error message saying you do not have access to a GPU/TPU instance? Do not continue and try again later, KoboldAI will not run correctly without them."
],
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "Lrm840I33hkC"
}
"cellView": "form",
"id": "5k8fK4F6UiTs"
},
"outputs": [],
"source": [
"#@title <b>Model Cleaner</b>\n",
"#@markdown Out of space? Run this to remove all cached models (Google Drive models are not effected).\n",
"!rm -rf /content/KoboldAI-Client/cache/*\n"
]
}
]
],
"metadata": {
"accelerator": "GPU",
"colab": {
"name": "ColabKobold GPU",
"private_outputs": true,
"provenance": [],
"include_colab_link": true
},
"kernelspec": {
"display_name": "Python 3",
"name": "python3"
},
"language_info": {
"name": "python"
}
},
"nbformat": 4,
"nbformat_minor": 0
}

View File

@ -46,7 +46,7 @@
"#@title <-- Tap this if you play on Mobile { display-mode: \"form\" }\n",
"%%html\n",
"<b>Press play on the music player to keep the tab alive, then start KoboldAI below (Uses only 13MB of data)</b><br/>\n",
"<audio src=\"https://henk.tech/colabkobold/silence.m4a\" controls>"
"<audio src=\"https://raw.githubusercontent.com/KoboldAI/KoboldAI-Client/main/colab/silence.m4a\" controls>"
],
"metadata": {
"id": "ZIL7itnNaw5V"
@ -66,9 +66,10 @@
"#@title <b><-- Select your model below and then click this to start KoboldAI</b>\n",
"#@markdown You can find a description of the models below along with instructions on how to start KoboldAI.\n",
"\n",
"Model = \"Nerys 13B V2\" #@param [\"Nerys 13B V2\", \"Erebus 13B\", \"Janeway 13B\", \"Shinen 13B\", \"Skein 20B\", \"Erebus 20B\", \"Skein 6B\", \"Janeway 6B\", \"Adventure 6B\", \"Shinen 6B\", \"Lit V2 6B\", \"Lit 6B\", \"NeoX 20B\", \"OPT 13B\", \"Fairseq Dense 13B\", \"GPT-J-6B\"] {allow-input: true}\n",
"Model = \"Nerys 13B V2\" #@param [\"Nerys 13B V2\", \"Janeway 13B\", \"Skein 20B\", \"Skein 6B\", \"Janeway 6B\", \"Adventure 6B\", \"NeoX 20B\", \"OPT 13B\", \"Fairseq Dense 13B\", \"GPT-J-6B\"] {allow-input: true}\n",
"Version = \"Official\" #@param [\"Official\", \"United\"] {allow-input: true}\n",
"Provider = \"Cloudflare\" #@param [\"Localtunnel\", \"Cloudflare\"]\n",
"use_google_drive = True #@param {type:\"boolean\"}\n",
"\n",
"import os\n",
"try:\n",
@ -79,7 +80,16 @@
" raise RuntimeError(\"⚠You can not run this notebook without the TPU accelerator, go to Runtime->Sessions, terminate your session and then try again.⚠️\")\n",
"print('Now we will need your Google Drive to store settings and saves, you must login with the same account you used for Colab.')\n",
"from google.colab import drive\n",
"drive.mount('/content/drive/')\n",
"if use_google_drive:\n",
" drive.mount('/content/drive/')\n",
"else:\n",
" import os\n",
" if not os.path.exists(\"/content/drive\"):\n",
" os.mkdir(\"/content/drive\")\n",
" if not os.path.exists(\"/content/drive/MyDrive/\"):\n",
" os.mkdir(\"/content/drive/MyDrive/\")\n",
"\n",
"Revision = \"\"\n",
"\n",
"if Model == \"Janeway 13B\":\n",
" Model = \"KoboldAI/fairseq-dense-13B-Janeway\"\n",
@ -89,18 +99,6 @@
" Model = \"KoboldAI/OPT-13B-Nerys-v2\"\n",
" path = \"\"\n",
" download = \"\"\n",
"elif Model == \"Erebus 13B\":\n",
" Model = \"KoboldAI/OPT-13B-Erebus\"\n",
" path = \"\"\n",
" download = \"\"\n",
"elif Model == \"Shinen 13B\":\n",
" Model = \"KoboldAI/fairseq-dense-13B-Shinen\"\n",
" path = \"\"\n",
" download = \"\"\n",
"elif Model == \"Erebus 20B\":\n",
" Model = \"KoboldAI/GPT-NeoX-20B-Erebus\"\n",
" path = \"\"\n",
" download = \"\"\n",
"elif Model == \"Skein 20B\":\n",
" Model = \"KoboldAI/GPT-NeoX-20B-Skein\"\n",
" path = \"\"\n",
@ -121,18 +119,6 @@
" Model = \"KoboldAI/GPT-J-6B-Adventure\"\n",
" path = \"\"\n",
" download = \"\"\n",
"elif Model == \"Lit V2 6B\":\n",
" Model = \"hakurei/litv2-6B-rev3\"\n",
" path = \"\"\n",
" download = \"\"\n",
"elif Model == \"Lit 6B\":\n",
" Model = \"hakurei/lit-6B\"\n",
" path = \"\"\n",
" download = \"\"\n",
"elif Model == \"Shinen 6B\":\n",
" Model = \"KoboldAI/GPT-J-6B-Shinen\"\n",
" path = \"\"\n",
" download = \"\"\n",
"elif Model == \"OPT 13B\":\n",
" Model = \"facebook/opt-13b\"\n",
" path = \"\"\n",
@ -154,7 +140,7 @@
"else:\n",
" tunnel = \"\"\n",
"\n",
"!wget https://koboldai.org/ckds -O - | bash /dev/stdin $path$download -m $Model -g $Version $tunnel"
"!wget https://koboldai.org/ckds -O - | bash /dev/stdin $path$download -m $Model -g $Version $tunnel $Revision"
]
},
{
@ -162,36 +148,33 @@
"source": [
"# TPU Edition Model Descriptions\n",
"\n",
"| Model | Size | Style | Description |\n",
"| --- | --- | --- | --- |\n",
"| [Nerys](https://huggingface.co/KoboldAI/fairseq-dense-13B-Nerys) by Mr Seeker | 13B | Novel/Adventure | Nerys is a hybrid model based on Pike (A newer Janeway), on top of the Pike dataset you also get some Light Novels, Adventure mode support and a little bit of Shinen thrown in the mix. The end result is a very diverse model that is heavily biased towards SFW novel writing, but one that can go beyond its novel training and make for an excellent adventure model to. Adventure mode is best played from a second person perspective, but can be played in first or third person as well. Novel writing can be done best from the first or third person. |\n",
"| [Janeway](https://huggingface.co/KoboldAI/fairseq-dense-13B-Janeway) by Mr Seeker | 13B | Novel | Janeway is a model created from Picard's dataset combined with a brand new collection of ebooks. This model is trained on 20% more content than Picard and has been trained on literature from various genres. Although the model is mainly focussed on SFW, romantic scenes might involve a degree of nudity. |\n",
"| [Shinen](https://huggingface.co/KoboldAI/fairseq-dense-13B-Shinen) by Mr Seeker | 13B | NSFW | Shinen is an NSFW model designed to be more explicit. Trained on a variety of stories from the website Sexstories it contains many different kinks. |\n",
"| [Skein](https://huggingface.co/KoboldAI/GPT-J-6B-Skein) by VE\\_FORBRYDERNE | 6B | Adventure | Skein is best used with Adventure mode enabled, it consists of a 4 times larger adventure dataset than the Adventure model making it excellent for text adventure gaming. On top of that it also consists of light novel training further expanding its knowledge and writing capabilities. It can be used with the You filter bias if you wish to write Novels with it, but dedicated Novel models can perform better for this task. |\n",
"| [Adventure](https://huggingface.co/KoboldAI/GPT-J-6B-Adventure) by VE\\_FORBRYDERNE | 6B | Adventure | Adventure is a 6B model designed to mimick the behavior of AI Dungeon. It is exclusively for Adventure Mode and can take you on the epic and wackey adventures that AI Dungeon players love. It also features the many tropes of AI Dungeon as it has been trained on very similar data. It must be used in second person (You). |\n",
"| [Lit](https://huggingface.co/hakurei/lit-6B) by Haru | 6B | NSFW | Lit is a great NSFW model trained by Haru on both a large set of Literotica stories and high quality novels along with tagging support. Creating a high quality model for your NSFW stories. This model is exclusively a novel model and is best used in third person. |\n",
"| Neo(X) by EleutherAI | 20B | Generic | NeoX is the largest EleutherAI model currently available, being a generic model it is not particularly trained towards anything and can do a variety of writing, Q&A and coding tasks. 20B's performance is closely compared to the 13B models and it is worth trying both especially if you have a task that does not involve english writing. Its behavior will be similar to the GPT-J-6B model since they are trained on the same dataset but with more sensitivity towards repetition penalty and with more knowledge. |\n",
"| [Fairseq Dense](https://huggingface.co/KoboldAI/fairseq-dense-13B) | 13B | Generic | Trained by Facebook Researchers this model stems from the MOE research project within Fairseq. This particular version has been converted by us for use in KoboldAI. It is known to be on par with the larger 20B model from EleutherAI and considered as better for pop culture and language tasks. Because the model has never seen a new line (enter) it may perform worse on formatting and paragraphing. |\n",
"| [GPT-J-6B](https://huggingface.co/EleutherAI/gpt-j-6B) by EleutherAI | 6B | Generic | This model serves as the basis for most other 6B models (Some being based on Fairseq Dense instead). Being trained on the Pile and not biased towards anything in particular it is suitable for a variety of tasks such as writing, Q&A and coding tasks. You will likely get better result with larger generic models or finetuned models. |\n",
"\n",
"| Model | Style | Description |\n",
"| --- | --- | --- |\n",
"| [Nerys](https://huggingface.co/KoboldAI/fairseq-dense-13B-Nerys) by Mr Seeker | Novel/Adventure | Nerys is a hybrid model based on Pike (A newer Janeway), on top of the Pike dataset you also get some Light Novels, Adventure mode support and a little bit of Shinen thrown in the mix. The end result is a very diverse model that is heavily biased towards SFW novel writing, but one that can go beyond its novel training and make for an excellent adventure model to. Adventure mode is best played from a second person perspective, but can be played in first or third person as well. Novel writing can be done best from the first or third person. |\n",
"| [Janeway](https://huggingface.co/KoboldAI/fairseq-dense-13B-Janeway) by Mr Seeker | Novel | Janeway is a model created from Picard's dataset combined with a brand new collection of ebooks. This model is trained on 20% more content than Picard and has been trained on literature from various genres. Although the model is mainly focussed on SFW, romantic scenes might involve a degree of nudity. |\n",
"| [Skein](https://huggingface.co/KoboldAI/GPT-J-6B-Skein) by VE\\_FORBRYDERNE | Adventure | Skein is best used with Adventure mode enabled, it consists of a 4 times larger adventure dataset than the Adventure model making it excellent for text adventure gaming. On top of that it also consists of light novel training further expanding its knowledge and writing capabilities. It can be used with the You filter bias if you wish to write Novels with it, but dedicated Novel models can perform better for this task. |\n",
"| [Adventure](https://huggingface.co/KoboldAI/GPT-J-6B-Adventure) by VE\\_FORBRYDERNE | Adventure | Adventure is a 6B model designed to mimick the behavior of AI Dungeon. It is exclusively for Adventure Mode and can take you on the epic and wackey adventures that AI Dungeon players love. It also features the many tropes of AI Dungeon as it has been trained on very similar data. It must be used in second person (You). |\n",
"| [OPT](https://huggingface.co/facebook/opt-13b) by Metaseq | Generic | OPT is considered one of the best base models as far as content goes, its behavior has the strengths of both GPT-Neo and Fairseq Dense. Compared to Neo duplicate and unnecessary content has been left out, while additional literature was added in similar to the Fairseq Dense model. The Fairseq Dense model however lacks the broader data that OPT does have. The biggest downfall of OPT is its license, which prohibits any commercial usage, or usage beyond research purposes. |\n",
"| [Neo(X)](https://huggingface.co/EleutherAI/gpt-neox-20b) by EleutherAI | Generic | NeoX is the largest EleutherAI model currently available, being a generic model it is not particularly trained towards anything and can do a variety of writing, Q&A and coding tasks. 20B's performance is closely compared to the 13B models and it is worth trying both especially if you have a task that does not involve english writing. Its behavior will be similar to the GPT-J-6B model since they are trained on the same dataset but with more sensitivity towards repetition penalty and with more knowledge. |\n",
"| [Fairseq Dense](https://huggingface.co/KoboldAI/fairseq-dense-13B) | Generic | Trained by Facebook Researchers this model stems from the MOE research project within Fairseq. This particular version has been converted by us for use in KoboldAI. It is known to be on par with the larger 20B model from EleutherAI and considered as better for pop culture and language tasks. Because the model has never seen a new line (enter) it may perform worse on formatting and paragraphing. Compared to other models the dataset focuses primarily on literature and contains little else. |\n",
"| [GPT-J-6B](https://huggingface.co/EleutherAI/gpt-j-6B) by EleutherAI | Generic | This model serves as the basis for most other 6B models (Some being based on Fairseq Dense instead). Being trained on the Pile and not biased towards anything in particular it is suitable for a variety of tasks such as writing, Q&A and coding tasks. You will likely get better result with larger generic models or finetuned models. |\n",
"\n",
"# [GPU Edition Model Descriptions](https://colab.research.google.com/github/KoboldAI/KoboldAI-Client/blob/main/colab/GPU.ipynb)\n",
"\n",
"| Model | Size | Style | Description |\n",
"| --- | --- | --- | --- |\n",
"| [Nerys 2.7B](https://huggingface.co/KoboldAI/fairseq-dense-2.7B-Nerys) by Mr Seeker | 2.7B | Novel/Adventure | Nerys is a hybrid model based on Pike (A newer Janeway), on top of the Pike dataset you also get some Light Novels, Adventure mode support and a little bit of Shinen thrown in the mix. The end result is a very diverse model that is heavily biased towards SFW novel writing, but one that can go beyond its novel training and make for an excellent adventure model to. Adventure mode is best played from a second person perspective, but can be played in first or third person as well. Novel writing can be done best from the first or third person. |\n",
"| [Janeway 2.7B](https://huggingface.co/KoboldAI/GPT-Neo-2.7B-Janeway) by Mr Seeker | 2.7B | Novel | Janeway is a model created from Picard's dataset combined with a brand new collection of ebooks. This model is trained on 20% more content than Picard and has been trained on literature from various genres. Although the model is mainly focussed on SFW, romantic scenes might involve a degree of nudity. |\n",
"| [Picard 2.7B](https://huggingface.co/KoboldAI/GPT-Neo-2.7B-Picard) by Mr Seeker | 2.7B | Novel | Picard is a model trained for SFW Novels based on Neo 2.7B. It is focused on Novel style writing without the NSFW bias. While the name suggests a sci-fi model this model is designed for Novels of a variety of genre's. It is meant to be used in KoboldAI's regular mode. |\n",
"| [AID 2.7B](https://huggingface.co/KoboldAI/GPT-Neo-2.7B-AID) by melastacho | 2.7B | Adventure | Also know as Adventure 2.7B this is a clone of the AI Dungeon Classic model and is best known for the epic wackey adventures that AI Dungeon Classic players love. |\n",
"| [Horni LN 2.7B](https://huggingface.co/KoboldAI/GPT-Neo-2.7B-Horni-LN) by finetune | 2.7B | Novel | This model is based on Horni 2.7B and retains its NSFW knowledge, but was then further biased towards SFW novel stories. If you seek a balance between a SFW Novel model and a NSFW model this model should be a good choice. |\n",
"| [Horni 2.7B](https://huggingface.co/KoboldAI/GPT-Neo-2.7B-Horni) by finetune | 2.7B | NSFW | This model is tuned on Literotica to produce a Novel style model biased towards NSFW content. Can still be used for SFW stories but will have a bias towards NSFW content. It is meant to be used in KoboldAI's regular mode. |\n",
"| [Shinen 2.7B](https://huggingface.co/KoboldAI/GPT-Neo-2.7B-Shinen) by Mr Seeker | 2.7B | NSFW | Shinen is an alternative to the Horni model designed to be more explicit. If Horni is to tame for you Shinen might produce better results. While it is a Novel model it is unsuitable for SFW stories due to its heavy NSFW bias. Shinen will not hold back. It is meant to be used in KoboldAI's regular mode. |\n",
"| [Neo 2.7B](https://huggingface.co/EleutherAI/gpt-neo-2.7B) by EleutherAI | 2.7B | Generic | This is the base model for all the other 2.7B models, it is best used when you have a use case that we have no other models available for, such as writing blog articles or programming. It can also be a good basis for the experience of some of the softprompts if your softprompt is not about a subject the other models cover. |\n",
"| Model | Style | Description |\n",
"| --- | --- | --- |\n",
"| [Nerys](https://huggingface.co/KoboldAI/fairseq-dense-2.7B-Nerys) by Mr Seeker | Novel/Adventure | Nerys is a hybrid model based on Pike (A newer Janeway), on top of the Pike dataset you also get some Light Novels, Adventure mode support and a little bit of Shinen thrown in the mix. The end result is a very diverse model that is heavily biased towards SFW novel writing, but one that can go beyond its novel training and make for an excellent adventure model to. Adventure mode is best played from a second person perspective, but can be played in first or third person as well. Novel writing can be done best from the first or third person. |\n",
"| [Janeway](https://huggingface.co/KoboldAI/GPT-Neo-2.7B-Janeway) by Mr Seeker | Novel | Janeway is a model created from Picard's dataset combined with a brand new collection of ebooks. This model is trained on 20% more content than Picard and has been trained on literature from various genres. Although the model is mainly focussed on SFW, romantic scenes might involve a degree of nudity. |\n",
"| [Picard](https://huggingface.co/KoboldAI/GPT-Neo-2.7B-Picard) by Mr Seeker | Novel | Picard is a model trained for SFW Novels based on Neo 2.7B. It is focused on Novel style writing without the NSFW bias. While the name suggests a sci-fi model this model is designed for Novels of a variety of genre's. It is meant to be used in KoboldAI's regular mode. |\n",
"| [AID](https://huggingface.co/KoboldAI/GPT-Neo-2.7B-AID) by melastacho | Adventure | Also know as Adventure 2.7B this is a clone of the AI Dungeon Classic model and is best known for the epic wackey adventures that AI Dungeon Classic players love. |\n",
"| [OPT](https://huggingface.co/facebook/opt-2.7b) by Metaseq | Generic | OPT is considered one of the best base models as far as content goes, its behavior has the strengths of both GPT-Neo and Fairseq Dense. Compared to Neo duplicate and unnecessary content has been left out, while additional literature was added in similar to the Fairseq Dense model. The Fairseq Dense model however lacks the broader data that OPT does have. The biggest downfall of OPT is its license, which prohibits any commercial usage, or usage beyond research purposes. |\n",
"| [Fairseq Dense](https://huggingface.co/KoboldAI/fairseq-dense-2.7B) | Generic | Trained by Facebook Researchers this model stems from the MOE research project within Fairseq. This particular version has been converted by us for use in KoboldAI. It is known to be on par with the larger models from EleutherAI and considered as better for pop culture and language tasks. Because the model has never seen a new line (enter) it may perform worse on formatting and paragraphing. Compared to other models the dataset focuses primarily on literature and contains little else. |\n",
"| [Neo](https://huggingface.co/EleutherAI/gpt-neo-2.7B) by EleutherAI | Generic | This is the base model for all the other 2.7B models, it is best used when you have a use case that we have no other models available for, such as writing blog articles or programming. It can also be a good basis for the experience of some of the softprompts if your softprompt is not about a subject the other models cover. |\n",
"\n",
"\n",
"| Style | Description |\n",
"| --- | --- |\n",
"| Novel | For regular story writing, not compatible with Adventure mode or other specialty modes. |\n",
"| NSFW | Indicates that the model is strongly biased towards NSFW content and is not suitable for children, work environments or livestreaming. Most NSFW models are also Novel models in nature. |\n",
"| Adventure | These models are excellent for people willing to play KoboldAI like a Text Adventure game and are meant to be used with Adventure mode enabled. Even if you wish to use it as a Novel style model you should always have Adventure mode on and set it to story. These models typically have a strong bias towards the use of the word You and without Adventure mode enabled break the story flow and write actions on your behalf. |\n",
"| Generic | Generic models are not trained towards anything specific, typically used as a basis for other tasks and models. They can do everything the other models can do, but require much more handholding to work properly. Generic models are an ideal basis for tasks that we have no specific model for, or for experiencing a softprompt in its raw form. |\n",
"\n",
@ -227,7 +210,6 @@
"name": "ColabKobold TPU",
"provenance": [],
"private_outputs": true,
"collapsed_sections": [],
"include_colab_link": true
},
"kernelspec": {

BIN
colab/silence.m4a Normal file

Binary file not shown.

View File

@ -1,5 +1,7 @@
@echo off
cd /D %~dp0
SET CONDA_SHLVL=
TITLE CMD for KoboldAI Runtime
SET /P M=<loader.settings
IF %M%==1 GOTO drivemap

View File

@ -0,0 +1 @@
{"aria2_port":null, "breakmodel":null, "breakmodel_disklayers":null, "breakmodel_gpulayers":null, "breakmodel_layers":null, "colab":null, "configname":null, "cpu":null, "host":null, "localtunnel":null, "lowmem":null, "model":null, "ngrok":null, "no_aria2":null, "noaimenu":null, "nobreakmodel":null, "override_delete":null, "override_rename":null, "path":null, "port":null, "quiet":null, "remote":null, "revision":null, "savemodel":null, "unblock":null}

View File

@ -6,4 +6,4 @@ WORKDIR /content/
COPY env.yml /home/micromamba/env.yml
RUN micromamba install -y -n base -f /home/micromamba/env.yml
USER root
RUN apt update && apt install xorg -y
RUN apt update && apt install xorg aria2 -y

View File

@ -5,6 +5,8 @@ services:
environment:
- DISPLAY=${DISPLAY}
network_mode: "host"
security_opt:
- label:disable
volumes:
- /tmp/.X11-unix:/tmp/.X11-unix
- /etc/protocols:/etc/protocols:ro

View File

@ -3,4 +3,4 @@ WORKDIR /content/
COPY env.yml /home/micromamba/env.yml
RUN micromamba install -y -n base -f /home/micromamba/env.yml
USER root
RUN apt update && apt install xorg libsqlite3-0 -y
RUN apt update && apt install xorg libsqlite3-0 aria2 -y

View File

@ -5,6 +5,8 @@ services:
environment:
- DISPLAY=${DISPLAY}
network_mode: "host"
security_opt:
- label:disable
volumes:
- /tmp/.X11-unix:/tmp/.X11-unix
- /etc/protocols:/etc/protocols:ro

View File

@ -0,0 +1,8 @@
FROM debian
RUN apt update && apt install wget aria2 git bzip2 -y
RUN git clone https://github.com/koboldai/koboldai-client /opt/koboldai
WORKDIR /opt/koboldai
RUN ./install_requirements.sh cuda
COPY docker-helper.sh /opt/koboldai/docker-helper.sh
EXPOSE 5000/tcp
CMD /opt/koboldai/docker-helper.sh

View File

@ -0,0 +1,17 @@
These are the source files for the official versions of the standalone docker and are provided for completeness.
Using these files you will not use any of the local modifications you make, instead it will use the latest github version of KoboldAI as the basis.
If you wish to run KoboldAI containerised with access to the local directory you can do so using docker-cuda.sh or docker-rocm.sh instead.
We do not support ROCm in the standalone docker as it is intended for cloud deployment on CUDA systems.
If you wish to build a ROCm version instead, you can do so by modifying the Dockerfile and changing the install_requirements.sh from cuda to rocm.
Similarly you need to modify the Dockerfile to specify which branch of KoboldAI the docker is being built for.
Usage:
This docker will automatically assume the persistent volume is mounted to /content and will by default not store models there.
The following environment variables exist to adjust the behavior if desired.
KOBOLDAI_DATADIR=/content , this can be used to specify a different default location for your stories, settings, userscripts, etc in case your provider does not let you change the mounted folder path.
KOBOLDAI_MODELDIR= , This variable can be used to make model storage persistent, it can be the same location as your datadir but this is not required.
KOBOLDAI_ARGS= , This variable is built in KoboldAI and can be used to override the default launch options. Right now the docker by default will launch in remote mode, with output hidden from the logs and file management enabled.

View File

@ -0,0 +1,47 @@
#!/bin/bash
cd /opt/koboldai
git pull
#./install_requirements.sh cuda
if [[ ! -v KOBOLDAI_DATADIR ]];then
mkdir /content
KOBOLDAI_DATADIR=/content
fi
mkdir $KOBOLDAI_DATADIR/stories
if [[ -v KOBOLDAI_MODELDIR ]];then
mkdir $KOBOLDAI_MODELDIR/models
fi
mkdir $KOBOLDAI_DATADIR/settings
mkdir $KOBOLDAI_DATADIR/softprompts
mkdir $KOBOLDAI_DATADIR/userscripts
#mkdir $KOBOLDAI_MODELDIR/cache
cp -rn stories/* $KOBOLDAI_DATADIR/stories/
cp -rn userscripts/* $KOBOLDAI_DATADIR/userscripts/
cp -rn softprompts/* $KOBOLDAI_DATADIR/softprompts/
rm stories
rm -rf stories/
rm userscripts
rm -rf userscripts/
rm softprompts
rm -rf softprompts/
if [[ -v KOBOLDAI_MODELDIR ]];then
rm models
rm -rf models/
#rm cache
#rm -rf cache/
fi
ln -s $KOBOLDAI_DATADIR/stories/ stories
ln -s $KOBOLDAI_DATADIR/settings/ settings
ln -s $KOBOLDAI_DATADIR/softprompts/ softprompts
ln -s $KOBOLDAI_DATADIR/userscripts/ userscripts
if [[ -v KOBOLDAI_MODELDIR ]];then
ln -s $KOBOLDAI_MODELDIR/models/ models
#ln -s $KOBOLDAI_MODELDIR/cache/ cache
fi
PYTHONUNBUFFERED=1 ./play.sh --remote --quiet --override_delete --override_rename

View File

@ -1,22 +0,0 @@
name: koboldai
channels:
- pytorch
- conda-forge
- defaults
dependencies:
- colorama
- flask-socketio
- pytorch
- cudatoolkit=11.1
- tensorflow-gpu
- python=3.8.*
- eventlet
- markdown
- bleach=4.1.0
- pip
- git=2.35.1
- pip:
- git+https://github.com/finetuneanon/transformers@gpt-neo-localattention3-rp-b
- flask-cloudflared
- flask-ngrok
- lupa==1.10

View File

@ -5,20 +5,33 @@ channels:
- defaults
dependencies:
- colorama
- flask-socketio
- flask=2.2.3
- flask-socketio=5.3.2
- flask-session=0.4.0
- python-socketio=5.7.2
- pytorch=1.11.*
- python=3.8.*
- cudatoolkit=11.1
- eventlet
- eventlet=0.33.3
- dnspython=2.2.1
- markdown
- bleach=4.1.0
- pip
- git=2.35.1
- sentencepiece
- protobuf
- marshmallow>=3.13
- apispec-webframeworks
- loguru
- termcolor
- psutil
- pip:
- flask-cloudflared
- flask-cloudflared==0.0.10
- flask-ngrok
- Werkzeug==2.3.7
- lupa==1.10
- transformers>=4.20.1
- accelerate
- transformers==4.24.0
- huggingface_hub==0.12.1
- safetensors
- accelerate
- git+https://github.com/VE-FORBRYDERNE/mkultra

View File

@ -1,21 +0,0 @@
name: koboldai-ft
channels:
- conda-forge
- defaults
dependencies:
- colorama
- flask-socketio
- python=3.8.*
- eventlet
- markdown
- bleach=4.1.0
- pip
- git=2.35.1
- pip:
- --find-links https://download.pytorch.org/whl/rocm4.2/torch_stable.html
- torch
- torchvision==0.11.1
- flask-cloudflared
- git+https://github.com/finetuneanon/transformers@gpt-neo-localattention3-rp-b
- flask-ngrok
- lupa==1.10

View File

@ -4,21 +4,33 @@ channels:
- defaults
dependencies:
- colorama
- flask-socketio
- flask=2.2.3
- flask-socketio=5.3.2
- flask-session=0.4.0
- python-socketio=5.7.2
- python=3.8.*
- eventlet
- eventlet=0.33.3
- dnspython=2.2.1
- markdown
- bleach=4.1.0
- pip
- git=2.35.1
- sentencepiece
- protobuf
- marshmallow>=3.13
- apispec-webframeworks
- loguru
- termcolor
- psutil
- pip:
- --find-links https://download.pytorch.org/whl/rocm4.2/torch_stable.html
- torch==1.10.*
- torchvision
- flask-cloudflared
- --extra-index-url https://download.pytorch.org/whl/rocm5.1.1
- torch==1.12.1+rocm5.1.1
- flask-cloudflared==0.0.10
- flask-ngrok
- Werkzeug==2.3.7
- lupa==1.10
- transformers>=4.20.1
- transformers==4.24.0
- huggingface_hub==0.12.1
- safetensors
- accelerate
- git+https://github.com/VE-FORBRYDERNE/mkultra

View File

@ -3,6 +3,7 @@ from typing import Tuple, Union, Optional
import os
import json
import zipfile
from logger import logger
#==================================================================#
# Generic Method for prompting for file path
@ -85,7 +86,7 @@ def uspath(filename):
def getstoryfiles():
list = []
for file in listdir("stories"):
if file.endswith(".json"):
if file.endswith(".json") and not file.endswith(".v2.json"):
ob = {}
ob["name"] = file.replace(".json", "")
f = open("stories/"+file, "r")
@ -149,16 +150,16 @@ def getspfiles(model_dimension: int):
continue
z, version, shape, fortran_order, dtype = checksp(file, model_dimension)
if z == 1:
print(f"Browser SP loading error: {file} is malformed or not a soft prompt ZIP file.")
logger.warning(f"Softprompt {file} is malformed or not a soft prompt ZIP file.")
continue
if z == 2:
print(f"Browser SP loading error: {file} tensor.npy has unsupported dtype '{dtype.name}'.")
logger.warning(f"Softprompt {file} tensor.npy has unsupported dtype '{dtype.name}'.")
continue
if z == 3:
print(f"Browser SP loading error: {file} tensor.npy has model dimension {shape[1]} which does not match your model's model dimension of {model_dimension}. This usually means this soft prompt is not compatible with your model.")
logger.debug(f"Softprompt {file} tensor.npy has model dimension {shape[1]} which does not match your model's model dimension of {model_dimension}. This usually means this soft prompt is not compatible with your model.")
continue
if z == 4:
print(f"Browser SP loading error: {file} tensor.npy has {shape[0]} tokens but it is supposed to have less than 2048 tokens.")
logger.warning(f"Softprompt {file} tensor.npy has {shape[0]} tokens but it is supposed to have less than 2048 tokens.")
continue
assert isinstance(z, zipfile.ZipFile)
try:

View File

@ -230,6 +230,50 @@ gensettingstf = [
"default": 0,
"tooltip": "Disables userscript generation modifiers."
},
{
"uitype": "toggle",
"unit": "bool",
"label": "Full Determinism",
"id": "setfulldeterminism",
"min": 0,
"max": 1,
"step": 1,
"default": 0,
"tooltip": "Causes generation to be fully deterministic -- the model will always output the same thing as long as your story, settings and RNG seed are the same. If this is off, only the sequence of outputs that the model makes will be deterministic."
},
{
"uitype": "toggle",
"unit": "bool",
"label": "Token Streaming",
"id": "setoutputstreaming",
"min": 0,
"max": 1,
"step": 1,
"default": 0,
"tooltip": "Shows outputs to you as they are made. Does not work with more than one gens per action."
},
{
"uitype": "toggle",
"unit": "bool",
"label": "Probability Viewer",
"id": "setshowprobs",
"min": 0,
"max": 1,
"step": 1,
"default": 0,
"tooltip": "Shows token selection probabilities. Does not work with more than one gens per action."
},
{
"uitype": "toggle",
"unit": "bool",
"label": "Show Field Budget",
"id": "setshowbudget",
"min": 0,
"max": 1,
"step": 1,
"default": 0,
"tooltip": "Shows token usage when typing in relevant text boxes. <b>May lag slower devices.</b>"
},
{
"uitype": "toggle",
"unit": "bool",
@ -240,7 +284,7 @@ gensettingstf = [
"step": 1,
"default": 0,
"tooltip": "Show debug info"
}
},
]
gensettingsik =[{
@ -404,9 +448,9 @@ formatcontrols = [{
"tooltip": "Remove special characters (@,#,%,^, etc)"
},
{
"label": "Add sentence spacing",
"label": "Automatic spacing",
"id": "frmtadsnsp",
"tooltip": "If the last action ended with punctuation, add a space to the beginning of the next action."
"tooltip": "Add spaces automatically if needed"
},
{
"label": "Single Line",

View File

@ -8,6 +8,7 @@ echo.
Reg add "HKLM\SYSTEM\CurrentControlSet\Control\FileSystem" /v "LongPathsEnabled" /t REG_DWORD /d "1" /f 2>nul
cd /D %~dp0
SET CONDA_SHLVL=
if exist miniconda3\ (
echo Delete existing installation?

View File

@ -1,12 +1,12 @@
#!/bin/bash
if [[ $1 = "cuda" ]]; then
if [[ $1 = "cuda" || $1 = "CUDA" ]]; then
wget -qO- https://micromamba.snakepit.net/api/micromamba/linux-64/latest | tar -xvj bin/micromamba
bin/micromamba create -f environments/huggingface.yml -r runtime -n koboldai -y
# Weird micromamba bug causes it to fail the first time, running it twice just to be safe, the second time is much faster
bin/micromamba create -f environments/huggingface.yml -r runtime -n koboldai -y
exit
fi
if [[ $1 = "rocm" ]]; then
if [[ $1 = "rocm" || $1 = "ROCM" ]]; then
wget -qO- https://micromamba.snakepit.net/api/micromamba/linux-64/latest | tar -xvj bin/micromamba
bin/micromamba create -f environments/rocm.yml -r runtime -n koboldai-rocm -y
# Weird micromamba bug causes it to fail the first time, running it twice just to be safe, the second time is much faster

99
logger.py Normal file
View File

@ -0,0 +1,99 @@
import sys
from functools import partialmethod
from loguru import logger
STDOUT_LEVELS = ["GENERATION", "PROMPT"]
INIT_LEVELS = ["INIT", "INIT_OK", "INIT_WARN", "INIT_ERR"]
MESSAGE_LEVELS = ["MESSAGE"]
# By default we're at error level or higher
verbosity = 20
quiet = 0
def set_logger_verbosity(count):
global verbosity
# The count comes reversed. So count = 0 means minimum verbosity
# While count 5 means maximum verbosity
# So the more count we have, the lowe we drop the versbosity maximum
verbosity = 20 - (count * 10)
def quiesce_logger(count):
global quiet
# The bigger the count, the more silent we want our logger
quiet = count * 10
def is_stdout_log(record):
if record["level"].name not in STDOUT_LEVELS:
return(False)
if record["level"].no < verbosity + quiet:
return(False)
return(True)
def is_init_log(record):
if record["level"].name not in INIT_LEVELS:
return(False)
if record["level"].no < verbosity + quiet:
return(False)
return(True)
def is_msg_log(record):
if record["level"].name not in MESSAGE_LEVELS:
return(False)
if record["level"].no < verbosity + quiet:
return(False)
return(True)
def is_stderr_log(record):
if record["level"].name in STDOUT_LEVELS + INIT_LEVELS + MESSAGE_LEVELS:
return(False)
if record["level"].no < verbosity + quiet:
return(False)
return(True)
def test_logger():
logger.generation("This is a generation message\nIt is typically multiline\nThee Lines".encode("unicode_escape").decode("utf-8"))
logger.prompt("This is a prompt message")
logger.debug("Debug Message")
logger.info("Info Message")
logger.warning("Info Warning")
logger.error("Error Message")
logger.critical("Critical Message")
logger.init("This is an init message", status="Starting")
logger.init_ok("This is an init message", status="OK")
logger.init_warn("This is an init message", status="Warning")
logger.init_err("This is an init message", status="Error")
logger.message("This is user message")
sys.exit()
logfmt = "<level>{level: <10}</level> | <green>{name}</green>:<green>{function}</green>:<green>{line}</green> - <level>{message}</level>"
genfmt = "<level>{level: <10}</level> @ <green>{time:YYYY-MM-DD HH:mm:ss}</green> | <level>{message}</level>"
initfmt = "<magenta>INIT </magenta> | <level>{extra[status]: <10}</level> | <magenta>{message}</magenta>"
msgfmt = "<level>{level: <10}</level> | <level>{message}</level>"
logger.level("GENERATION", no=24, color="<cyan>")
logger.level("PROMPT", no=23, color="<yellow>")
logger.level("INIT", no=31, color="<white>")
logger.level("INIT_OK", no=31, color="<green>")
logger.level("INIT_WARN", no=31, color="<yellow>")
logger.level("INIT_ERR", no=31, color="<red>")
# Messages contain important information without which this application might not be able to be used
# As such, they have the highest priority
logger.level("MESSAGE", no=61, color="<green>")
logger.__class__.generation = partialmethod(logger.__class__.log, "GENERATION")
logger.__class__.prompt = partialmethod(logger.__class__.log, "PROMPT")
logger.__class__.init = partialmethod(logger.__class__.log, "INIT")
logger.__class__.init_ok = partialmethod(logger.__class__.log, "INIT_OK")
logger.__class__.init_warn = partialmethod(logger.__class__.log, "INIT_WARN")
logger.__class__.init_err = partialmethod(logger.__class__.log, "INIT_ERR")
logger.__class__.message = partialmethod(logger.__class__.log, "MESSAGE")
config = {
"handlers": [
{"sink": sys.stderr, "format": logfmt, "colorize":True, "filter": is_stderr_log},
{"sink": sys.stdout, "format": genfmt, "level": "PROMPT", "colorize":True, "filter": is_stdout_log},
{"sink": sys.stdout, "format": initfmt, "level": "INIT", "colorize":True, "filter": is_init_log},
{"sink": sys.stdout, "format": msgfmt, "level": "MESSAGE", "colorize":True, "filter": is_msg_log}
],
}
logger.configure(**config)

30
maps/bloom.json Normal file
View File

@ -0,0 +1,30 @@
{
"mtj_compat": "bloom",
"mtj_pe": "alibi",
"mtj_config_map": {
"d_model": "n_embed",
"n_heads": "num_attention_heads",
"layers": "n_layer"
},
"static_weights": {
"word_embeddings.weight": {"mtj": {"module": "embedding_shard/~/linear", "param": "w", "transforms": ["no_transpose", "vocab_pad"]}},
"word_embeddings_layernorm.weight": {"mtj": {"module": "embedding_shard/~/replicated_layer_norm", "param": "scale"}},
"word_embeddings_layernorm.bias": {"mtj": {"module": "embedding_shard/~/replicated_layer_norm", "param": "offset"}},
"ln_f.weight": {"mtj": {"module": "projection_shard/~/replicated_layer_norm", "param": "scale"}},
"ln_f.bias": {"mtj": {"module": "projection_shard/~/replicated_layer_norm", "param": "offset"}}
},
"layer_weights": {
"h.{layer}.self_attention.query_key_value.weight": {"mtj": {"module": "layer_{layer}/~/combined_qkv", "param": "w"}},
"h.{layer}.self_attention.query_key_value.bias": {"mtj": {"module": "layer_{layer}/~/combined_qkv", "param": "b"}},
"h.{layer}.self_attention.dense.weight": {"mtj": {"module": "layer_{layer}/~/linear_3", "param": "w"}},
"h.{layer}.self_attention.dense.bias": {"mtj": {"module": "layer_{layer}/~/linear_3", "param": "b", "transforms": ["divide_by_shards"]}},
"h.{layer}.mlp.dense_h_to_4h.weight": {"mtj": {"module": "layer_{layer}/~/linear_4", "param": "w"}},
"h.{layer}.mlp.dense_h_to_4h.bias": {"mtj": {"module": "layer_{layer}/~/linear_4", "param": "b"}},
"h.{layer}.mlp.dense_4h_to_h.weight": {"mtj": {"module": "layer_{layer}/~/linear_5", "param": "w"}},
"h.{layer}.mlp.dense_4h_to_h.bias": {"mtj": {"module": "layer_{layer}/~/linear_5", "param": "b", "transforms": ["divide_by_shards"]}},
"h.{layer}.input_layernorm.weight": {"mtj": {"module": "layer_{layer}/~/replicated_layer_norm", "param": "scale"}},
"h.{layer}.input_layernorm.bias": {"mtj": {"module": "layer_{layer}/~/replicated_layer_norm", "param": "offset"}},
"h.{layer}.post_attention_layernorm.weight": {"mtj": {"module": "layer_{layer}/~/replicated_layer_norm_1", "param": "scale"}},
"h.{layer}.post_attention_layernorm.bias": {"mtj": {"module": "layer_{layer}/~/replicated_layer_norm_1", "param": "offset"}}
}
}

View File

@ -9,11 +9,11 @@
},
"static_weights": {
"transformer.wte.weight": {"mtj": {"module": "embedding_shard/~/linear", "param": "w", "transforms": ["no_transpose", "vocab_pad"]}},
"transformer.wte.bias": {"mtj": {"module": "embedding_shard/~/linear", "param": "b"}},
"transformer.wte.bias": {"mtj": {"module": "embedding_shard/~/linear", "param": "b", "transforms": ["vocab_pad"]}},
"transformer.ln_f.weight": {"mtj": {"module": "projection_shard/~/replicated_layer_norm", "param": "scale"}},
"transformer.ln_f.bias": {"mtj": {"module": "projection_shard/~/replicated_layer_norm", "param": "offset"}},
"lm_head.weight": {"mtj": {"module": "projection_shard/~/linear", "param": "w", "transforms": ["vocab_pad"]}},
"lm_head.bias": {"mtj": {"module": "projection_shard/~/linear", "param": "b"}}
"lm_head.bias": {"mtj": {"module": "projection_shard/~/linear", "param": "b", "transforms": ["vocab_pad"]}}
},
"layer_weights": {
"transformer.h.{layer}.attn.bias": {},

View File

@ -1,5 +1,9 @@
@echo off
cd /D %~dp0
SET CONDA_SHLVL=
rmdir /S /Q flask_session
TITLE KoboldAI - Server
SET /P M=<loader.settings
IF %M%==1 GOTO drivemap

1082
prompt_tuner.py Normal file

File diff suppressed because it is too large Load Diff

2
pytest.ini Normal file
View File

@ -0,0 +1,2 @@
[pytest]
addopts = --ignore=miniconda3 --ignore=runtime --html=unit_test_report.html --self-contained-html -v

View File

@ -1,14 +1,25 @@
transformers>=4.20.1
Flask
Flask-SocketIO
transformers==4.24.0
huggingface_hub==0.12.1
Flask==2.2.3
Flask-SocketIO==5.3.2
Werkzeug==2.3.7
python-socketio==5.7.2
requests
torch==1.11
flask-cloudflared
torch >= 1.9, < 1.13
flask-cloudflared==0.0.10
flask-ngrok
eventlet
eventlet==0.33.3
dnspython==2.2.1
lupa==1.10
markdown
bleach==4.1.0
sentencepiece
protobuf
accelerate
flask-session==0.4.0
marshmallow>=3.13
apispec-webframeworks
loguru
termcolor
safetensors
git+https://github.com/VE-FORBRYDERNE/mkultra

View File

@ -1,18 +1,27 @@
torch >= 1.9, <= 1.11
torch >= 1.9, < 1.13
numpy
tqdm
requests
dm-haiku == 0.0.5
jax == 0.2.21
jaxlib >= 0.1.69, <= 0.3.7
transformers >= 4.20.1
dm-haiku==0.0.9
jax==0.3.25
jaxlib==0.3.25
chex == 0.1.5
transformers == 4.24.0
huggingface_hub==0.12.1
progressbar2
git+https://github.com/VE-FORBRYDERNE/mesh-transformer-jax@ck
flask
Flask-SocketIO
flask-cloudflared >= 0.0.5
Flask==2.2.3
Flask-SocketIO==5.3.2
python-socketio==5.7.2
flask-cloudflared==0.0.10
flask-ngrok
eventlet
Werkzeug==2.3.7
eventlet==0.33.3
dnspython==2.2.1
lupa==1.10
markdown
bleach==4.1.0
flask-session==0.4.0
marshmallow>=3.13
apispec-webframeworks
loguru

File diff suppressed because it is too large Load Diff

View File

@ -291,7 +291,7 @@ body.connected #formatmenu, #formatmenu.always-available {
align-items: center;
}
#popup {
#popup_old {
width: 75%;
min-width: 500px;
max-width: 1000px;
@ -369,14 +369,14 @@ body.connected #popupfooter, #popupfooter.always-available {
margin-top: 200px;
}
#loadpopup {
.loadpopup {
width: 500px;
background-color: #262626;
margin-top: 100px;
}
@media (max-width: 768px) {
#loadpopup {
.loadpopup {
width: 100%;
background-color: #262626;
margin-top: 100px;
@ -473,7 +473,7 @@ body.connected #popupfooter, #popupfooter.always-available {
}
#samplerslist {
height: 300px;
height: 310px;
overflow-y: scroll;
overflow-wrap: anywhere;
}
@ -1056,7 +1056,7 @@ body.connected .statusiconlabel, .statusiconlabel.always-available {
}
.loadlistitem {
padding: 5px 10px 5px 10px;
padding: 0px 0px 0px 0px;
display: flex;
flex-grow: 1;
color: #ffffff;
@ -1072,6 +1072,28 @@ body.connected .statusiconlabel, .statusiconlabel.always-available {
background-color: #688f1f;
}
.breadcrumbitem {
padding: 5px 10px 5px 10px;
color: #ffffff;
background-color: transparent;
border: none;
-moz-transition: background-color 0.25s ease-in;
-o-transition: background-color 0.25s ease-in;
-webkit-transition: background-color 0.25s ease-in;
transition: background-color 0.25s ease-in;
}
.breadcrumbitem:hover {
cursor: pointer;
background-color: #688f1f;
}
hr {
padding: 0px;
margin: 0px;
}
.loadlistpadding {
padding-right: 10px;
}
@ -1463,3 +1485,240 @@ body.connected .popupfooter, .popupfooter.always-available {
overflow: hidden;
font-size: 12pt;
}
.model_layers {
width: 3ch;
background-color: inherit;
border: none;
outline: none;
}
.model_layers:focus {
color: #cdf;
}
.menu_icon {
position: fixed;
top:10px;
left: 5px;
z-index:100;
display: inline-block;
cursor: pointer;
}
.SideMenu {
height: 100%;
width: 0;
position: fixed;
z-index: 1;
top: 0;
left: 0;
background-color: #111;
overflow-x: hidden;
transition: 0.5s;
padding-top: 60px;
}
.SideMenu.open {
width: 450px;
}
@media only screen and (max-width: 768px) {
.SideMenu.open {
width: 100%;
}
}
.menubar1, .menubar2, .menubar3 {
width: 21px;
height: 3px;
background-color: #999;
margin: 3px 0;
transition: 0.4s;
}
.change .menubar1 {
transform: translate(0px, 6px) rotate(-45deg);
}
.change .menubar2 {opacity: 0;}
.change .menubar3 {
transform: translate(0px, -6px) rotate(45deg);
}
/*---------------------------------- Popup -------------------------------------------------*/
.new_popup {
position: absolute;
top: 10vh;
left: 10%;
z-index: 999;
width: 80%;
height: 80vh;
background-color: black;
display: flex;
flex-direction: column;
background-color: #474B4F;
color: white;
}
.new_popup .title {
width: 100%;
background-color: #337AB7;
text-align: center;
font-size: 1.3em;
}
.new_popup .popup_list_area {
height: 70vh;
overflow-x: hidden;
}
.new_popup .item {
width: 100%;
background-color: #262626;
padding: 2px;
display: grid;
grid-template-areas: "folder_icon delete_icon edit_icon rename_icon file";
grid-template-columns: 20px 20px 20px 20px auto;
}
.new_popup .item .folder_icon {
grid-area: folder_icon;
}
.new_popup .item .edit_icon {
grid-area: edit_icon;
}
.new_popup .item .rename_icon {
grid-area: rename_icon;
}
.new_popup .item .delete_icon {
grid-area: delete_icon;
}
.new_popup .item .file {
grid-area: file;
}
.new_popup .item .file:hover {
background-color: #688f1f;
}
.new_popup textarea {
grid-area: textarea;
background-color: #404040;
color: white;
resize: none;
width: 100%;
}
.new_popup .popup_load_cancel {
text-align: center;
background-color: #285070;
}
.popup_load_cancel_button {
vertical-align: bottom;
display: inline;
}
.popup_load_cancel_button.btn-secondary {
color: rgb(51, 51, 51);
background-color: #686c68;
}
.breadcrumbitem {
padding: 5px 10px 5px 10px;
color: #ffffff;
background-color: transparent;
border: none;
-moz-transition: background-color 0.25s ease-in;
-o-transition: background-color 0.25s ease-in;
-webkit-transition: background-color 0.25s ease-in;
transition: background-color 0.25s ease-in;
}
.breadcrumbitem:hover {
cursor: pointer;
background-color: #688f1f;
}
#token_prob_menu {
color: white;
background-color: #262626;
}
.token-probs {
display: inline-block;
text-align: center;
margin-right: 5px;
}
.token-probs > table {
width: 100%;
}
.token-probs > table > tbody > tr > td {
border: 1px solid #262626;
border-collapse: collapse;
padding: 2px 15px;
}
.token-probs > table > tbody > tr {
background-color: #3e3e3e;
}
.token-probs > table > tbody > tr:nth-child(2n) {
background-color: #575757;
}
.token-probs-final-token {
font-weight: bold;
text-decoration: underline;
}
.token-probs-final-token > td {
background: #5c8a5a;
}
.token-probs-header {
display: block;
}
#token_prob_container {
overflow-x: auto;
white-space: nowrap;
}
.tokens-counted {
position: relative;
}
.input-token-usage {
color: white;
position: absolute;
font-size: 10px;
bottom: 2px;
right: 5px;
-webkit-user-select: none;
-moz-user-select: none;
-ms-user-select: none;
user-select: none;
}
/* Override needed here due to the 10px right padding on inputrowleft; add 10 px. */
#inputrowleft > .input-token-usage {
right: 15px;
bottom: 1px;
}
.wientry > .input-token-usage {
bottom: 8px;
}

70
static/favicon.js Normal file
View File

@ -0,0 +1,70 @@
// Global Definitions
var fav_icon2 = "data:image/x-icon;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAMAAAAoLQ9TAAAABGdBTUEAALGPC/xhBQAAACBjSFJNAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB+1BMVEUAAAAAAAAAAAAAAAAAAQAAAAAAAQAAAAAAAAASFhBBWD4iUyoFEwgFEwguUTM+VDoMFAwAAAA+elIudz8AAAAAAAA0MigyLyQAAAAbLh1LdElSbUoVMBkAAABAZ0M2fkUAAAABAQFMiGQraDkAAQANFxEGFQkLFg8EEAYAAAAsZDonZjUAAABCgVVAnFYrSjhEjFpFi1sdRScAAAAjOi8VMxx1dGOFgGYAAABOTEabmIdlYlQaGhgaGhddXFauqY5JRjoAAAAAAAABAQFGeExIl1lX0XRW0XRHi1RFe02vv5W31KFd1Hpc1Hpe1HvO1KvDvJlqZ1plYVOmoIVt1IFl1H7AuZp1cV9jX1AmSCw3Nzg7NmA1MTJuz4Bm1H5MST9HPl9BQEMgNiNXgWKiobFgXICDd5dfw3RZVnJiV3zGv9Bqf29Oj2G/v8hTTpGhl8dbxHVd0npiYoxhWJvIxtlcimZFn1lRclg9SkZNblZBeEpDbEZCa0ZBc0hLY1BAS1BdaV87j01Vx3FWynJSrGZOhlVasGtas2xatm1at21WnWJQm15WyXJQvmlavnBZrGlEYEJWe1RBWz9Um2BavXBgxn9XhllGY0RLaklXiFlTwG5OpmVSfFNMbUpGZEVLa0lShldEhVCChHiKiHvWz6/Kw6WWlZGAfmj///8kr0X+AAAARHRSTlMAASFrcAhxIjLb/vWvsPb+20b4+DFFyMkz2vf43CP9/m5y9vZysLGvsQn19mz+/tz4+NxHycr3+Ejb/vaxsPX+3TRtcBrzrrgAAAABYktHRKhQCDaSAAAAB3RJTUUH5gYJFyQy3tftxgAAAQBJREFUGNNjYGBgYGRiZmFlZWNmZ2SAAA5OLm4eXj5+AQ6ogKCQi6ubu4ensCCIxygiKubl7ePr6+cfIC4owcjAJCkVGBQc4usbGhYeIS0jy8AsFxkVHRPr6xsXn5CYJK/AoKiUnJKalg5UkZGZla2swsCqmpObl1/g61tYVFxSqsbKwKpeVl5RWVVdU1tX39CoocnAotXU3NLa1t7R2dXd06utwqCj6+vb1z9h4sRJk6f4+uopMLDrG0z1nTZ94sQZM31nGRrJMjBKGJvMnjN3wrz5CxaaCnKAvSNqtmjxkqXLlptbQP0iYmllbWNrZ+/gCBVgZHdS1GR1VpAFqQcApI0/jqlZOvEAAAAldEVYdGRhdGU6Y3JlYXRlADIwMjItMDYtMDlUMjM6MzY6NTArMDA6MDDi0xr+AAAAJXRFWHRkYXRlOm1vZGlmeQAyMDIyLTA2LTA5VDIzOjM2OjUwKzAwOjAwk46iQgAAAABJRU5ErkJggg==";
var fav_icon1 = "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAMAAAAoLQ9TAAAABGdBTUEAALGPC/xhBQAAACBjSFJNAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB+FBMVEUAAAAAAAAAAAAAAAAAAAEAAAAAAQEAAAAAAAAUFRlLVGYrSWgHEBoHEBk3S19HUGMOExkAAABOcos7apIAAAAAAAA2Ly01KyoAAAAgKzdVaX9bZHIaKzwAAABKYHhDcZgAAAABAQFfgJY2XX0AAQEQFhoIEhwOFRgGDRUAAAAAAQE3W3cyWnwAAABSeJJRjLs1R1FVgaFWgJ4lPlMAAAAsOD4aLj55bm2Md3QAAABPSkmfko9pXlsbGRkbGRlfWlm1oJxMQkAAAAAAAAABAQFTb4tYibFtvPpWgKNScpC6s7nExtNzwPp1wPnZx8jMsKtuZGFoXVutmJODwfJ7wfbHr6p5a2hnW1gtQlI4ODk7N2A2LzWDvet8wPZPRkRHPl9CQUQlMTthe4+ko7RhXYGEeJhzsuJaVXRjWHzIwtNwfYddhqLCwcpTTpGimMhvsuVzv/djYpBgWJvLydxlgptVirdZbX1ASFZUaXtOb4xOZX1OZHxNa4ZRX21DSV5gaG9Je6lqsepstO1knclcfJxtoc5tpNFuptVup9ZnkbdgjrVss+xjpuBvrd9snspOW29jdI5LVmlkj7Vvrd54t+RlfptQXXJWZHtlf51oruNgmMFfdJBYZn1RXnRWZXthfZxSeZiGgYGOhYLdxb/RubWZlpWFd3T////2kwjgAAAARXRSTlMAASFrcAhxIjLb/vWvsPb+20b4+DFFyMkz2vf43CP9/m5y9vZysLGvsQlw9fZs/v7c+PjcR8nK9/hI2/72sbD1/t00bXBAFktiAAAAAWJLR0SnwLcrAwAAAAd0SU1FB+YGCRchHQhxJNoAAAD/SURBVBjTY2BgYGBkYmZhZWVjZmdkgAAOTi5uHl4+fgEOqICgkKubu7uHp7AgiMcoIirm5e3j4+Pr5y8uKMHIwCQpFRAYFOzjExIaFi4tI8vALBcRGRUd4+MTGxefkCivwKColJSckpoGVJGekZmlrMLAqpqdk5uX7+NTUFhUXKLGysCqXlpWXlFZVV1TW1ffoKHJoKXd2NTc0trW3tHZ1d2jo8Kgq+fj09vXP2HCxEmTfXz0FRjYDQyn+EydNmHC9Bk+M42MZRkYJUxMZ82e0z933vwFZoIcYO+Imi9ctHjJ0mUWllC/iFhZ29ja2Ts4OkEFGNmdFTVZXRRkQeoBhkE/Yj5NSZ4AAAAldEVYdGRhdGU6Y3JlYXRlADIwMjItMDYtMDlUMjM6MzM6MjgrMDA6MDA90JbEAAAAJXRFWHRkYXRlOm1vZGlmeQAyMDIyLTA2LTA5VDIzOjMzOjI4KzAwOjAwTI0ueAAAAABJRU5ErkJggg==";
var fav_icon = "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAMAAAAoLQ9TAAAABGdBTUEAALGPC/xhBQAAACBjSFJNAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB8lBMVEUAAAAAAAAAAAAAAAABAAAAAAABAAAAAAAAAAAdEBB0Pz5rKCgaBwcZBwdkMzJxPDocDAwAAACLTU6SOzsAAAAAAAA9Mic/LyEAAAA6HByQUUaIVEY+GBgAAACAQkKaQUIAAAABAQGWXl9+NjYBAAAaEBAcCAgZDQ0WBQUAAAB3Nzd9MjIAAACTUVK7UVJRNTWhVVaeVldTJSUAAAA+LC0+GhuGcmCgf2EAAABUTESrl4NzYlEdGhcdGhdiXFbIqIhWRjcAAAAAAAABAQGUSkq1VVX6bW6oUVGXS0vmro7+uJn6c3T6dXX/yqPnu5F3aFhxYVG/oH/7gHv6enjeuJOEcFtzX01VLCs4ODk7NmA5MTH1gHr6e3hWSTxHPl9CQUQ/JCKPYGGko7RhXYGEeJjmcW9cVnFjWH3IwtOHb3CjXV3CwcpTTpGimMjlb3D4c3RmYI1gWJvLydybZWW+T0x+V1hRP0Z7U1WTSEiHRUWGRUSORkZuTlBRQVBwX2CvRkXtaGjvamrNYWKmU1PVZ2fXaGjbaWncaWnAX1+7W1vkYF/ja2zRZWV9QkGeVFN2Pz69XV3ia2zkeHmpWFd/REOJSUirWVjjaGjBYGCeUlKMSkl8QkGBRUSoVlWeUE2QgXeWiHr1zqjmw5+bl5KVe2T///8NZLRGAAAARHRSTlMAASFrcAhxIjLb/vWvsPb+20b4+DFFyMkz2vf43CP9/m5y9vZysLGvsQn19mz+/tz4+NxHycr3+Ejb/vaxsPX+3TRtcBrzrrgAAAABYktHRKUuuUovAAAAB3RJTUUH5gYJFzsfVlK/LQAAAP9JREFUGNNjYGBgYGRiZmFlZWNmZ2SAAA5OLm4eXj5+AQ6ogKCQi6ubm7uHsCCIxygiKubp5e3t7ePrJy4owcjAJCnlHxAY5O0dHBIaJi0jy8AsFx4RGRXt7R0TGxefIK/AoKiUmJSckgpUkZaekamswsCqmpWdk5vn7Z1fUFhUrMbKwKpeUlpWXlFZVV1TW1evocnAotXQ2NTc0trW3tHZ2KWtwqCj6+3d3dPb19c/YaK3t54CA7u+wSTvyVP6+qZO855uaCTLwChhbDJj5qzZc6bOnWcqyAH2jqjZ/AULFy1eYm4B9YuIpZW1ja2dvYMjVICR3UlRk9VZQRakHgAlRz6K4dvoSgAAACV0RVh0ZGF0ZTpjcmVhdGUAMjAyMi0wNi0wOVQyMzo1OTozMSswMDowMJt1iQMAAAAldEVYdGRhdGU6bW9kaWZ5ADIwMjItMDYtMDlUMjM6NTk6MzErMDA6MDDqKDG/AAAAAElFTkSuQmCC"
var submit_start;
var favicon = {
// Change the Page Icon and Title.
change: function(iconURL) {
this.addLink(iconURL, "icon");
this.addLink(iconURL, "shortcut icon");
},
addLink: function(iconURL, relValue) {
var link = document.createElement("link");
link.type = "image/x-icon";
link.rel = relValue;
link.href = iconURL;
this.removeLink(relValue);
this.docHead.appendChild(link);
},
removeLink: function(relValue) {
var links = this.docHead.getElementsByTagName("link");
for (var i = 0; i < links.length; i++) {
var link = links[i];
if (link.type == "image/x-icon" && link.rel == relValue) {
this.docHead.removeChild(link);
return; // Assuming only one match at most.
}
}
},
swapLink: function() {
if (this.run == true) {
if (this.icon == 1) {
this.change(fav_icon2);
this.icon = 2;
} else {
this.change(fav_icon1);
this.icon = 1;
}
}
},
auto_swap: function() {
if (this.run == true) {
this.swapLink();
setTimeout(() => { this.auto_swap(); }, 1000);
}
},
start_swap: function() {
this.run = true;
this.auto_swap();
submit_start = Date.now();
},
stop_swap: function() {
this.run = false;
this.change(fav_icon);
if (typeof submit_start !== 'undefined') {
$("#runtime")[0].innerHTML = `Execution time: ${Math.round((Date.now() - submit_start)/1000)} sec`;
delete submit_start;
}
},
docHead:document.getElementsByTagName("head")[0]
}

View File

@ -0,0 +1,952 @@
/* Bootstrap */
@font-face {
font-family: 'Icons';
src: url('../fonts/open-iconic.eot');
src: url('../fonts/open-iconic.eot?#iconic-sm') format('embedded-opentype'), url('../fonts/open-iconic.woff') format('woff'), url('../fonts/open-iconic.ttf') format('truetype'), url('../fonts/open-iconic.otf') format('opentype'), url('../fonts/open-iconic.svg#iconic-sm') format('svg');
font-weight: normal;
font-style: normal;
}
.oi {
position: relative;
top: 1px;
display: inline-block;
speak:none;
font-family: 'Icons';
font-style: normal;
font-weight: normal;
line-height: 1;
-webkit-font-smoothing: antialiased;
-moz-osx-font-smoothing: grayscale;
}
.oi:empty:before {
width: 1em;
text-align: center;
box-sizing: content-box;
}
.oi.oi-align-center:before {
text-align: center;
}
.oi.oi-align-left:before {
text-align: left;
}
.oi.oi-align-right:before {
text-align: right;
}
.oi.oi-flip-horizontal:before {
-webkit-transform: scale(-1, 1);
-ms-transform: scale(-1, 1);
transform: scale(-1, 1);
}
.oi.oi-flip-vertical:before {
-webkit-transform: scale(1, -1);
-ms-transform: scale(-1, 1);
transform: scale(1, -1);
}
.oi.oi-flip-horizontal-vertical:before {
-webkit-transform: scale(-1, -1);
-ms-transform: scale(-1, 1);
transform: scale(-1, -1);
}
.oi-account-login:before {
content:'\e000';
}
.oi-account-logout:before {
content:'\e001';
}
.oi-action-redo:before {
content:'\e002';
}
.oi-action-undo:before {
content:'\e003';
}
.oi-align-center:before {
content:'\e004';
}
.oi-align-left:before {
content:'\e005';
}
.oi-align-right:before {
content:'\e006';
}
.oi-aperture:before {
content:'\e007';
}
.oi-arrow-bottom:before {
content:'\e008';
}
.oi-arrow-circle-bottom:before {
content:'\e009';
}
.oi-arrow-circle-left:before {
content:'\e00a';
}
.oi-arrow-circle-right:before {
content:'\e00b';
}
.oi-arrow-circle-top:before {
content:'\e00c';
}
.oi-arrow-left:before {
content:'\e00d';
}
.oi-arrow-right:before {
content:'\e00e';
}
.oi-arrow-thick-bottom:before {
content:'\e00f';
}
.oi-arrow-thick-left:before {
content:'\e010';
}
.oi-arrow-thick-right:before {
content:'\e011';
}
.oi-arrow-thick-top:before {
content:'\e012';
}
.oi-arrow-top:before {
content:'\e013';
}
.oi-audio-spectrum:before {
content:'\e014';
}
.oi-audio:before {
content:'\e015';
}
.oi-badge:before {
content:'\e016';
}
.oi-ban:before {
content:'\e017';
}
.oi-bar-chart:before {
content:'\e018';
}
.oi-basket:before {
content:'\e019';
}
.oi-battery-empty:before {
content:'\e01a';
}
.oi-battery-full:before {
content:'\e01b';
}
.oi-beaker:before {
content:'\e01c';
}
.oi-bell:before {
content:'\e01d';
}
.oi-bluetooth:before {
content:'\e01e';
}
.oi-bold:before {
content:'\e01f';
}
.oi-bolt:before {
content:'\e020';
}
.oi-book:before {
content:'\e021';
}
.oi-bookmark:before {
content:'\e022';
}
.oi-box:before {
content:'\e023';
}
.oi-briefcase:before {
content:'\e024';
}
.oi-british-pound:before {
content:'\e025';
}
.oi-browser:before {
content:'\e026';
}
.oi-brush:before {
content:'\e027';
}
.oi-bug:before {
content:'\e028';
}
.oi-bullhorn:before {
content:'\e029';
}
.oi-calculator:before {
content:'\e02a';
}
.oi-calendar:before {
content:'\e02b';
}
.oi-camera-slr:before {
content:'\e02c';
}
.oi-caret-bottom:before {
content:'\e02d';
}
.oi-caret-left:before {
content:'\e02e';
}
.oi-caret-right:before {
content:'\e02f';
}
.oi-caret-top:before {
content:'\e030';
}
.oi-cart:before {
content:'\e031';
}
.oi-chat:before {
content:'\e032';
}
.oi-check:before {
content:'\e033';
}
.oi-chevron-bottom:before {
content:'\e034';
}
.oi-chevron-left:before {
content:'\e035';
}
.oi-chevron-right:before {
content:'\e036';
}
.oi-chevron-top:before {
content:'\e037';
}
.oi-circle-check:before {
content:'\e038';
}
.oi-circle-x:before {
content:'\e039';
}
.oi-clipboard:before {
content:'\e03a';
}
.oi-clock:before {
content:'\e03b';
}
.oi-cloud-download:before {
content:'\e03c';
}
.oi-cloud-upload:before {
content:'\e03d';
}
.oi-cloud:before {
content:'\e03e';
}
.oi-cloudy:before {
content:'\e03f';
}
.oi-code:before {
content:'\e040';
}
.oi-cog:before {
content:'\e041';
}
.oi-collapse-down:before {
content:'\e042';
}
.oi-collapse-left:before {
content:'\e043';
}
.oi-collapse-right:before {
content:'\e044';
}
.oi-collapse-up:before {
content:'\e045';
}
.oi-command:before {
content:'\e046';
}
.oi-comment-square:before {
content:'\e047';
}
.oi-compass:before {
content:'\e048';
}
.oi-contrast:before {
content:'\e049';
}
.oi-copywriting:before {
content:'\e04a';
}
.oi-credit-card:before {
content:'\e04b';
}
.oi-crop:before {
content:'\e04c';
}
.oi-dashboard:before {
content:'\e04d';
}
.oi-data-transfer-download:before {
content:'\e04e';
}
.oi-data-transfer-upload:before {
content:'\e04f';
}
.oi-delete:before {
content:'\e050';
}
.oi-dial:before {
content:'\e051';
}
.oi-document:before {
content:'\e052';
}
.oi-dollar:before {
content:'\e053';
}
.oi-double-quote-sans-left:before {
content:'\e054';
}
.oi-double-quote-sans-right:before {
content:'\e055';
}
.oi-double-quote-serif-left:before {
content:'\e056';
}
.oi-double-quote-serif-right:before {
content:'\e057';
}
.oi-droplet:before {
content:'\e058';
}
.oi-eject:before {
content:'\e059';
}
.oi-elevator:before {
content:'\e05a';
}
.oi-ellipses:before {
content:'\e05b';
}
.oi-envelope-closed:before {
content:'\e05c';
}
.oi-envelope-open:before {
content:'\e05d';
}
.oi-euro:before {
content:'\e05e';
}
.oi-excerpt:before {
content:'\e05f';
}
.oi-expand-down:before {
content:'\e060';
}
.oi-expand-left:before {
content:'\e061';
}
.oi-expand-right:before {
content:'\e062';
}
.oi-expand-up:before {
content:'\e063';
}
.oi-external-link:before {
content:'\e064';
}
.oi-eye:before {
content:'\e065';
}
.oi-eyedropper:before {
content:'\e066';
}
.oi-file:before {
content:'\e067';
}
.oi-fire:before {
content:'\e068';
}
.oi-flag:before {
content:'\e069';
}
.oi-flash:before {
content:'\e06a';
}
.oi-folder:before {
content:'\e06b';
}
.oi-fork:before {
content:'\e06c';
}
.oi-fullscreen-enter:before {
content:'\e06d';
}
.oi-fullscreen-exit:before {
content:'\e06e';
}
.oi-globe:before {
content:'\e06f';
}
.oi-graph:before {
content:'\e070';
}
.oi-grid-four-up:before {
content:'\e071';
}
.oi-grid-three-up:before {
content:'\e072';
}
.oi-grid-two-up:before {
content:'\e073';
}
.oi-hard-drive:before {
content:'\e074';
}
.oi-header:before {
content:'\e075';
}
.oi-headphones:before {
content:'\e076';
}
.oi-heart:before {
content:'\e077';
}
.oi-home:before {
content:'\e078';
}
.oi-image:before {
content:'\e079';
}
.oi-inbox:before {
content:'\e07a';
}
.oi-infinity:before {
content:'\e07b';
}
.oi-info:before {
content:'\e07c';
}
.oi-italic:before {
content:'\e07d';
}
.oi-justify-center:before {
content:'\e07e';
}
.oi-justify-left:before {
content:'\e07f';
}
.oi-justify-right:before {
content:'\e080';
}
.oi-key:before {
content:'\e081';
}
.oi-laptop:before {
content:'\e082';
}
.oi-layers:before {
content:'\e083';
}
.oi-lightbulb:before {
content:'\e084';
}
.oi-link-broken:before {
content:'\e085';
}
.oi-link-intact:before {
content:'\e086';
}
.oi-list-rich:before {
content:'\e087';
}
.oi-list:before {
content:'\e088';
}
.oi-location:before {
content:'\e089';
}
.oi-lock-locked:before {
content:'\e08a';
}
.oi-lock-unlocked:before {
content:'\e08b';
}
.oi-loop-circular:before {
content:'\e08c';
}
.oi-loop-square:before {
content:'\e08d';
}
.oi-loop:before {
content:'\e08e';
}
.oi-magnifying-glass:before {
content:'\e08f';
}
.oi-map-marker:before {
content:'\e090';
}
.oi-map:before {
content:'\e091';
}
.oi-media-pause:before {
content:'\e092';
}
.oi-media-play:before {
content:'\e093';
}
.oi-media-record:before {
content:'\e094';
}
.oi-media-skip-backward:before {
content:'\e095';
}
.oi-media-skip-forward:before {
content:'\e096';
}
.oi-media-step-backward:before {
content:'\e097';
}
.oi-media-step-forward:before {
content:'\e098';
}
.oi-media-stop:before {
content:'\e099';
}
.oi-medical-cross:before {
content:'\e09a';
}
.oi-menu:before {
content:'\e09b';
}
.oi-microphone:before {
content:'\e09c';
}
.oi-minus:before {
content:'\e09d';
}
.oi-monitor:before {
content:'\e09e';
}
.oi-moon:before {
content:'\e09f';
}
.oi-move:before {
content:'\e0a0';
}
.oi-musical-note:before {
content:'\e0a1';
}
.oi-paperclip:before {
content:'\e0a2';
}
.oi-pencil:before {
content:'\e0a3';
}
.oi-people:before {
content:'\e0a4';
}
.oi-person:before {
content:'\e0a5';
}
.oi-phone:before {
content:'\e0a6';
}
.oi-pie-chart:before {
content:'\e0a7';
}
.oi-pin:before {
content:'\e0a8';
}
.oi-play-circle:before {
content:'\e0a9';
}
.oi-plus:before {
content:'\e0aa';
}
.oi-power-standby:before {
content:'\e0ab';
}
.oi-print:before {
content:'\e0ac';
}
.oi-project:before {
content:'\e0ad';
}
.oi-pulse:before {
content:'\e0ae';
}
.oi-puzzle-piece:before {
content:'\e0af';
}
.oi-question-mark:before {
content:'\e0b0';
}
.oi-rain:before {
content:'\e0b1';
}
.oi-random:before {
content:'\e0b2';
}
.oi-reload:before {
content:'\e0b3';
}
.oi-resize-both:before {
content:'\e0b4';
}
.oi-resize-height:before {
content:'\e0b5';
}
.oi-resize-width:before {
content:'\e0b6';
}
.oi-rss-alt:before {
content:'\e0b7';
}
.oi-rss:before {
content:'\e0b8';
}
.oi-script:before {
content:'\e0b9';
}
.oi-share-boxed:before {
content:'\e0ba';
}
.oi-share:before {
content:'\e0bb';
}
.oi-shield:before {
content:'\e0bc';
}
.oi-signal:before {
content:'\e0bd';
}
.oi-signpost:before {
content:'\e0be';
}
.oi-sort-ascending:before {
content:'\e0bf';
}
.oi-sort-descending:before {
content:'\e0c0';
}
.oi-spreadsheet:before {
content:'\e0c1';
}
.oi-star:before {
content:'\e0c2';
}
.oi-sun:before {
content:'\e0c3';
}
.oi-tablet:before {
content:'\e0c4';
}
.oi-tag:before {
content:'\e0c5';
}
.oi-tags:before {
content:'\e0c6';
}
.oi-target:before {
content:'\e0c7';
}
.oi-task:before {
content:'\e0c8';
}
.oi-terminal:before {
content:'\e0c9';
}
.oi-text:before {
content:'\e0ca';
}
.oi-thumb-down:before {
content:'\e0cb';
}
.oi-thumb-up:before {
content:'\e0cc';
}
.oi-timer:before {
content:'\e0cd';
}
.oi-transfer:before {
content:'\e0ce';
}
.oi-trash:before {
content:'\e0cf';
}
.oi-underline:before {
content:'\e0d0';
}
.oi-vertical-align-bottom:before {
content:'\e0d1';
}
.oi-vertical-align-center:before {
content:'\e0d2';
}
.oi-vertical-align-top:before {
content:'\e0d3';
}
.oi-video:before {
content:'\e0d4';
}
.oi-volume-high:before {
content:'\e0d5';
}
.oi-volume-low:before {
content:'\e0d6';
}
.oi-volume-off:before {
content:'\e0d7';
}
.oi-warning:before {
content:'\e0d8';
}
.oi-wifi:before {
content:'\e0d9';
}
.oi-wrench:before {
content:'\e0da';
}
.oi-x:before {
content:'\e0db';
}
.oi-yen:before {
content:'\e0dc';
}
.oi-zoom-in:before {
content:'\e0dd';
}
.oi-zoom-out:before {
content:'\e0de';
}

View File

@ -0,0 +1,960 @@
/* Bootstrap */
/* Override Bootstrap default variable */
//@icon-font-path: "../fonts/";
@font-face {
font-family: 'Icons';
src: ~"url('@{icon-font-path}open-iconic.eot')";
src: ~"url('@{icon-font-path}open-iconic.eot?#iconic-sm') format('embedded-opentype')",
~"url('@{icon-font-path}open-iconic.woff') format('woff')",
~"url('@{icon-font-path}open-iconic.ttf') format('truetype')",
~"url('@{icon-font-path}open-iconic.svg#iconic-sm') format('svg')";
font-weight: normal;
font-style: normal;
}
// Catchall baseclass
.oi {
position: relative;
top: 1px;
display: inline-block;
font-family: 'Icons';
font-style: normal;
font-weight: normal;
line-height: 1;
-webkit-font-smoothing: antialiased;
-moz-osx-font-smoothing: grayscale;
&:empty:before {
width: 1em;
text-align: center;
box-sizing: content-box;
}
&.oi-align-center:before {
text-align: center;
}
&.oi-align-left:before {
text-align: left;
}
&.oi-align-right:before {
text-align: right;
}
&.oi-flip-horizontal:before {
-webkit-transform: scale(-1, 1);
-ms-transform: scale(-1, 1);
transform: scale(-1, 1);
}
&.oi-flip-vertical:before {
-webkit-transform: scale(1, -1);
-ms-transform: scale(-1, 1);
transform: scale(1, -1);
}
&.oi-flip-horizontal-vertical:before {
-webkit-transform: scale(-1, -1);
-ms-transform: scale(-1, 1);
transform: scale(-1, -1);
}
}
.oi-account-login:before {
content:"\e000";
}
.oi-account-logout:before {
content:"\e001";
}
.oi-action-redo:before {
content:"\e002";
}
.oi-action-undo:before {
content:"\e003";
}
.oi-align-center:before {
content:"\e004";
}
.oi-align-left:before {
content:"\e005";
}
.oi-align-right:before {
content:"\e006";
}
.oi-aperture:before {
content:"\e007";
}
.oi-arrow-bottom:before {
content:"\e008";
}
.oi-arrow-circle-bottom:before {
content:"\e009";
}
.oi-arrow-circle-left:before {
content:"\e00a";
}
.oi-arrow-circle-right:before {
content:"\e00b";
}
.oi-arrow-circle-top:before {
content:"\e00c";
}
.oi-arrow-left:before {
content:"\e00d";
}
.oi-arrow-right:before {
content:"\e00e";
}
.oi-arrow-thick-bottom:before {
content:"\e00f";
}
.oi-arrow-thick-left:before {
content:"\e010";
}
.oi-arrow-thick-right:before {
content:"\e011";
}
.oi-arrow-thick-top:before {
content:"\e012";
}
.oi-arrow-top:before {
content:"\e013";
}
.oi-audio-spectrum:before {
content:"\e014";
}
.oi-audio:before {
content:"\e015";
}
.oi-badge:before {
content:"\e016";
}
.oi-ban:before {
content:"\e017";
}
.oi-bar-chart:before {
content:"\e018";
}
.oi-basket:before {
content:"\e019";
}
.oi-battery-empty:before {
content:"\e01a";
}
.oi-battery-full:before {
content:"\e01b";
}
.oi-beaker:before {
content:"\e01c";
}
.oi-bell:before {
content:"\e01d";
}
.oi-bluetooth:before {
content:"\e01e";
}
.oi-bold:before {
content:"\e01f";
}
.oi-bolt:before {
content:"\e020";
}
.oi-book:before {
content:"\e021";
}
.oi-bookmark:before {
content:"\e022";
}
.oi-box:before {
content:"\e023";
}
.oi-briefcase:before {
content:"\e024";
}
.oi-british-pound:before {
content:"\e025";
}
.oi-browser:before {
content:"\e026";
}
.oi-brush:before {
content:"\e027";
}
.oi-bug:before {
content:"\e028";
}
.oi-bullhorn:before {
content:"\e029";
}
.oi-calculator:before {
content:"\e02a";
}
.oi-calendar:before {
content:"\e02b";
}
.oi-camera-slr:before {
content:"\e02c";
}
.oi-caret-bottom:before {
content:"\e02d";
}
.oi-caret-left:before {
content:"\e02e";
}
.oi-caret-right:before {
content:"\e02f";
}
.oi-caret-top:before {
content:"\e030";
}
.oi-cart:before {
content:"\e031";
}
.oi-chat:before {
content:"\e032";
}
.oi-check:before {
content:"\e033";
}
.oi-chevron-bottom:before {
content:"\e034";
}
.oi-chevron-left:before {
content:"\e035";
}
.oi-chevron-right:before {
content:"\e036";
}
.oi-chevron-top:before {
content:"\e037";
}
.oi-circle-check:before {
content:"\e038";
}
.oi-circle-x:before {
content:"\e039";
}
.oi-clipboard:before {
content:"\e03a";
}
.oi-clock:before {
content:"\e03b";
}
.oi-cloud-download:before {
content:"\e03c";
}
.oi-cloud-upload:before {
content:"\e03d";
}
.oi-cloud:before {
content:"\e03e";
}
.oi-cloudy:before {
content:"\e03f";
}
.oi-code:before {
content:"\e040";
}
.oi-cog:before {
content:"\e041";
}
.oi-collapse-down:before {
content:"\e042";
}
.oi-collapse-left:before {
content:"\e043";
}
.oi-collapse-right:before {
content:"\e044";
}
.oi-collapse-up:before {
content:"\e045";
}
.oi-command:before {
content:"\e046";
}
.oi-comment-square:before {
content:"\e047";
}
.oi-compass:before {
content:"\e048";
}
.oi-contrast:before {
content:"\e049";
}
.oi-copywriting:before {
content:"\e04a";
}
.oi-credit-card:before {
content:"\e04b";
}
.oi-crop:before {
content:"\e04c";
}
.oi-dashboard:before {
content:"\e04d";
}
.oi-data-transfer-download:before {
content:"\e04e";
}
.oi-data-transfer-upload:before {
content:"\e04f";
}
.oi-delete:before {
content:"\e050";
}
.oi-dial:before {
content:"\e051";
}
.oi-document:before {
content:"\e052";
}
.oi-dollar:before {
content:"\e053";
}
.oi-double-quote-sans-left:before {
content:"\e054";
}
.oi-double-quote-sans-right:before {
content:"\e055";
}
.oi-double-quote-serif-left:before {
content:"\e056";
}
.oi-double-quote-serif-right:before {
content:"\e057";
}
.oi-droplet:before {
content:"\e058";
}
.oi-eject:before {
content:"\e059";
}
.oi-elevator:before {
content:"\e05a";
}
.oi-ellipses:before {
content:"\e05b";
}
.oi-envelope-closed:before {
content:"\e05c";
}
.oi-envelope-open:before {
content:"\e05d";
}
.oi-euro:before {
content:"\e05e";
}
.oi-excerpt:before {
content:"\e05f";
}
.oi-expand-down:before {
content:"\e060";
}
.oi-expand-left:before {
content:"\e061";
}
.oi-expand-right:before {
content:"\e062";
}
.oi-expand-up:before {
content:"\e063";
}
.oi-external-link:before {
content:"\e064";
}
.oi-eye:before {
content:"\e065";
}
.oi-eyedropper:before {
content:"\e066";
}
.oi-file:before {
content:"\e067";
}
.oi-fire:before {
content:"\e068";
}
.oi-flag:before {
content:"\e069";
}
.oi-flash:before {
content:"\e06a";
}
.oi-folder:before {
content:"\e06b";
}
.oi-fork:before {
content:"\e06c";
}
.oi-fullscreen-enter:before {
content:"\e06d";
}
.oi-fullscreen-exit:before {
content:"\e06e";
}
.oi-globe:before {
content:"\e06f";
}
.oi-graph:before {
content:"\e070";
}
.oi-grid-four-up:before {
content:"\e071";
}
.oi-grid-three-up:before {
content:"\e072";
}
.oi-grid-two-up:before {
content:"\e073";
}
.oi-hard-drive:before {
content:"\e074";
}
.oi-header:before {
content:"\e075";
}
.oi-headphones:before {
content:"\e076";
}
.oi-heart:before {
content:"\e077";
}
.oi-home:before {
content:"\e078";
}
.oi-image:before {
content:"\e079";
}
.oi-inbox:before {
content:"\e07a";
}
.oi-infinity:before {
content:"\e07b";
}
.oi-info:before {
content:"\e07c";
}
.oi-italic:before {
content:"\e07d";
}
.oi-justify-center:before {
content:"\e07e";
}
.oi-justify-left:before {
content:"\e07f";
}
.oi-justify-right:before {
content:"\e080";
}
.oi-key:before {
content:"\e081";
}
.oi-laptop:before {
content:"\e082";
}
.oi-layers:before {
content:"\e083";
}
.oi-lightbulb:before {
content:"\e084";
}
.oi-link-broken:before {
content:"\e085";
}
.oi-link-intact:before {
content:"\e086";
}
.oi-list-rich:before {
content:"\e087";
}
.oi-list:before {
content:"\e088";
}
.oi-location:before {
content:"\e089";
}
.oi-lock-locked:before {
content:"\e08a";
}
.oi-lock-unlocked:before {
content:"\e08b";
}
.oi-loop-circular:before {
content:"\e08c";
}
.oi-loop-square:before {
content:"\e08d";
}
.oi-loop:before {
content:"\e08e";
}
.oi-magnifying-glass:before {
content:"\e08f";
}
.oi-map-marker:before {
content:"\e090";
}
.oi-map:before {
content:"\e091";
}
.oi-media-pause:before {
content:"\e092";
}
.oi-media-play:before {
content:"\e093";
}
.oi-media-record:before {
content:"\e094";
}
.oi-media-skip-backward:before {
content:"\e095";
}
.oi-media-skip-forward:before {
content:"\e096";
}
.oi-media-step-backward:before {
content:"\e097";
}
.oi-media-step-forward:before {
content:"\e098";
}
.oi-media-stop:before {
content:"\e099";
}
.oi-medical-cross:before {
content:"\e09a";
}
.oi-menu:before {
content:"\e09b";
}
.oi-microphone:before {
content:"\e09c";
}
.oi-minus:before {
content:"\e09d";
}
.oi-monitor:before {
content:"\e09e";
}
.oi-moon:before {
content:"\e09f";
}
.oi-move:before {
content:"\e0a0";
}
.oi-musical-note:before {
content:"\e0a1";
}
.oi-paperclip:before {
content:"\e0a2";
}
.oi-pencil:before {
content:"\e0a3";
}
.oi-people:before {
content:"\e0a4";
}
.oi-person:before {
content:"\e0a5";
}
.oi-phone:before {
content:"\e0a6";
}
.oi-pie-chart:before {
content:"\e0a7";
}
.oi-pin:before {
content:"\e0a8";
}
.oi-play-circle:before {
content:"\e0a9";
}
.oi-plus:before {
content:"\e0aa";
}
.oi-power-standby:before {
content:"\e0ab";
}
.oi-print:before {
content:"\e0ac";
}
.oi-project:before {
content:"\e0ad";
}
.oi-pulse:before {
content:"\e0ae";
}
.oi-puzzle-piece:before {
content:"\e0af";
}
.oi-question-mark:before {
content:"\e0b0";
}
.oi-rain:before {
content:"\e0b1";
}
.oi-random:before {
content:"\e0b2";
}
.oi-reload:before {
content:"\e0b3";
}
.oi-resize-both:before {
content:"\e0b4";
}
.oi-resize-height:before {
content:"\e0b5";
}
.oi-resize-width:before {
content:"\e0b6";
}
.oi-rss-alt:before {
content:"\e0b7";
}
.oi-rss:before {
content:"\e0b8";
}
.oi-script:before {
content:"\e0b9";
}
.oi-share-boxed:before {
content:"\e0ba";
}
.oi-share:before {
content:"\e0bb";
}
.oi-shield:before {
content:"\e0bc";
}
.oi-signal:before {
content:"\e0bd";
}
.oi-signpost:before {
content:"\e0be";
}
.oi-sort-ascending:before {
content:"\e0bf";
}
.oi-sort-descending:before {
content:"\e0c0";
}
.oi-spreadsheet:before {
content:"\e0c1";
}
.oi-star:before {
content:"\e0c2";
}
.oi-sun:before {
content:"\e0c3";
}
.oi-tablet:before {
content:"\e0c4";
}
.oi-tag:before {
content:"\e0c5";
}
.oi-tags:before {
content:"\e0c6";
}
.oi-target:before {
content:"\e0c7";
}
.oi-task:before {
content:"\e0c8";
}
.oi-terminal:before {
content:"\e0c9";
}
.oi-text:before {
content:"\e0ca";
}
.oi-thumb-down:before {
content:"\e0cb";
}
.oi-thumb-up:before {
content:"\e0cc";
}
.oi-timer:before {
content:"\e0cd";
}
.oi-transfer:before {
content:"\e0ce";
}
.oi-trash:before {
content:"\e0cf";
}
.oi-underline:before {
content:"\e0d0";
}
.oi-vertical-align-bottom:before {
content:"\e0d1";
}
.oi-vertical-align-center:before {
content:"\e0d2";
}
.oi-vertical-align-top:before {
content:"\e0d3";
}
.oi-video:before {
content:"\e0d4";
}
.oi-volume-high:before {
content:"\e0d5";
}
.oi-volume-low:before {
content:"\e0d6";
}
.oi-volume-off:before {
content:"\e0d7";
}
.oi-warning:before {
content:"\e0d8";
}
.oi-wifi:before {
content:"\e0d9";
}
.oi-wrench:before {
content:"\e0da";
}
.oi-x:before {
content:"\e0db";
}
.oi-yen:before {
content:"\e0dc";
}
.oi-zoom-in:before {
content:"\e0dd";
}
.oi-zoom-out:before {
content:"\e0de";
}

File diff suppressed because one or more lines are too long

View File

@ -0,0 +1,958 @@
/* Bootstrap */
/* Override Bootstrap default variable */
$icon-font-path: '../fonts/' !default;
@font-face {
font-family: 'Icons';
src: url('#{$icon-font-path}open-iconic.eot');
src: url('#{$icon-font-path}open-iconic.eot?#iconic-sm') format('embedded-opentype'), url('#{$icon-font-path}open-iconic.woff') format('woff'), url('#{$icon-font-path}open-iconic.ttf') format('truetype'), url('#{$icon-font-path}open-iconic.svg#iconic-sm') format('svg');
font-weight: normal;
font-style: normal;
}
// Catchall baseclass
.oi {
position: relative;
top: 1px;
display: inline-block;
font-family: 'Icons';
font-style: normal;
font-weight: normal;
line-height: 1;
-webkit-font-smoothing: antialiased;
-moz-osx-font-smoothing: grayscale;
&:empty:before {
width: 1em;
text-align: center;
box-sizing: content-box;
}
&.oi-align-center:before {
text-align: center;
}
&.oi-align-left:before {
text-align: left;
}
&.oi-align-right:before {
text-align: right;
}
&.oi-flip-horizontal:before {
-webkit-transform: scale(-1, 1);
-ms-transform: scale(-1, 1);
transform: scale(-1, 1);
}
&.oi-flip-vertical:before {
-webkit-transform: scale(1, -1);
-ms-transform: scale(-1, 1);
transform: scale(1, -1);
}
&.oi-flip-horizontal-vertical:before {
-webkit-transform: scale(-1, -1);
-ms-transform: scale(-1, 1);
transform: scale(-1, -1);
}
}
.oi-account-login:before {
content:'\e000';
}
.oi-account-logout:before {
content:'\e001';
}
.oi-action-redo:before {
content:'\e002';
}
.oi-action-undo:before {
content:'\e003';
}
.oi-align-center:before {
content:'\e004';
}
.oi-align-left:before {
content:'\e005';
}
.oi-align-right:before {
content:'\e006';
}
.oi-aperture:before {
content:'\e007';
}
.oi-arrow-bottom:before {
content:'\e008';
}
.oi-arrow-circle-bottom:before {
content:'\e009';
}
.oi-arrow-circle-left:before {
content:'\e00a';
}
.oi-arrow-circle-right:before {
content:'\e00b';
}
.oi-arrow-circle-top:before {
content:'\e00c';
}
.oi-arrow-left:before {
content:'\e00d';
}
.oi-arrow-right:before {
content:'\e00e';
}
.oi-arrow-thick-bottom:before {
content:'\e00f';
}
.oi-arrow-thick-left:before {
content:'\e010';
}
.oi-arrow-thick-right:before {
content:'\e011';
}
.oi-arrow-thick-top:before {
content:'\e012';
}
.oi-arrow-top:before {
content:'\e013';
}
.oi-audio-spectrum:before {
content:'\e014';
}
.oi-audio:before {
content:'\e015';
}
.oi-badge:before {
content:'\e016';
}
.oi-ban:before {
content:'\e017';
}
.oi-bar-chart:before {
content:'\e018';
}
.oi-basket:before {
content:'\e019';
}
.oi-battery-empty:before {
content:'\e01a';
}
.oi-battery-full:before {
content:'\e01b';
}
.oi-beaker:before {
content:'\e01c';
}
.oi-bell:before {
content:'\e01d';
}
.oi-bluetooth:before {
content:'\e01e';
}
.oi-bold:before {
content:'\e01f';
}
.oi-bolt:before {
content:'\e020';
}
.oi-book:before {
content:'\e021';
}
.oi-bookmark:before {
content:'\e022';
}
.oi-box:before {
content:'\e023';
}
.oi-briefcase:before {
content:'\e024';
}
.oi-british-pound:before {
content:'\e025';
}
.oi-browser:before {
content:'\e026';
}
.oi-brush:before {
content:'\e027';
}
.oi-bug:before {
content:'\e028';
}
.oi-bullhorn:before {
content:'\e029';
}
.oi-calculator:before {
content:'\e02a';
}
.oi-calendar:before {
content:'\e02b';
}
.oi-camera-slr:before {
content:'\e02c';
}
.oi-caret-bottom:before {
content:'\e02d';
}
.oi-caret-left:before {
content:'\e02e';
}
.oi-caret-right:before {
content:'\e02f';
}
.oi-caret-top:before {
content:'\e030';
}
.oi-cart:before {
content:'\e031';
}
.oi-chat:before {
content:'\e032';
}
.oi-check:before {
content:'\e033';
}
.oi-chevron-bottom:before {
content:'\e034';
}
.oi-chevron-left:before {
content:'\e035';
}
.oi-chevron-right:before {
content:'\e036';
}
.oi-chevron-top:before {
content:'\e037';
}
.oi-circle-check:before {
content:'\e038';
}
.oi-circle-x:before {
content:'\e039';
}
.oi-clipboard:before {
content:'\e03a';
}
.oi-clock:before {
content:'\e03b';
}
.oi-cloud-download:before {
content:'\e03c';
}
.oi-cloud-upload:before {
content:'\e03d';
}
.oi-cloud:before {
content:'\e03e';
}
.oi-cloudy:before {
content:'\e03f';
}
.oi-code:before {
content:'\e040';
}
.oi-cog:before {
content:'\e041';
}
.oi-collapse-down:before {
content:'\e042';
}
.oi-collapse-left:before {
content:'\e043';
}
.oi-collapse-right:before {
content:'\e044';
}
.oi-collapse-up:before {
content:'\e045';
}
.oi-command:before {
content:'\e046';
}
.oi-comment-square:before {
content:'\e047';
}
.oi-compass:before {
content:'\e048';
}
.oi-contrast:before {
content:'\e049';
}
.oi-copywriting:before {
content:'\e04a';
}
.oi-credit-card:before {
content:'\e04b';
}
.oi-crop:before {
content:'\e04c';
}
.oi-dashboard:before {
content:'\e04d';
}
.oi-data-transfer-download:before {
content:'\e04e';
}
.oi-data-transfer-upload:before {
content:'\e04f';
}
.oi-delete:before {
content:'\e050';
}
.oi-dial:before {
content:'\e051';
}
.oi-document:before {
content:'\e052';
}
.oi-dollar:before {
content:'\e053';
}
.oi-double-quote-sans-left:before {
content:'\e054';
}
.oi-double-quote-sans-right:before {
content:'\e055';
}
.oi-double-quote-serif-left:before {
content:'\e056';
}
.oi-double-quote-serif-right:before {
content:'\e057';
}
.oi-droplet:before {
content:'\e058';
}
.oi-eject:before {
content:'\e059';
}
.oi-elevator:before {
content:'\e05a';
}
.oi-ellipses:before {
content:'\e05b';
}
.oi-envelope-closed:before {
content:'\e05c';
}
.oi-envelope-open:before {
content:'\e05d';
}
.oi-euro:before {
content:'\e05e';
}
.oi-excerpt:before {
content:'\e05f';
}
.oi-expand-down:before {
content:'\e060';
}
.oi-expand-left:before {
content:'\e061';
}
.oi-expand-right:before {
content:'\e062';
}
.oi-expand-up:before {
content:'\e063';
}
.oi-external-link:before {
content:'\e064';
}
.oi-eye:before {
content:'\e065';
}
.oi-eyedropper:before {
content:'\e066';
}
.oi-file:before {
content:'\e067';
}
.oi-fire:before {
content:'\e068';
}
.oi-flag:before {
content:'\e069';
}
.oi-flash:before {
content:'\e06a';
}
.oi-folder:before {
content:'\e06b';
}
.oi-fork:before {
content:'\e06c';
}
.oi-fullscreen-enter:before {
content:'\e06d';
}
.oi-fullscreen-exit:before {
content:'\e06e';
}
.oi-globe:before {
content:'\e06f';
}
.oi-graph:before {
content:'\e070';
}
.oi-grid-four-up:before {
content:'\e071';
}
.oi-grid-three-up:before {
content:'\e072';
}
.oi-grid-two-up:before {
content:'\e073';
}
.oi-hard-drive:before {
content:'\e074';
}
.oi-header:before {
content:'\e075';
}
.oi-headphones:before {
content:'\e076';
}
.oi-heart:before {
content:'\e077';
}
.oi-home:before {
content:'\e078';
}
.oi-image:before {
content:'\e079';
}
.oi-inbox:before {
content:'\e07a';
}
.oi-infinity:before {
content:'\e07b';
}
.oi-info:before {
content:'\e07c';
}
.oi-italic:before {
content:'\e07d';
}
.oi-justify-center:before {
content:'\e07e';
}
.oi-justify-left:before {
content:'\e07f';
}
.oi-justify-right:before {
content:'\e080';
}
.oi-key:before {
content:'\e081';
}
.oi-laptop:before {
content:'\e082';
}
.oi-layers:before {
content:'\e083';
}
.oi-lightbulb:before {
content:'\e084';
}
.oi-link-broken:before {
content:'\e085';
}
.oi-link-intact:before {
content:'\e086';
}
.oi-list-rich:before {
content:'\e087';
}
.oi-list:before {
content:'\e088';
}
.oi-location:before {
content:'\e089';
}
.oi-lock-locked:before {
content:'\e08a';
}
.oi-lock-unlocked:before {
content:'\e08b';
}
.oi-loop-circular:before {
content:'\e08c';
}
.oi-loop-square:before {
content:'\e08d';
}
.oi-loop:before {
content:'\e08e';
}
.oi-magnifying-glass:before {
content:'\e08f';
}
.oi-map-marker:before {
content:'\e090';
}
.oi-map:before {
content:'\e091';
}
.oi-media-pause:before {
content:'\e092';
}
.oi-media-play:before {
content:'\e093';
}
.oi-media-record:before {
content:'\e094';
}
.oi-media-skip-backward:before {
content:'\e095';
}
.oi-media-skip-forward:before {
content:'\e096';
}
.oi-media-step-backward:before {
content:'\e097';
}
.oi-media-step-forward:before {
content:'\e098';
}
.oi-media-stop:before {
content:'\e099';
}
.oi-medical-cross:before {
content:'\e09a';
}
.oi-menu:before {
content:'\e09b';
}
.oi-microphone:before {
content:'\e09c';
}
.oi-minus:before {
content:'\e09d';
}
.oi-monitor:before {
content:'\e09e';
}
.oi-moon:before {
content:'\e09f';
}
.oi-move:before {
content:'\e0a0';
}
.oi-musical-note:before {
content:'\e0a1';
}
.oi-paperclip:before {
content:'\e0a2';
}
.oi-pencil:before {
content:'\e0a3';
}
.oi-people:before {
content:'\e0a4';
}
.oi-person:before {
content:'\e0a5';
}
.oi-phone:before {
content:'\e0a6';
}
.oi-pie-chart:before {
content:'\e0a7';
}
.oi-pin:before {
content:'\e0a8';
}
.oi-play-circle:before {
content:'\e0a9';
}
.oi-plus:before {
content:'\e0aa';
}
.oi-power-standby:before {
content:'\e0ab';
}
.oi-print:before {
content:'\e0ac';
}
.oi-project:before {
content:'\e0ad';
}
.oi-pulse:before {
content:'\e0ae';
}
.oi-puzzle-piece:before {
content:'\e0af';
}
.oi-question-mark:before {
content:'\e0b0';
}
.oi-rain:before {
content:'\e0b1';
}
.oi-random:before {
content:'\e0b2';
}
.oi-reload:before {
content:'\e0b3';
}
.oi-resize-both:before {
content:'\e0b4';
}
.oi-resize-height:before {
content:'\e0b5';
}
.oi-resize-width:before {
content:'\e0b6';
}
.oi-rss-alt:before {
content:'\e0b7';
}
.oi-rss:before {
content:'\e0b8';
}
.oi-script:before {
content:'\e0b9';
}
.oi-share-boxed:before {
content:'\e0ba';
}
.oi-share:before {
content:'\e0bb';
}
.oi-shield:before {
content:'\e0bc';
}
.oi-signal:before {
content:'\e0bd';
}
.oi-signpost:before {
content:'\e0be';
}
.oi-sort-ascending:before {
content:'\e0bf';
}
.oi-sort-descending:before {
content:'\e0c0';
}
.oi-spreadsheet:before {
content:'\e0c1';
}
.oi-star:before {
content:'\e0c2';
}
.oi-sun:before {
content:'\e0c3';
}
.oi-tablet:before {
content:'\e0c4';
}
.oi-tag:before {
content:'\e0c5';
}
.oi-tags:before {
content:'\e0c6';
}
.oi-target:before {
content:'\e0c7';
}
.oi-task:before {
content:'\e0c8';
}
.oi-terminal:before {
content:'\e0c9';
}
.oi-text:before {
content:'\e0ca';
}
.oi-thumb-down:before {
content:'\e0cb';
}
.oi-thumb-up:before {
content:'\e0cc';
}
.oi-timer:before {
content:'\e0cd';
}
.oi-transfer:before {
content:'\e0ce';
}
.oi-trash:before {
content:'\e0cf';
}
.oi-underline:before {
content:'\e0d0';
}
.oi-vertical-align-bottom:before {
content:'\e0d1';
}
.oi-vertical-align-center:before {
content:'\e0d2';
}
.oi-vertical-align-top:before {
content:'\e0d3';
}
.oi-video:before {
content:'\e0d4';
}
.oi-volume-high:before {
content:'\e0d5';
}
.oi-volume-low:before {
content:'\e0d6';
}
.oi-volume-off:before {
content:'\e0d7';
}
.oi-warning:before {
content:'\e0d8';
}
.oi-wifi:before {
content:'\e0d9';
}
.oi-wrench:before {
content:'\e0da';
}
.oi-x:before {
content:'\e0db';
}
.oi-yen:before {
content:'\e0dc';
}
.oi-zoom-in:before {
content:'\e0dd';
}
.oi-zoom-out:before {
content:'\e0de';
}

View File

@ -0,0 +1,954 @@
/* Bootstrap */
@font-face
font-family 'Icons'
src url('../fonts/open-iconic.eot')
src url('../fonts/open-iconic.eot?#iconic-sm') format('embedded-opentype'), url('../fonts/open-iconic.woff') format('woff'), url('../fonts/open-iconic.ttf') format('truetype'), url('../fonts/open-iconic.svg#iconic-sm') format('svg')
font-weight normal
font-style normal
// Catchall baseclass
.oi
position relative
top 1px
display inline-block
font-family 'Icons'
font-style normal
font-weight normal
line-height 1
-webkit-font-smoothing antialiased
-moz-osx-font-smoothing grayscale
&:empty:before
width 1em
text-align center
box-sizing content-box
&.oi-align-center:before
text-align center
&.oi-align-left:before
text-align left
&.oi-align-right:before
text-align right
&.oi-flip-horizontal:before
-webkit-transform scale(-1, 1)
-ms-transform scale(-1, 1)
transform scale(-1, 1)
&.oi-flip-vertical:before
-webkit-transform scale(1, -1)
-ms-transform scale(-1, 1)
transform scale(1, -1)
&.oi-flip-horizontal-vertical:before
-webkit-transform scale(-1, -1)
-ms-transform scale(-1, 1)
transform scale(-1, -1)
.oi-account-login:before {
content'\e000'
}
.oi-account-logout:before {
content'\e001'
}
.oi-action-redo:before {
content'\e002'
}
.oi-action-undo:before {
content'\e003'
}
.oi-align-center:before {
content'\e004'
}
.oi-align-left:before {
content'\e005'
}
.oi-align-right:before {
content'\e006'
}
.oi-aperture:before {
content'\e007'
}
.oi-arrow-bottom:before {
content'\e008'
}
.oi-arrow-circle-bottom:before {
content'\e009'
}
.oi-arrow-circle-left:before {
content'\e00a'
}
.oi-arrow-circle-right:before {
content'\e00b'
}
.oi-arrow-circle-top:before {
content'\e00c'
}
.oi-arrow-left:before {
content'\e00d'
}
.oi-arrow-right:before {
content'\e00e'
}
.oi-arrow-thick-bottom:before {
content'\e00f'
}
.oi-arrow-thick-left:before {
content'\e010'
}
.oi-arrow-thick-right:before {
content'\e011'
}
.oi-arrow-thick-top:before {
content'\e012'
}
.oi-arrow-top:before {
content'\e013'
}
.oi-audio-spectrum:before {
content'\e014'
}
.oi-audio:before {
content'\e015'
}
.oi-badge:before {
content'\e016'
}
.oi-ban:before {
content'\e017'
}
.oi-bar-chart:before {
content'\e018'
}
.oi-basket:before {
content'\e019'
}
.oi-battery-empty:before {
content'\e01a'
}
.oi-battery-full:before {
content'\e01b'
}
.oi-beaker:before {
content'\e01c'
}
.oi-bell:before {
content'\e01d'
}
.oi-bluetooth:before {
content'\e01e'
}
.oi-bold:before {
content'\e01f'
}
.oi-bolt:before {
content'\e020'
}
.oi-book:before {
content'\e021'
}
.oi-bookmark:before {
content'\e022'
}
.oi-box:before {
content'\e023'
}
.oi-briefcase:before {
content'\e024'
}
.oi-british-pound:before {
content'\e025'
}
.oi-browser:before {
content'\e026'
}
.oi-brush:before {
content'\e027'
}
.oi-bug:before {
content'\e028'
}
.oi-bullhorn:before {
content'\e029'
}
.oi-calculator:before {
content'\e02a'
}
.oi-calendar:before {
content'\e02b'
}
.oi-camera-slr:before {
content'\e02c'
}
.oi-caret-bottom:before {
content'\e02d'
}
.oi-caret-left:before {
content'\e02e'
}
.oi-caret-right:before {
content'\e02f'
}
.oi-caret-top:before {
content'\e030'
}
.oi-cart:before {
content'\e031'
}
.oi-chat:before {
content'\e032'
}
.oi-check:before {
content'\e033'
}
.oi-chevron-bottom:before {
content'\e034'
}
.oi-chevron-left:before {
content'\e035'
}
.oi-chevron-right:before {
content'\e036'
}
.oi-chevron-top:before {
content'\e037'
}
.oi-circle-check:before {
content'\e038'
}
.oi-circle-x:before {
content'\e039'
}
.oi-clipboard:before {
content'\e03a'
}
.oi-clock:before {
content'\e03b'
}
.oi-cloud-download:before {
content'\e03c'
}
.oi-cloud-upload:before {
content'\e03d'
}
.oi-cloud:before {
content'\e03e'
}
.oi-cloudy:before {
content'\e03f'
}
.oi-code:before {
content'\e040'
}
.oi-cog:before {
content'\e041'
}
.oi-collapse-down:before {
content'\e042'
}
.oi-collapse-left:before {
content'\e043'
}
.oi-collapse-right:before {
content'\e044'
}
.oi-collapse-up:before {
content'\e045'
}
.oi-command:before {
content'\e046'
}
.oi-comment-square:before {
content'\e047'
}
.oi-compass:before {
content'\e048'
}
.oi-contrast:before {
content'\e049'
}
.oi-copywriting:before {
content'\e04a'
}
.oi-credit-card:before {
content'\e04b'
}
.oi-crop:before {
content'\e04c'
}
.oi-dashboard:before {
content'\e04d'
}
.oi-data-transfer-download:before {
content'\e04e'
}
.oi-data-transfer-upload:before {
content'\e04f'
}
.oi-delete:before {
content'\e050'
}
.oi-dial:before {
content'\e051'
}
.oi-document:before {
content'\e052'
}
.oi-dollar:before {
content'\e053'
}
.oi-double-quote-sans-left:before {
content'\e054'
}
.oi-double-quote-sans-right:before {
content'\e055'
}
.oi-double-quote-serif-left:before {
content'\e056'
}
.oi-double-quote-serif-right:before {
content'\e057'
}
.oi-droplet:before {
content'\e058'
}
.oi-eject:before {
content'\e059'
}
.oi-elevator:before {
content'\e05a'
}
.oi-ellipses:before {
content'\e05b'
}
.oi-envelope-closed:before {
content'\e05c'
}
.oi-envelope-open:before {
content'\e05d'
}
.oi-euro:before {
content'\e05e'
}
.oi-excerpt:before {
content'\e05f'
}
.oi-expand-down:before {
content'\e060'
}
.oi-expand-left:before {
content'\e061'
}
.oi-expand-right:before {
content'\e062'
}
.oi-expand-up:before {
content'\e063'
}
.oi-external-link:before {
content'\e064'
}
.oi-eye:before {
content'\e065'
}
.oi-eyedropper:before {
content'\e066'
}
.oi-file:before {
content'\e067'
}
.oi-fire:before {
content'\e068'
}
.oi-flag:before {
content'\e069'
}
.oi-flash:before {
content'\e06a'
}
.oi-folder:before {
content'\e06b'
}
.oi-fork:before {
content'\e06c'
}
.oi-fullscreen-enter:before {
content'\e06d'
}
.oi-fullscreen-exit:before {
content'\e06e'
}
.oi-globe:before {
content'\e06f'
}
.oi-graph:before {
content'\e070'
}
.oi-grid-four-up:before {
content'\e071'
}
.oi-grid-three-up:before {
content'\e072'
}
.oi-grid-two-up:before {
content'\e073'
}
.oi-hard-drive:before {
content'\e074'
}
.oi-header:before {
content'\e075'
}
.oi-headphones:before {
content'\e076'
}
.oi-heart:before {
content'\e077'
}
.oi-home:before {
content'\e078'
}
.oi-image:before {
content'\e079'
}
.oi-inbox:before {
content'\e07a'
}
.oi-infinity:before {
content'\e07b'
}
.oi-info:before {
content'\e07c'
}
.oi-italic:before {
content'\e07d'
}
.oi-justify-center:before {
content'\e07e'
}
.oi-justify-left:before {
content'\e07f'
}
.oi-justify-right:before {
content'\e080'
}
.oi-key:before {
content'\e081'
}
.oi-laptop:before {
content'\e082'
}
.oi-layers:before {
content'\e083'
}
.oi-lightbulb:before {
content'\e084'
}
.oi-link-broken:before {
content'\e085'
}
.oi-link-intact:before {
content'\e086'
}
.oi-list-rich:before {
content'\e087'
}
.oi-list:before {
content'\e088'
}
.oi-location:before {
content'\e089'
}
.oi-lock-locked:before {
content'\e08a'
}
.oi-lock-unlocked:before {
content'\e08b'
}
.oi-loop-circular:before {
content'\e08c'
}
.oi-loop-square:before {
content'\e08d'
}
.oi-loop:before {
content'\e08e'
}
.oi-magnifying-glass:before {
content'\e08f'
}
.oi-map-marker:before {
content'\e090'
}
.oi-map:before {
content'\e091'
}
.oi-media-pause:before {
content'\e092'
}
.oi-media-play:before {
content'\e093'
}
.oi-media-record:before {
content'\e094'
}
.oi-media-skip-backward:before {
content'\e095'
}
.oi-media-skip-forward:before {
content'\e096'
}
.oi-media-step-backward:before {
content'\e097'
}
.oi-media-step-forward:before {
content'\e098'
}
.oi-media-stop:before {
content'\e099'
}
.oi-medical-cross:before {
content'\e09a'
}
.oi-menu:before {
content'\e09b'
}
.oi-microphone:before {
content'\e09c'
}
.oi-minus:before {
content'\e09d'
}
.oi-monitor:before {
content'\e09e'
}
.oi-moon:before {
content'\e09f'
}
.oi-move:before {
content'\e0a0'
}
.oi-musical-note:before {
content'\e0a1'
}
.oi-paperclip:before {
content'\e0a2'
}
.oi-pencil:before {
content'\e0a3'
}
.oi-people:before {
content'\e0a4'
}
.oi-person:before {
content'\e0a5'
}
.oi-phone:before {
content'\e0a6'
}
.oi-pie-chart:before {
content'\e0a7'
}
.oi-pin:before {
content'\e0a8'
}
.oi-play-circle:before {
content'\e0a9'
}
.oi-plus:before {
content'\e0aa'
}
.oi-power-standby:before {
content'\e0ab'
}
.oi-print:before {
content'\e0ac'
}
.oi-project:before {
content'\e0ad'
}
.oi-pulse:before {
content'\e0ae'
}
.oi-puzzle-piece:before {
content'\e0af'
}
.oi-question-mark:before {
content'\e0b0'
}
.oi-rain:before {
content'\e0b1'
}
.oi-random:before {
content'\e0b2'
}
.oi-reload:before {
content'\e0b3'
}
.oi-resize-both:before {
content'\e0b4'
}
.oi-resize-height:before {
content'\e0b5'
}
.oi-resize-width:before {
content'\e0b6'
}
.oi-rss-alt:before {
content'\e0b7'
}
.oi-rss:before {
content'\e0b8'
}
.oi-script:before {
content'\e0b9'
}
.oi-share-boxed:before {
content'\e0ba'
}
.oi-share:before {
content'\e0bb'
}
.oi-shield:before {
content'\e0bc'
}
.oi-signal:before {
content'\e0bd'
}
.oi-signpost:before {
content'\e0be'
}
.oi-sort-ascending:before {
content'\e0bf'
}
.oi-sort-descending:before {
content'\e0c0'
}
.oi-spreadsheet:before {
content'\e0c1'
}
.oi-star:before {
content'\e0c2'
}
.oi-sun:before {
content'\e0c3'
}
.oi-tablet:before {
content'\e0c4'
}
.oi-tag:before {
content'\e0c5'
}
.oi-tags:before {
content'\e0c6'
}
.oi-target:before {
content'\e0c7'
}
.oi-task:before {
content'\e0c8'
}
.oi-terminal:before {
content'\e0c9'
}
.oi-text:before {
content'\e0ca'
}
.oi-thumb-down:before {
content'\e0cb'
}
.oi-thumb-up:before {
content'\e0cc'
}
.oi-timer:before {
content'\e0cd'
}
.oi-transfer:before {
content'\e0ce'
}
.oi-trash:before {
content'\e0cf'
}
.oi-underline:before {
content'\e0d0'
}
.oi-vertical-align-bottom:before {
content'\e0d1'
}
.oi-vertical-align-center:before {
content'\e0d2'
}
.oi-vertical-align-top:before {
content'\e0d3'
}
.oi-video:before {
content'\e0d4'
}
.oi-volume-high:before {
content'\e0d5'
}
.oi-volume-low:before {
content'\e0d6'
}
.oi-volume-off:before {
content'\e0d7'
}
.oi-warning:before {
content'\e0d8'
}
.oi-wifi:before {
content'\e0d9'
}
.oi-wrench:before {
content'\e0da'
}
.oi-x:before {
content'\e0db'
}
.oi-yen:before {
content'\e0dc'
}
.oi-zoom-in:before {
content'\e0dd'
}
.oi-zoom-out:before {
content'\e0de'
}

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because one or more lines are too long

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,511 @@
@font-face {
font-family: 'Icons';
src: url('../fonts/open-iconic.eot');
src: url('../fonts/open-iconic.eot?#iconic-sm') format('embedded-opentype'), url('../fonts/open-iconic.woff') format('woff'), url('../fonts/open-iconic.ttf') format('truetype'), url('../fonts/open-iconic.otf') format('opentype'), url('../fonts/open-iconic.svg#iconic-sm') format('svg');
font-weight: normal;
font-style: normal;
}
.oi[data-glyph].oi-text-replace {
font-size: 0;
line-height: 0;
}
.oi[data-glyph].oi-text-replace:before {
width: 1em;
text-align: center;
}
.oi[data-glyph]:before {
font-family: 'Icons';
display: inline-block;
speak: none;
line-height: 1;
vertical-align: baseline;
font-weight: normal;
font-style: normal;
-webkit-font-smoothing: antialiased;
-moz-osx-font-smoothing: grayscale;
}
.oi[data-glyph]:empty:before {
width: 1em;
text-align: center;
box-sizing: content-box;
}
.oi[data-glyph].oi-align-left:before {
text-align: left;
}
.oi[data-glyph].oi-align-right:before {
text-align: right;
}
.oi[data-glyph].oi-align-center:before {
text-align: center;
}
.oi[data-glyph].oi-flip-horizontal:before {
-webkit-transform: scale(-1, 1);
-ms-transform: scale(-1, 1);
transform: scale(-1, 1);
}
.oi[data-glyph].oi-flip-vertical:before {
-webkit-transform: scale(1, -1);
-ms-transform: scale(-1, 1);
transform: scale(1, -1);
}
.oi[data-glyph].oi-flip-horizontal-vertical:before {
-webkit-transform: scale(-1, -1);
-ms-transform: scale(-1, 1);
transform: scale(-1, -1);
}
.oi[data-glyph=account-login]:before { content:'\e000'; }
.oi[data-glyph=account-logout]:before { content:'\e001'; }
.oi[data-glyph=action-redo]:before { content:'\e002'; }
.oi[data-glyph=action-undo]:before { content:'\e003'; }
.oi[data-glyph=align-center]:before { content:'\e004'; }
.oi[data-glyph=align-left]:before { content:'\e005'; }
.oi[data-glyph=align-right]:before { content:'\e006'; }
.oi[data-glyph=aperture]:before { content:'\e007'; }
.oi[data-glyph=arrow-bottom]:before { content:'\e008'; }
.oi[data-glyph=arrow-circle-bottom]:before { content:'\e009'; }
.oi[data-glyph=arrow-circle-left]:before { content:'\e00a'; }
.oi[data-glyph=arrow-circle-right]:before { content:'\e00b'; }
.oi[data-glyph=arrow-circle-top]:before { content:'\e00c'; }
.oi[data-glyph=arrow-left]:before { content:'\e00d'; }
.oi[data-glyph=arrow-right]:before { content:'\e00e'; }
.oi[data-glyph=arrow-thick-bottom]:before { content:'\e00f'; }
.oi[data-glyph=arrow-thick-left]:before { content:'\e010'; }
.oi[data-glyph=arrow-thick-right]:before { content:'\e011'; }
.oi[data-glyph=arrow-thick-top]:before { content:'\e012'; }
.oi[data-glyph=arrow-top]:before { content:'\e013'; }
.oi[data-glyph=audio-spectrum]:before { content:'\e014'; }
.oi[data-glyph=audio]:before { content:'\e015'; }
.oi[data-glyph=badge]:before { content:'\e016'; }
.oi[data-glyph=ban]:before { content:'\e017'; }
.oi[data-glyph=bar-chart]:before { content:'\e018'; }
.oi[data-glyph=basket]:before { content:'\e019'; }
.oi[data-glyph=battery-empty]:before { content:'\e01a'; }
.oi[data-glyph=battery-full]:before { content:'\e01b'; }
.oi[data-glyph=beaker]:before { content:'\e01c'; }
.oi[data-glyph=bell]:before { content:'\e01d'; }
.oi[data-glyph=bluetooth]:before { content:'\e01e'; }
.oi[data-glyph=bold]:before { content:'\e01f'; }
.oi[data-glyph=bolt]:before { content:'\e020'; }
.oi[data-glyph=book]:before { content:'\e021'; }
.oi[data-glyph=bookmark]:before { content:'\e022'; }
.oi[data-glyph=box]:before { content:'\e023'; }
.oi[data-glyph=briefcase]:before { content:'\e024'; }
.oi[data-glyph=british-pound]:before { content:'\e025'; }
.oi[data-glyph=browser]:before { content:'\e026'; }
.oi[data-glyph=brush]:before { content:'\e027'; }
.oi[data-glyph=bug]:before { content:'\e028'; }
.oi[data-glyph=bullhorn]:before { content:'\e029'; }
.oi[data-glyph=calculator]:before { content:'\e02a'; }
.oi[data-glyph=calendar]:before { content:'\e02b'; }
.oi[data-glyph=camera-slr]:before { content:'\e02c'; }
.oi[data-glyph=caret-bottom]:before { content:'\e02d'; }
.oi[data-glyph=caret-left]:before { content:'\e02e'; }
.oi[data-glyph=caret-right]:before { content:'\e02f'; }
.oi[data-glyph=caret-top]:before { content:'\e030'; }
.oi[data-glyph=cart]:before { content:'\e031'; }
.oi[data-glyph=chat]:before { content:'\e032'; }
.oi[data-glyph=check]:before { content:'\e033'; }
.oi[data-glyph=chevron-bottom]:before { content:'\e034'; }
.oi[data-glyph=chevron-left]:before { content:'\e035'; }
.oi[data-glyph=chevron-right]:before { content:'\e036'; }
.oi[data-glyph=chevron-top]:before { content:'\e037'; }
.oi[data-glyph=circle-check]:before { content:'\e038'; }
.oi[data-glyph=circle-x]:before { content:'\e039'; }
.oi[data-glyph=clipboard]:before { content:'\e03a'; }
.oi[data-glyph=clock]:before { content:'\e03b'; }
.oi[data-glyph=cloud-download]:before { content:'\e03c'; }
.oi[data-glyph=cloud-upload]:before { content:'\e03d'; }
.oi[data-glyph=cloud]:before { content:'\e03e'; }
.oi[data-glyph=cloudy]:before { content:'\e03f'; }
.oi[data-glyph=code]:before { content:'\e040'; }
.oi[data-glyph=cog]:before { content:'\e041'; }
.oi[data-glyph=collapse-down]:before { content:'\e042'; }
.oi[data-glyph=collapse-left]:before { content:'\e043'; }
.oi[data-glyph=collapse-right]:before { content:'\e044'; }
.oi[data-glyph=collapse-up]:before { content:'\e045'; }
.oi[data-glyph=command]:before { content:'\e046'; }
.oi[data-glyph=comment-square]:before { content:'\e047'; }
.oi[data-glyph=compass]:before { content:'\e048'; }
.oi[data-glyph=contrast]:before { content:'\e049'; }
.oi[data-glyph=copywriting]:before { content:'\e04a'; }
.oi[data-glyph=credit-card]:before { content:'\e04b'; }
.oi[data-glyph=crop]:before { content:'\e04c'; }
.oi[data-glyph=dashboard]:before { content:'\e04d'; }
.oi[data-glyph=data-transfer-download]:before { content:'\e04e'; }
.oi[data-glyph=data-transfer-upload]:before { content:'\e04f'; }
.oi[data-glyph=delete]:before { content:'\e050'; }
.oi[data-glyph=dial]:before { content:'\e051'; }
.oi[data-glyph=document]:before { content:'\e052'; }
.oi[data-glyph=dollar]:before { content:'\e053'; }
.oi[data-glyph=double-quote-sans-left]:before { content:'\e054'; }
.oi[data-glyph=double-quote-sans-right]:before { content:'\e055'; }
.oi[data-glyph=double-quote-serif-left]:before { content:'\e056'; }
.oi[data-glyph=double-quote-serif-right]:before { content:'\e057'; }
.oi[data-glyph=droplet]:before { content:'\e058'; }
.oi[data-glyph=eject]:before { content:'\e059'; }
.oi[data-glyph=elevator]:before { content:'\e05a'; }
.oi[data-glyph=ellipses]:before { content:'\e05b'; }
.oi[data-glyph=envelope-closed]:before { content:'\e05c'; }
.oi[data-glyph=envelope-open]:before { content:'\e05d'; }
.oi[data-glyph=euro]:before { content:'\e05e'; }
.oi[data-glyph=excerpt]:before { content:'\e05f'; }
.oi[data-glyph=expand-down]:before { content:'\e060'; }
.oi[data-glyph=expand-left]:before { content:'\e061'; }
.oi[data-glyph=expand-right]:before { content:'\e062'; }
.oi[data-glyph=expand-up]:before { content:'\e063'; }
.oi[data-glyph=external-link]:before { content:'\e064'; }
.oi[data-glyph=eye]:before { content:'\e065'; }
.oi[data-glyph=eyedropper]:before { content:'\e066'; }
.oi[data-glyph=file]:before { content:'\e067'; }
.oi[data-glyph=fire]:before { content:'\e068'; }
.oi[data-glyph=flag]:before { content:'\e069'; }
.oi[data-glyph=flash]:before { content:'\e06a'; }
.oi[data-glyph=folder]:before { content:'\e06b'; }
.oi[data-glyph=fork]:before { content:'\e06c'; }
.oi[data-glyph=fullscreen-enter]:before { content:'\e06d'; }
.oi[data-glyph=fullscreen-exit]:before { content:'\e06e'; }
.oi[data-glyph=globe]:before { content:'\e06f'; }
.oi[data-glyph=graph]:before { content:'\e070'; }
.oi[data-glyph=grid-four-up]:before { content:'\e071'; }
.oi[data-glyph=grid-three-up]:before { content:'\e072'; }
.oi[data-glyph=grid-two-up]:before { content:'\e073'; }
.oi[data-glyph=hard-drive]:before { content:'\e074'; }
.oi[data-glyph=header]:before { content:'\e075'; }
.oi[data-glyph=headphones]:before { content:'\e076'; }
.oi[data-glyph=heart]:before { content:'\e077'; }
.oi[data-glyph=home]:before { content:'\e078'; }
.oi[data-glyph=image]:before { content:'\e079'; }
.oi[data-glyph=inbox]:before { content:'\e07a'; }
.oi[data-glyph=infinity]:before { content:'\e07b'; }
.oi[data-glyph=info]:before { content:'\e07c'; }
.oi[data-glyph=italic]:before { content:'\e07d'; }
.oi[data-glyph=justify-center]:before { content:'\e07e'; }
.oi[data-glyph=justify-left]:before { content:'\e07f'; }
.oi[data-glyph=justify-right]:before { content:'\e080'; }
.oi[data-glyph=key]:before { content:'\e081'; }
.oi[data-glyph=laptop]:before { content:'\e082'; }
.oi[data-glyph=layers]:before { content:'\e083'; }
.oi[data-glyph=lightbulb]:before { content:'\e084'; }
.oi[data-glyph=link-broken]:before { content:'\e085'; }
.oi[data-glyph=link-intact]:before { content:'\e086'; }
.oi[data-glyph=list-rich]:before { content:'\e087'; }
.oi[data-glyph=list]:before { content:'\e088'; }
.oi[data-glyph=location]:before { content:'\e089'; }
.oi[data-glyph=lock-locked]:before { content:'\e08a'; }
.oi[data-glyph=lock-unlocked]:before { content:'\e08b'; }
.oi[data-glyph=loop-circular]:before { content:'\e08c'; }
.oi[data-glyph=loop-square]:before { content:'\e08d'; }
.oi[data-glyph=loop]:before { content:'\e08e'; }
.oi[data-glyph=magnifying-glass]:before { content:'\e08f'; }
.oi[data-glyph=map-marker]:before { content:'\e090'; }
.oi[data-glyph=map]:before { content:'\e091'; }
.oi[data-glyph=media-pause]:before { content:'\e092'; }
.oi[data-glyph=media-play]:before { content:'\e093'; }
.oi[data-glyph=media-record]:before { content:'\e094'; }
.oi[data-glyph=media-skip-backward]:before { content:'\e095'; }
.oi[data-glyph=media-skip-forward]:before { content:'\e096'; }
.oi[data-glyph=media-step-backward]:before { content:'\e097'; }
.oi[data-glyph=media-step-forward]:before { content:'\e098'; }
.oi[data-glyph=media-stop]:before { content:'\e099'; }
.oi[data-glyph=medical-cross]:before { content:'\e09a'; }
.oi[data-glyph=menu]:before { content:'\e09b'; }
.oi[data-glyph=microphone]:before { content:'\e09c'; }
.oi[data-glyph=minus]:before { content:'\e09d'; }
.oi[data-glyph=monitor]:before { content:'\e09e'; }
.oi[data-glyph=moon]:before { content:'\e09f'; }
.oi[data-glyph=move]:before { content:'\e0a0'; }
.oi[data-glyph=musical-note]:before { content:'\e0a1'; }
.oi[data-glyph=paperclip]:before { content:'\e0a2'; }
.oi[data-glyph=pencil]:before { content:'\e0a3'; }
.oi[data-glyph=people]:before { content:'\e0a4'; }
.oi[data-glyph=person]:before { content:'\e0a5'; }
.oi[data-glyph=phone]:before { content:'\e0a6'; }
.oi[data-glyph=pie-chart]:before { content:'\e0a7'; }
.oi[data-glyph=pin]:before { content:'\e0a8'; }
.oi[data-glyph=play-circle]:before { content:'\e0a9'; }
.oi[data-glyph=plus]:before { content:'\e0aa'; }
.oi[data-glyph=power-standby]:before { content:'\e0ab'; }
.oi[data-glyph=print]:before { content:'\e0ac'; }
.oi[data-glyph=project]:before { content:'\e0ad'; }
.oi[data-glyph=pulse]:before { content:'\e0ae'; }
.oi[data-glyph=puzzle-piece]:before { content:'\e0af'; }
.oi[data-glyph=question-mark]:before { content:'\e0b0'; }
.oi[data-glyph=rain]:before { content:'\e0b1'; }
.oi[data-glyph=random]:before { content:'\e0b2'; }
.oi[data-glyph=reload]:before { content:'\e0b3'; }
.oi[data-glyph=resize-both]:before { content:'\e0b4'; }
.oi[data-glyph=resize-height]:before { content:'\e0b5'; }
.oi[data-glyph=resize-width]:before { content:'\e0b6'; }
.oi[data-glyph=rss-alt]:before { content:'\e0b7'; }
.oi[data-glyph=rss]:before { content:'\e0b8'; }
.oi[data-glyph=script]:before { content:'\e0b9'; }
.oi[data-glyph=share-boxed]:before { content:'\e0ba'; }
.oi[data-glyph=share]:before { content:'\e0bb'; }
.oi[data-glyph=shield]:before { content:'\e0bc'; }
.oi[data-glyph=signal]:before { content:'\e0bd'; }
.oi[data-glyph=signpost]:before { content:'\e0be'; }
.oi[data-glyph=sort-ascending]:before { content:'\e0bf'; }
.oi[data-glyph=sort-descending]:before { content:'\e0c0'; }
.oi[data-glyph=spreadsheet]:before { content:'\e0c1'; }
.oi[data-glyph=star]:before { content:'\e0c2'; }
.oi[data-glyph=sun]:before { content:'\e0c3'; }
.oi[data-glyph=tablet]:before { content:'\e0c4'; }
.oi[data-glyph=tag]:before { content:'\e0c5'; }
.oi[data-glyph=tags]:before { content:'\e0c6'; }
.oi[data-glyph=target]:before { content:'\e0c7'; }
.oi[data-glyph=task]:before { content:'\e0c8'; }
.oi[data-glyph=terminal]:before { content:'\e0c9'; }
.oi[data-glyph=text]:before { content:'\e0ca'; }
.oi[data-glyph=thumb-down]:before { content:'\e0cb'; }
.oi[data-glyph=thumb-up]:before { content:'\e0cc'; }
.oi[data-glyph=timer]:before { content:'\e0cd'; }
.oi[data-glyph=transfer]:before { content:'\e0ce'; }
.oi[data-glyph=trash]:before { content:'\e0cf'; }
.oi[data-glyph=underline]:before { content:'\e0d0'; }
.oi[data-glyph=vertical-align-bottom]:before { content:'\e0d1'; }
.oi[data-glyph=vertical-align-center]:before { content:'\e0d2'; }
.oi[data-glyph=vertical-align-top]:before { content:'\e0d3'; }
.oi[data-glyph=video]:before { content:'\e0d4'; }
.oi[data-glyph=volume-high]:before { content:'\e0d5'; }
.oi[data-glyph=volume-low]:before { content:'\e0d6'; }
.oi[data-glyph=volume-off]:before { content:'\e0d7'; }
.oi[data-glyph=warning]:before { content:'\e0d8'; }
.oi[data-glyph=wifi]:before { content:'\e0d9'; }
.oi[data-glyph=wrench]:before { content:'\e0da'; }
.oi[data-glyph=x]:before { content:'\e0db'; }
.oi[data-glyph=yen]:before { content:'\e0dc'; }
.oi[data-glyph=zoom-in]:before { content:'\e0dd'; }
.oi[data-glyph=zoom-out]:before { content:'\e0de'; }

View File

@ -0,0 +1,962 @@
@iconic-font-path: '../fonts/';
@font-face {
font-family: 'Icons';
src: url('@{iconic-font-path}open-iconic.eot');
src: url('@{iconic-font-path}open-iconic.eot?#iconic-sm') format('embedded-opentype'), url('@{iconic-font-path}open-iconic.woff') format('woff'), url('@{iconic-font-path}open-iconic.ttf') format('truetype'), url('@{iconic-font-path}open-iconic.otf') format('opentype'), url('@{iconic-font-path}open-iconic.svg#iconic-sm') format('svg');
font-weight: normal;
font-style: normal;
}
.oi[data-glyph].oi-text-replace {
font-size: 0;
line-height: 0;
}
.oi[data-glyph].oi-text-replace:before {
width: 1em;
text-align: center;
}
.oi[data-glyph] {
&:before {
position: relative;
top: 1px;
font-family: 'Icons';
display: inline-block;
speak: none;
line-height: 1;
vertical-align: baseline;
font-weight: normal;
font-style: normal;
-webkit-font-smoothing: antialiased;
-moz-osx-font-smoothing: grayscale;
}
&:empty:before {
width: 1em;
text-align: center;
box-sizing: content-box;
}
&.oi-align-left:before {
text-align: left;
}
&.oi-align-right:before {
text-align: right;
}
&.oi-align-center:before {
text-align: center;
}
&.oi-flip-horizontal:before {
-webkit-transform: scale(-1, 1);
-ms-transform: scale(-1, 1);
transform: scale(-1, 1);
}
&.oi-flip-vertical:before {
-webkit-transform: scale(1, -1);
-ms-transform: scale(-1, 1);
transform: scale(1, -1);
}
&.oi-flip-horizontal-vertical:before {
-webkit-transform: scale(-1, -1);
-ms-transform: scale(-1, 1);
transform: scale(-1, -1);
}
}
.oi[data-glyph=account-login]:before {
content: '\e000';
}
.oi[data-glyph=account-logout]:before {
content: '\e001';
}
.oi[data-glyph=action-redo]:before {
content: '\e002';
}
.oi[data-glyph=action-undo]:before {
content: '\e003';
}
.oi[data-glyph=align-center]:before {
content: '\e004';
}
.oi[data-glyph=align-left]:before {
content: '\e005';
}
.oi[data-glyph=align-right]:before {
content: '\e006';
}
.oi[data-glyph=aperture]:before {
content: '\e007';
}
.oi[data-glyph=arrow-bottom]:before {
content: '\e008';
}
.oi[data-glyph=arrow-circle-bottom]:before {
content: '\e009';
}
.oi[data-glyph=arrow-circle-left]:before {
content: '\e00a';
}
.oi[data-glyph=arrow-circle-right]:before {
content: '\e00b';
}
.oi[data-glyph=arrow-circle-top]:before {
content: '\e00c';
}
.oi[data-glyph=arrow-left]:before {
content: '\e00d';
}
.oi[data-glyph=arrow-right]:before {
content: '\e00e';
}
.oi[data-glyph=arrow-thick-bottom]:before {
content: '\e00f';
}
.oi[data-glyph=arrow-thick-left]:before {
content: '\e010';
}
.oi[data-glyph=arrow-thick-right]:before {
content: '\e011';
}
.oi[data-glyph=arrow-thick-top]:before {
content: '\e012';
}
.oi[data-glyph=arrow-top]:before {
content: '\e013';
}
.oi[data-glyph=audio-spectrum]:before {
content: '\e014';
}
.oi[data-glyph=audio]:before {
content: '\e015';
}
.oi[data-glyph=badge]:before {
content: '\e016';
}
.oi[data-glyph=ban]:before {
content: '\e017';
}
.oi[data-glyph=bar-chart]:before {
content: '\e018';
}
.oi[data-glyph=basket]:before {
content: '\e019';
}
.oi[data-glyph=battery-empty]:before {
content: '\e01a';
}
.oi[data-glyph=battery-full]:before {
content: '\e01b';
}
.oi[data-glyph=beaker]:before {
content: '\e01c';
}
.oi[data-glyph=bell]:before {
content: '\e01d';
}
.oi[data-glyph=bluetooth]:before {
content: '\e01e';
}
.oi[data-glyph=bold]:before {
content: '\e01f';
}
.oi[data-glyph=bolt]:before {
content: '\e020';
}
.oi[data-glyph=book]:before {
content: '\e021';
}
.oi[data-glyph=bookmark]:before {
content: '\e022';
}
.oi[data-glyph=box]:before {
content: '\e023';
}
.oi[data-glyph=briefcase]:before {
content: '\e024';
}
.oi[data-glyph=british-pound]:before {
content: '\e025';
}
.oi[data-glyph=browser]:before {
content: '\e026';
}
.oi[data-glyph=brush]:before {
content: '\e027';
}
.oi[data-glyph=bug]:before {
content: '\e028';
}
.oi[data-glyph=bullhorn]:before {
content: '\e029';
}
.oi[data-glyph=calculator]:before {
content: '\e02a';
}
.oi[data-glyph=calendar]:before {
content: '\e02b';
}
.oi[data-glyph=camera-slr]:before {
content: '\e02c';
}
.oi[data-glyph=caret-bottom]:before {
content: '\e02d';
}
.oi[data-glyph=caret-left]:before {
content: '\e02e';
}
.oi[data-glyph=caret-right]:before {
content: '\e02f';
}
.oi[data-glyph=caret-top]:before {
content: '\e030';
}
.oi[data-glyph=cart]:before {
content: '\e031';
}
.oi[data-glyph=chat]:before {
content: '\e032';
}
.oi[data-glyph=check]:before {
content: '\e033';
}
.oi[data-glyph=chevron-bottom]:before {
content: '\e034';
}
.oi[data-glyph=chevron-left]:before {
content: '\e035';
}
.oi[data-glyph=chevron-right]:before {
content: '\e036';
}
.oi[data-glyph=chevron-top]:before {
content: '\e037';
}
.oi[data-glyph=circle-check]:before {
content: '\e038';
}
.oi[data-glyph=circle-x]:before {
content: '\e039';
}
.oi[data-glyph=clipboard]:before {
content: '\e03a';
}
.oi[data-glyph=clock]:before {
content: '\e03b';
}
.oi[data-glyph=cloud-download]:before {
content: '\e03c';
}
.oi[data-glyph=cloud-upload]:before {
content: '\e03d';
}
.oi[data-glyph=cloud]:before {
content: '\e03e';
}
.oi[data-glyph=cloudy]:before {
content: '\e03f';
}
.oi[data-glyph=code]:before {
content: '\e040';
}
.oi[data-glyph=cog]:before {
content: '\e041';
}
.oi[data-glyph=collapse-down]:before {
content: '\e042';
}
.oi[data-glyph=collapse-left]:before {
content: '\e043';
}
.oi[data-glyph=collapse-right]:before {
content: '\e044';
}
.oi[data-glyph=collapse-up]:before {
content: '\e045';
}
.oi[data-glyph=command]:before {
content: '\e046';
}
.oi[data-glyph=comment-square]:before {
content: '\e047';
}
.oi[data-glyph=compass]:before {
content: '\e048';
}
.oi[data-glyph=contrast]:before {
content: '\e049';
}
.oi[data-glyph=copywriting]:before {
content: '\e04a';
}
.oi[data-glyph=credit-card]:before {
content: '\e04b';
}
.oi[data-glyph=crop]:before {
content: '\e04c';
}
.oi[data-glyph=dashboard]:before {
content: '\e04d';
}
.oi[data-glyph=data-transfer-download]:before {
content: '\e04e';
}
.oi[data-glyph=data-transfer-upload]:before {
content: '\e04f';
}
.oi[data-glyph=delete]:before {
content: '\e050';
}
.oi[data-glyph=dial]:before {
content: '\e051';
}
.oi[data-glyph=document]:before {
content: '\e052';
}
.oi[data-glyph=dollar]:before {
content: '\e053';
}
.oi[data-glyph=double-quote-sans-left]:before {
content: '\e054';
}
.oi[data-glyph=double-quote-sans-right]:before {
content: '\e055';
}
.oi[data-glyph=double-quote-serif-left]:before {
content: '\e056';
}
.oi[data-glyph=double-quote-serif-right]:before {
content: '\e057';
}
.oi[data-glyph=droplet]:before {
content: '\e058';
}
.oi[data-glyph=eject]:before {
content: '\e059';
}
.oi[data-glyph=elevator]:before {
content: '\e05a';
}
.oi[data-glyph=ellipses]:before {
content: '\e05b';
}
.oi[data-glyph=envelope-closed]:before {
content: '\e05c';
}
.oi[data-glyph=envelope-open]:before {
content: '\e05d';
}
.oi[data-glyph=euro]:before {
content: '\e05e';
}
.oi[data-glyph=excerpt]:before {
content: '\e05f';
}
.oi[data-glyph=expand-down]:before {
content: '\e060';
}
.oi[data-glyph=expand-left]:before {
content: '\e061';
}
.oi[data-glyph=expand-right]:before {
content: '\e062';
}
.oi[data-glyph=expand-up]:before {
content: '\e063';
}
.oi[data-glyph=external-link]:before {
content: '\e064';
}
.oi[data-glyph=eye]:before {
content: '\e065';
}
.oi[data-glyph=eyedropper]:before {
content: '\e066';
}
.oi[data-glyph=file]:before {
content: '\e067';
}
.oi[data-glyph=fire]:before {
content: '\e068';
}
.oi[data-glyph=flag]:before {
content: '\e069';
}
.oi[data-glyph=flash]:before {
content: '\e06a';
}
.oi[data-glyph=folder]:before {
content: '\e06b';
}
.oi[data-glyph=fork]:before {
content: '\e06c';
}
.oi[data-glyph=fullscreen-enter]:before {
content: '\e06d';
}
.oi[data-glyph=fullscreen-exit]:before {
content: '\e06e';
}
.oi[data-glyph=globe]:before {
content: '\e06f';
}
.oi[data-glyph=graph]:before {
content: '\e070';
}
.oi[data-glyph=grid-four-up]:before {
content: '\e071';
}
.oi[data-glyph=grid-three-up]:before {
content: '\e072';
}
.oi[data-glyph=grid-two-up]:before {
content: '\e073';
}
.oi[data-glyph=hard-drive]:before {
content: '\e074';
}
.oi[data-glyph=header]:before {
content: '\e075';
}
.oi[data-glyph=headphones]:before {
content: '\e076';
}
.oi[data-glyph=heart]:before {
content: '\e077';
}
.oi[data-glyph=home]:before {
content: '\e078';
}
.oi[data-glyph=image]:before {
content: '\e079';
}
.oi[data-glyph=inbox]:before {
content: '\e07a';
}
.oi[data-glyph=infinity]:before {
content: '\e07b';
}
.oi[data-glyph=info]:before {
content: '\e07c';
}
.oi[data-glyph=italic]:before {
content: '\e07d';
}
.oi[data-glyph=justify-center]:before {
content: '\e07e';
}
.oi[data-glyph=justify-left]:before {
content: '\e07f';
}
.oi[data-glyph=justify-right]:before {
content: '\e080';
}
.oi[data-glyph=key]:before {
content: '\e081';
}
.oi[data-glyph=laptop]:before {
content: '\e082';
}
.oi[data-glyph=layers]:before {
content: '\e083';
}
.oi[data-glyph=lightbulb]:before {
content: '\e084';
}
.oi[data-glyph=link-broken]:before {
content: '\e085';
}
.oi[data-glyph=link-intact]:before {
content: '\e086';
}
.oi[data-glyph=list-rich]:before {
content: '\e087';
}
.oi[data-glyph=list]:before {
content: '\e088';
}
.oi[data-glyph=location]:before {
content: '\e089';
}
.oi[data-glyph=lock-locked]:before {
content: '\e08a';
}
.oi[data-glyph=lock-unlocked]:before {
content: '\e08b';
}
.oi[data-glyph=loop-circular]:before {
content: '\e08c';
}
.oi[data-glyph=loop-square]:before {
content: '\e08d';
}
.oi[data-glyph=loop]:before {
content: '\e08e';
}
.oi[data-glyph=magnifying-glass]:before {
content: '\e08f';
}
.oi[data-glyph=map-marker]:before {
content: '\e090';
}
.oi[data-glyph=map]:before {
content: '\e091';
}
.oi[data-glyph=media-pause]:before {
content: '\e092';
}
.oi[data-glyph=media-play]:before {
content: '\e093';
}
.oi[data-glyph=media-record]:before {
content: '\e094';
}
.oi[data-glyph=media-skip-backward]:before {
content: '\e095';
}
.oi[data-glyph=media-skip-forward]:before {
content: '\e096';
}
.oi[data-glyph=media-step-backward]:before {
content: '\e097';
}
.oi[data-glyph=media-step-forward]:before {
content: '\e098';
}
.oi[data-glyph=media-stop]:before {
content: '\e099';
}
.oi[data-glyph=medical-cross]:before {
content: '\e09a';
}
.oi[data-glyph=menu]:before {
content: '\e09b';
}
.oi[data-glyph=microphone]:before {
content: '\e09c';
}
.oi[data-glyph=minus]:before {
content: '\e09d';
}
.oi[data-glyph=monitor]:before {
content: '\e09e';
}
.oi[data-glyph=moon]:before {
content: '\e09f';
}
.oi[data-glyph=move]:before {
content: '\e0a0';
}
.oi[data-glyph=musical-note]:before {
content: '\e0a1';
}
.oi[data-glyph=paperclip]:before {
content: '\e0a2';
}
.oi[data-glyph=pencil]:before {
content: '\e0a3';
}
.oi[data-glyph=people]:before {
content: '\e0a4';
}
.oi[data-glyph=person]:before {
content: '\e0a5';
}
.oi[data-glyph=phone]:before {
content: '\e0a6';
}
.oi[data-glyph=pie-chart]:before {
content: '\e0a7';
}
.oi[data-glyph=pin]:before {
content: '\e0a8';
}
.oi[data-glyph=play-circle]:before {
content: '\e0a9';
}
.oi[data-glyph=plus]:before {
content: '\e0aa';
}
.oi[data-glyph=power-standby]:before {
content: '\e0ab';
}
.oi[data-glyph=print]:before {
content: '\e0ac';
}
.oi[data-glyph=project]:before {
content: '\e0ad';
}
.oi[data-glyph=pulse]:before {
content: '\e0ae';
}
.oi[data-glyph=puzzle-piece]:before {
content: '\e0af';
}
.oi[data-glyph=question-mark]:before {
content: '\e0b0';
}
.oi[data-glyph=rain]:before {
content: '\e0b1';
}
.oi[data-glyph=random]:before {
content: '\e0b2';
}
.oi[data-glyph=reload]:before {
content: '\e0b3';
}
.oi[data-glyph=resize-both]:before {
content: '\e0b4';
}
.oi[data-glyph=resize-height]:before {
content: '\e0b5';
}
.oi[data-glyph=resize-width]:before {
content: '\e0b6';
}
.oi[data-glyph=rss-alt]:before {
content: '\e0b7';
}
.oi[data-glyph=rss]:before {
content: '\e0b8';
}
.oi[data-glyph=script]:before {
content: '\e0b9';
}
.oi[data-glyph=share-boxed]:before {
content: '\e0ba';
}
.oi[data-glyph=share]:before {
content: '\e0bb';
}
.oi[data-glyph=shield]:before {
content: '\e0bc';
}
.oi[data-glyph=signal]:before {
content: '\e0bd';
}
.oi[data-glyph=signpost]:before {
content: '\e0be';
}
.oi[data-glyph=sort-ascending]:before {
content: '\e0bf';
}
.oi[data-glyph=sort-descending]:before {
content: '\e0c0';
}
.oi[data-glyph=spreadsheet]:before {
content: '\e0c1';
}
.oi[data-glyph=star]:before {
content: '\e0c2';
}
.oi[data-glyph=sun]:before {
content: '\e0c3';
}
.oi[data-glyph=tablet]:before {
content: '\e0c4';
}
.oi[data-glyph=tag]:before {
content: '\e0c5';
}
.oi[data-glyph=tags]:before {
content: '\e0c6';
}
.oi[data-glyph=target]:before {
content: '\e0c7';
}
.oi[data-glyph=task]:before {
content: '\e0c8';
}
.oi[data-glyph=terminal]:before {
content: '\e0c9';
}
.oi[data-glyph=text]:before {
content: '\e0ca';
}
.oi[data-glyph=thumb-down]:before {
content: '\e0cb';
}
.oi[data-glyph=thumb-up]:before {
content: '\e0cc';
}
.oi[data-glyph=timer]:before {
content: '\e0cd';
}
.oi[data-glyph=transfer]:before {
content: '\e0ce';
}
.oi[data-glyph=trash]:before {
content: '\e0cf';
}
.oi[data-glyph=underline]:before {
content: '\e0d0';
}
.oi[data-glyph=vertical-align-bottom]:before {
content: '\e0d1';
}
.oi[data-glyph=vertical-align-center]:before {
content: '\e0d2';
}
.oi[data-glyph=vertical-align-top]:before {
content: '\e0d3';
}
.oi[data-glyph=video]:before {
content: '\e0d4';
}
.oi[data-glyph=volume-high]:before {
content: '\e0d5';
}
.oi[data-glyph=volume-low]:before {
content: '\e0d6';
}
.oi[data-glyph=volume-off]:before {
content: '\e0d7';
}
.oi[data-glyph=warning]:before {
content: '\e0d8';
}
.oi[data-glyph=wifi]:before {
content: '\e0d9';
}
.oi[data-glyph=wrench]:before {
content: '\e0da';
}
.oi[data-glyph=x]:before {
content: '\e0db';
}
.oi[data-glyph=yen]:before {
content: '\e0dc';
}
.oi[data-glyph=zoom-in]:before {
content: '\e0dd';
}
.oi[data-glyph=zoom-out]:before {
content: '\e0de';
}

File diff suppressed because one or more lines are too long

View File

@ -0,0 +1,963 @@
$iconic-font-path: '../fonts/' !default;
@font-face {
font-family: 'Icons';
src: url('#{$iconic-font-path}open-iconic.eot');
src: url('#{$iconic-font-path}open-iconic.eot?#iconic-sm') format('embedded-opentype'), url('#{$iconic-font-path}open-iconic.woff') format('woff'), url('#{$iconic-font-path}open-iconic.ttf') format('truetype'), url('#{$iconic-font-path}open-iconic.otf') format('opentype'), url('#{$iconic-font-path}open-iconic.svg#iconic-sm') format('svg');
font-weight: normal;
font-style: normal;
}
.oi[data-glyph].oi-text-replace {
font-size: 0;
line-height: 0;
}
.oi[data-glyph].oi-text-replace:before {
width: 1em;
text-align: center;
}
.oi[data-glyph] {
&:before {
position: relative;
top: 1px;
font-family: 'Icons';
display: inline-block;
speak: none;
line-height: 1;
vertical-align: baseline;
font-weight: normal;
font-style: normal;
-webkit-font-smoothing: antialiased;
-moz-osx-font-smoothing: grayscale;
}
&:empty:before {
width: 1em;
text-align: center;
box-sizing: content-box;
}
&.oi-align-left:before {
text-align: left;
}
&.oi-align-right:before {
text-align: right;
}
&.oi-align-center:before {
text-align: center;
}
&.oi-flip-horizontal:before {
-webkit-transform: scale(-1, 1);
-ms-transform: scale(-1, 1);
transform: scale(-1, 1);
}
&.oi-flip-vertical:before {
-webkit-transform: scale(1, -1);
-ms-transform: scale(-1, 1);
transform: scale(1, -1);
}
&.oi-flip-horizontal-vertical:before {
-webkit-transform: scale(-1, -1);
-ms-transform: scale(-1, 1);
transform: scale(-1, -1);
}
}
.oi[data-glyph=account-login]:before {
content: '\e000';
}
.oi[data-glyph=account-logout]:before {
content: '\e001';
}
.oi[data-glyph=action-redo]:before {
content: '\e002';
}
.oi[data-glyph=action-undo]:before {
content: '\e003';
}
.oi[data-glyph=align-center]:before {
content: '\e004';
}
.oi[data-glyph=align-left]:before {
content: '\e005';
}
.oi[data-glyph=align-right]:before {
content: '\e006';
}
.oi[data-glyph=aperture]:before {
content: '\e007';
}
.oi[data-glyph=arrow-bottom]:before {
content: '\e008';
}
.oi[data-glyph=arrow-circle-bottom]:before {
content: '\e009';
}
.oi[data-glyph=arrow-circle-left]:before {
content: '\e00a';
}
.oi[data-glyph=arrow-circle-right]:before {
content: '\e00b';
}
.oi[data-glyph=arrow-circle-top]:before {
content: '\e00c';
}
.oi[data-glyph=arrow-left]:before {
content: '\e00d';
}
.oi[data-glyph=arrow-right]:before {
content: '\e00e';
}
.oi[data-glyph=arrow-thick-bottom]:before {
content: '\e00f';
}
.oi[data-glyph=arrow-thick-left]:before {
content: '\e010';
}
.oi[data-glyph=arrow-thick-right]:before {
content: '\e011';
}
.oi[data-glyph=arrow-thick-top]:before {
content: '\e012';
}
.oi[data-glyph=arrow-top]:before {
content: '\e013';
}
.oi[data-glyph=audio-spectrum]:before {
content: '\e014';
}
.oi[data-glyph=audio]:before {
content: '\e015';
}
.oi[data-glyph=badge]:before {
content: '\e016';
}
.oi[data-glyph=ban]:before {
content: '\e017';
}
.oi[data-glyph=bar-chart]:before {
content: '\e018';
}
.oi[data-glyph=basket]:before {
content: '\e019';
}
.oi[data-glyph=battery-empty]:before {
content: '\e01a';
}
.oi[data-glyph=battery-full]:before {
content: '\e01b';
}
.oi[data-glyph=beaker]:before {
content: '\e01c';
}
.oi[data-glyph=bell]:before {
content: '\e01d';
}
.oi[data-glyph=bluetooth]:before {
content: '\e01e';
}
.oi[data-glyph=bold]:before {
content: '\e01f';
}
.oi[data-glyph=bolt]:before {
content: '\e020';
}
.oi[data-glyph=book]:before {
content: '\e021';
}
.oi[data-glyph=bookmark]:before {
content: '\e022';
}
.oi[data-glyph=box]:before {
content: '\e023';
}
.oi[data-glyph=briefcase]:before {
content: '\e024';
}
.oi[data-glyph=british-pound]:before {
content: '\e025';
}
.oi[data-glyph=browser]:before {
content: '\e026';
}
.oi[data-glyph=brush]:before {
content: '\e027';
}
.oi[data-glyph=bug]:before {
content: '\e028';
}
.oi[data-glyph=bullhorn]:before {
content: '\e029';
}
.oi[data-glyph=calculator]:before {
content: '\e02a';
}
.oi[data-glyph=calendar]:before {
content: '\e02b';
}
.oi[data-glyph=camera-slr]:before {
content: '\e02c';
}
.oi[data-glyph=caret-bottom]:before {
content: '\e02d';
}
.oi[data-glyph=caret-left]:before {
content: '\e02e';
}
.oi[data-glyph=caret-right]:before {
content: '\e02f';
}
.oi[data-glyph=caret-top]:before {
content: '\e030';
}
.oi[data-glyph=cart]:before {
content: '\e031';
}
.oi[data-glyph=chat]:before {
content: '\e032';
}
.oi[data-glyph=check]:before {
content: '\e033';
}
.oi[data-glyph=chevron-bottom]:before {
content: '\e034';
}
.oi[data-glyph=chevron-left]:before {
content: '\e035';
}
.oi[data-glyph=chevron-right]:before {
content: '\e036';
}
.oi[data-glyph=chevron-top]:before {
content: '\e037';
}
.oi[data-glyph=circle-check]:before {
content: '\e038';
}
.oi[data-glyph=circle-x]:before {
content: '\e039';
}
.oi[data-glyph=clipboard]:before {
content: '\e03a';
}
.oi[data-glyph=clock]:before {
content: '\e03b';
}
.oi[data-glyph=cloud-download]:before {
content: '\e03c';
}
.oi[data-glyph=cloud-upload]:before {
content: '\e03d';
}
.oi[data-glyph=cloud]:before {
content: '\e03e';
}
.oi[data-glyph=cloudy]:before {
content: '\e03f';
}
.oi[data-glyph=code]:before {
content: '\e040';
}
.oi[data-glyph=cog]:before {
content: '\e041';
}
.oi[data-glyph=collapse-down]:before {
content: '\e042';
}
.oi[data-glyph=collapse-left]:before {
content: '\e043';
}
.oi[data-glyph=collapse-right]:before {
content: '\e044';
}
.oi[data-glyph=collapse-up]:before {
content: '\e045';
}
.oi[data-glyph=command]:before {
content: '\e046';
}
.oi[data-glyph=comment-square]:before {
content: '\e047';
}
.oi[data-glyph=compass]:before {
content: '\e048';
}
.oi[data-glyph=contrast]:before {
content: '\e049';
}
.oi[data-glyph=copywriting]:before {
content: '\e04a';
}
.oi[data-glyph=credit-card]:before {
content: '\e04b';
}
.oi[data-glyph=crop]:before {
content: '\e04c';
}
.oi[data-glyph=dashboard]:before {
content: '\e04d';
}
.oi[data-glyph=data-transfer-download]:before {
content: '\e04e';
}
.oi[data-glyph=data-transfer-upload]:before {
content: '\e04f';
}
.oi[data-glyph=delete]:before {
content: '\e050';
}
.oi[data-glyph=dial]:before {
content: '\e051';
}
.oi[data-glyph=document]:before {
content: '\e052';
}
.oi[data-glyph=dollar]:before {
content: '\e053';
}
.oi[data-glyph=double-quote-sans-left]:before {
content: '\e054';
}
.oi[data-glyph=double-quote-sans-right]:before {
content: '\e055';
}
.oi[data-glyph=double-quote-serif-left]:before {
content: '\e056';
}
.oi[data-glyph=double-quote-serif-right]:before {
content: '\e057';
}
.oi[data-glyph=droplet]:before {
content: '\e058';
}
.oi[data-glyph=eject]:before {
content: '\e059';
}
.oi[data-glyph=elevator]:before {
content: '\e05a';
}
.oi[data-glyph=ellipses]:before {
content: '\e05b';
}
.oi[data-glyph=envelope-closed]:before {
content: '\e05c';
}
.oi[data-glyph=envelope-open]:before {
content: '\e05d';
}
.oi[data-glyph=euro]:before {
content: '\e05e';
}
.oi[data-glyph=excerpt]:before {
content: '\e05f';
}
.oi[data-glyph=expand-down]:before {
content: '\e060';
}
.oi[data-glyph=expand-left]:before {
content: '\e061';
}
.oi[data-glyph=expand-right]:before {
content: '\e062';
}
.oi[data-glyph=expand-up]:before {
content: '\e063';
}
.oi[data-glyph=external-link]:before {
content: '\e064';
}
.oi[data-glyph=eye]:before {
content: '\e065';
}
.oi[data-glyph=eyedropper]:before {
content: '\e066';
}
.oi[data-glyph=file]:before {
content: '\e067';
}
.oi[data-glyph=fire]:before {
content: '\e068';
}
.oi[data-glyph=flag]:before {
content: '\e069';
}
.oi[data-glyph=flash]:before {
content: '\e06a';
}
.oi[data-glyph=folder]:before {
content: '\e06b';
}
.oi[data-glyph=fork]:before {
content: '\e06c';
}
.oi[data-glyph=fullscreen-enter]:before {
content: '\e06d';
}
.oi[data-glyph=fullscreen-exit]:before {
content: '\e06e';
}
.oi[data-glyph=globe]:before {
content: '\e06f';
}
.oi[data-glyph=graph]:before {
content: '\e070';
}
.oi[data-glyph=grid-four-up]:before {
content: '\e071';
}
.oi[data-glyph=grid-three-up]:before {
content: '\e072';
}
.oi[data-glyph=grid-two-up]:before {
content: '\e073';
}
.oi[data-glyph=hard-drive]:before {
content: '\e074';
}
.oi[data-glyph=header]:before {
content: '\e075';
}
.oi[data-glyph=headphones]:before {
content: '\e076';
}
.oi[data-glyph=heart]:before {
content: '\e077';
}
.oi[data-glyph=home]:before {
content: '\e078';
}
.oi[data-glyph=image]:before {
content: '\e079';
}
.oi[data-glyph=inbox]:before {
content: '\e07a';
}
.oi[data-glyph=infinity]:before {
content: '\e07b';
}
.oi[data-glyph=info]:before {
content: '\e07c';
}
.oi[data-glyph=italic]:before {
content: '\e07d';
}
.oi[data-glyph=justify-center]:before {
content: '\e07e';
}
.oi[data-glyph=justify-left]:before {
content: '\e07f';
}
.oi[data-glyph=justify-right]:before {
content: '\e080';
}
.oi[data-glyph=key]:before {
content: '\e081';
}
.oi[data-glyph=laptop]:before {
content: '\e082';
}
.oi[data-glyph=layers]:before {
content: '\e083';
}
.oi[data-glyph=lightbulb]:before {
content: '\e084';
}
.oi[data-glyph=link-broken]:before {
content: '\e085';
}
.oi[data-glyph=link-intact]:before {
content: '\e086';
}
.oi[data-glyph=list-rich]:before {
content: '\e087';
}
.oi[data-glyph=list]:before {
content: '\e088';
}
.oi[data-glyph=location]:before {
content: '\e089';
}
.oi[data-glyph=lock-locked]:before {
content: '\e08a';
}
.oi[data-glyph=lock-unlocked]:before {
content: '\e08b';
}
.oi[data-glyph=loop-circular]:before {
content: '\e08c';
}
.oi[data-glyph=loop-square]:before {
content: '\e08d';
}
.oi[data-glyph=loop]:before {
content: '\e08e';
}
.oi[data-glyph=magnifying-glass]:before {
content: '\e08f';
}
.oi[data-glyph=map-marker]:before {
content: '\e090';
}
.oi[data-glyph=map]:before {
content: '\e091';
}
.oi[data-glyph=media-pause]:before {
content: '\e092';
}
.oi[data-glyph=media-play]:before {
content: '\e093';
}
.oi[data-glyph=media-record]:before {
content: '\e094';
}
.oi[data-glyph=media-skip-backward]:before {
content: '\e095';
}
.oi[data-glyph=media-skip-forward]:before {
content: '\e096';
}
.oi[data-glyph=media-step-backward]:before {
content: '\e097';
}
.oi[data-glyph=media-step-forward]:before {
content: '\e098';
}
.oi[data-glyph=media-stop]:before {
content: '\e099';
}
.oi[data-glyph=medical-cross]:before {
content: '\e09a';
}
.oi[data-glyph=menu]:before {
content: '\e09b';
}
.oi[data-glyph=microphone]:before {
content: '\e09c';
}
.oi[data-glyph=minus]:before {
content: '\e09d';
}
.oi[data-glyph=monitor]:before {
content: '\e09e';
}
.oi[data-glyph=moon]:before {
content: '\e09f';
}
.oi[data-glyph=move]:before {
content: '\e0a0';
}
.oi[data-glyph=musical-note]:before {
content: '\e0a1';
}
.oi[data-glyph=paperclip]:before {
content: '\e0a2';
}
.oi[data-glyph=pencil]:before {
content: '\e0a3';
}
.oi[data-glyph=people]:before {
content: '\e0a4';
}
.oi[data-glyph=person]:before {
content: '\e0a5';
}
.oi[data-glyph=phone]:before {
content: '\e0a6';
}
.oi[data-glyph=pie-chart]:before {
content: '\e0a7';
}
.oi[data-glyph=pin]:before {
content: '\e0a8';
}
.oi[data-glyph=play-circle]:before {
content: '\e0a9';
}
.oi[data-glyph=plus]:before {
content: '\e0aa';
}
.oi[data-glyph=power-standby]:before {
content: '\e0ab';
}
.oi[data-glyph=print]:before {
content: '\e0ac';
}
.oi[data-glyph=project]:before {
content: '\e0ad';
}
.oi[data-glyph=pulse]:before {
content: '\e0ae';
}
.oi[data-glyph=puzzle-piece]:before {
content: '\e0af';
}
.oi[data-glyph=question-mark]:before {
content: '\e0b0';
}
.oi[data-glyph=rain]:before {
content: '\e0b1';
}
.oi[data-glyph=random]:before {
content: '\e0b2';
}
.oi[data-glyph=reload]:before {
content: '\e0b3';
}
.oi[data-glyph=resize-both]:before {
content: '\e0b4';
}
.oi[data-glyph=resize-height]:before {
content: '\e0b5';
}
.oi[data-glyph=resize-width]:before {
content: '\e0b6';
}
.oi[data-glyph=rss-alt]:before {
content: '\e0b7';
}
.oi[data-glyph=rss]:before {
content: '\e0b8';
}
.oi[data-glyph=script]:before {
content: '\e0b9';
}
.oi[data-glyph=share-boxed]:before {
content: '\e0ba';
}
.oi[data-glyph=share]:before {
content: '\e0bb';
}
.oi[data-glyph=shield]:before {
content: '\e0bc';
}
.oi[data-glyph=signal]:before {
content: '\e0bd';
}
.oi[data-glyph=signpost]:before {
content: '\e0be';
}
.oi[data-glyph=sort-ascending]:before {
content: '\e0bf';
}
.oi[data-glyph=sort-descending]:before {
content: '\e0c0';
}
.oi[data-glyph=spreadsheet]:before {
content: '\e0c1';
}
.oi[data-glyph=star]:before {
content: '\e0c2';
}
.oi[data-glyph=sun]:before {
content: '\e0c3';
}
.oi[data-glyph=tablet]:before {
content: '\e0c4';
}
.oi[data-glyph=tag]:before {
content: '\e0c5';
}
.oi[data-glyph=tags]:before {
content: '\e0c6';
}
.oi[data-glyph=target]:before {
content: '\e0c7';
}
.oi[data-glyph=task]:before {
content: '\e0c8';
}
.oi[data-glyph=terminal]:before {
content: '\e0c9';
}
.oi[data-glyph=text]:before {
content: '\e0ca';
}
.oi[data-glyph=thumb-down]:before {
content: '\e0cb';
}
.oi[data-glyph=thumb-up]:before {
content: '\e0cc';
}
.oi[data-glyph=timer]:before {
content: '\e0cd';
}
.oi[data-glyph=transfer]:before {
content: '\e0ce';
}
.oi[data-glyph=trash]:before {
content: '\e0cf';
}
.oi[data-glyph=underline]:before {
content: '\e0d0';
}
.oi[data-glyph=vertical-align-bottom]:before {
content: '\e0d1';
}
.oi[data-glyph=vertical-align-center]:before {
content: '\e0d2';
}
.oi[data-glyph=vertical-align-top]:before {
content: '\e0d3';
}
.oi[data-glyph=video]:before {
content: '\e0d4';
}
.oi[data-glyph=volume-high]:before {
content: '\e0d5';
}
.oi[data-glyph=volume-low]:before {
content: '\e0d6';
}
.oi[data-glyph=volume-off]:before {
content: '\e0d7';
}
.oi[data-glyph=warning]:before {
content: '\e0d8';
}
.oi[data-glyph=wifi]:before {
content: '\e0d9';
}
.oi[data-glyph=wrench]:before {
content: '\e0da';
}
.oi[data-glyph=x]:before {
content: '\e0db';
}
.oi[data-glyph=yen]:before {
content: '\e0dc';
}
.oi[data-glyph=zoom-in]:before {
content: '\e0dd';
}
.oi[data-glyph=zoom-out]:before {
content: '\e0de';
}

View File

@ -0,0 +1,733 @@
@font-face
font-family 'Icons'
src url('../fonts/open-iconic.eot')
src url('../fonts/open-iconic.eot?#iconic-sm') format('embedded-opentype'), url('../fonts/open-iconic.woff') format('woff'), url('../fonts/open-iconic.ttf') format('truetype'), url('../fonts/open-iconic.otf') format('opentype'), url('../fonts/open-iconic.svg#iconic-sm') format('svg')
font-weight normal
font-style normal
.oi[data-glyph].oi-text-replace
font-size 0
line-height 0
.oi[data-glyph].oi-text-replace:before
width 1em
text-align center
.oi[data-glyph]
&:before
position relative
top 1px
font-family 'Icons'
display inline-block
speak none
line-height 1
vertical-align baseline
font-weight normal
font-style normal
-webkit-font-smoothing antialiased
-moz-osx-font-smoothing grayscale
&:empty:before
width 1em
text-align center
box-sizing content-box
&.oi-align-left:before
text-align left
&.oi-align-right:before
text-align right
&.oi-align-center:before
text-align center
&.oi-flip-horizontal:before
-webkit-transform scale(-1, 1)
-ms-transform scale(-1, 1)
transform scale(-1, 1)
&.oi-flip-vertical:before
-webkit-transform scale(1, -1)
-ms-transform scale(-1, 1)
transform scale(1, -1)
&.oi-flip-horizontal-vertical:before
-webkit-transform scale(-1, -1)
-ms-transform scale(-1, 1)
transform scale(-1, -1)
.oi[data-glyph=account-login]:before
content '\e000'
.oi[data-glyph=account-logout]:before
content '\e001'
.oi[data-glyph=action-redo]:before
content '\e002'
.oi[data-glyph=action-undo]:before
content '\e003'
.oi[data-glyph=align-center]:before
content '\e004'
.oi[data-glyph=align-left]:before
content '\e005'
.oi[data-glyph=align-right]:before
content '\e006'
.oi[data-glyph=aperture]:before
content '\e007'
.oi[data-glyph=arrow-bottom]:before
content '\e008'
.oi[data-glyph=arrow-circle-bottom]:before
content '\e009'
.oi[data-glyph=arrow-circle-left]:before
content '\e00a'
.oi[data-glyph=arrow-circle-right]:before
content '\e00b'
.oi[data-glyph=arrow-circle-top]:before
content '\e00c'
.oi[data-glyph=arrow-left]:before
content '\e00d'
.oi[data-glyph=arrow-right]:before
content '\e00e'
.oi[data-glyph=arrow-thick-bottom]:before
content '\e00f'
.oi[data-glyph=arrow-thick-left]:before
content '\e010'
.oi[data-glyph=arrow-thick-right]:before
content '\e011'
.oi[data-glyph=arrow-thick-top]:before
content '\e012'
.oi[data-glyph=arrow-top]:before
content '\e013'
.oi[data-glyph=audio-spectrum]:before
content '\e014'
.oi[data-glyph=audio]:before
content '\e015'
.oi[data-glyph=badge]:before
content '\e016'
.oi[data-glyph=ban]:before
content '\e017'
.oi[data-glyph=bar-chart]:before
content '\e018'
.oi[data-glyph=basket]:before
content '\e019'
.oi[data-glyph=battery-empty]:before
content '\e01a'
.oi[data-glyph=battery-full]:before
content '\e01b'
.oi[data-glyph=beaker]:before
content '\e01c'
.oi[data-glyph=bell]:before
content '\e01d'
.oi[data-glyph=bluetooth]:before
content '\e01e'
.oi[data-glyph=bold]:before
content '\e01f'
.oi[data-glyph=bolt]:before
content '\e020'
.oi[data-glyph=book]:before
content '\e021'
.oi[data-glyph=bookmark]:before
content '\e022'
.oi[data-glyph=box]:before
content '\e023'
.oi[data-glyph=briefcase]:before
content '\e024'
.oi[data-glyph=british-pound]:before
content '\e025'
.oi[data-glyph=browser]:before
content '\e026'
.oi[data-glyph=brush]:before
content '\e027'
.oi[data-glyph=bug]:before
content '\e028'
.oi[data-glyph=bullhorn]:before
content '\e029'
.oi[data-glyph=calculator]:before
content '\e02a'
.oi[data-glyph=calendar]:before
content '\e02b'
.oi[data-glyph=camera-slr]:before
content '\e02c'
.oi[data-glyph=caret-bottom]:before
content '\e02d'
.oi[data-glyph=caret-left]:before
content '\e02e'
.oi[data-glyph=caret-right]:before
content '\e02f'
.oi[data-glyph=caret-top]:before
content '\e030'
.oi[data-glyph=cart]:before
content '\e031'
.oi[data-glyph=chat]:before
content '\e032'
.oi[data-glyph=check]:before
content '\e033'
.oi[data-glyph=chevron-bottom]:before
content '\e034'
.oi[data-glyph=chevron-left]:before
content '\e035'
.oi[data-glyph=chevron-right]:before
content '\e036'
.oi[data-glyph=chevron-top]:before
content '\e037'
.oi[data-glyph=circle-check]:before
content '\e038'
.oi[data-glyph=circle-x]:before
content '\e039'
.oi[data-glyph=clipboard]:before
content '\e03a'
.oi[data-glyph=clock]:before
content '\e03b'
.oi[data-glyph=cloud-download]:before
content '\e03c'
.oi[data-glyph=cloud-upload]:before
content '\e03d'
.oi[data-glyph=cloud]:before
content '\e03e'
.oi[data-glyph=cloudy]:before
content '\e03f'
.oi[data-glyph=code]:before
content '\e040'
.oi[data-glyph=cog]:before
content '\e041'
.oi[data-glyph=collapse-down]:before
content '\e042'
.oi[data-glyph=collapse-left]:before
content '\e043'
.oi[data-glyph=collapse-right]:before
content '\e044'
.oi[data-glyph=collapse-up]:before
content '\e045'
.oi[data-glyph=command]:before
content '\e046'
.oi[data-glyph=comment-square]:before
content '\e047'
.oi[data-glyph=compass]:before
content '\e048'
.oi[data-glyph=contrast]:before
content '\e049'
.oi[data-glyph=copywriting]:before
content '\e04a'
.oi[data-glyph=credit-card]:before
content '\e04b'
.oi[data-glyph=crop]:before
content '\e04c'
.oi[data-glyph=dashboard]:before
content '\e04d'
.oi[data-glyph=data-transfer-download]:before
content '\e04e'
.oi[data-glyph=data-transfer-upload]:before
content '\e04f'
.oi[data-glyph=delete]:before
content '\e050'
.oi[data-glyph=dial]:before
content '\e051'
.oi[data-glyph=document]:before
content '\e052'
.oi[data-glyph=dollar]:before
content '\e053'
.oi[data-glyph=double-quote-sans-left]:before
content '\e054'
.oi[data-glyph=double-quote-sans-right]:before
content '\e055'
.oi[data-glyph=double-quote-serif-left]:before
content '\e056'
.oi[data-glyph=double-quote-serif-right]:before
content '\e057'
.oi[data-glyph=droplet]:before
content '\e058'
.oi[data-glyph=eject]:before
content '\e059'
.oi[data-glyph=elevator]:before
content '\e05a'
.oi[data-glyph=ellipses]:before
content '\e05b'
.oi[data-glyph=envelope-closed]:before
content '\e05c'
.oi[data-glyph=envelope-open]:before
content '\e05d'
.oi[data-glyph=euro]:before
content '\e05e'
.oi[data-glyph=excerpt]:before
content '\e05f'
.oi[data-glyph=expand-down]:before
content '\e060'
.oi[data-glyph=expand-left]:before
content '\e061'
.oi[data-glyph=expand-right]:before
content '\e062'
.oi[data-glyph=expand-up]:before
content '\e063'
.oi[data-glyph=external-link]:before
content '\e064'
.oi[data-glyph=eye]:before
content '\e065'
.oi[data-glyph=eyedropper]:before
content '\e066'
.oi[data-glyph=file]:before
content '\e067'
.oi[data-glyph=fire]:before
content '\e068'
.oi[data-glyph=flag]:before
content '\e069'
.oi[data-glyph=flash]:before
content '\e06a'
.oi[data-glyph=folder]:before
content '\e06b'
.oi[data-glyph=fork]:before
content '\e06c'
.oi[data-glyph=fullscreen-enter]:before
content '\e06d'
.oi[data-glyph=fullscreen-exit]:before
content '\e06e'
.oi[data-glyph=globe]:before
content '\e06f'
.oi[data-glyph=graph]:before
content '\e070'
.oi[data-glyph=grid-four-up]:before
content '\e071'
.oi[data-glyph=grid-three-up]:before
content '\e072'
.oi[data-glyph=grid-two-up]:before
content '\e073'
.oi[data-glyph=hard-drive]:before
content '\e074'
.oi[data-glyph=header]:before
content '\e075'
.oi[data-glyph=headphones]:before
content '\e076'
.oi[data-glyph=heart]:before
content '\e077'
.oi[data-glyph=home]:before
content '\e078'
.oi[data-glyph=image]:before
content '\e079'
.oi[data-glyph=inbox]:before
content '\e07a'
.oi[data-glyph=infinity]:before
content '\e07b'
.oi[data-glyph=info]:before
content '\e07c'
.oi[data-glyph=italic]:before
content '\e07d'
.oi[data-glyph=justify-center]:before
content '\e07e'
.oi[data-glyph=justify-left]:before
content '\e07f'
.oi[data-glyph=justify-right]:before
content '\e080'
.oi[data-glyph=key]:before
content '\e081'
.oi[data-glyph=laptop]:before
content '\e082'
.oi[data-glyph=layers]:before
content '\e083'
.oi[data-glyph=lightbulb]:before
content '\e084'
.oi[data-glyph=link-broken]:before
content '\e085'
.oi[data-glyph=link-intact]:before
content '\e086'
.oi[data-glyph=list-rich]:before
content '\e087'
.oi[data-glyph=list]:before
content '\e088'
.oi[data-glyph=location]:before
content '\e089'
.oi[data-glyph=lock-locked]:before
content '\e08a'
.oi[data-glyph=lock-unlocked]:before
content '\e08b'
.oi[data-glyph=loop-circular]:before
content '\e08c'
.oi[data-glyph=loop-square]:before
content '\e08d'
.oi[data-glyph=loop]:before
content '\e08e'
.oi[data-glyph=magnifying-glass]:before
content '\e08f'
.oi[data-glyph=map-marker]:before
content '\e090'
.oi[data-glyph=map]:before
content '\e091'
.oi[data-glyph=media-pause]:before
content '\e092'
.oi[data-glyph=media-play]:before
content '\e093'
.oi[data-glyph=media-record]:before
content '\e094'
.oi[data-glyph=media-skip-backward]:before
content '\e095'
.oi[data-glyph=media-skip-forward]:before
content '\e096'
.oi[data-glyph=media-step-backward]:before
content '\e097'
.oi[data-glyph=media-step-forward]:before
content '\e098'
.oi[data-glyph=media-stop]:before
content '\e099'
.oi[data-glyph=medical-cross]:before
content '\e09a'
.oi[data-glyph=menu]:before
content '\e09b'
.oi[data-glyph=microphone]:before
content '\e09c'
.oi[data-glyph=minus]:before
content '\e09d'
.oi[data-glyph=monitor]:before
content '\e09e'
.oi[data-glyph=moon]:before
content '\e09f'
.oi[data-glyph=move]:before
content '\e0a0'
.oi[data-glyph=musical-note]:before
content '\e0a1'
.oi[data-glyph=paperclip]:before
content '\e0a2'
.oi[data-glyph=pencil]:before
content '\e0a3'
.oi[data-glyph=people]:before
content '\e0a4'
.oi[data-glyph=person]:before
content '\e0a5'
.oi[data-glyph=phone]:before
content '\e0a6'
.oi[data-glyph=pie-chart]:before
content '\e0a7'
.oi[data-glyph=pin]:before
content '\e0a8'
.oi[data-glyph=play-circle]:before
content '\e0a9'
.oi[data-glyph=plus]:before
content '\e0aa'
.oi[data-glyph=power-standby]:before
content '\e0ab'
.oi[data-glyph=print]:before
content '\e0ac'
.oi[data-glyph=project]:before
content '\e0ad'
.oi[data-glyph=pulse]:before
content '\e0ae'
.oi[data-glyph=puzzle-piece]:before
content '\e0af'
.oi[data-glyph=question-mark]:before
content '\e0b0'
.oi[data-glyph=rain]:before
content '\e0b1'
.oi[data-glyph=random]:before
content '\e0b2'
.oi[data-glyph=reload]:before
content '\e0b3'
.oi[data-glyph=resize-both]:before
content '\e0b4'
.oi[data-glyph=resize-height]:before
content '\e0b5'
.oi[data-glyph=resize-width]:before
content '\e0b6'
.oi[data-glyph=rss-alt]:before
content '\e0b7'
.oi[data-glyph=rss]:before
content '\e0b8'
.oi[data-glyph=script]:before
content '\e0b9'
.oi[data-glyph=share-boxed]:before
content '\e0ba'
.oi[data-glyph=share]:before
content '\e0bb'
.oi[data-glyph=shield]:before
content '\e0bc'
.oi[data-glyph=signal]:before
content '\e0bd'
.oi[data-glyph=signpost]:before
content '\e0be'
.oi[data-glyph=sort-ascending]:before
content '\e0bf'
.oi[data-glyph=sort-descending]:before
content '\e0c0'
.oi[data-glyph=spreadsheet]:before
content '\e0c1'
.oi[data-glyph=star]:before
content '\e0c2'
.oi[data-glyph=sun]:before
content '\e0c3'
.oi[data-glyph=tablet]:before
content '\e0c4'
.oi[data-glyph=tag]:before
content '\e0c5'
.oi[data-glyph=tags]:before
content '\e0c6'
.oi[data-glyph=target]:before
content '\e0c7'
.oi[data-glyph=task]:before
content '\e0c8'
.oi[data-glyph=terminal]:before
content '\e0c9'
.oi[data-glyph=text]:before
content '\e0ca'
.oi[data-glyph=thumb-down]:before
content '\e0cb'
.oi[data-glyph=thumb-up]:before
content '\e0cc'
.oi[data-glyph=timer]:before
content '\e0cd'
.oi[data-glyph=transfer]:before
content '\e0ce'
.oi[data-glyph=trash]:before
content '\e0cf'
.oi[data-glyph=underline]:before
content '\e0d0'
.oi[data-glyph=vertical-align-bottom]:before
content '\e0d1'
.oi[data-glyph=vertical-align-center]:before
content '\e0d2'
.oi[data-glyph=vertical-align-top]:before
content '\e0d3'
.oi[data-glyph=video]:before
content '\e0d4'
.oi[data-glyph=volume-high]:before
content '\e0d5'
.oi[data-glyph=volume-low]:before
content '\e0d6'
.oi[data-glyph=volume-off]:before
content '\e0d7'
.oi[data-glyph=warning]:before
content '\e0d8'
.oi[data-glyph=wifi]:before
content '\e0d9'
.oi[data-glyph=wrench]:before
content '\e0da'
.oi[data-glyph=x]:before
content '\e0db'
.oi[data-glyph=yen]:before
content '\e0dc'
.oi[data-glyph=zoom-in]:before
content '\e0dd'
.oi[data-glyph=zoom-out]:before
content '\e0de'

Binary file not shown.

Binary file not shown.

View File

@ -0,0 +1,543 @@
<?xml version="1.0" standalone="no"?>
<!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN" "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd" >
<!--
2014-7-1: Created.
-->
<svg xmlns="http://www.w3.org/2000/svg">
<metadata>
Created by FontForge 20120731 at Tue Jul 1 20:39:22 2014
By P.J. Onori
Created by P.J. Onori with FontForge 2.0 (http://fontforge.sf.net)
</metadata>
<defs>
<font id="open-iconic" horiz-adv-x="800" >
<font-face
font-family="Icons"
font-weight="400"
font-stretch="normal"
units-per-em="800"
panose-1="2 0 5 3 0 0 0 0 0 0"
ascent="800"
descent="0"
bbox="-0.5 -101 802 800.126"
underline-thickness="50"
underline-position="-100"
unicode-range="U+E000-E0DE"
/>
<missing-glyph />
<glyph glyph-name="" unicode="&#xe000;"
d="M300 700h500v-700h-500v100h400v500h-400v100zM400 500l200 -150l-200 -150v100h-400v100h400v100z" />
<glyph glyph-name="1" unicode="&#xe001;"
d="M300 700h500v-700h-500v100h400v500h-400v100zM200 500v-100h400v-100h-400v-100l-200 150z" />
<glyph glyph-name="2" unicode="&#xe002;"
d="M350 700c193 0 350 -157 350 -350v-50h100l-200 -200l-200 200h100v50c0 138 -112 250 -250 250s-250 -112 -250 -250c0 193 157 350 350 350z" />
<glyph glyph-name="3" unicode="&#xe003;"
d="M450 700c193 0 350 -157 350 -350c0 138 -112 250 -250 250s-250 -112 -250 -250v-50h100l-200 -200l-200 200h100v50c0 193 157 350 350 350z" />
<glyph glyph-name="4" unicode="&#xe004;"
d="M0 700h800v-100h-800v100zM100 500h600v-100h-600v100zM0 300h800v-100h-800v100zM100 100h600v-100h-600v100z" />
<glyph glyph-name="5" unicode="&#xe005;"
d="M0 700h800v-100h-800v100zM0 500h600v-100h-600v100zM0 300h800v-100h-800v100zM0 100h600v-100h-600v100z" />
<glyph glyph-name="6" unicode="&#xe006;"
d="M0 700h800v-100h-800v100zM200 500h600v-100h-600v100zM0 300h800v-100h-800v100zM200 100h600v-100h-600v100z" />
<glyph glyph-name="7" unicode="&#xe007;"
d="M400 700c75 0 146 -23 206 -59l-75 -225l-322 234c57 31 122 50 191 50zM125 588l191 -138l-310 -222c-4 24 -6 47 -6 72c0 114 49 215 125 288zM688 575c69 -72 112 -168 112 -275c0 -35 -8 -68 -16 -100h-218zM216 253l112 -347c-128 23 -232 109 -287 222zM372 100
h372c-64 -109 -177 -185 -310 -197z" />
<glyph glyph-name="8" unicode="&#xe008;" horiz-adv-x="600"
d="M200 800h100v-500h200l-247 -300l-253 300h200v500z" />
<glyph glyph-name="9" unicode="&#xe009;"
d="M400 800c221 0 400 -179 400 -400s-179 -400 -400 -400s-400 179 -400 400s179 400 400 400zM300 700v-300h-200l300 -300l300 300h-200v300h-200z" />
<glyph glyph-name="a" unicode="&#xe00a;"
d="M400 800c221 0 400 -179 400 -400s-179 -400 -400 -400s-400 179 -400 400s179 400 400 400zM400 700l-300 -300l300 -300v200h300v200h-300v200z" />
<glyph glyph-name="b" unicode="&#xe00b;"
d="M400 800c221 0 400 -179 400 -400s-179 -400 -400 -400s-400 179 -400 400s179 400 400 400zM400 700v-200h-300v-200h300v-200l300 300z" />
<glyph glyph-name="c" unicode="&#xe00c;"
d="M400 800c221 0 400 -179 400 -400s-179 -400 -400 -400s-400 179 -400 400s179 400 400 400zM400 700l-300 -300h200v-300h200v300h200z" />
<glyph glyph-name="d" unicode="&#xe00d;"
d="M300 600v-200h500v-100h-500v-200l-300 247z" />
<glyph glyph-name="e" unicode="&#xe00e;"
d="M500 600l300 -247l-300 -253v200h-500v100h500v200z" />
<glyph glyph-name="f" unicode="&#xe00f;" horiz-adv-x="600"
d="M200 800h200v-500h200l-297 -300l-303 300h200v500z" />
<glyph glyph-name="10" unicode="&#xe010;"
d="M300 700v-200h500v-200h-500v-200l-300 297z" />
<glyph glyph-name="11" unicode="&#xe011;"
d="M500 700l300 -297l-300 -303v200h-500v200h500v200z" />
<glyph glyph-name="12" unicode="&#xe012;" horiz-adv-x="600"
d="M297 800l303 -300h-200v-500h-200v500h-200z" />
<glyph glyph-name="13" unicode="&#xe013;" horiz-adv-x="600"
d="M247 800l253 -300h-200v-500h-100v500h-200z" />
<glyph glyph-name="14" unicode="&#xe014;"
d="M400 800h100v-800h-100v800zM200 700h100v-600h-100v600zM600 600h100v-400h-100v400zM0 500h100v-200h-100v200z" />
<glyph glyph-name="15" unicode="&#xe015;"
d="M116 600l72 -72c-54 -54 -88 -126 -88 -209s34 -159 88 -213l-72 -72c-72 72 -116 175 -116 285s44 209 116 281zM684 600c72 -72 116 -171 116 -281s-44 -213 -116 -285l-72 72c54 54 88 130 88 213s-34 155 -88 209zM259 460l69 -72c-18 -18 -28 -41 -28 -69
s10 -54 28 -72l-69 -72c-36 36 -59 89 -59 144s23 105 59 141zM541 459c36 -36 59 -85 59 -140s-23 -108 -59 -144l-69 72c18 18 28 44 28 72s-10 51 -28 69z" />
<glyph glyph-name="16" unicode="&#xe016;" horiz-adv-x="400"
d="M200 800c110 0 200 -90 200 -200s-90 -200 -200 -200s-200 90 -200 200s90 200 200 200zM100 319c31 -11 65 -19 100 -19s68 8 100 19v-319l-100 100l-100 -100v319z" />
<glyph glyph-name="17" unicode="&#xe017;"
d="M400 800c220 0 400 -180 400 -400s-180 -400 -400 -400s-400 180 -400 400s180 400 400 400zM400 700c-166 0 -300 -134 -300 -300c0 -66 21 -126 56 -175l419 419c-49 35 -109 56 -175 56zM644 575l-419 -419c49 -35 109 -56 175 -56c166 0 300 134 300 300
c0 66 -21 126 -56 175z" />
<glyph glyph-name="18" unicode="&#xe018;"
d="M0 700h100v-600h700v-100h-800v700zM500 700h200v-500h-200v500zM200 500h200v-300h-200v300z" />
<glyph glyph-name="19" unicode="&#xe019;"
d="M397 800c13 1 23 -4 34 -13c2 -2 214 -254 241 -287h128v-100h-100v-366c0 -18 -16 -34 -34 -34h-532c-18 0 -34 16 -34 34v366h-100v100h128l234 281c9 11 22 18 35 19zM400 672l-144 -172h288zM250 300c-28 0 -50 -22 -50 -50v-100c0 -28 22 -50 50 -50s50 22 50 50
v100c0 28 -22 50 -50 50zM550 300c-28 0 -50 -22 -50 -50v-100c0 -28 22 -50 50 -50s50 22 50 50v100c0 28 -22 50 -50 50z" />
<glyph glyph-name="1a" unicode="&#xe01a;"
d="M9 700h682c6 0 9 -4 9 -10v-190h100v-200h-100v-191c0 -6 -3 -9 -9 -9h-682c-6 0 -9 3 -9 9v582c0 6 3 9 9 9zM100 600v-400h500v400h-500z" />
<glyph glyph-name="1b" unicode="&#xe01b;"
d="M9 700h682c6 0 9 -4 9 -10v-190h100v-200h-100v-191c0 -6 -3 -9 -9 -9h-682c-6 0 -9 3 -9 9v582c0 6 3 9 9 9z" />
<glyph glyph-name="1c" unicode="&#xe01c;"
d="M92 650c0 23 19 50 45 50h3h5h5h500c28 0 50 -22 50 -50s-22 -50 -50 -50h-50v-141c9 -17 120 -231 166 -309c16 -26 34 -61 34 -106c0 -39 -15 -77 -41 -103h-3c-26 -25 -62 -41 -100 -41h-512c-39 0 -77 15 -103 41s-41 64 -41 103c0 46 18 80 34 106
c46 78 157 292 166 309v141h-50c-2 0 -6 -1 -8 -1c-28 0 -50 23 -50 51zM500 600h-200v-162l-6 -10s-63 -123 -119 -228h450c-56 105 -119 228 -119 228l-6 10v162z" />
<glyph glyph-name="1d" unicode="&#xe01d;"
d="M400 800c110 0 200 -90 200 -200c0 -104 52 -198 134 -266c41 -34 66 -82 66 -134h-800c0 52 25 100 66 134c82 68 134 162 134 266c0 110 90 200 200 200zM300 100h200c0 -55 -45 -100 -100 -100s-100 45 -100 100z" />
<glyph glyph-name="1e" unicode="&#xe01e;" horiz-adv-x="600"
d="M150 800h50l350 -250l-225 -147l225 -153l-350 -250h-50v250l-75 -75l-75 75l150 150l-150 150l75 75l75 -75v250zM250 650v-200l150 100zM250 350v-200l150 100z" />
<glyph glyph-name="1f" unicode="&#xe01f;"
d="M0 800h500c110 0 200 -90 200 -200c0 -47 -17 -91 -44 -125c85 -40 144 -125 144 -225c0 -138 -112 -250 -250 -250h-550v100c55 0 100 45 100 100v400c0 55 -45 100 -100 100v100zM300 700v-200h100c55 0 100 45 100 100s-45 100 -100 100h-100zM300 400v-300h150
c83 0 150 67 150 150s-67 150 -150 150h-150z" />
<glyph glyph-name="20" unicode="&#xe020;" horiz-adv-x="600"
d="M300 800v-300h200l-300 -500v300h-200z" />
<glyph glyph-name="21" unicode="&#xe021;"
d="M100 800h300v-300l100 100l100 -100v300h50c28 0 50 -22 50 -50v-550h-550c-28 0 -50 -22 -50 -50s22 -50 50 -50h550v-100h-550c-83 0 -150 67 -150 150v550l3 19c8 39 39 70 78 78z" />
<glyph glyph-name="22" unicode="&#xe022;" horiz-adv-x="400"
d="M0 800h400v-800l-200 200l-200 -200v800z" />
<glyph glyph-name="23" unicode="&#xe023;"
d="M0 800h800v-100h-800v100zM0 600h300v-103h203v103h297v-591c0 -6 -3 -9 -9 -9h-782c-6 0 -9 3 -9 9v591z" />
<glyph glyph-name="24" unicode="&#xe024;"
d="M300 800h200c55 0 100 -45 100 -100v-100h191c6 0 9 -3 9 -9v-241c0 -28 -22 -50 -50 -50h-700c-28 0 -50 22 -50 50v241c0 6 3 9 9 9h191v100c0 55 45 100 100 100zM300 700v-100h200v100h-200zM0 209c16 -6 32 -9 50 -9h700c18 0 34 3 50 9v-200c0 -6 -3 -9 -9 -9h-782
c-6 0 -9 3 -9 9v200z" />
<glyph glyph-name="25" unicode="&#xe025;" horiz-adv-x="600"
d="M300 800c58 0 110 -16 147 -53s53 -89 53 -147h-100c0 39 -11 61 -25 75s-36 25 -75 25c-35 0 -55 -10 -72 -31s-28 -55 -28 -94c0 -51 20 -107 28 -175h172v-100h-178c-14 -60 -49 -127 -113 -200h491v-100h-600v122l16 12c69 69 95 121 106 166h-122v100h125
c-8 50 -25 106 -25 175c0 58 16 114 50 156c34 43 88 69 150 69z" />
<glyph glyph-name="26" unicode="&#xe026;"
d="M34 700h4h3h4h5h700c28 0 50 -22 50 -50v-700c0 -28 -22 -50 -50 -50h-700c-28 0 -50 22 -50 50v700v2c0 20 15 42 34 48zM150 600c-28 0 -50 -22 -50 -50s22 -50 50 -50s50 22 50 50s-22 50 -50 50zM350 600c-28 0 -50 -22 -50 -50s22 -50 50 -50h300c28 0 50 22 50 50
s-22 50 -50 50h-300zM100 400v-400h600v400h-600z" />
<glyph glyph-name="27" unicode="&#xe027;"
d="M744 797l6 -3l44 -44c4 -4 3 -8 0 -12l-266 -375l-15 -13l-25 -12c-23 72 -78 127 -150 150l12 25l13 15l375 266zM266 400c74 0 134 -60 134 -134c0 -147 -119 -266 -266 -266c-48 0 -95 12 -134 34c80 46 134 133 134 232c0 74 58 134 132 134z" />
<glyph glyph-name="28" unicode="&#xe028;"
d="M9 451c0 23 19 50 46 50c8 0 19 -3 26 -7l131 -66l29 22c-79 81 -1 250 118 250s197 -167 119 -250l28 -22l131 66c6 4 12 7 21 7c28 0 50 -22 50 -50c0 -17 -12 -37 -27 -45l-115 -56c9 -16 19 -33 25 -50h68c28 0 50 -22 50 -50s-22 -50 -50 -50h-50
c0 -23 -2 -45 -6 -66l78 -40c21 -5 37 -28 37 -49c0 -28 -22 -50 -50 -50c-10 0 -23 5 -31 11l-65 35c-24 -46 -62 -86 -103 -110c-35 19 -60 45 -60 72v135v4v5v6v5v5v87c0 28 -22 50 -50 50c-24 0 -45 -17 -50 -40c1 -3 1 -8 1 -11s0 -8 -1 -11v-82v-4v-5v-144
c0 -28 -24 -53 -59 -72c-41 25 -79 64 -103 110l-66 -35c-8 -6 -21 -11 -31 -11c-28 0 -50 22 -50 50c0 21 16 44 37 49l78 40c-4 21 -6 43 -6 66h-50h-5c-28 0 -50 22 -50 50c0 26 22 50 50 50h5h69c6 17 16 34 25 50l-116 56c-16 7 -28 27 -28 45z" />
<glyph glyph-name="29" unicode="&#xe029;"
d="M600 700h91c6 0 9 -3 9 -9v-582c0 -6 -3 -9 -9 -9h-91v600zM210 503l290 147v-500l-250 125v-3c-15 0 -25 -8 -28 -22l75 -178c11 -25 0 -58 -25 -69s-58 0 -69 25l-103 272h-91c-6 0 -9 3 -9 9v182c0 6 3 9 9 9h182z" />
<glyph glyph-name="2a" unicode="&#xe02a;"
d="M9 800h682c6 0 9 -3 9 -9v-782c0 -6 -3 -9 -9 -9h-682c-6 0 -9 3 -9 9v782c0 6 3 9 9 9zM100 700v-200h500v200h-500zM100 400v-100h100v100h-100zM300 400v-100h100v100h-100zM500 400v-300h100v300h-100zM100 200v-100h100v100h-100zM300 200v-100h100v100h-100z" />
<glyph glyph-name="2b" unicode="&#xe02b;"
d="M0 800h700v-200h-700v200zM0 500h700v-491c0 -6 -3 -9 -9 -9h-682c-6 0 -9 3 -9 9v491zM100 400v-100h100v100h-100zM300 400v-100h100v100h-100zM500 400v-100h100v100h-100zM100 200v-100h100v100h-100zM300 200v-100h100v100h-100z" />
<glyph glyph-name="2c" unicode="&#xe02c;"
d="M409 800h182c6 0 10 -4 12 -9l94 -182c2 -5 6 -9 12 -9h82c6 0 9 -3 9 -9v-582c0 -6 -3 -9 -9 -9h-782c-6 0 -9 3 -9 9v441c0 83 67 150 150 150h141c6 0 10 4 12 9l94 182c2 5 6 9 12 9zM150 500c-28 0 -50 -22 -50 -50s22 -50 50 -50s50 22 50 50s-22 50 -50 50z
M500 500c-110 0 -200 -90 -200 -200s90 -200 200 -200s200 90 200 200s-90 200 -200 200zM500 400c55 0 100 -45 100 -100s-45 -100 -100 -100s-100 45 -100 100s45 100 100 100z" />
<glyph glyph-name="2d" unicode="&#xe02d;"
d="M0 600h800l-400 -400z" />
<glyph glyph-name="2e" unicode="&#xe02e;" horiz-adv-x="400"
d="M400 800v-800l-400 400z" />
<glyph glyph-name="2f" unicode="&#xe02f;" horiz-adv-x="400"
d="M0 800l400 -400l-400 -400v800z" />
<glyph glyph-name="30" unicode="&#xe030;"
d="M400 600l400 -400h-800z" />
<glyph glyph-name="31" unicode="&#xe031;"
d="M0 550c0 23 20 50 46 50h3h5h4h200c17 0 37 -13 44 -28l38 -72h444c14 0 19 -12 15 -25l-81 -250c-4 -13 -21 -25 -35 -25h-350c-14 0 -30 12 -34 25c-27 83 -54 167 -81 250l-10 25h-150c-2 0 -5 -1 -7 -1c-28 0 -51 23 -51 51zM358 100c28 0 50 -22 50 -50
s-22 -50 -50 -50s-50 22 -50 50s22 50 50 50zM658 100c28 0 50 -22 50 -50s-22 -50 -50 -50s-50 22 -50 50s22 50 50 50z" />
<glyph glyph-name="32" unicode="&#xe032;"
d="M0 700h500v-100h-300v-300h-100l-100 -100v500zM300 500h500v-500l-100 100h-400v400z" />
<glyph glyph-name="33" unicode="&#xe033;"
d="M641 700l143 -141l-493 -493c-71 76 -146 148 -219 222l-72 71l141 141c50 -51 101 -101 153 -150c116 117 234 231 347 350z" />
<glyph glyph-name="34" unicode="&#xe034;"
d="M150 600l250 -250l250 250l150 -150l-400 -400l-400 400z" />
<glyph glyph-name="35" unicode="&#xe035;" horiz-adv-x="600"
d="M400 800l150 -150l-250 -250l250 -250l-150 -150l-400 400z" />
<glyph glyph-name="36" unicode="&#xe036;" horiz-adv-x="600"
d="M150 800l400 -400l-400 -400l-150 150l250 250l-250 250z" />
<glyph glyph-name="37" unicode="&#xe037;"
d="M400 600l400 -400l-150 -150l-250 250l-250 -250l-150 150z" />
<glyph glyph-name="38" unicode="&#xe038;"
d="M400 800c221 0 400 -179 400 -400s-179 -400 -400 -400s-400 179 -400 400s179 400 400 400zM600 622l-250 -250l-100 100l-72 -72l172 -172l322 322z" />
<glyph glyph-name="39" unicode="&#xe039;"
d="M400 800c221 0 400 -179 400 -400s-179 -400 -400 -400s-400 179 -400 400s179 400 400 400zM250 622l-72 -72l150 -150l-150 -150l72 -72l150 150l150 -150l72 72l-150 150l150 150l-72 72l-150 -150z" />
<glyph glyph-name="3a" unicode="&#xe03a;"
d="M350 800c28 0 50 -22 50 -50v-50h75c14 0 25 -11 25 -25v-75h-300v75c0 14 11 25 25 25h75v50c0 28 22 50 50 50zM25 700h75v-200h500v200h75c14 0 25 -11 25 -25v-650c0 -14 -11 -25 -25 -25h-650c-14 0 -25 11 -25 25v650c0 14 11 25 25 25z" />
<glyph glyph-name="3b" unicode="&#xe03b;"
d="M400 800c220 0 400 -180 400 -400s-180 -400 -400 -400s-400 180 -400 400s180 400 400 400zM400 700c-166 0 -300 -134 -300 -300s134 -300 300 -300s300 134 300 300s-134 300 -300 300zM350 600h100v-181c23 -24 47 -47 72 -69l-72 -72c-27 30 -55 59 -84 88l-16 12
v222z" />
<glyph glyph-name="3c" unicode="&#xe03c;"
d="M450 800c138 0 250 -112 250 -250v-50c58 -21 100 -85 100 -150c0 -18 -3 -34 -9 -50h-191v50c0 83 -67 150 -150 150s-150 -67 -150 -150v-50h-272c-17 30 -28 63 -28 100c0 110 90 200 200 200c23 114 129 200 250 200zM434 400h3h4c3 0 6 1 9 1c28 0 50 -22 50 -50v-1
v-150h150l-200 -200l-200 200h150v150v2c0 20 15 42 34 48z" />
<glyph glyph-name="3d" unicode="&#xe03d;"
d="M450 800c138 0 250 -112 250 -250v-50c58 -21 100 -85 100 -150c0 -18 -3 -34 -9 -50h-141l-200 200l-200 -200h-222c-17 30 -28 63 -28 100c0 110 90 200 200 200c23 114 129 200 250 200zM450 350l250 -250h-200v-50c0 -28 -22 -50 -50 -50s-50 22 -50 50v50h-200z" />
<glyph glyph-name="3e" unicode="&#xe03e;"
d="M450 700c138 0 250 -112 250 -250v-50c58 -21 100 -85 100 -150c0 -83 -67 -150 -150 -150h-450c-110 0 -200 90 -200 200s90 200 200 200c23 114 129 200 250 200z" />
<glyph glyph-name="3f" unicode="&#xe03f;"
d="M250 800c82 0 154 -40 200 -100c-143 0 -270 -85 -325 -209c-36 -10 -70 -25 -100 -47c-16 33 -25 67 -25 106c0 138 112 250 250 250zM450 600c138 0 250 -112 250 -250v-50c58 -21 100 -85 100 -150c0 -83 -67 -150 -150 -150h-450c-110 0 -200 90 -200 200
s90 200 200 200c23 114 129 200 250 200z" />
<glyph glyph-name="40" unicode="&#xe040;"
d="M500 700h100l-300 -600h-100zM100 600h100l-100 -200l100 -200h-100l-100 200zM600 600h100l100 -200l-100 -200h-100l100 200z" />
<glyph glyph-name="41" unicode="&#xe041;"
d="M350 800h100l50 -119l28 -12l119 50l72 -72l-50 -119l12 -28l119 -50v-100l-119 -50l-12 -28l50 -119l-72 -72l-119 50l-28 -12l-50 -119h-100l-50 119l-28 12l-119 -50l-72 72l50 119l-12 28l-119 50v100l119 50l12 28l-50 119l72 72l119 -50l28 12zM400 550
c-83 0 -150 -67 -150 -150s67 -150 150 -150s150 67 150 150s-67 150 -150 150z" />
<glyph glyph-name="42" unicode="&#xe042;"
d="M0 800h800v-200h-800v200zM200 500h400l-200 -200zM0 100h800v-100h-800v100z" />
<glyph glyph-name="43" unicode="&#xe043;"
d="M0 800h100v-800h-100v800zM600 800h200v-800h-200v800zM500 600v-400l-200 200z" />
<glyph glyph-name="44" unicode="&#xe044;"
d="M0 800h200v-800h-200v800zM700 800h100v-800h-100v800zM300 600l200 -200l-200 -200v400z" />
<glyph glyph-name="45" unicode="&#xe045;"
d="M0 800h800v-100h-800v100zM400 500l200 -200h-400zM0 200h800v-200h-800v200z" />
<glyph glyph-name="46" unicode="&#xe046;"
d="M150 700c83 0 150 -67 150 -150v-50h100v50c0 83 67 150 150 150s150 -67 150 -150s-67 -150 -150 -150h-50v-100h50c83 0 150 -67 150 -150s-67 -150 -150 -150s-150 67 -150 150v50h-100v-50c0 -83 -67 -150 -150 -150s-150 67 -150 150s67 150 150 150h50v100h-50
c-83 0 -150 67 -150 150s67 150 150 150zM150 600c-28 0 -50 -22 -50 -50s22 -50 50 -50h50v50c0 28 -22 50 -50 50zM550 600c-28 0 -50 -22 -50 -50v-50h50c28 0 50 22 50 50s-22 50 -50 50zM300 400v-100h100v100h-100zM150 200c-28 0 -50 -22 -50 -50s22 -50 50 -50
s50 22 50 50v50h-50zM500 200v-50c0 -28 22 -50 50 -50s50 22 50 50s-22 50 -50 50h-50z" />
<glyph glyph-name="47" unicode="&#xe047;"
d="M0 791c0 5 4 9 9 9h782c6 0 9 -4 9 -10v-790l-200 200h-591c-6 0 -9 3 -9 9v582z" />
<glyph glyph-name="48" unicode="&#xe048;"
d="M400 800c220 0 400 -180 400 -400s-180 -400 -400 -400s-400 180 -400 400s180 400 400 400zM400 700c-166 0 -300 -134 -300 -300s134 -300 300 -300s300 134 300 300s-134 300 -300 300zM600 600l-100 -300l-300 -100l100 300zM400 450c-28 0 -50 -22 -50 -50
s22 -50 50 -50s50 22 50 50s-22 50 -50 50z" />
<glyph glyph-name="49" unicode="&#xe049;"
d="M400 800c220 0 400 -180 400 -400s-180 -400 -400 -400s-400 180 -400 400s180 400 400 400zM400 700v-600c166 0 300 134 300 300s-134 300 -300 300z" />
<glyph glyph-name="4a" unicode="&#xe04a;"
d="M0 800h800v-100h-800v100zM0 600h500v-100h-500v100zM0 300h800v-100h-800v100zM0 100h600v-100h-600v100zM750 100c28 0 50 -22 50 -50s-22 -50 -50 -50s-50 22 -50 50s22 50 50 50z" />
<glyph glyph-name="4b" unicode="&#xe04b;"
d="M25 700h750c14 0 25 -11 25 -25v-75h-800v75c0 14 11 25 25 25zM0 500h800v-375c0 -14 -11 -25 -25 -25h-750c-14 0 -25 11 -25 25v375zM100 300v-100h100v100h-100zM300 300v-100h100v100h-100z" />
<glyph glyph-name="4c" unicode="&#xe04c;"
d="M100 800h100v-100h450l100 100l50 -50l-100 -100v-450h100v-100h-100v-100h-100v100h-500v500h-100v100h100v100zM200 600v-350l350 350h-350zM600 550l-350 -350h350v350z" />
<glyph glyph-name="4d" unicode="&#xe04d;"
d="M400 800c220 0 400 -180 400 -400s-180 -400 -400 -400s-400 180 -400 400s180 400 400 400zM400 700c-166 0 -300 -134 -300 -300s134 -300 300 -300s300 134 300 300s-134 300 -300 300zM400 600c28 0 50 -22 50 -50s-22 -50 -50 -50s-50 22 -50 50s22 50 50 50z
M200 452c0 20 15 42 34 48h3h3h8c12 0 28 -7 36 -16l91 -90l25 6c55 0 100 -45 100 -100s-45 -100 -100 -100s-100 45 -100 100l6 25l-90 91c-9 8 -16 24 -16 36zM550 500c28 0 50 -22 50 -50s-22 -50 -50 -50s-50 22 -50 50s22 50 50 50z" />
<glyph glyph-name="4e" unicode="&#xe04e;"
d="M300 800h200v-300h200l-300 -300l-300 300h200v300zM0 100h800v-100h-800v100z" />
<glyph glyph-name="4f" unicode="&#xe04f;"
d="M0 800h800v-100h-800v100zM400 600l300 -300h-200v-300h-200v300h-200z" />
<glyph glyph-name="50" unicode="&#xe050;"
d="M200 700h600v-600h-600l-200 300zM350 622l-72 -72l150 -150l-150 -150l72 -72l150 150l150 -150l72 72l-150 150l150 150l-72 72l-150 -150z" />
<glyph glyph-name="51" unicode="&#xe051;"
d="M400 700c220 0 400 -180 400 -400h-100c0 166 -134 300 -300 300s-300 -134 -300 -300h-100c0 220 180 400 400 400zM341 491l59 -88l59 88c81 -25 141 -101 141 -191c0 -110 -90 -200 -200 -200s-200 90 -200 200c0 90 60 166 141 191z" />
<glyph glyph-name="52" unicode="&#xe052;"
d="M0 800h300v-400h400v-400h-700v800zM400 800l300 -300h-300v300zM100 600v-100h100v100h-100zM100 400v-100h100v100h-100zM100 200v-100h400v100h-400z" />
<glyph glyph-name="53" unicode="&#xe053;" horiz-adv-x="600"
d="M200 700h100v-100h75c30 0 58 -6 81 -22s44 -44 44 -78v-100h-100v94c-4 3 -13 6 -25 6h-250c-14 0 -25 -11 -25 -25v-50c0 -15 20 -40 34 -44l257 -65c66 -16 109 -73 109 -141v-50c0 -68 -57 -125 -125 -125h-75v-100h-100v100h-75c-30 0 -58 6 -81 22s-44 44 -44 78
v100h100v-94c4 -3 13 -6 25 -6h250c14 0 25 11 25 25v50c0 15 -20 40 -34 44l-257 65c-66 16 -109 73 -109 141v50c0 68 57 125 125 125h75v100z" />
<glyph glyph-name="54" unicode="&#xe054;"
d="M0 700h300v-300l-300 -300v600zM500 700h300v-300l-300 -300v600z" />
<glyph glyph-name="55" unicode="&#xe055;"
d="M300 700v-600h-300v300zM800 700v-600h-300v300z" />
<glyph glyph-name="56" unicode="&#xe056;"
d="M300 700v-100c-111 0 -200 -89 -200 -200h200v-300h-300v300c0 165 135 300 300 300zM800 700v-100c-111 0 -200 -89 -200 -200h200v-300h-300v300c0 165 135 300 300 300z" />
<glyph glyph-name="57" unicode="&#xe057;"
d="M0 700h300v-300c0 -165 -135 -300 -300 -300v100c111 0 200 89 200 200h-200v300zM500 700h300v-300c0 -165 -135 -300 -300 -300v100c111 0 200 89 200 200h-200v300z" />
<glyph glyph-name="58" unicode="&#xe058;" horiz-adv-x="600"
d="M300 800l34 -34c11 -11 266 -270 266 -488c0 -165 -135 -300 -300 -300s-300 135 -300 300c0 218 255 477 266 488zM150 328c-28 0 -50 -22 -50 -50c0 -110 90 -200 200 -200c28 0 50 22 50 50s-22 50 -50 50c-55 0 -100 45 -100 100c0 28 -22 50 -50 50z" />
<glyph glyph-name="59" unicode="&#xe059;"
d="M400 800l400 -500h-800zM0 200h800v-200h-800v200z" />
<glyph glyph-name="5a" unicode="&#xe05a;" horiz-adv-x="600"
d="M300 800l300 -300h-600zM0 300h600l-300 -300z" />
<glyph glyph-name="5b" unicode="&#xe05b;"
d="M0 500h200v-200h-200v200zM300 500h200v-200h-200v200zM600 500h200v-200h-200v200z" />
<glyph glyph-name="5c" unicode="&#xe05c;"
d="M0 700h800v-100l-400 -200l-400 200v100zM0 500l400 -200l400 200v-400h-800v400z" />
<glyph glyph-name="5d" unicode="&#xe05d;"
d="M400 800l400 -200v-600h-800v600zM400 688l-300 -150v-188l300 -150l300 150v188zM200 500h400v-100l-200 -100l-200 100v100z" />
<glyph glyph-name="5e" unicode="&#xe05e;"
d="M600 700c69 0 134 -19 191 -50l-16 -106c-49 35 -109 56 -175 56c-131 0 -240 -84 -281 -200h331l-16 -100h-334c0 -36 8 -68 19 -100h297l-16 -100h-222c55 -61 133 -100 222 -100c78 0 147 30 200 78v-122c-59 -35 -127 -56 -200 -56c-147 0 -274 82 -344 200h-256
l19 100h197c-8 32 -16 66 -16 100h-200l25 100h191c45 172 198 300 384 300z" />
<glyph glyph-name="5f" unicode="&#xe05f;"
d="M0 700h700v-100h-700v100zM0 500h500v-100h-500v100zM0 300h800v-100h-800v100zM0 100h100v-100h-100v100zM200 100h100v-100h-100v100zM400 100h100v-100h-100v100z" />
<glyph glyph-name="60" unicode="&#xe060;"
d="M0 800h800v-100h-800v100zM200 600h400l-200 -200zM0 200h800v-200h-800v200z" />
<glyph glyph-name="61" unicode="&#xe061;"
d="M0 800h100v-800h-100v800zM600 800h200v-800h-200v800zM200 600l200 -200l-200 -200v400z" />
<glyph glyph-name="62" unicode="&#xe062;"
d="M0 800h200v-800h-200v800zM700 800h100v-800h-100v800zM600 600v-400l-200 200z" />
<glyph glyph-name="63" unicode="&#xe063;"
d="M0 800h800v-200h-800v200zM400 400l200 -200h-400zM0 100h800v-100h-800v100z" />
<glyph glyph-name="64" unicode="&#xe064;"
d="M0 800h200v-100h-100v-600h600v100h100v-200h-800v800zM400 800h400v-400l-150 150l-250 -250l-100 100l250 250z" />
<glyph glyph-name="65" unicode="&#xe065;"
d="M403 700c247 0 397 -300 397 -300s-150 -300 -397 -300c-253 0 -403 300 -403 300s150 300 403 300zM400 600c-110 0 -200 -90 -200 -200s90 -200 200 -200s200 90 200 200s-90 200 -200 200zM400 500c10 0 19 -3 28 -6c-16 -8 -28 -24 -28 -44c0 -28 22 -50 50 -50
c20 0 36 12 44 28c3 -9 6 -18 6 -28c0 -55 -45 -100 -100 -100s-100 45 -100 100s45 100 100 100z" />
<glyph glyph-name="66" unicode="&#xe066;" horiz-adv-x="900"
d="M331 700h3h3c3 1 7 1 10 1c12 0 29 -8 37 -17l94 -93l66 65c57 57 155 57 212 0c58 -58 58 -154 0 -212l-65 -66l93 -94c10 -8 18 -25 18 -38c0 -28 -22 -50 -50 -50c-13 0 -32 9 -40 20l-62 65l-381 -381h-269v272l375 381l-63 63c-9 8 -16 24 -16 36c0 20 16 42 35 48z
M447 481l-313 -315l128 -132l316 316z" />
<glyph glyph-name="67" unicode="&#xe067;"
d="M0 800h300v-400h400v-400h-700v800zM400 800l300 -300h-300v300z" />
<glyph glyph-name="68" unicode="&#xe068;"
d="M200 800c0 0 200 -100 200 -300s-298 -302 -200 -500c0 0 -200 100 -200 300s300 300 200 500zM500 500c0 0 200 -100 200 -300c0 -150 -60 -200 -100 -200h-300c0 200 300 300 200 500z" />
<glyph glyph-name="69" unicode="&#xe069;"
d="M0 800h100v-800h-100v800zM200 800h300v-100h300l-200 -203l200 -197h-400v100h-200v400z" />
<glyph glyph-name="6a" unicode="&#xe06a;" horiz-adv-x="400"
d="M150 800h150l-100 -200h200l-150 -300h150l-300 -300l-100 300h134l66 200h-200z" />
<glyph glyph-name="6b" unicode="&#xe06b;"
d="M0 800h300v-100h500v-100h-800v200zM0 500h800v-450c0 -28 -22 -50 -50 -50h-700c-28 0 -50 22 -50 50v450z" />
<glyph glyph-name="6c" unicode="&#xe06c;"
d="M150 800c83 0 150 -67 150 -150c0 -66 -41 -121 -100 -141v-118c15 5 33 9 50 9h200c28 0 50 22 50 50v59c-59 20 -100 75 -100 141c0 83 67 150 150 150s150 -67 150 -150c0 -66 -41 -121 -100 -141v-59c0 -82 -68 -150 -150 -150h-200c-14 0 -25 -7 -34 -16
c50 -24 84 -74 84 -134c0 -83 -67 -150 -150 -150s-150 67 -150 150c0 66 41 121 100 141v218c-59 20 -100 75 -100 141c0 83 67 150 150 150z" />
<glyph glyph-name="6d" unicode="&#xe06d;"
d="M0 800h400l-150 -150l150 -150l-100 -100l-150 150l-150 -150v400zM500 400l150 -150l150 150v-400h-400l150 150l-150 150z" />
<glyph glyph-name="6e" unicode="&#xe06e;"
d="M100 800l150 -150l150 150v-400h-400l150 150l-150 150zM400 400h400l-150 -150l150 -150l-100 -100l-150 150l-150 -150v400z" />
<glyph glyph-name="6f" unicode="&#xe06f;"
d="M400 800c221 0 400 -179 400 -400s-179 -400 -400 -400s-400 179 -400 400s179 400 400 400zM400 700c-56 0 -108 -17 -153 -44l22 -19c33 -18 13 -48 -13 -59c-30 -13 -77 10 -65 -41c13 -55 -27 -3 -47 -15c-42 -26 49 -152 31 -156l-59 34c-8 0 -13 -5 -16 -10
c1 -30 10 -57 19 -84c28 -11 77 -2 100 -25c47 -28 97 -115 75 -159c34 -13 68 -22 106 -22c101 0 193 48 247 125c3 24 -8 44 -50 44c-69 0 -156 13 -153 97c2 46 101 108 66 143c-30 30 12 39 12 66c0 37 -65 32 -69 50s20 36 41 56c-30 10 -60 19 -94 19zM631 591
c-38 -11 -94 -35 -87 -53c6 -15 52 -1 65 -13c11 -10 16 -59 44 -31l22 22v3c-11 26 -26 50 -44 72z" />
<glyph glyph-name="70" unicode="&#xe070;"
d="M703 800l97 -100l-400 -400l-100 100l-200 -203l-100 100l300 303l100 -100zM0 100h800v-100h-800v100z" />
<glyph glyph-name="71" unicode="&#xe071;"
d="M0 700h100v-100h-100v100zM200 700h100v-100h-100v100zM400 700h100v-100h-100v100zM600 700h100v-100h-100v100zM0 500h100v-100h-100v100zM200 500h100v-100h-100v100zM400 500h100v-100h-100v100zM600 500h100v-100h-100v100zM0 300h100v-100h-100v100zM200 300h100
v-100h-100v100zM400 300h100v-100h-100v100zM600 300h100v-100h-100v100zM0 100h100v-100h-100v100zM200 100h100v-100h-100v100zM400 100h100v-100h-100v100zM600 100h100v-100h-100v100z" />
<glyph glyph-name="72" unicode="&#xe072;"
d="M0 800h200v-200h-200v200zM300 800h200v-200h-200v200zM600 800h200v-200h-200v200zM0 500h200v-200h-200v200zM300 500h200v-200h-200v200zM600 500h200v-200h-200v200zM0 200h200v-200h-200v200zM300 200h200v-200h-200v200zM600 200h200v-200h-200v200z" />
<glyph glyph-name="73" unicode="&#xe073;"
d="M0 800h300v-300h-300v300zM500 800h300v-300h-300v300zM0 300h300v-300h-300v300zM500 300h300v-300h-300v300z" />
<glyph glyph-name="74" unicode="&#xe074;"
d="M19 800h662c11 0 19 -8 19 -19v-331c0 -28 -22 -50 -50 -50h-600c-28 0 -50 22 -50 50v331c0 11 8 19 19 19zM0 309c16 -6 32 -9 50 -9h600c18 0 34 3 50 9v-290c0 -11 -8 -19 -19 -19h-662c-11 0 -19 8 -19 19v290zM550 200c-28 0 -50 -22 -50 -50s22 -50 50 -50
s50 22 50 50s-22 50 -50 50z" />
<glyph glyph-name="75" unicode="&#xe075;"
d="M0 700h300v-100h-50c-28 0 -50 -22 -50 -50v-150h300v150c0 28 -22 50 -50 50h-50v100h300v-100h-50c-28 0 -50 -22 -50 -50v-400c0 -28 22 -50 50 -50h50v-100h-300v100h50c28 0 50 22 50 50v150h-300v-150c0 -28 22 -50 50 -50h50v-100h-300v100h50c28 0 50 22 50 50
v400c0 28 -22 50 -50 50h-50v100z" />
<glyph glyph-name="76" unicode="&#xe076;"
d="M400 700c165 0 300 -135 300 -300v-100h50c28 0 50 -22 50 -50v-200c0 -28 -22 -50 -50 -50h-100c-28 0 -50 22 -50 50v350c0 111 -89 200 -200 200s-200 -89 -200 -200v-350c0 -28 -22 -50 -50 -50h-100c-28 0 -50 22 -50 50v200c0 28 22 50 50 50h50v100
c0 165 135 300 300 300z" />
<glyph glyph-name="77" unicode="&#xe077;"
d="M0 500c0 109 91 200 200 200s200 -91 200 -200c0 109 91 200 200 200s200 -91 200 -200c0 -55 -23 -105 -59 -141l-341 -340l-341 340c-36 36 -59 86 -59 141z" />
<glyph glyph-name="78" unicode="&#xe078;"
d="M400 700l400 -300l-100 3v-403h-200v200h-200v-200h-200v400h-100z" />
<glyph glyph-name="79" unicode="&#xe079;"
d="M0 800h800v-800h-800v800zM100 700v-300l100 100l400 -400h100v100l-200 200l100 100l100 -100v300h-600z" />
<glyph glyph-name="7a" unicode="&#xe07a;"
d="M19 800h762c11 0 19 -8 19 -19v-762c0 -11 -8 -19 -19 -19h-762c-11 0 -19 8 -19 19v762c0 11 8 19 19 19zM100 600v-300h100l100 -100h200l100 100h100v300h-600z" />
<glyph glyph-name="7b" unicode="&#xe07b;"
d="M200 600c80 0 142 -56 200 -122c58 66 119 122 200 122c131 0 200 -101 200 -200s-69 -200 -200 -200c-81 0 -142 56 -200 122c-58 -66 -121 -122 -200 -122c-131 0 -200 101 -200 200s69 200 200 200zM200 500c-74 0 -100 -54 -100 -100s26 -100 100 -100
c42 0 88 47 134 100c-46 53 -92 100 -134 100zM600 500c-43 0 -88 -47 -134 -100c46 -53 91 -100 134 -100c74 0 100 54 100 100s-26 100 -100 100z" />
<glyph glyph-name="7c" unicode="&#xe07c;" horiz-adv-x="400"
d="M300 800c55 0 100 -45 100 -100s-45 -100 -100 -100s-100 45 -100 100s45 100 100 100zM150 550c83 0 150 -69 150 -150c0 -66 -100 -214 -100 -250c0 -28 22 -50 50 -50s50 22 50 50h100c0 -83 -67 -150 -150 -150s-150 64 -150 150s100 222 100 250s-22 50 -50 50
s-50 -22 -50 -50h-100c0 83 67 150 150 150z" />
<glyph glyph-name="7d" unicode="&#xe07d;"
d="M200 800h500v-100h-122c-77 -197 -156 -392 -234 -588l-6 -12h162v-100h-500v100h122c77 197 156 392 234 588l7 12h-163v100z" />
<glyph glyph-name="7e" unicode="&#xe07e;"
d="M0 700h800v-100h-800v100zM0 500h800v-100h-800v100zM0 300h800v-100h-800v100zM100 100h600v-100h-600v100z" />
<glyph glyph-name="7f" unicode="&#xe07f;"
d="M0 700h800v-100h-800v100zM0 500h800v-100h-800v100zM0 300h800v-100h-800v100zM0 100h600v-100h-600v100z" />
<glyph glyph-name="80" unicode="&#xe080;"
d="M0 700h800v-100h-800v100zM0 500h800v-100h-800v100zM0 300h800v-100h-800v100zM200 100h600v-100h-600v100z" />
<glyph glyph-name="81" unicode="&#xe081;"
d="M550 800c138 0 250 -112 250 -250s-112 -250 -250 -250c-16 0 -32 0 -47 3l-3 -3v-100h-200v-200h-300v200l303 303c-3 15 -3 31 -3 47c0 138 112 250 250 250zM600 700c-55 0 -100 -45 -100 -100s45 -100 100 -100s100 45 100 100s-45 100 -100 100z" />
<glyph glyph-name="82" unicode="&#xe082;"
d="M134 600h3h4h4h5h500c28 0 50 -22 50 -50v-350h100v-150c0 -28 -22 -50 -50 -50h-700c-28 0 -50 22 -50 50v150h100v350v2c0 20 15 42 34 48zM200 500v-300h100v-100h200v100h100v300h-400z" />
<glyph glyph-name="83" unicode="&#xe083;"
d="M0 800h400v-400h-400v400zM500 600h100v-400h-400v100h300v300zM700 400h100v-400h-400v100h300v300z" />
<glyph glyph-name="84" unicode="&#xe084;" horiz-adv-x="600"
d="M337 694c6 4 12 7 21 7c28 0 50 -22 50 -50c0 -17 -12 -37 -27 -45l-300 -150c-8 -6 -21 -11 -31 -11c-28 0 -50 22 -50 50c0 21 16 44 37 49zM437 544c6 4 12 7 21 7c28 0 50 -22 50 -50c0 -17 -12 -37 -27 -45l-400 -200c-8 -6 -21 -11 -31 -11c-28 0 -50 22 -50 50
c0 21 16 44 37 49zM437 344c6 4 12 7 21 7c28 0 50 -22 50 -50c0 -17 -12 -37 -27 -45l-106 -56c24 -4 43 -26 43 -50c0 -28 -23 -51 -51 -51c-2 0 -6 1 -8 1h-200c-26 1 -48 24 -48 50c0 16 12 36 26 44zM151 -50c0 23 20 50 46 50h3h4h5h100c28 0 50 -22 50 -50
s-22 -50 -50 -50h-100c-2 0 -6 -1 -8 -1c-28 0 -50 23 -50 51z" />
<glyph glyph-name="85" unicode="&#xe085;"
d="M199 800h100v-200h-200v100h100v100zM586 797h1c18 1 38 1 56 -3c36 -8 69 -26 97 -54c78 -78 78 -203 0 -281l-150 -150c-8 -13 -28 -24 -43 -24c-28 0 -50 22 -50 50c0 15 11 35 24 43l150 150c40 40 39 105 0 144c-41 41 -110 34 -144 0l-44 -44
c-8 -13 -27 -24 -42 -24c-28 0 -50 22 -50 50c0 15 11 35 24 43l43 44c32 33 72 53 128 56zM208 490c4 5 14 16 22 16h3c2 0 6 1 8 1c28 0 50 -22 50 -50c0 -11 -6 -27 -14 -35l-150 -150c-40 -40 -39 -105 0 -144c41 -41 110 -34 144 0l44 44c8 13 27 24 42 24
c28 0 50 -22 50 -50c0 -15 -11 -35 -24 -43l-43 -44c-22 -22 -48 -37 -75 -47c-70 -25 -151 -9 -207 47c-78 78 -78 203 0 281zM499 200h200v-100h-100v-100h-100v200z" />
<glyph glyph-name="86" unicode="&#xe086;"
d="M586 797c18 1 39 1 57 -3c36 -8 69 -26 97 -54c78 -78 78 -203 0 -281l-150 -150c-62 -62 -132 -81 -182 -78s-69 17 -84 25s-26 27 -26 44c0 28 22 51 50 51c8 0 19 -3 26 -7c0 0 15 -11 41 -13s62 3 106 47l150 150c40 40 39 105 0 144c-41 41 -110 34 -144 0
c-8 -13 -28 -24 -43 -24c-28 0 -50 22 -50 50c0 15 11 35 24 43c32 33 72 53 128 56zM386 566c50 -2 64 -17 85 -22s37 -28 37 -49c0 -28 -22 -50 -50 -50c-10 0 -23 5 -31 11c0 0 -19 9 -47 10s-63 -4 -103 -44l-150 -150c-40 -40 -39 -105 0 -144c41 -41 110 -34 144 0
c8 13 27 24 42 24c28 0 50 -22 50 -50c0 -15 -10 -35 -23 -43c-22 -22 -48 -37 -75 -47c-70 -25 -151 -9 -207 47c-78 78 -78 203 0 281l150 150c60 60 128 78 178 76z" />
<glyph glyph-name="87" unicode="&#xe087;"
d="M0 700h300v-300h-300v300zM400 700h400v-100h-400v100zM400 500h300v-100h-300v100zM0 300h300v-300h-300v300zM400 300h400v-100h-400v100zM400 100h300v-100h-300v100z" />
<glyph glyph-name="88" unicode="&#xe088;"
d="M50 700c28 0 50 -22 50 -50s-22 -50 -50 -50s-50 22 -50 50s22 50 50 50zM200 700h600v-100h-600v100zM50 500c28 0 50 -22 50 -50s-22 -50 -50 -50s-50 22 -50 50s22 50 50 50zM200 500h600v-100h-600v100zM50 300c28 0 50 -22 50 -50s-22 -50 -50 -50s-50 22 -50 50
s22 50 50 50zM200 300h600v-100h-600v100zM50 100c28 0 50 -22 50 -50s-22 -50 -50 -50s-50 22 -50 50s22 50 50 50zM200 100h600v-100h-600v100z" />
<glyph glyph-name="89" unicode="&#xe089;"
d="M800 800l-400 -800l-100 300l-300 100z" />
<glyph glyph-name="8a" unicode="&#xe08a;" horiz-adv-x="600"
d="M300 700c110 0 200 -90 200 -200v-100h100v-400h-600v400h100v100c0 110 90 200 200 200zM300 600c-56 0 -100 -44 -100 -100v-100h200v100c0 56 -44 100 -100 100z" />
<glyph glyph-name="8b" unicode="&#xe08b;" horiz-adv-x="600"
d="M300 800c110 0 200 -90 200 -200v-200h100v-400h-600v400h400v200c0 56 -44 100 -100 100s-100 -44 -100 -100h-100c0 110 90 200 200 200z" />
<glyph glyph-name="8c" unicode="&#xe08c;"
d="M400 700v-100c-111 0 -200 -89 -200 -200h100l-150 -200l-150 200h100c0 165 135 300 300 300zM650 600l150 -200h-100c0 -165 -135 -300 -300 -300v100c111 0 200 89 200 200h-100z" />
<glyph glyph-name="8d" unicode="&#xe08d;"
d="M100 800h600v-300h100l-150 -250l-150 250h100v200h-400v-100h-100v200zM150 550l150 -250h-100v-200h400v100h100v-200h-600v300h-100z" />
<glyph glyph-name="8e" unicode="&#xe08e;"
d="M600 700l200 -150l-200 -150v100h-500v-100h-100v100c0 55 45 100 100 100h500v100zM200 300v-100h500v100h100v-100c0 -55 -45 -100 -100 -100h-500v-100l-200 150z" />
<glyph glyph-name="8f" unicode="&#xe08f;" horiz-adv-x="900"
d="M350 800c193 0 350 -157 350 -350c0 -60 -17 -117 -44 -166c5 -3 12 -8 16 -12l100 -100c16 -16 30 -49 30 -72c0 -56 -46 -102 -102 -102c-23 0 -56 14 -72 30l-100 100c-4 3 -9 9 -12 13c-49 -26 -107 -41 -166 -41c-193 0 -350 157 -350 350s157 350 350 350zM350 200
c142 0 250 108 250 250c0 139 -111 250 -250 250s-250 -111 -250 -250s111 -250 250 -250z" />
<glyph glyph-name="90" unicode="&#xe090;" horiz-adv-x="600"
d="M300 800c166 0 300 -134 300 -300c0 -200 -300 -500 -300 -500s-300 300 -300 500c0 166 134 300 300 300zM300 700c-110 0 -200 -90 -200 -200s90 -200 200 -200s200 90 200 200s-90 200 -200 200z" />
<glyph glyph-name="91" unicode="&#xe091;" horiz-adv-x="900"
d="M0 800h800v-541c1 -3 1 -8 1 -11s0 -7 -1 -10v-238h-800v800zM495 250c0 26 22 50 50 50h5h150v400h-600v-600h600v100h-150h-5c-28 0 -50 22 -50 50zM350 600c83 0 150 -67 150 -150c0 -100 -150 -250 -150 -250s-150 150 -150 250c0 83 67 150 150 150zM350 500
c-28 0 -50 -22 -50 -50s22 -50 50 -50s50 22 50 50s-22 50 -50 50z" />
<glyph glyph-name="92" unicode="&#xe092;" horiz-adv-x="600"
d="M0 700h200v-600h-200v600zM400 700h200v-600h-200v600z" />
<glyph glyph-name="93" unicode="&#xe093;" horiz-adv-x="600"
d="M0 700l600 -300l-600 -300v600z" />
<glyph glyph-name="94" unicode="&#xe094;" horiz-adv-x="600"
d="M300 700c166 0 300 -134 300 -300s-134 -300 -300 -300s-300 134 -300 300s134 300 300 300z" />
<glyph glyph-name="95" unicode="&#xe095;"
d="M400 700v-600l-400 300zM400 400l400 300v-600z" />
<glyph glyph-name="96" unicode="&#xe096;"
d="M0 700l400 -300l-400 -300v600zM400 100v600l400 -300z" />
<glyph glyph-name="97" unicode="&#xe097;"
d="M0 700h200v-600h-200v600zM200 400l500 300v-600z" />
<glyph glyph-name="98" unicode="&#xe098;"
d="M0 700l500 -300l-500 -300v600zM500 100v600h200v-600h-200z" />
<glyph glyph-name="99" unicode="&#xe099;" horiz-adv-x="600"
d="M0 700h600v-600h-600v600z" />
<glyph glyph-name="9a" unicode="&#xe09a;"
d="M200 800h400v-200h200v-400h-200v-200h-400v200h-200v400h200v200z" />
<glyph glyph-name="9b" unicode="&#xe09b;"
d="M0 700h800v-100h-800v100zM0 403h800v-100h-800v100zM0 103h800v-100h-800v100z" />
<glyph glyph-name="9c" unicode="&#xe09c;" horiz-adv-x="600"
d="M278 700c7 2 13 4 22 4c55 0 100 -45 100 -100v-4v-200c0 -55 -45 -100 -100 -100s-100 45 -100 100v200v2c0 44 35 88 78 98zM34 500h4h3c3 0 6 1 9 1c28 0 50 -22 50 -50v-1v-50c0 -111 89 -200 200 -200s200 89 200 200v50c0 28 22 50 50 50s50 -22 50 -50v-50
c0 -148 -109 -270 -250 -294v-106h50c55 0 100 -45 100 -100h-400c0 55 45 100 100 100h50v106c-141 24 -250 146 -250 294v50v2c0 20 15 42 34 48z" />
<glyph glyph-name="9d" unicode="&#xe09d;"
d="M0 500h800v-200h-800v200z" />
<glyph glyph-name="9e" unicode="&#xe09e;"
d="M34 700h4h3h4h5h700c28 0 50 -22 50 -50v-500c0 -28 -22 -50 -50 -50h-250v-100h100c55 0 100 -45 100 -100h-600c0 55 45 100 100 100h100v100h-250c-28 0 -50 22 -50 50v500v2c0 20 15 42 34 48zM100 600v-400h600v400h-600z" />
<glyph glyph-name="9f" unicode="&#xe09f;"
d="M272 700c-14 -40 -22 -83 -22 -128c0 -221 179 -400 400 -400c45 0 88 8 128 22c-53 -158 -202 -272 -378 -272c-221 0 -400 179 -400 400c0 176 114 325 272 378z" />
<glyph glyph-name="a0" unicode="&#xe0a0;"
d="M350 700l150 -150h-100v-150h150v100l150 -150l-150 -150v100h-150v-150h100l-150 -150l-150 150h100v150h-150v-100l-150 150l150 150v-100h150v150h-100z" />
<glyph glyph-name="a1" unicode="&#xe0a1;"
d="M800 800v-550c0 -83 -67 -150 -150 -150s-150 67 -150 150s67 150 150 150c17 0 35 -4 50 -9v206c-201 -6 -327 -27 -400 -50v-397c0 -83 -67 -150 -150 -150s-150 67 -150 150s67 150 150 150c17 0 35 -4 50 -9v409s100 100 600 100z" />
<glyph glyph-name="a2" unicode="&#xe0a2;" horiz-adv-x="700"
d="M499 700c51 0 102 -20 141 -59c78 -78 78 -203 0 -281l-250 -244c-48 -48 -127 -48 -175 0s-48 127 0 175l96 97l69 -69l-90 -94l-7 -3c-10 -10 -10 -28 0 -38s28 -10 38 0l250 247c37 40 39 102 0 141s-104 40 -144 0l-278 -275c-66 -69 -68 -179 0 -247
c69 -69 181 -69 250 0l9 12l116 113l69 -69l-125 -125c-107 -107 -281 -107 -388 0s-107 281 0 388l278 272c39 39 90 59 141 59z" />
<glyph glyph-name="a3" unicode="&#xe0a3;"
d="M600 800l200 -200l-100 -100l-200 200zM400 600l200 -200l-400 -400h-200v200z" />
<glyph glyph-name="a4" unicode="&#xe0a4;"
d="M550 800c83 0 150 -90 150 -200s-67 -200 -150 -200c-22 0 -40 8 -59 19c6 26 9 52 9 81c0 84 -27 158 -72 212c27 52 71 88 122 88zM250 700c83 0 150 -90 150 -200s-67 -200 -150 -200s-150 90 -150 200s67 200 150 200zM725 384c44 -22 75 -66 75 -118v-166h-200v66
c0 50 -17 96 -44 134c66 2 126 33 169 84zM75 284c45 -53 106 -84 175 -84s130 31 175 84c44 -22 75 -66 75 -118v-166h-500v166c0 52 31 96 75 118z" />
<glyph glyph-name="a5" unicode="&#xe0a5;"
d="M400 800c110 0 200 -112 200 -250s-90 -250 -200 -250s-200 112 -200 250s90 250 200 250zM191 300c54 -61 128 -100 209 -100s155 39 209 100c106 -5 191 -92 191 -200v-100h-800v100c0 108 85 195 191 200z" />
<glyph glyph-name="a6" unicode="&#xe0a6;" horiz-adv-x="600"
d="M19 800h462c11 0 19 -8 19 -19v-762c0 -11 -8 -19 -19 -19h-462c-11 0 -19 8 -19 19v762c0 11 8 19 19 19zM100 700v-500h300v500h-300zM250 150c-28 0 -50 -22 -50 -50s22 -50 50 -50s50 22 50 50s-22 50 -50 50z" />
<glyph glyph-name="a7" unicode="&#xe0a7;"
d="M350 800c17 0 34 -1 50 -3v-397l-297 297c63 64 150 103 247 103zM500 694c169 -25 300 -168 300 -344c0 -193 -157 -350 -350 -350c-85 0 -161 31 -222 81l272 272v341zM91 562l237 -234l-212 -212c-70 55 -116 138 -116 234c0 84 35 158 91 212z" />
<glyph glyph-name="a8" unicode="&#xe0a8;"
d="M92 650c0 23 20 50 46 50h3h4h5h400c28 0 50 -22 50 -50s-22 -50 -50 -50h-50v-200h100c55 0 100 -45 100 -100h-300v-300l-56 -100l-44 100v300h-300c0 55 45 100 100 100h100v200h-50c-2 0 -6 -1 -8 -1c-28 0 -50 23 -50 51z" />
<glyph glyph-name="a9" unicode="&#xe0a9;"
d="M400 800c221 0 400 -179 400 -400s-179 -400 -400 -400s-400 179 -400 400s179 400 400 400zM300 600v-400l300 200z" />
<glyph glyph-name="aa" unicode="&#xe0aa;"
d="M300 800h200v-300h300v-200h-300v-300h-200v300h-300v200h300v300z" />
<glyph glyph-name="ab" unicode="&#xe0ab;"
d="M300 800h100v-400h-100v400zM172 656l62 -78l-40 -31c-58 -46 -94 -117 -94 -197c0 -139 111 -250 250 -250s250 111 250 250c0 80 -39 151 -97 197l-37 31l62 78l38 -31c82 -64 134 -164 134 -275c0 -193 -157 -350 -350 -350s-350 157 -350 350c0 111 53 211 134 275z
" />
<glyph glyph-name="ac" unicode="&#xe0ac;"
d="M200 800h400v-200h-400v200zM9 500h782c6 0 9 -3 9 -9v-282c0 -6 -3 -9 -9 -9h-91v200h-600v-200h-91c-6 0 -9 3 -9 9v282c0 6 3 9 9 9zM200 300h400v-300h-400v300z" />
<glyph glyph-name="ad" unicode="&#xe0ad;"
d="M0 700h100v-700h-100v700zM700 700h100v-700h-100v700zM200 600h200v-100h-200v100zM300 400h200v-100h-200v100zM400 200h200v-100h-200v100z" />
<glyph glyph-name="ae" unicode="&#xe0ae;"
d="M325 700c42 -141 87 -280 131 -419c29 74 59 148 88 222c30 -57 58 -114 87 -172h169v-100h-231l-13 28c-37 -92 -74 -184 -112 -275c-38 129 -79 257 -119 385c-42 -133 -83 -267 -125 -400c-28 88 -56 175 -84 262h-116v100h188l9 -34l3 -6c42 137 83 273 125 409z" />
<glyph glyph-name="af" unicode="&#xe0af;"
d="M200 600c0 57 43 100 100 100s100 -43 100 -100c0 -28 -18 -48 -28 -72c-3 -6 -3 -16 -3 -28h231v-231c12 0 22 0 28 3c24 10 44 28 72 28c57 0 100 -43 100 -100s-43 -100 -100 -100c-28 0 -48 18 -72 28c-6 3 -16 3 -28 3v-231h-231c0 12 0 22 3 28c10 24 28 44 28 72
c0 57 -43 100 -100 100s-100 -43 -100 -100c0 -28 18 -48 28 -72c3 -6 3 -16 3 -28h-231v600h231c0 12 0 22 -3 28c-10 24 -28 44 -28 72z" />
<glyph glyph-name="b0" unicode="&#xe0b0;" horiz-adv-x="500"
d="M247 700c84 0 148 -20 191 -59s59 -93 59 -141c0 -117 -69 -181 -119 -225s-81 -67 -81 -150v-25h-100v25c0 117 65 181 115 225s85 67 85 150c0 25 -8 48 -28 66s-56 34 -122 34s-97 -18 -116 -37s-27 -43 -31 -69l-100 12c5 38 19 88 59 128s103 66 188 66zM197 0h100
v-100h-100v100z" />
<glyph glyph-name="b1" unicode="&#xe0b1;"
d="M450 800c138 0 250 -112 250 -250v-50c58 -21 100 -85 100 -150c0 -69 -48 -127 -112 -144c-22 55 -75 94 -138 94c-20 0 -39 -5 -56 -12c-17 64 -75 112 -144 112s-127 -48 -144 -112c-17 7 -36 12 -56 12c-37 0 -71 -12 -97 -34c-33 36 -53 82 -53 134
c0 110 90 200 200 200c23 114 129 200 250 200zM334 300h4h3c3 0 6 1 9 1c28 0 50 -22 50 -50v-1v-200c0 -28 -22 -50 -50 -50s-50 22 -50 50v200v2c0 20 15 42 34 48zM134 200h4h3c3 0 6 1 9 1c28 0 50 -22 50 -50v-1v-100c0 -28 -22 -50 -50 -50s-50 22 -50 50v100v2
c0 20 15 42 34 48zM534 200h3h4c3 0 6 1 9 1c28 0 50 -22 50 -50v-1v-100c0 -28 -22 -50 -50 -50s-50 22 -50 50v100v2c0 20 15 42 34 48z" />
<glyph glyph-name="b2" unicode="&#xe0b2;"
d="M600 800l200 -150l-200 -150v100h-50l-153 -191l175 -206l6 -3h22v100l200 -150l-200 -150v100h-25c-35 0 -56 12 -78 38l-166 190l-153 -190c-22 -27 -43 -38 -78 -38h-100v100h100l166 206l-163 191l-3 3h-100v100h100c34 0 56 -12 78 -38l153 -178l141 178
c22 27 43 38 78 38h50v100z" />
<glyph glyph-name="b3" unicode="&#xe0b3;"
d="M400 800c110 0 209 -47 281 -119l119 119v-300h-300l109 109c-54 55 -126 91 -209 91c-166 0 -300 -134 -300 -300s134 -300 300 -300c83 0 158 34 212 88l72 -72c-72 -72 -174 -116 -284 -116c-220 0 -400 180 -400 400s180 400 400 400z" />
<glyph glyph-name="b4" unicode="&#xe0b4;"
d="M400 800h400v-400l-166 166l-400 -400l166 -166h-400v400l166 -166l400 400z" />
<glyph glyph-name="b5" unicode="&#xe0b5;" horiz-adv-x="600"
d="M250 800l250 -300h-200v-200h200l-250 -300l-250 300h200v200h-200z" />
<glyph glyph-name="b6" unicode="&#xe0b6;"
d="M300 600v-200h200v200l300 -250l-300 -250v200h-200v-200l-300 250z" />
<glyph glyph-name="b7" unicode="&#xe0b7;"
d="M0 800c441 0 800 -359 800 -800h-200c0 333 -267 600 -600 600v200zM0 500c275 0 500 -225 500 -500h-200c0 167 -133 300 -300 300v200zM0 200c110 0 200 -90 200 -200h-200v200z" />
<glyph glyph-name="b8" unicode="&#xe0b8;"
d="M100 800c386 0 700 -314 700 -700h-100c0 332 -268 600 -600 600v100zM100 600c276 0 500 -224 500 -500h-100c0 222 -178 400 -400 400v100zM100 400c165 0 300 -135 300 -300h-100c0 111 -89 200 -200 200v100zM100 200c55 0 100 -45 100 -100s-45 -100 -100 -100
s-100 45 -100 100s45 100 100 100z" />
<glyph glyph-name="b9" unicode="&#xe0b9;"
d="M300 800h400c55 0 100 -45 100 -100v-200h-400v150c0 28 -22 50 -50 50s-50 -22 -50 -50v-250h400v-300c0 -55 -45 -100 -100 -100h-500c-55 0 -100 45 -100 100v200h100v-150c0 -28 22 -50 50 -50s50 22 50 50v550c0 55 45 100 100 100z" />
<glyph glyph-name="ba" unicode="&#xe0ba;"
d="M75 700h225v-100h-200v-500h400v100h100v-125c0 -41 -34 -75 -75 -75h-450c-41 0 -75 34 -75 75v550c0 41 34 75 75 75zM600 700l200 -200l-200 -200v100h-200c-94 0 -173 -65 -194 -153c23 199 189 353 394 353v100z" />
<glyph glyph-name="bb" unicode="&#xe0bb;"
d="M500 700l300 -284l-300 -316v200h-100c-200 0 -348 -102 -400 -300c0 295 100 500 500 500v200z" />
<glyph glyph-name="bc" unicode="&#xe0bc;"
d="M381 791l19 9l19 -9c127 -53 253 -108 381 -160v-31c0 -166 -67 -313 -147 -419c-40 -53 -83 -97 -125 -128s-82 -53 -128 -53s-86 22 -128 53s-85 75 -125 128c-80 107 -147 253 -147 419v31c128 52 254 107 381 160zM400 100v591l-294 -122c8 -126 58 -243 122 -328
c35 -46 73 -86 106 -110s62 -31 66 -31z" />
<glyph glyph-name="bd" unicode="&#xe0bd;"
d="M600 800h100v-800h-100v800zM400 700h100v-700h-100v700zM200 500h100v-500h-100v500zM0 300h100v-300h-100v300z" />
<glyph glyph-name="be" unicode="&#xe0be;"
d="M300 800h100v-200h200l100 -100l-100 -100h-200v-400h-100v500h-200l-100 100l100 100h200v100z" />
<glyph glyph-name="bf" unicode="&#xe0bf;"
d="M200 800h100v-600h200l-250 -200l-250 200h200v600zM400 800h200v-100h-200v100zM400 600h300v-100h-300v100zM400 400h400v-100h-400v100z" />
<glyph glyph-name="c0" unicode="&#xe0c0;"
d="M200 800h100v-600h200l-250 -200l-250 200h200v600zM400 800h400v-100h-400v100zM400 600h300v-100h-300v100zM400 400h200v-100h-200v100z" />
<glyph glyph-name="c1" unicode="&#xe0c1;"
d="M75 700h650c41 0 75 -34 75 -75v-550c0 -41 -34 -75 -75 -75h-650c-41 0 -75 34 -75 75v550c0 41 34 75 75 75zM100 600v-100h100v100h-100zM300 600v-100h400v100h-400zM100 400v-100h100v100h-100zM300 400v-100h400v100h-400zM100 200v-100h100v100h-100zM300 200
v-100h400v100h-400z" />
<glyph glyph-name="c2" unicode="&#xe0c2;"
d="M400 800l100 -300h300l-250 -200l100 -300l-250 200l-250 -200l100 300l-250 200h300z" />
<glyph glyph-name="c3" unicode="&#xe0c3;"
d="M400 800c28 0 50 -22 50 -50s-22 -50 -50 -50s-50 22 -50 50s22 50 50 50zM150 700c28 0 50 -22 50 -50s-22 -50 -50 -50s-50 22 -50 50s22 50 50 50zM650 700c28 0 50 -22 50 -50s-22 -50 -50 -50s-50 22 -50 50s22 50 50 50zM400 600c110 0 200 -90 200 -200
s-90 -200 -200 -200s-200 90 -200 200s90 200 200 200zM50 450c28 0 50 -22 50 -50s-22 -50 -50 -50s-50 22 -50 50s22 50 50 50zM750 450c28 0 50 -22 50 -50s-22 -50 -50 -50s-50 22 -50 50s22 50 50 50zM150 200c28 0 50 -22 50 -50s-22 -50 -50 -50s-50 22 -50 50
s22 50 50 50zM650 200c28 0 50 -22 50 -50s-22 -50 -50 -50s-50 22 -50 50s22 50 50 50zM400 100c28 0 50 -22 50 -50s-22 -50 -50 -50s-50 22 -50 50s22 50 50 50z" />
<glyph glyph-name="c4" unicode="&#xe0c4;"
d="M34 800h632c18 0 34 -16 34 -34v-732c0 -18 -16 -34 -34 -34h-632c-18 0 -34 16 -34 34v732c0 18 16 34 34 34zM100 700v-500h500v500h-500zM350 150c-38 0 -63 -42 -44 -75s69 -33 88 0s-6 75 -44 75z" />
<glyph glyph-name="c5" unicode="&#xe0c5;"
d="M0 800h300l500 -500l-300 -300l-500 500v300zM200 700c-55 0 -100 -45 -100 -100s45 -100 100 -100s100 45 100 100s-45 100 -100 100z" />
<glyph glyph-name="c6" unicode="&#xe0c6;"
d="M0 600h200l300 -300l-200 -200l-300 300v200zM340 600h160l300 -300l-200 -200l-78 78l119 122zM150 500c-28 0 -50 -22 -50 -50s22 -50 50 -50s50 22 50 50s-22 50 -50 50z" />
<glyph glyph-name="c7" unicode="&#xe0c7;"
d="M400 800c220 0 400 -180 400 -400s-180 -400 -400 -400s-400 180 -400 400s180 400 400 400zM400 700c-166 0 -300 -134 -300 -300s134 -300 300 -300s300 134 300 300s-134 300 -300 300zM400 600c110 0 200 -90 200 -200s-90 -200 -200 -200s-200 90 -200 200
s90 200 200 200zM400 500c-56 0 -100 -44 -100 -100s44 -100 100 -100s100 44 100 100s-44 100 -100 100z" />
<glyph glyph-name="c8" unicode="&#xe0c8;"
d="M0 700h559l-100 -100h-359v-500h500v159l100 100v-359h-700v700zM700 700l100 -100l-400 -400l-200 200l100 100l100 -100z" />
<glyph glyph-name="c9" unicode="&#xe0c9;"
d="M9 800h782c6 0 9 -3 9 -9v-782c0 -6 -3 -9 -9 -9h-782c-6 0 -9 3 -9 9v782c0 6 3 9 9 9zM150 722l-72 -72l100 -100l-100 -100l72 -72l172 172zM400 500v-100h300v100h-300z" />
<glyph glyph-name="ca" unicode="&#xe0ca;"
d="M0 800h800v-200h-50c0 55 -45 100 -100 100h-150v-550c0 -28 22 -50 50 -50h50v-100h-400v100h50c28 0 50 22 50 50v550h-150c-55 0 -100 -45 -100 -100h-50v200z" />
<glyph glyph-name="cb" unicode="&#xe0cb;"
d="M0 700h100v-400h-100v400zM200 700h350c21 0 39 -13 47 -31c0 0 103 -291 103 -319s-22 -50 -50 -50h-150c-28 0 -50 -25 -50 -50s39 -158 47 -184s-5 -55 -31 -63s-52 5 -66 31s-109 219 -128 238s-44 28 -72 28v400z" />
<glyph glyph-name="cc" unicode="&#xe0cc;"
d="M400 666c10 19 28 32 47 34l19 -3c26 -8 39 -37 31 -63s-47 -159 -47 -184s22 -50 50 -50h150c28 0 50 -22 50 -50s-103 -319 -103 -319c-8 -18 -26 -31 -47 -31h-350v400c28 0 53 9 72 28s114 212 128 238zM0 400h100v-400h-100v400z" />
<glyph glyph-name="cd" unicode="&#xe0cd;"
d="M200 700h300v-100h-100v-6c25 -4 50 -8 72 -16l-34 -94c-28 11 -58 16 -88 16c-139 0 -250 -111 -250 -250s111 -250 250 -250s250 111 250 250c0 31 -5 60 -16 88l91 37c14 -38 25 -81 25 -125c0 -193 -157 -350 -350 -350s-350 157 -350 350c0 176 130 323 300 347v3
h-100v100zM700 584c0 0 -296 -348 -316 -368s-48 -20 -68 0s-20 48 0 68s384 300 384 300z" />
<glyph glyph-name="ce" unicode="&#xe0ce;"
d="M600 700l200 -150l-200 -150v100h-600v100h600v100zM200 300v-100h600v-100h-600v-100l-200 150z" />
<glyph glyph-name="cf" unicode="&#xe0cf;"
d="M300 800h100c55 0 100 -45 100 -100h100c55 0 100 -45 100 -100h-700c0 55 45 100 100 100h100c0 55 45 100 100 100zM100 500h100v-350c0 -28 22 -50 50 -50s50 22 50 50v350h100v-350c0 -28 22 -50 50 -50s50 22 50 50v350h100v-481c0 -11 -8 -19 -19 -19h-462
c-11 0 -19 8 -19 19v481z" />
<glyph glyph-name="d0" unicode="&#xe0d0;"
d="M100 800h200v-400c0 -55 45 -100 100 -100s100 45 100 100v400h100v-400c0 -110 -90 -200 -200 -200h-50c-138 0 -250 90 -250 200v400zM0 100h700v-100h-700v100z" />
<glyph glyph-name="d1" unicode="&#xe0d1;"
d="M9 700h182c6 0 9 -3 9 -9v-482c0 -6 -3 -9 -9 -9h-182c-6 0 -9 3 -9 9v482c0 6 3 9 9 9zM609 700h182c6 0 9 -3 9 -9v-482c0 -6 -3 -9 -9 -9h-182c-6 0 -9 3 -9 9v482c0 6 3 9 9 9zM309 500h182c6 0 9 -3 9 -9v-282c0 -6 -3 -9 -9 -9h-182c-6 0 -9 3 -9 9v282
c0 6 3 9 9 9zM0 100h800v-100h-800v100z" />
<glyph glyph-name="d2" unicode="&#xe0d2;"
d="M10 700h181c6 0 9 -3 9 -9v-191h-200v191c0 6 4 9 10 9zM610 700h181c6 0 9 -3 9 -9v-191h-200v191c0 6 5 9 10 9zM310 600h181c6 0 9 -3 9 -9v-91h-200v91c0 6 4 9 10 9zM0 400h800v-100h-800v100zM0 200h200v-191c0 -6 -3 -9 -9 -9h-182c-6 0 -9 3 -9 9v191zM300 200
h200v-91c0 -6 -3 -9 -9 -9h-181c-6 0 -10 3 -10 9v91zM600 200h200v-191c0 -6 -3 -9 -9 -9h-181c-6 0 -10 3 -10 9v191z" />
<glyph glyph-name="d3" unicode="&#xe0d3;"
d="M0 700h800v-100h-800v100zM9 500h182c6 0 9 -3 9 -9v-482c0 -6 -3 -9 -9 -9h-182c-6 0 -9 3 -9 9v482c0 6 3 9 9 9zM309 500h182c6 0 9 -3 9 -9v-282c0 -6 -3 -9 -9 -9h-182c-6 0 -9 3 -9 9v282c0 6 3 9 9 9zM609 500h182c6 0 9 -3 9 -9v-482c0 -6 -3 -9 -9 -9h-182
c-6 0 -9 3 -9 9v482c0 6 3 9 9 9z" />
<glyph glyph-name="d4" unicode="&#xe0d4;"
d="M50 600h500c28 0 50 -22 50 -50v-150l100 100h100v-300h-100l-100 100v-150c0 -28 -22 -50 -50 -50h-500c-28 0 -50 22 -50 50v400c0 28 22 50 50 50z" />
<glyph glyph-name="d5" unicode="&#xe0d5;"
d="M334 800h66v-800h-66l-134 200h-200v400h200zM500 600v100c26 0 52 -4 75 -10c130 -33 225 -150 225 -290s-95 -258 -225 -291h-3c-23 -6 -47 -9 -72 -9v100c17 0 34 2 50 6c86 22 150 100 150 194s-64 172 -150 194c-16 4 -33 6 -50 6zM500 500l25 -3
c44 -11 75 -51 75 -97s-32 -86 -75 -97l-25 -3v200z" />
<glyph glyph-name="d6" unicode="&#xe0d6;" horiz-adv-x="600"
d="M334 800h66v-800h-66l-134 200h-200v400h200zM500 500l25 -3c44 -11 75 -51 75 -97s-32 -86 -75 -97l-25 -3v200z" />
<glyph glyph-name="d7" unicode="&#xe0d7;" horiz-adv-x="400"
d="M334 800h66v-800h-66l-134 200h-200v400h200z" />
<glyph glyph-name="d8" unicode="&#xe0d8;"
d="M309 800h82c6 0 10 -4 12 -9l294 -682l3 -19v-81c0 -6 -3 -9 -9 -9h-682c-6 0 -9 3 -9 9v81l3 19l294 682c2 5 6 9 12 9zM300 500v-200h100v200h-100zM300 200v-100h100v100h-100z" />
<glyph glyph-name="d9" unicode="&#xe0d9;"
d="M375 800c138 0 269 -39 378 -109l-53 -82c-93 60 -205 91 -325 91c-119 0 -229 -32 -322 -91l-53 82c109 70 237 109 375 109zM375 500c78 0 154 -23 216 -62l-53 -85c-46 30 -104 47 -163 47c-60 0 -112 -17 -159 -47l-54 85c62 40 134 62 213 62zM375 200
c55 0 100 -45 100 -100s-45 -100 -100 -100s-100 45 -100 100s45 100 100 100z" />
<glyph glyph-name="da" unicode="&#xe0da;" horiz-adv-x="900"
d="M551 800c16 0 32 0 47 -3l-97 -97v-200h200l97 97c3 -15 3 -31 3 -47c0 -138 -112 -250 -250 -250c-32 0 -62 8 -90 19l-288 -291c-20 -20 -46 -28 -72 -28s-52 8 -72 28c-39 39 -39 105 0 144l291 287c-11 28 -19 59 -19 91c0 138 112 250 250 250zM101 150
c-28 0 -50 -22 -50 -50s22 -50 50 -50s50 22 50 50s-22 50 -50 50z" />
<glyph glyph-name="db" unicode="&#xe0db;"
d="M141 700c84 -84 169 -167 253 -250c82 83 167 165 247 250l143 -141l-253 -253c84 -82 167 -166 253 -247l-143 -143c-81 86 -165 169 -247 253l-253 -253l-141 143c85 80 167 164 250 247c-83 84 -166 169 -250 253z" />
<glyph glyph-name="dc" unicode="&#xe0dc;"
d="M0 800h100l231 -300h38l231 300h100l-225 -300h225v-100h-300v-100h300v-100h-300v-200h-100v200h-300v100h300v100h-300v100h225z" />
<glyph glyph-name="dd" unicode="&#xe0dd;" horiz-adv-x="900"
d="M350 800c193 0 350 -157 350 -350c0 -61 -17 -119 -44 -169c4 -2 10 -6 13 -9l103 -100c16 -16 30 -49 30 -72c0 -56 -46 -102 -102 -102c-23 0 -56 14 -72 30l-100 103c-3 3 -7 9 -9 13c-50 -28 -108 -44 -169 -44c-193 0 -350 157 -350 350s157 350 350 350zM350 700
c-139 0 -250 -111 -250 -250s111 -250 250 -250c62 0 119 23 163 60c7 11 19 25 31 31l3 3c34 43 53 97 53 156c0 139 -111 250 -250 250zM300 600h100v-100h100v-100h-100v-100h-100v100h-100v100h100v100z" />
<glyph glyph-name="de" unicode="&#xe0de;" horiz-adv-x="900"
d="M350 800c193 0 350 -157 350 -350c0 -61 -17 -119 -44 -169c4 -2 10 -6 13 -9l103 -100c16 -16 30 -49 30 -72c0 -56 -46 -102 -102 -102c-23 0 -56 14 -72 30l-100 103c-3 3 -7 9 -9 13c-50 -28 -108 -44 -169 -44c-193 0 -350 157 -350 350s157 350 350 350zM350 700
c-139 0 -250 -111 -250 -250s111 -250 250 -250c62 0 119 23 163 60c7 11 19 25 31 31l3 3c34 43 53 97 53 156c0 139 -111 250 -250 250zM200 500h300v-100h-300v100z" />
</font>
</defs></svg>

After

Width:  |  Height:  |  Size: 54 KiB

Binary file not shown.

Binary file not shown.

202
static/swagger-ui/LICENSE vendored Normal file
View File

@ -0,0 +1,202 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

853
static/swagger-ui/SwaggerDark.css vendored Normal file
View File

@ -0,0 +1,853 @@
/*!
* MIT License
*
* Copyright (c) 2020 Romans Pokrovskis
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in all
* copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
* SOFTWARE.
*/
a { color: #8c8cfa; }
::-webkit-scrollbar-track-piece { background-color: rgba(255, 255, 255, .2) !important; }
::-webkit-scrollbar-track { background-color: rgba(255, 255, 255, .3) !important; }
::-webkit-scrollbar-thumb { background-color: rgba(255, 255, 255, .5) !important; }
embed[type="application/pdf"] { filter: invert(90%); }
html {
background: #1f1f1f !important;
box-sizing: border-box;
filter: contrast(100%) brightness(100%) saturate(100%);
overflow-y: scroll;
}
body {
background: #1f1f1f;
background-color: #1f1f1f;
background-image: none !important;
}
button, input, select, textarea {
background-color: #1f1f1f;
color: #bfbfbf;
}
font, html { color: #bfbfbf; }
.swagger-ui, .swagger-ui section h3 { color: #b5bac9; }
.swagger-ui a { background-color: transparent; }
.swagger-ui mark {
background-color: #664b00;
color: #bfbfbf;
}
.swagger-ui legend { color: inherit; }
.swagger-ui .debug * { outline: #e6da99 solid 1px; }
.swagger-ui .debug-white * { outline: #fff solid 1px; }
.swagger-ui .debug-black * { outline: #bfbfbf solid 1px; }
.swagger-ui .debug-grid { background: url(data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAgAAAAICAYAAADED76LAAAAGXRFWHRTb2Z0d2FyZQBBZG9iZSBJbWFnZVJlYWR5ccllPAAAAyhpVFh0WE1MOmNvbS5hZG9iZS54bXAAAAAAADw/eHBhY2tldCBiZWdpbj0i77u/IiBpZD0iVzVNME1wQ2VoaUh6cmVTek5UY3prYzlkIj8+IDx4OnhtcG1ldGEgeG1sbnM6eD0iYWRvYmU6bnM6bWV0YS8iIHg6eG1wdGs9IkFkb2JlIFhNUCBDb3JlIDUuNi1jMTExIDc5LjE1ODMyNSwgMjAxNS8wOS8xMC0wMToxMDoyMCAgICAgICAgIj4gPHJkZjpSREYgeG1sbnM6cmRmPSJodHRwOi8vd3d3LnczLm9yZy8xOTk5LzAyLzIyLXJkZi1zeW50YXgtbnMjIj4gPHJkZjpEZXNjcmlwdGlvbiByZGY6YWJvdXQ9IiIgeG1sbnM6eG1wTU09Imh0dHA6Ly9ucy5hZG9iZS5jb20veGFwLzEuMC9tbS8iIHhtbG5zOnN0UmVmPSJodHRwOi8vbnMuYWRvYmUuY29tL3hhcC8xLjAvc1R5cGUvUmVzb3VyY2VSZWYjIiB4bWxuczp4bXA9Imh0dHA6Ly9ucy5hZG9iZS5jb20veGFwLzEuMC8iIHhtcE1NOkRvY3VtZW50SUQ9InhtcC5kaWQ6MTRDOTY4N0U2N0VFMTFFNjg2MzZDQjkwNkQ4MjgwMEIiIHhtcE1NOkluc3RhbmNlSUQ9InhtcC5paWQ6MTRDOTY4N0Q2N0VFMTFFNjg2MzZDQjkwNkQ4MjgwMEIiIHhtcDpDcmVhdG9yVG9vbD0iQWRvYmUgUGhvdG9zaG9wIENDIDIwMTUgKE1hY2ludG9zaCkiPiA8eG1wTU06RGVyaXZlZEZyb20gc3RSZWY6aW5zdGFuY2VJRD0ieG1wLmlpZDo3NjcyQkQ3NjY3QzUxMUU2QjJCQ0UyNDA4MTAwMjE3MSIgc3RSZWY6ZG9jdW1lbnRJRD0ieG1wLmRpZDo3NjcyQkQ3NzY3QzUxMUU2QjJCQ0UyNDA4MTAwMjE3MSIvPiA8L3JkZjpEZXNjcmlwdGlvbj4gPC9yZGY6UkRGPiA8L3g6eG1wbWV0YT4gPD94cGFja2V0IGVuZD0iciI/PsBS+GMAAAAjSURBVHjaYvz//z8DLsD4gcGXiYEAGBIKGBne//fFpwAgwAB98AaF2pjlUQAAAABJRU5ErkJggg==) 0 0; }
.swagger-ui .debug-grid-16 { background: url(data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAYAAAAf8/9hAAAAGXRFWHRTb2Z0d2FyZQBBZG9iZSBJbWFnZVJlYWR5ccllPAAAAyhpVFh0WE1MOmNvbS5hZG9iZS54bXAAAAAAADw/eHBhY2tldCBiZWdpbj0i77u/IiBpZD0iVzVNME1wQ2VoaUh6cmVTek5UY3prYzlkIj8+IDx4OnhtcG1ldGEgeG1sbnM6eD0iYWRvYmU6bnM6bWV0YS8iIHg6eG1wdGs9IkFkb2JlIFhNUCBDb3JlIDUuNi1jMTExIDc5LjE1ODMyNSwgMjAxNS8wOS8xMC0wMToxMDoyMCAgICAgICAgIj4gPHJkZjpSREYgeG1sbnM6cmRmPSJodHRwOi8vd3d3LnczLm9yZy8xOTk5LzAyLzIyLXJkZi1zeW50YXgtbnMjIj4gPHJkZjpEZXNjcmlwdGlvbiByZGY6YWJvdXQ9IiIgeG1sbnM6eG1wTU09Imh0dHA6Ly9ucy5hZG9iZS5jb20veGFwLzEuMC9tbS8iIHhtbG5zOnN0UmVmPSJodHRwOi8vbnMuYWRvYmUuY29tL3hhcC8xLjAvc1R5cGUvUmVzb3VyY2VSZWYjIiB4bWxuczp4bXA9Imh0dHA6Ly9ucy5hZG9iZS5jb20veGFwLzEuMC8iIHhtcE1NOkRvY3VtZW50SUQ9InhtcC5kaWQ6ODYyRjhERDU2N0YyMTFFNjg2MzZDQjkwNkQ4MjgwMEIiIHhtcE1NOkluc3RhbmNlSUQ9InhtcC5paWQ6ODYyRjhERDQ2N0YyMTFFNjg2MzZDQjkwNkQ4MjgwMEIiIHhtcDpDcmVhdG9yVG9vbD0iQWRvYmUgUGhvdG9zaG9wIENDIDIwMTUgKE1hY2ludG9zaCkiPiA8eG1wTU06RGVyaXZlZEZyb20gc3RSZWY6aW5zdGFuY2VJRD0ieG1wLmlpZDo3NjcyQkQ3QTY3QzUxMUU2QjJCQ0UyNDA4MTAwMjE3MSIgc3RSZWY6ZG9jdW1lbnRJRD0ieG1wLmRpZDo3NjcyQkQ3QjY3QzUxMUU2QjJCQ0UyNDA4MTAwMjE3MSIvPiA8L3JkZjpEZXNjcmlwdGlvbj4gPC9yZGY6UkRGPiA8L3g6eG1wbWV0YT4gPD94cGFja2V0IGVuZD0iciI/PvCS01IAAABMSURBVHjaYmR4/5+BFPBfAMFm/MBgx8RAGWCn1AAmSg34Q6kBDKMGMDCwICeMIemF/5QawEipAWwUhwEjMDvbAWlWkvVBwu8vQIABAEwBCph8U6c0AAAAAElFTkSuQmCC) 0 0; }
.swagger-ui .debug-grid-8-solid { background: url(data:image/jpeg;base64,/9j/4QAYRXhpZgAASUkqAAgAAAAAAAAAAAAAAP/sABFEdWNreQABAAQAAAAAAAD/4QMxaHR0cDovL25zLmFkb2JlLmNvbS94YXAvMS4wLwA8P3hwYWNrZXQgYmVnaW49Iu+7vyIgaWQ9Ilc1TTBNcENlaGlIenJlU3pOVGN6a2M5ZCI/PiA8eDp4bXBtZXRhIHhtbG5zOng9ImFkb2JlOm5zOm1ldGEvIiB4OnhtcHRrPSJBZG9iZSBYTVAgQ29yZSA1LjYtYzExMSA3OS4xNTgzMjUsIDIwMTUvMDkvMTAtMDE6MTA6MjAgICAgICAgICI+IDxyZGY6UkRGIHhtbG5zOnJkZj0iaHR0cDovL3d3dy53My5vcmcvMTk5OS8wMi8yMi1yZGYtc3ludGF4LW5zIyI+IDxyZGY6RGVzY3JpcHRpb24gcmRmOmFib3V0PSIiIHhtbG5zOnhtcD0iaHR0cDovL25zLmFkb2JlLmNvbS94YXAvMS4wLyIgeG1sbnM6eG1wTU09Imh0dHA6Ly9ucy5hZG9iZS5jb20veGFwLzEuMC9tbS8iIHhtbG5zOnN0UmVmPSJodHRwOi8vbnMuYWRvYmUuY29tL3hhcC8xLjAvc1R5cGUvUmVzb3VyY2VSZWYjIiB4bXA6Q3JlYXRvclRvb2w9IkFkb2JlIFBob3Rvc2hvcCBDQyAyMDE1IChNYWNpbnRvc2gpIiB4bXBNTTpJbnN0YW5jZUlEPSJ4bXAuaWlkOkIxMjI0OTczNjdCMzExRTZCMkJDRTI0MDgxMDAyMTcxIiB4bXBNTTpEb2N1bWVudElEPSJ4bXAuZGlkOkIxMjI0OTc0NjdCMzExRTZCMkJDRTI0MDgxMDAyMTcxIj4gPHhtcE1NOkRlcml2ZWRGcm9tIHN0UmVmOmluc3RhbmNlSUQ9InhtcC5paWQ6QjEyMjQ5NzE2N0IzMTFFNkIyQkNFMjQwODEwMDIxNzEiIHN0UmVmOmRvY3VtZW50SUQ9InhtcC5kaWQ6QjEyMjQ5NzI2N0IzMTFFNkIyQkNFMjQwODEwMDIxNzEiLz4gPC9yZGY6RGVzY3JpcHRpb24+IDwvcmRmOlJERj4gPC94OnhtcG1ldGE+IDw/eHBhY2tldCBlbmQ9InIiPz7/7gAOQWRvYmUAZMAAAAAB/9sAhAAbGhopHSlBJiZBQi8vL0JHPz4+P0dHR0dHR0dHR0dHR0dHR0dHR0dHR0dHR0dHR0dHR0dHR0dHR0dHR0dHR0dHAR0pKTQmND8oKD9HPzU/R0dHR0dHR0dHR0dHR0dHR0dHR0dHR0dHR0dHR0dHR0dHR0dHR0dHR0dHR0dHR0dHR0f/wAARCAAIAAgDASIAAhEBAxEB/8QAWQABAQAAAAAAAAAAAAAAAAAAAAYBAQEAAAAAAAAAAAAAAAAAAAIEEAEBAAMBAAAAAAAAAAAAAAABADECA0ERAAEDBQAAAAAAAAAAAAAAAAARITFBUWESIv/aAAwDAQACEQMRAD8AoOnTV1QTD7JJshP3vSM3P//Z) 0 0 #1c1c21; }
.swagger-ui .debug-grid-16-solid { background: url(data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAIAAACQkWg2AAAAGXRFWHRTb2Z0d2FyZQBBZG9iZSBJbWFnZVJlYWR5ccllPAAAAyhpVFh0WE1MOmNvbS5hZG9iZS54bXAAAAAAADw/eHBhY2tldCBiZWdpbj0i77u/IiBpZD0iVzVNME1wQ2VoaUh6cmVTek5UY3prYzlkIj8+IDx4OnhtcG1ldGEgeG1sbnM6eD0iYWRvYmU6bnM6bWV0YS8iIHg6eG1wdGs9IkFkb2JlIFhNUCBDb3JlIDUuNi1jMTExIDc5LjE1ODMyNSwgMjAxNS8wOS8xMC0wMToxMDoyMCAgICAgICAgIj4gPHJkZjpSREYgeG1sbnM6cmRmPSJodHRwOi8vd3d3LnczLm9yZy8xOTk5LzAyLzIyLXJkZi1zeW50YXgtbnMjIj4gPHJkZjpEZXNjcmlwdGlvbiByZGY6YWJvdXQ9IiIgeG1sbnM6eG1wPSJodHRwOi8vbnMuYWRvYmUuY29tL3hhcC8xLjAvIiB4bWxuczp4bXBNTT0iaHR0cDovL25zLmFkb2JlLmNvbS94YXAvMS4wL21tLyIgeG1sbnM6c3RSZWY9Imh0dHA6Ly9ucy5hZG9iZS5jb20veGFwLzEuMC9zVHlwZS9SZXNvdXJjZVJlZiMiIHhtcDpDcmVhdG9yVG9vbD0iQWRvYmUgUGhvdG9zaG9wIENDIDIwMTUgKE1hY2ludG9zaCkiIHhtcE1NOkluc3RhbmNlSUQ9InhtcC5paWQ6NzY3MkJEN0U2N0M1MTFFNkIyQkNFMjQwODEwMDIxNzEiIHhtcE1NOkRvY3VtZW50SUQ9InhtcC5kaWQ6NzY3MkJEN0Y2N0M1MTFFNkIyQkNFMjQwODEwMDIxNzEiPiA8eG1wTU06RGVyaXZlZEZyb20gc3RSZWY6aW5zdGFuY2VJRD0ieG1wLmlpZDo3NjcyQkQ3QzY3QzUxMUU2QjJCQ0UyNDA4MTAwMjE3MSIgc3RSZWY6ZG9jdW1lbnRJRD0ieG1wLmRpZDo3NjcyQkQ3RDY3QzUxMUU2QjJCQ0UyNDA4MTAwMjE3MSIvPiA8L3JkZjpEZXNjcmlwdGlvbj4gPC9yZGY6UkRGPiA8L3g6eG1wbWV0YT4gPD94cGFja2V0IGVuZD0iciI/Pve6J3kAAAAzSURBVHjaYvz//z8D0UDsMwMjSRoYP5Gq4SPNbRjVMEQ1fCRDg+in/6+J1AJUxsgAEGAA31BAJMS0GYEAAAAASUVORK5CYII=) 0 0 #1c1c21; }
.swagger-ui .b--black { border-color: #000; }
.swagger-ui .b--near-black { border-color: #121212; }
.swagger-ui .b--dark-gray { border-color: #333; }
.swagger-ui .b--mid-gray { border-color: #545454; }
.swagger-ui .b--gray { border-color: #787878; }
.swagger-ui .b--silver { border-color: #999; }
.swagger-ui .b--light-silver { border-color: #6e6e6e; }
.swagger-ui .b--moon-gray { border-color: #4d4d4d; }
.swagger-ui .b--light-gray { border-color: #2b2b2b; }
.swagger-ui .b--near-white { border-color: #242424; }
.swagger-ui .b--white { border-color: #1c1c21; }
.swagger-ui .b--white-90 { border-color: rgba(28, 28, 33, .9); }
.swagger-ui .b--white-80 { border-color: rgba(28, 28, 33, .8); }
.swagger-ui .b--white-70 { border-color: rgba(28, 28, 33, .7); }
.swagger-ui .b--white-60 { border-color: rgba(28, 28, 33, .6); }
.swagger-ui .b--white-50 { border-color: rgba(28, 28, 33, .5); }
.swagger-ui .b--white-40 { border-color: rgba(28, 28, 33, .4); }
.swagger-ui .b--white-30 { border-color: rgba(28, 28, 33, .3); }
.swagger-ui .b--white-20 { border-color: rgba(28, 28, 33, .2); }
.swagger-ui .b--white-10 { border-color: rgba(28, 28, 33, .1); }
.swagger-ui .b--white-05 { border-color: rgba(28, 28, 33, .05); }
.swagger-ui .b--white-025 { border-color: rgba(28, 28, 33, .024); }
.swagger-ui .b--white-0125 { border-color: rgba(28, 28, 33, .01); }
.swagger-ui .b--black-90 { border-color: rgba(0, 0, 0, .9); }
.swagger-ui .b--black-80 { border-color: rgba(0, 0, 0, .8); }
.swagger-ui .b--black-70 { border-color: rgba(0, 0, 0, .7); }
.swagger-ui .b--black-60 { border-color: rgba(0, 0, 0, .6); }
.swagger-ui .b--black-50 { border-color: rgba(0, 0, 0, .5); }
.swagger-ui .b--black-40 { border-color: rgba(0, 0, 0, .4); }
.swagger-ui .b--black-30 { border-color: rgba(0, 0, 0, .3); }
.swagger-ui .b--black-20 { border-color: rgba(0, 0, 0, .2); }
.swagger-ui .b--black-10 { border-color: rgba(0, 0, 0, .1); }
.swagger-ui .b--black-05 { border-color: rgba(0, 0, 0, .05); }
.swagger-ui .b--black-025 { border-color: rgba(0, 0, 0, .024); }
.swagger-ui .b--black-0125 { border-color: rgba(0, 0, 0, .01); }
.swagger-ui .b--dark-red { border-color: #bc2f36; }
.swagger-ui .b--red { border-color: #c83932; }
.swagger-ui .b--light-red { border-color: #ab3c2b; }
.swagger-ui .b--orange { border-color: #cc6e33; }
.swagger-ui .b--purple { border-color: #5e2ca5; }
.swagger-ui .b--light-purple { border-color: #672caf; }
.swagger-ui .b--dark-pink { border-color: #ab2b81; }
.swagger-ui .b--hot-pink { border-color: #c03086; }
.swagger-ui .b--pink { border-color: #8f2464; }
.swagger-ui .b--light-pink { border-color: #721d4d; }
.swagger-ui .b--dark-green { border-color: #1c6e50; }
.swagger-ui .b--green { border-color: #279b70; }
.swagger-ui .b--light-green { border-color: #228762; }
.swagger-ui .b--navy { border-color: #0d1d35; }
.swagger-ui .b--dark-blue { border-color: #20497e; }
.swagger-ui .b--blue { border-color: #4380d0; }
.swagger-ui .b--light-blue { border-color: #20517e; }
.swagger-ui .b--lightest-blue { border-color: #143a52; }
.swagger-ui .b--washed-blue { border-color: #0c312d; }
.swagger-ui .b--washed-green { border-color: #0f3d2c; }
.swagger-ui .b--washed-red { border-color: #411010; }
.swagger-ui .b--transparent { border-color: transparent; }
.swagger-ui .b--gold, .swagger-ui .b--light-yellow, .swagger-ui .b--washed-yellow, .swagger-ui .b--yellow { border-color: #664b00; }
.swagger-ui .shadow-1 { box-shadow: rgba(0, 0, 0, .2) 0 0 4px 2px; }
.swagger-ui .shadow-2 { box-shadow: rgba(0, 0, 0, .2) 0 0 8px 2px; }
.swagger-ui .shadow-3 { box-shadow: rgba(0, 0, 0, .2) 2px 2px 4px 2px; }
.swagger-ui .shadow-4 { box-shadow: rgba(0, 0, 0, .2) 2px 2px 8px 0; }
.swagger-ui .shadow-5 { box-shadow: rgba(0, 0, 0, .2) 4px 4px 8px 0; }
@media screen and (min-width: 30em) {
.swagger-ui .shadow-1-ns { box-shadow: rgba(0, 0, 0, .2) 0 0 4px 2px; }
.swagger-ui .shadow-2-ns { box-shadow: rgba(0, 0, 0, .2) 0 0 8px 2px; }
.swagger-ui .shadow-3-ns { box-shadow: rgba(0, 0, 0, .2) 2px 2px 4px 2px; }
.swagger-ui .shadow-4-ns { box-shadow: rgba(0, 0, 0, .2) 2px 2px 8px 0; }
.swagger-ui .shadow-5-ns { box-shadow: rgba(0, 0, 0, .2) 4px 4px 8px 0; }
}
@media screen and (max-width: 60em) and (min-width: 30em) {
.swagger-ui .shadow-1-m { box-shadow: rgba(0, 0, 0, .2) 0 0 4px 2px; }
.swagger-ui .shadow-2-m { box-shadow: rgba(0, 0, 0, .2) 0 0 8px 2px; }
.swagger-ui .shadow-3-m { box-shadow: rgba(0, 0, 0, .2) 2px 2px 4px 2px; }
.swagger-ui .shadow-4-m { box-shadow: rgba(0, 0, 0, .2) 2px 2px 8px 0; }
.swagger-ui .shadow-5-m { box-shadow: rgba(0, 0, 0, .2) 4px 4px 8px 0; }
}
@media screen and (min-width: 60em) {
.swagger-ui .shadow-1-l { box-shadow: rgba(0, 0, 0, .2) 0 0 4px 2px; }
.swagger-ui .shadow-2-l { box-shadow: rgba(0, 0, 0, .2) 0 0 8px 2px; }
.swagger-ui .shadow-3-l { box-shadow: rgba(0, 0, 0, .2) 2px 2px 4px 2px; }
.swagger-ui .shadow-4-l { box-shadow: rgba(0, 0, 0, .2) 2px 2px 8px 0; }
.swagger-ui .shadow-5-l { box-shadow: rgba(0, 0, 0, .2) 4px 4px 8px 0; }
}
.swagger-ui .black-05 { color: rgba(191, 191, 191, .05); }
.swagger-ui .bg-black-05 { background-color: rgba(0, 0, 0, .05); }
.swagger-ui .black-90, .swagger-ui .hover-black-90:focus, .swagger-ui .hover-black-90:hover { color: rgba(191, 191, 191, .9); }
.swagger-ui .black-80, .swagger-ui .hover-black-80:focus, .swagger-ui .hover-black-80:hover { color: rgba(191, 191, 191, .8); }
.swagger-ui .black-70, .swagger-ui .hover-black-70:focus, .swagger-ui .hover-black-70:hover { color: rgba(191, 191, 191, .7); }
.swagger-ui .black-60, .swagger-ui .hover-black-60:focus, .swagger-ui .hover-black-60:hover { color: rgba(191, 191, 191, .6); }
.swagger-ui .black-50, .swagger-ui .hover-black-50:focus, .swagger-ui .hover-black-50:hover { color: rgba(191, 191, 191, .5); }
.swagger-ui .black-40, .swagger-ui .hover-black-40:focus, .swagger-ui .hover-black-40:hover { color: rgba(191, 191, 191, .4); }
.swagger-ui .black-30, .swagger-ui .hover-black-30:focus, .swagger-ui .hover-black-30:hover { color: rgba(191, 191, 191, .3); }
.swagger-ui .black-20, .swagger-ui .hover-black-20:focus, .swagger-ui .hover-black-20:hover { color: rgba(191, 191, 191, .2); }
.swagger-ui .black-10, .swagger-ui .hover-black-10:focus, .swagger-ui .hover-black-10:hover { color: rgba(191, 191, 191, .1); }
.swagger-ui .hover-white-90:focus, .swagger-ui .hover-white-90:hover, .swagger-ui .white-90 { color: rgba(255, 255, 255, .9); }
.swagger-ui .hover-white-80:focus, .swagger-ui .hover-white-80:hover, .swagger-ui .white-80 { color: rgba(255, 255, 255, .8); }
.swagger-ui .hover-white-70:focus, .swagger-ui .hover-white-70:hover, .swagger-ui .white-70 { color: rgba(255, 255, 255, .7); }
.swagger-ui .hover-white-60:focus, .swagger-ui .hover-white-60:hover, .swagger-ui .white-60 { color: rgba(255, 255, 255, .6); }
.swagger-ui .hover-white-50:focus, .swagger-ui .hover-white-50:hover, .swagger-ui .white-50 { color: rgba(255, 255, 255, .5); }
.swagger-ui .hover-white-40:focus, .swagger-ui .hover-white-40:hover, .swagger-ui .white-40 { color: rgba(255, 255, 255, .4); }
.swagger-ui .hover-white-30:focus, .swagger-ui .hover-white-30:hover, .swagger-ui .white-30 { color: rgba(255, 255, 255, .3); }
.swagger-ui .hover-white-20:focus, .swagger-ui .hover-white-20:hover, .swagger-ui .white-20 { color: rgba(255, 255, 255, .2); }
.swagger-ui .hover-white-10:focus, .swagger-ui .hover-white-10:hover, .swagger-ui .white-10 { color: rgba(255, 255, 255, .1); }
.swagger-ui .hover-moon-gray:focus, .swagger-ui .hover-moon-gray:hover, .swagger-ui .moon-gray { color: #ccc; }
.swagger-ui .hover-light-gray:focus, .swagger-ui .hover-light-gray:hover, .swagger-ui .light-gray { color: #ededed; }
.swagger-ui .hover-near-white:focus, .swagger-ui .hover-near-white:hover, .swagger-ui .near-white { color: #f5f5f5; }
.swagger-ui .dark-red, .swagger-ui .hover-dark-red:focus, .swagger-ui .hover-dark-red:hover { color: #e6999d; }
.swagger-ui .hover-red:focus, .swagger-ui .hover-red:hover, .swagger-ui .red { color: #e69d99; }
.swagger-ui .hover-light-red:focus, .swagger-ui .hover-light-red:hover, .swagger-ui .light-red { color: #e6a399; }
.swagger-ui .hover-orange:focus, .swagger-ui .hover-orange:hover, .swagger-ui .orange { color: #e6b699; }
.swagger-ui .gold, .swagger-ui .hover-gold:focus, .swagger-ui .hover-gold:hover { color: #e6d099; }
.swagger-ui .hover-yellow:focus, .swagger-ui .hover-yellow:hover, .swagger-ui .yellow { color: #e6da99; }
.swagger-ui .hover-light-yellow:focus, .swagger-ui .hover-light-yellow:hover, .swagger-ui .light-yellow { color: #ede6b6; }
.swagger-ui .hover-purple:focus, .swagger-ui .hover-purple:hover, .swagger-ui .purple { color: #b99ae4; }
.swagger-ui .hover-light-purple:focus, .swagger-ui .hover-light-purple:hover, .swagger-ui .light-purple { color: #bb99e6; }
.swagger-ui .dark-pink, .swagger-ui .hover-dark-pink:focus, .swagger-ui .hover-dark-pink:hover { color: #e699cc; }
.swagger-ui .hot-pink, .swagger-ui .hover-hot-pink:focus, .swagger-ui .hover-hot-pink:hover, .swagger-ui .hover-pink:focus, .swagger-ui .hover-pink:hover, .swagger-ui .pink { color: #e699c7; }
.swagger-ui .hover-light-pink:focus, .swagger-ui .hover-light-pink:hover, .swagger-ui .light-pink { color: #edb6d5; }
.swagger-ui .dark-green, .swagger-ui .green, .swagger-ui .hover-dark-green:focus, .swagger-ui .hover-dark-green:hover, .swagger-ui .hover-green:focus, .swagger-ui .hover-green:hover { color: #99e6c9; }
.swagger-ui .hover-light-green:focus, .swagger-ui .hover-light-green:hover, .swagger-ui .light-green { color: #a1e8ce; }
.swagger-ui .hover-navy:focus, .swagger-ui .hover-navy:hover, .swagger-ui .navy { color: #99b8e6; }
.swagger-ui .blue, .swagger-ui .dark-blue, .swagger-ui .hover-blue:focus, .swagger-ui .hover-blue:hover, .swagger-ui .hover-dark-blue:focus, .swagger-ui .hover-dark-blue:hover { color: #99bae6; }
.swagger-ui .hover-light-blue:focus, .swagger-ui .hover-light-blue:hover, .swagger-ui .light-blue { color: #a9cbea; }
.swagger-ui .hover-lightest-blue:focus, .swagger-ui .hover-lightest-blue:hover, .swagger-ui .lightest-blue { color: #d6e9f5; }
.swagger-ui .hover-washed-blue:focus, .swagger-ui .hover-washed-blue:hover, .swagger-ui .washed-blue { color: #f7fdfc; }
.swagger-ui .hover-washed-green:focus, .swagger-ui .hover-washed-green:hover, .swagger-ui .washed-green { color: #ebfaf4; }
.swagger-ui .hover-washed-yellow:focus, .swagger-ui .hover-washed-yellow:hover, .swagger-ui .washed-yellow { color: #fbf9ef; }
.swagger-ui .hover-washed-red:focus, .swagger-ui .hover-washed-red:hover, .swagger-ui .washed-red { color: #f9e7e7; }
.swagger-ui .color-inherit, .swagger-ui .hover-inherit:focus, .swagger-ui .hover-inherit:hover { color: inherit; }
.swagger-ui .bg-black-90, .swagger-ui .hover-bg-black-90:focus, .swagger-ui .hover-bg-black-90:hover { background-color: rgba(0, 0, 0, .9); }
.swagger-ui .bg-black-80, .swagger-ui .hover-bg-black-80:focus, .swagger-ui .hover-bg-black-80:hover { background-color: rgba(0, 0, 0, .8); }
.swagger-ui .bg-black-70, .swagger-ui .hover-bg-black-70:focus, .swagger-ui .hover-bg-black-70:hover { background-color: rgba(0, 0, 0, .7); }
.swagger-ui .bg-black-60, .swagger-ui .hover-bg-black-60:focus, .swagger-ui .hover-bg-black-60:hover { background-color: rgba(0, 0, 0, .6); }
.swagger-ui .bg-black-50, .swagger-ui .hover-bg-black-50:focus, .swagger-ui .hover-bg-black-50:hover { background-color: rgba(0, 0, 0, .5); }
.swagger-ui .bg-black-40, .swagger-ui .hover-bg-black-40:focus, .swagger-ui .hover-bg-black-40:hover { background-color: rgba(0, 0, 0, .4); }
.swagger-ui .bg-black-30, .swagger-ui .hover-bg-black-30:focus, .swagger-ui .hover-bg-black-30:hover { background-color: rgba(0, 0, 0, .3); }
.swagger-ui .bg-black-20, .swagger-ui .hover-bg-black-20:focus, .swagger-ui .hover-bg-black-20:hover { background-color: rgba(0, 0, 0, .2); }
.swagger-ui .bg-white-90, .swagger-ui .hover-bg-white-90:focus, .swagger-ui .hover-bg-white-90:hover { background-color: rgba(28, 28, 33, .9); }
.swagger-ui .bg-white-80, .swagger-ui .hover-bg-white-80:focus, .swagger-ui .hover-bg-white-80:hover { background-color: rgba(28, 28, 33, .8); }
.swagger-ui .bg-white-70, .swagger-ui .hover-bg-white-70:focus, .swagger-ui .hover-bg-white-70:hover { background-color: rgba(28, 28, 33, .7); }
.swagger-ui .bg-white-60, .swagger-ui .hover-bg-white-60:focus, .swagger-ui .hover-bg-white-60:hover { background-color: rgba(28, 28, 33, .6); }
.swagger-ui .bg-white-50, .swagger-ui .hover-bg-white-50:focus, .swagger-ui .hover-bg-white-50:hover { background-color: rgba(28, 28, 33, .5); }
.swagger-ui .bg-white-40, .swagger-ui .hover-bg-white-40:focus, .swagger-ui .hover-bg-white-40:hover { background-color: rgba(28, 28, 33, .4); }
.swagger-ui .bg-white-30, .swagger-ui .hover-bg-white-30:focus, .swagger-ui .hover-bg-white-30:hover { background-color: rgba(28, 28, 33, .3); }
.swagger-ui .bg-white-20, .swagger-ui .hover-bg-white-20:focus, .swagger-ui .hover-bg-white-20:hover { background-color: rgba(28, 28, 33, .2); }
.swagger-ui .bg-black, .swagger-ui .hover-bg-black:focus, .swagger-ui .hover-bg-black:hover { background-color: #000; }
.swagger-ui .bg-near-black, .swagger-ui .hover-bg-near-black:focus, .swagger-ui .hover-bg-near-black:hover { background-color: #121212; }
.swagger-ui .bg-dark-gray, .swagger-ui .hover-bg-dark-gray:focus, .swagger-ui .hover-bg-dark-gray:hover { background-color: #333; }
.swagger-ui .bg-mid-gray, .swagger-ui .hover-bg-mid-gray:focus, .swagger-ui .hover-bg-mid-gray:hover { background-color: #545454; }
.swagger-ui .bg-gray, .swagger-ui .hover-bg-gray:focus, .swagger-ui .hover-bg-gray:hover { background-color: #787878; }
.swagger-ui .bg-silver, .swagger-ui .hover-bg-silver:focus, .swagger-ui .hover-bg-silver:hover { background-color: #999; }
.swagger-ui .bg-white, .swagger-ui .hover-bg-white:focus, .swagger-ui .hover-bg-white:hover { background-color: #1c1c21; }
.swagger-ui .bg-transparent, .swagger-ui .hover-bg-transparent:focus, .swagger-ui .hover-bg-transparent:hover { background-color: transparent; }
.swagger-ui .bg-dark-red, .swagger-ui .hover-bg-dark-red:focus, .swagger-ui .hover-bg-dark-red:hover { background-color: #bc2f36; }
.swagger-ui .bg-red, .swagger-ui .hover-bg-red:focus, .swagger-ui .hover-bg-red:hover { background-color: #c83932; }
.swagger-ui .bg-light-red, .swagger-ui .hover-bg-light-red:focus, .swagger-ui .hover-bg-light-red:hover { background-color: #ab3c2b; }
.swagger-ui .bg-orange, .swagger-ui .hover-bg-orange:focus, .swagger-ui .hover-bg-orange:hover { background-color: #cc6e33; }
.swagger-ui .bg-gold, .swagger-ui .bg-light-yellow, .swagger-ui .bg-washed-yellow, .swagger-ui .bg-yellow, .swagger-ui .hover-bg-gold:focus, .swagger-ui .hover-bg-gold:hover, .swagger-ui .hover-bg-light-yellow:focus, .swagger-ui .hover-bg-light-yellow:hover, .swagger-ui .hover-bg-washed-yellow:focus, .swagger-ui .hover-bg-washed-yellow:hover, .swagger-ui .hover-bg-yellow:focus, .swagger-ui .hover-bg-yellow:hover { background-color: #664b00; }
.swagger-ui .bg-purple, .swagger-ui .hover-bg-purple:focus, .swagger-ui .hover-bg-purple:hover { background-color: #5e2ca5; }
.swagger-ui .bg-light-purple, .swagger-ui .hover-bg-light-purple:focus, .swagger-ui .hover-bg-light-purple:hover { background-color: #672caf; }
.swagger-ui .bg-dark-pink, .swagger-ui .hover-bg-dark-pink:focus, .swagger-ui .hover-bg-dark-pink:hover { background-color: #ab2b81; }
.swagger-ui .bg-hot-pink, .swagger-ui .hover-bg-hot-pink:focus, .swagger-ui .hover-bg-hot-pink:hover { background-color: #c03086; }
.swagger-ui .bg-pink, .swagger-ui .hover-bg-pink:focus, .swagger-ui .hover-bg-pink:hover { background-color: #8f2464; }
.swagger-ui .bg-light-pink, .swagger-ui .hover-bg-light-pink:focus, .swagger-ui .hover-bg-light-pink:hover { background-color: #721d4d; }
.swagger-ui .bg-dark-green, .swagger-ui .hover-bg-dark-green:focus, .swagger-ui .hover-bg-dark-green:hover { background-color: #1c6e50; }
.swagger-ui .bg-green, .swagger-ui .hover-bg-green:focus, .swagger-ui .hover-bg-green:hover { background-color: #279b70; }
.swagger-ui .bg-light-green, .swagger-ui .hover-bg-light-green:focus, .swagger-ui .hover-bg-light-green:hover { background-color: #228762; }
.swagger-ui .bg-navy, .swagger-ui .hover-bg-navy:focus, .swagger-ui .hover-bg-navy:hover { background-color: #0d1d35; }
.swagger-ui .bg-dark-blue, .swagger-ui .hover-bg-dark-blue:focus, .swagger-ui .hover-bg-dark-blue:hover { background-color: #20497e; }
.swagger-ui .bg-blue, .swagger-ui .hover-bg-blue:focus, .swagger-ui .hover-bg-blue:hover { background-color: #4380d0; }
.swagger-ui .bg-light-blue, .swagger-ui .hover-bg-light-blue:focus, .swagger-ui .hover-bg-light-blue:hover { background-color: #20517e; }
.swagger-ui .bg-lightest-blue, .swagger-ui .hover-bg-lightest-blue:focus, .swagger-ui .hover-bg-lightest-blue:hover { background-color: #143a52; }
.swagger-ui .bg-washed-blue, .swagger-ui .hover-bg-washed-blue:focus, .swagger-ui .hover-bg-washed-blue:hover { background-color: #0c312d; }
.swagger-ui .bg-washed-green, .swagger-ui .hover-bg-washed-green:focus, .swagger-ui .hover-bg-washed-green:hover { background-color: #0f3d2c; }
.swagger-ui .bg-washed-red, .swagger-ui .hover-bg-washed-red:focus, .swagger-ui .hover-bg-washed-red:hover { background-color: #411010; }
.swagger-ui .bg-inherit, .swagger-ui .hover-bg-inherit:focus, .swagger-ui .hover-bg-inherit:hover { background-color: inherit; }
.swagger-ui .shadow-hover { transition: all .5s cubic-bezier(.165, .84, .44, 1) 0s; }
.swagger-ui .shadow-hover::after {
border-radius: inherit;
box-shadow: rgba(0, 0, 0, .2) 0 0 16px 2px;
content: "";
height: 100%;
left: 0;
opacity: 0;
position: absolute;
top: 0;
transition: opacity .5s cubic-bezier(.165, .84, .44, 1) 0s;
width: 100%;
z-index: -1;
}
.swagger-ui .bg-animate, .swagger-ui .bg-animate:focus, .swagger-ui .bg-animate:hover { transition: background-color .15s ease-in-out 0s; }
.swagger-ui .nested-links a {
color: #99bae6;
transition: color .15s ease-in 0s;
}
.swagger-ui .nested-links a:focus, .swagger-ui .nested-links a:hover {
color: #a9cbea;
transition: color .15s ease-in 0s;
}
.swagger-ui .opblock-tag {
border-bottom: 1px solid rgba(58, 64, 80, .3);
color: #b5bac9;
transition: all .2s ease 0s;
}
.swagger-ui .opblock-tag svg, .swagger-ui section.models h4 svg { transition: all .4s ease 0s; }
.swagger-ui .opblock {
border: 1px solid #000;
border-radius: 4px;
box-shadow: rgba(0, 0, 0, .19) 0 0 3px;
margin: 0 0 15px;
}
.swagger-ui .opblock .tab-header .tab-item.active h4 span::after { background: gray; }
.swagger-ui .opblock.is-open .opblock-summary { border-bottom: 1px solid #000; }
.swagger-ui .opblock .opblock-section-header {
background: rgba(28, 28, 33, .8);
box-shadow: rgba(0, 0, 0, .1) 0 1px 2px;
}
.swagger-ui .opblock .opblock-section-header > label > span { padding: 0 10px 0 0; }
.swagger-ui .opblock .opblock-summary-method {
background: #000;
color: #fff;
text-shadow: rgba(0, 0, 0, .1) 0 1px 0;
}
.swagger-ui .opblock.opblock-post {
background: rgba(72, 203, 144, .1);
border-color: #48cb90;
}
.swagger-ui .opblock.opblock-post .opblock-summary-method, .swagger-ui .opblock.opblock-post .tab-header .tab-item.active h4 span::after { background: #48cb90; }
.swagger-ui .opblock.opblock-post .opblock-summary { border-color: #48cb90; }
.swagger-ui .opblock.opblock-put {
background: rgba(213, 157, 88, .1);
border-color: #d59d58;
}
.swagger-ui .opblock.opblock-put .opblock-summary-method, .swagger-ui .opblock.opblock-put .tab-header .tab-item.active h4 span::after { background: #d59d58; }
.swagger-ui .opblock.opblock-put .opblock-summary { border-color: #d59d58; }
.swagger-ui .opblock.opblock-delete {
background: rgba(200, 50, 50, .1);
border-color: #c83232;
}
.swagger-ui .opblock.opblock-delete .opblock-summary-method, .swagger-ui .opblock.opblock-delete .tab-header .tab-item.active h4 span::after { background: #c83232; }
.swagger-ui .opblock.opblock-delete .opblock-summary { border-color: #c83232; }
.swagger-ui .opblock.opblock-get {
background: rgba(42, 105, 167, .1);
border-color: #2a69a7;
}
.swagger-ui .opblock.opblock-get .opblock-summary-method, .swagger-ui .opblock.opblock-get .tab-header .tab-item.active h4 span::after { background: #2a69a7; }
.swagger-ui .opblock.opblock-get .opblock-summary { border-color: #2a69a7; }
.swagger-ui .opblock.opblock-patch {
background: rgba(92, 214, 188, .1);
border-color: #5cd6bc;
}
.swagger-ui .opblock.opblock-patch .opblock-summary-method, .swagger-ui .opblock.opblock-patch .tab-header .tab-item.active h4 span::after { background: #5cd6bc; }
.swagger-ui .opblock.opblock-patch .opblock-summary { border-color: #5cd6bc; }
.swagger-ui .opblock.opblock-head {
background: rgba(140, 63, 207, .1);
border-color: #8c3fcf;
}
.swagger-ui .opblock.opblock-head .opblock-summary-method, .swagger-ui .opblock.opblock-head .tab-header .tab-item.active h4 span::after { background: #8c3fcf; }
.swagger-ui .opblock.opblock-head .opblock-summary { border-color: #8c3fcf; }
.swagger-ui .opblock.opblock-options {
background: rgba(36, 89, 143, .1);
border-color: #24598f;
}
.swagger-ui .opblock.opblock-options .opblock-summary-method, .swagger-ui .opblock.opblock-options .tab-header .tab-item.active h4 span::after { background: #24598f; }
.swagger-ui .opblock.opblock-options .opblock-summary { border-color: #24598f; }
.swagger-ui .opblock.opblock-deprecated {
background: rgba(46, 46, 46, .1);
border-color: #2e2e2e;
opacity: .6;
}
.swagger-ui .opblock.opblock-deprecated .opblock-summary-method, .swagger-ui .opblock.opblock-deprecated .tab-header .tab-item.active h4 span::after { background: #2e2e2e; }
.swagger-ui .opblock.opblock-deprecated .opblock-summary { border-color: #2e2e2e; }
.swagger-ui .filter .operation-filter-input { border: 2px solid #2b3446; }
.swagger-ui .tab li:first-of-type::after { background: rgba(0, 0, 0, .2); }
.swagger-ui .download-contents {
background: #7c8192;
color: #fff;
}
.swagger-ui .scheme-container {
background: #1c1c21;
box-shadow: rgba(0, 0, 0, .15) 0 1px 2px 0;
}
.swagger-ui .loading-container .loading::before {
animation: 1s linear 0s infinite normal none running rotation, .5s ease 0s 1 normal none running opacity;
border-color: rgba(0, 0, 0, .6) rgba(84, 84, 84, .1) rgba(84, 84, 84, .1);
}
.swagger-ui .response-control-media-type--accept-controller select { border-color: #196619; }
.swagger-ui .response-control-media-type__accept-message { color: #99e699; }
.swagger-ui .version-pragma__message code { background-color: #3b3b3b; }
.swagger-ui .btn {
background: 0 0;
border: 2px solid gray;
box-shadow: rgba(0, 0, 0, .1) 0 1px 2px;
color: #b5bac9;
}
.swagger-ui .btn:hover { box-shadow: rgba(0, 0, 0, .3) 0 0 5px; }
.swagger-ui .btn.authorize, .swagger-ui .btn.cancel {
background-color: transparent;
border-color: #a72a2a;
color: #e69999;
}
.swagger-ui .btn.authorize {
border-color: #48cb90;
color: #9ce3c3;
}
.swagger-ui .btn.authorize svg { fill: #9ce3c3; }
.swagger-ui .btn.execute {
background-color: #5892d5;
border-color: #5892d5;
color: #fff;
}
.swagger-ui .copy-to-clipboard { background: #7c8192; }
.swagger-ui .copy-to-clipboard button { background: url("data:image/svg+xml;charset=utf-8,<svg xmlns=\"http://www.w3.org/2000/svg\" width=\"16\" height=\"16\" aria-hidden=\"true\"><path fill=\"%23fff\" fill-rule=\"evenodd\" d=\"M2 13h4v1H2v-1zm5-6H2v1h5V7zm2 3V8l-3 3 3 3v-2h5v-2H9zM4.5 9H2v1h2.5V9zM2 12h2.5v-1H2v1zm9 1h1v2c-.02.28-.11.52-.3.7-.19.18-.42.28-.7.3H1c-.55 0-1-.45-1-1V4c0-.55.45-1 1-1h3c0-1.11.89-2 2-2 1.11 0 2 .89 2 2h3c.55 0 1 .45 1 1v5h-1V6H1v9h10v-2zM2 5h8c0-.55-.45-1-1-1H8c-.55 0-1-.45-1-1s-.45-1-1-1-1 .45-1 1-.45 1-1 1H3c-.55 0-1 .45-1 1z\"/></svg>") 50% center no-repeat; }
.swagger-ui select {
background: url("data:image/svg+xml;charset=utf-8,<svg xmlns=\"http://www.w3.org/2000/svg\" viewBox=\"0 0 20 20\"><path d=\"M13.418 7.859a.695.695 0 01.978 0 .68.68 0 010 .969l-3.908 3.83a.697.697 0 01-.979 0l-3.908-3.83a.68.68 0 010-.969.695.695 0 01.978 0L10 11l3.418-3.141z\"/></svg>") right 10px center/20px no-repeat #212121;
background: url(data:image/svg+xml;base64,PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0iVVRGLTgiIHN0YW5kYWxvbmU9Im5vIj8+CjxzdmcKICAgeG1sbnM6ZGM9Imh0dHA6Ly9wdXJsLm9yZy9kYy9lbGVtZW50cy8xLjEvIgogICB4bWxuczpjYz0iaHR0cDovL2NyZWF0aXZlY29tbW9ucy5vcmcvbnMjIgogICB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5OTkvMDIvMjItcmRmLXN5bnRheC1ucyMiCiAgIHhtbG5zOnN2Zz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciCiAgIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyIKICAgeG1sbnM6c29kaXBvZGk9Imh0dHA6Ly9zb2RpcG9kaS5zb3VyY2Vmb3JnZS5uZXQvRFREL3NvZGlwb2RpLTAuZHRkIgogICB4bWxuczppbmtzY2FwZT0iaHR0cDovL3d3dy5pbmtzY2FwZS5vcmcvbmFtZXNwYWNlcy9pbmtzY2FwZSIKICAgaW5rc2NhcGU6dmVyc2lvbj0iMS4wICg0MDM1YTRmYjQ5LCAyMDIwLTA1LTAxKSIKICAgc29kaXBvZGk6ZG9jbmFtZT0iZG93bmxvYWQuc3ZnIgogICBpZD0ic3ZnNCIKICAgdmVyc2lvbj0iMS4xIgogICB2aWV3Qm94PSIwIDAgMjAgMjAiPgogIDxtZXRhZGF0YQogICAgIGlkPSJtZXRhZGF0YTEwIj4KICAgIDxyZGY6UkRGPgogICAgICA8Y2M6V29yawogICAgICAgICByZGY6YWJvdXQ9IiI+CiAgICAgICAgPGRjOmZvcm1hdD5pbWFnZS9zdmcreG1sPC9kYzpmb3JtYXQ+CiAgICAgICAgPGRjOnR5cGUKICAgICAgICAgICByZGY6cmVzb3VyY2U9Imh0dHA6Ly9wdXJsLm9yZy9kYy9kY21pdHlwZS9TdGlsbEltYWdlIiAvPgogICAgICA8L2NjOldvcms+CiAgICA8L3JkZjpSREY+CiAgPC9tZXRhZGF0YT4KICA8ZGVmcwogICAgIGlkPSJkZWZzOCIgLz4KICA8c29kaXBvZGk6bmFtZWR2aWV3CiAgICAgaW5rc2NhcGU6Y3VycmVudC1sYXllcj0ic3ZnNCIKICAgICBpbmtzY2FwZTp3aW5kb3ctbWF4aW1pemVkPSIxIgogICAgIGlua3NjYXBlOndpbmRvdy15PSItOSIKICAgICBpbmtzY2FwZTp3aW5kb3cteD0iLTkiCiAgICAgaW5rc2NhcGU6Y3k9IjEwIgogICAgIGlua3NjYXBlOmN4PSIxMCIKICAgICBpbmtzY2FwZTp6b29tPSI0MS41IgogICAgIHNob3dncmlkPSJmYWxzZSIKICAgICBpZD0ibmFtZWR2aWV3NiIKICAgICBpbmtzY2FwZTp3aW5kb3ctaGVpZ2h0PSIxMDAxIgogICAgIGlua3NjYXBlOndpbmRvdy13aWR0aD0iMTkyMCIKICAgICBpbmtzY2FwZTpwYWdlc2hhZG93PSIyIgogICAgIGlua3NjYXBlOnBhZ2VvcGFjaXR5PSIwIgogICAgIGd1aWRldG9sZXJhbmNlPSIxMCIKICAgICBncmlkdG9sZXJhbmNlPSIxMCIKICAgICBvYmplY3R0b2xlcmFuY2U9IjEwIgogICAgIGJvcmRlcm9wYWNpdHk9IjEiCiAgICAgYm9yZGVyY29sb3I9IiM2NjY2NjYiCiAgICAgcGFnZWNvbG9yPSIjZmZmZmZmIiAvPgogIDxwYXRoCiAgICAgc3R5bGU9ImZpbGw6I2ZmZmZmZiIKICAgICBpZD0icGF0aDIiCiAgICAgZD0iTTEzLjQxOCA3Ljg1OWEuNjk1LjY5NSAwIDAxLjk3OCAwIC42OC42OCAwIDAxMCAuOTY5bC0zLjkwOCAzLjgzYS42OTcuNjk3IDAgMDEtLjk3OSAwbC0zLjkwOC0zLjgzYS42OC42OCAwIDAxMC0uOTY5LjY5NS42OTUgMCAwMS45NzggMEwxMCAxMWwzLjQxOC0zLjE0MXoiIC8+Cjwvc3ZnPgo=) right 10px center/20px no-repeat #1c1c21;
border: 2px solid #41444e;
}
.swagger-ui select[multiple] { background: #212121; }
.swagger-ui button.invalid, .swagger-ui input[type=email].invalid, .swagger-ui input[type=file].invalid, .swagger-ui input[type=password].invalid, .swagger-ui input[type=search].invalid, .swagger-ui input[type=text].invalid, .swagger-ui select.invalid, .swagger-ui textarea.invalid {
background: #390e0e;
border-color: #c83232;
}
.swagger-ui input[type=email], .swagger-ui input[type=file], .swagger-ui input[type=password], .swagger-ui input[type=search], .swagger-ui input[type=text], .swagger-ui textarea {
background: #1c1c21;
border: 1px solid #404040;
}
.swagger-ui textarea {
background: rgba(28, 28, 33, .8);
color: #b5bac9;
}
.swagger-ui input[disabled], .swagger-ui select[disabled] {
background-color: #1f1f1f;
color: #bfbfbf;
}
.swagger-ui textarea[disabled] {
background-color: #41444e;
color: #fff;
}
.swagger-ui select[disabled] { border-color: #878787; }
.swagger-ui textarea:focus { border: 2px solid #2a69a7; }
.swagger-ui .checkbox input[type=checkbox] + label > .item {
background: #303030;
box-shadow: #303030 0 0 0 2px;
}
.swagger-ui .checkbox input[type=checkbox]:checked + label > .item { background: url("data:image/svg+xml;charset=utf-8,<svg width=\"10\" height=\"8\" viewBox=\"3 7 10 8\" xmlns=\"http://www.w3.org/2000/svg\"><path fill=\"%2341474E\" fill-rule=\"evenodd\" d=\"M6.333 15L3 11.667l1.333-1.334 2 2L11.667 7 13 8.333z\"/></svg>") 50% center no-repeat #303030; }
.swagger-ui .dialog-ux .backdrop-ux { background: rgba(0, 0, 0, .8); }
.swagger-ui .dialog-ux .modal-ux {
background: #1c1c21;
border: 1px solid #2e2e2e;
box-shadow: rgba(0, 0, 0, .2) 0 10px 30px 0;
}
.swagger-ui .dialog-ux .modal-ux-header .close-modal { background: 0 0; }
.swagger-ui .model .deprecated span, .swagger-ui .model .deprecated td { color: #bfbfbf !important; }
.swagger-ui .model-toggle::after { background: url("data:image/svg+xml;charset=utf-8,<svg xmlns=\"http://www.w3.org/2000/svg\" width=\"24\" height=\"24\"><path d=\"M10 6L8.59 7.41 13.17 12l-4.58 4.59L10 18l6-6z\"/></svg>") 50% center/100% no-repeat; }
.swagger-ui .model-hint {
background: rgba(0, 0, 0, .7);
color: #ebebeb;
}
.swagger-ui section.models { border: 1px solid rgba(58, 64, 80, .3); }
.swagger-ui section.models.is-open h4 { border-bottom: 1px solid rgba(58, 64, 80, .3); }
.swagger-ui section.models .model-container { background: rgba(0, 0, 0, .05); }
.swagger-ui section.models .model-container:hover { background: rgba(0, 0, 0, .07); }
.swagger-ui .model-box { background: rgba(0, 0, 0, .1); }
.swagger-ui .prop-type { color: #aaaad4; }
.swagger-ui table thead tr td, .swagger-ui table thead tr th {
border-bottom: 1px solid rgba(58, 64, 80, .2);
color: #b5bac9;
}
.swagger-ui .parameter__name.required::after { color: rgba(230, 153, 153, .6); }
.swagger-ui .topbar .download-url-wrapper .select-label { color: #f0f0f0; }
.swagger-ui .topbar .download-url-wrapper .download-url-button {
background: #63a040;
color: #fff;
}
.swagger-ui .info .title small { background: #7c8492; }
.swagger-ui .info .title small.version-stamp { background-color: #7a9b27; }
.swagger-ui .auth-container .errors {
background-color: #350d0d;
color: #b5bac9;
}
.swagger-ui .errors-wrapper {
background: rgba(200, 50, 50, .1);
border: 2px solid #c83232;
}
.swagger-ui .markdown code, .swagger-ui .renderedmarkdown code {
background: rgba(0, 0, 0, .05);
color: #c299e6;
}
.swagger-ui .model-toggle:after { background: url(data:image/svg+xml;base64,PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0iVVRGLTgiIHN0YW5kYWxvbmU9Im5vIj8+CjxzdmcKICAgeG1sbnM6ZGM9Imh0dHA6Ly9wdXJsLm9yZy9kYy9lbGVtZW50cy8xLjEvIgogICB4bWxuczpjYz0iaHR0cDovL2NyZWF0aXZlY29tbW9ucy5vcmcvbnMjIgogICB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5OTkvMDIvMjItcmRmLXN5bnRheC1ucyMiCiAgIHhtbG5zOnN2Zz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciCiAgIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyIKICAgeG1sbnM6c29kaXBvZGk9Imh0dHA6Ly9zb2RpcG9kaS5zb3VyY2Vmb3JnZS5uZXQvRFREL3NvZGlwb2RpLTAuZHRkIgogICB4bWxuczppbmtzY2FwZT0iaHR0cDovL3d3dy5pbmtzY2FwZS5vcmcvbmFtZXNwYWNlcy9pbmtzY2FwZSIKICAgaW5rc2NhcGU6dmVyc2lvbj0iMS4wICg0MDM1YTRmYjQ5LCAyMDIwLTA1LTAxKSIKICAgc29kaXBvZGk6ZG9jbmFtZT0iZG93bmxvYWQyLnN2ZyIKICAgaWQ9InN2ZzQiCiAgIHZlcnNpb249IjEuMSIKICAgaGVpZ2h0PSIyNCIKICAgd2lkdGg9IjI0Ij4KICA8bWV0YWRhdGEKICAgICBpZD0ibWV0YWRhdGExMCI+CiAgICA8cmRmOlJERj4KICAgICAgPGNjOldvcmsKICAgICAgICAgcmRmOmFib3V0PSIiPgogICAgICAgIDxkYzpmb3JtYXQ+aW1hZ2Uvc3ZnK3htbDwvZGM6Zm9ybWF0PgogICAgICAgIDxkYzp0eXBlCiAgICAgICAgICAgcmRmOnJlc291cmNlPSJodHRwOi8vcHVybC5vcmcvZGMvZGNtaXR5cGUvU3RpbGxJbWFnZSIgLz4KICAgICAgPC9jYzpXb3JrPgogICAgPC9yZGY6UkRGPgogIDwvbWV0YWRhdGE+CiAgPGRlZnMKICAgICBpZD0iZGVmczgiIC8+CiAgPHNvZGlwb2RpOm5hbWVkdmlldwogICAgIGlua3NjYXBlOmN1cnJlbnQtbGF5ZXI9InN2ZzQiCiAgICAgaW5rc2NhcGU6d2luZG93LW1heGltaXplZD0iMSIKICAgICBpbmtzY2FwZTp3aW5kb3cteT0iLTkiCiAgICAgaW5rc2NhcGU6d2luZG93LXg9Ii05IgogICAgIGlua3NjYXBlOmN5PSIxMiIKICAgICBpbmtzY2FwZTpjeD0iMTIiCiAgICAgaW5rc2NhcGU6em9vbT0iMzQuNTgzMzMzIgogICAgIHNob3dncmlkPSJmYWxzZSIKICAgICBpZD0ibmFtZWR2aWV3NiIKICAgICBpbmtzY2FwZTp3aW5kb3ctaGVpZ2h0PSIxMDAxIgogICAgIGlua3NjYXBlOndpbmRvdy13aWR0aD0iMTkyMCIKICAgICBpbmtzY2FwZTpwYWdlc2hhZG93PSIyIgogICAgIGlua3NjYXBlOnBhZ2VvcGFjaXR5PSIwIgogICAgIGd1aWRldG9sZXJhbmNlPSIxMCIKICAgICBncmlkdG9sZXJhbmNlPSIxMCIKICAgICBvYmplY3R0b2xlcmFuY2U9IjEwIgogICAgIGJvcmRlcm9wYWNpdHk9IjEiCiAgICAgYm9yZGVyY29sb3I9IiM2NjY2NjYiCiAgICAgcGFnZWNvbG9yPSIjZmZmZmZmIiAvPgogIDxwYXRoCiAgICAgc3R5bGU9ImZpbGw6I2ZmZmZmZiIKICAgICBpZD0icGF0aDIiCiAgICAgZD0iTTEwIDZMOC41OSA3LjQxIDEzLjE3IDEybC00LjU4IDQuNTlMMTAgMThsNi02eiIgLz4KPC9zdmc+Cg==) 50% no-repeat; }
.swagger-ui .expand-operation svg, .swagger-ui section.models h4 svg { fill: #fff; }
::-webkit-scrollbar-track { background-color: #646464 !important; }
::-webkit-scrollbar-thumb {
background-color: #242424 !important;
border: 2px solid #3e4346 !important;
}
::-webkit-scrollbar-button:vertical:start:decrement {
background: linear-gradient(130deg, #696969 40%, rgba(255, 0, 0, 0) 41%), linear-gradient(230deg, #696969 40%, transparent 41%), linear-gradient(0deg, #696969 40%, transparent 31%);
background-color: #b6b6b6;
}
::-webkit-scrollbar-button:vertical:end:increment {
background: linear-gradient(310deg, #696969 40%, transparent 41%), linear-gradient(50deg, #696969 40%, transparent 41%), linear-gradient(180deg, #696969 40%, transparent 31%);
background-color: #b6b6b6;
}
::-webkit-scrollbar-button:horizontal:end:increment {
background: linear-gradient(210deg, #696969 40%, transparent 41%), linear-gradient(330deg, #696969 40%, transparent 41%), linear-gradient(90deg, #696969 30%, transparent 31%);
background-color: #b6b6b6;
}
::-webkit-scrollbar-button:horizontal:start:decrement {
background: linear-gradient(30deg, #696969 40%, transparent 41%), linear-gradient(150deg, #696969 40%, transparent 41%), linear-gradient(270deg, #696969 30%, transparent 31%);
background-color: #b6b6b6;
}
::-webkit-scrollbar-button, ::-webkit-scrollbar-track-piece { background-color: #3e4346 !important; }
.swagger-ui .black, .swagger-ui .checkbox, .swagger-ui .dark-gray, .swagger-ui .download-url-wrapper .loading, .swagger-ui .errors-wrapper .errors small, .swagger-ui .fallback, .swagger-ui .filter .loading, .swagger-ui .gray, .swagger-ui .hover-black:focus, .swagger-ui .hover-black:hover, .swagger-ui .hover-dark-gray:focus, .swagger-ui .hover-dark-gray:hover, .swagger-ui .hover-gray:focus, .swagger-ui .hover-gray:hover, .swagger-ui .hover-light-silver:focus, .swagger-ui .hover-light-silver:hover, .swagger-ui .hover-mid-gray:focus, .swagger-ui .hover-mid-gray:hover, .swagger-ui .hover-near-black:focus, .swagger-ui .hover-near-black:hover, .swagger-ui .hover-silver:focus, .swagger-ui .hover-silver:hover, .swagger-ui .light-silver, .swagger-ui .markdown pre, .swagger-ui .mid-gray, .swagger-ui .model .property, .swagger-ui .model .property.primitive, .swagger-ui .model-title, .swagger-ui .near-black, .swagger-ui .parameter__extension, .swagger-ui .parameter__in, .swagger-ui .prop-format, .swagger-ui .renderedmarkdown pre, .swagger-ui .response-col_links .response-undocumented, .swagger-ui .response-col_status .response-undocumented, .swagger-ui .silver, .swagger-ui section.models h4, .swagger-ui section.models h5, .swagger-ui span.token-not-formatted, .swagger-ui span.token-string, .swagger-ui table.headers .header-example, .swagger-ui table.model tr.description, .swagger-ui table.model tr.extension { color: #bfbfbf; }
.swagger-ui .hover-white:focus, .swagger-ui .hover-white:hover, .swagger-ui .info .title small pre, .swagger-ui .topbar a, .swagger-ui .white { color: #fff; }
.swagger-ui .bg-black-10, .swagger-ui .hover-bg-black-10:focus, .swagger-ui .hover-bg-black-10:hover, .swagger-ui .stripe-dark:nth-child(2n + 1) { background-color: rgba(0, 0, 0, .1); }
.swagger-ui .bg-white-10, .swagger-ui .hover-bg-white-10:focus, .swagger-ui .hover-bg-white-10:hover, .swagger-ui .stripe-light:nth-child(2n + 1) { background-color: rgba(28, 28, 33, .1); }
.swagger-ui .bg-light-silver, .swagger-ui .hover-bg-light-silver:focus, .swagger-ui .hover-bg-light-silver:hover, .swagger-ui .striped--light-silver:nth-child(2n + 1) { background-color: #6e6e6e; }
.swagger-ui .bg-moon-gray, .swagger-ui .hover-bg-moon-gray:focus, .swagger-ui .hover-bg-moon-gray:hover, .swagger-ui .striped--moon-gray:nth-child(2n + 1) { background-color: #4d4d4d; }
.swagger-ui .bg-light-gray, .swagger-ui .hover-bg-light-gray:focus, .swagger-ui .hover-bg-light-gray:hover, .swagger-ui .striped--light-gray:nth-child(2n + 1) { background-color: #2b2b2b; }
.swagger-ui .bg-near-white, .swagger-ui .hover-bg-near-white:focus, .swagger-ui .hover-bg-near-white:hover, .swagger-ui .striped--near-white:nth-child(2n + 1) { background-color: #242424; }
.swagger-ui .opblock-tag:hover, .swagger-ui section.models h4:hover { background: rgba(0, 0, 0, .02); }
.swagger-ui .checkbox p, .swagger-ui .dialog-ux .modal-ux-content h4, .swagger-ui .dialog-ux .modal-ux-content p, .swagger-ui .dialog-ux .modal-ux-header h3, .swagger-ui .errors-wrapper .errors h4, .swagger-ui .errors-wrapper hgroup h4, .swagger-ui .info .base-url, .swagger-ui .info .title, .swagger-ui .info h1, .swagger-ui .info h2, .swagger-ui .info h3, .swagger-ui .info h4, .swagger-ui .info h5, .swagger-ui .info li, .swagger-ui .info p, .swagger-ui .info table, .swagger-ui .loading-container .loading::after, .swagger-ui .model, .swagger-ui .opblock .opblock-section-header h4, .swagger-ui .opblock .opblock-section-header > label, .swagger-ui .opblock .opblock-summary-description, .swagger-ui .opblock .opblock-summary-operation-id, .swagger-ui .opblock .opblock-summary-path, .swagger-ui .opblock .opblock-summary-path__deprecated, .swagger-ui .opblock-description-wrapper, .swagger-ui .opblock-description-wrapper h4, .swagger-ui .opblock-description-wrapper p, .swagger-ui .opblock-external-docs-wrapper, .swagger-ui .opblock-external-docs-wrapper h4, .swagger-ui .opblock-external-docs-wrapper p, .swagger-ui .opblock-tag small, .swagger-ui .opblock-title_normal, .swagger-ui .opblock-title_normal h4, .swagger-ui .opblock-title_normal p, .swagger-ui .parameter__name, .swagger-ui .parameter__type, .swagger-ui .response-col_links, .swagger-ui .response-col_status, .swagger-ui .responses-inner h4, .swagger-ui .responses-inner h5, .swagger-ui .scheme-container .schemes > label, .swagger-ui .scopes h2, .swagger-ui .servers > label, .swagger-ui .tab li, .swagger-ui label, .swagger-ui select, .swagger-ui table.headers td { color: #b5bac9; }
.swagger-ui .download-url-wrapper .failed, .swagger-ui .filter .failed, .swagger-ui .model-deprecated-warning, .swagger-ui .parameter__deprecated, .swagger-ui .parameter__name.required span, .swagger-ui table.model tr.property-row .star { color: #e69999; }
.swagger-ui .opblock-body pre.microlight, .swagger-ui textarea.curl {
background: #41444e;
border-radius: 4px;
color: #fff;
}
.swagger-ui .expand-methods svg, .swagger-ui .expand-methods:hover svg { fill: #bfbfbf; }
.swagger-ui .auth-container, .swagger-ui .dialog-ux .modal-ux-header { border-bottom: 1px solid #2e2e2e; }
.swagger-ui .topbar .download-url-wrapper .select-label select, .swagger-ui .topbar .download-url-wrapper input[type=text] { border: 2px solid #63a040; }
.swagger-ui .info a, .swagger-ui .info a:hover, .swagger-ui .scopes h2 a { color: #99bde6; }
/* Dark Scrollbar */
::-webkit-scrollbar {
width: 14px;
height: 14px;
}
::-webkit-scrollbar-button {
background-color: #3e4346 !important;
}
::-webkit-scrollbar-track {
background-color: #646464 !important;
}
::-webkit-scrollbar-track-piece {
background-color: #3e4346 !important;
}
::-webkit-scrollbar-thumb {
height: 50px;
background-color: #242424 !important;
border: 2px solid #3e4346 !important;
}
::-webkit-scrollbar-corner {}
::-webkit-resizer {}
::-webkit-scrollbar-button:vertical:start:decrement {
background:
linear-gradient(130deg, #696969 40%, rgba(255, 0, 0, 0) 41%),
linear-gradient(230deg, #696969 40%, rgba(0, 0, 0, 0) 41%),
linear-gradient(0deg, #696969 40%, rgba(0, 0, 0, 0) 31%);
background-color: #b6b6b6;
}
::-webkit-scrollbar-button:vertical:end:increment {
background:
linear-gradient(310deg, #696969 40%, rgba(0, 0, 0, 0) 41%),
linear-gradient(50deg, #696969 40%, rgba(0, 0, 0, 0) 41%),
linear-gradient(180deg, #696969 40%, rgba(0, 0, 0, 0) 31%);
background-color: #b6b6b6;
}
::-webkit-scrollbar-button:horizontal:end:increment {
background:
linear-gradient(210deg, #696969 40%, rgba(0, 0, 0, 0) 41%),
linear-gradient(330deg, #696969 40%, rgba(0, 0, 0, 0) 41%),
linear-gradient(90deg, #696969 30%, rgba(0, 0, 0, 0) 31%);
background-color: #b6b6b6;
}
::-webkit-scrollbar-button:horizontal:start:decrement {
background:
linear-gradient(30deg, #696969 40%, rgba(0, 0, 0, 0) 41%),
linear-gradient(150deg, #696969 40%, rgba(0, 0, 0, 0) 41%),
linear-gradient(270deg, #696969 30%, rgba(0, 0, 0, 0) 31%);
background-color: #b6b6b6;
}

17
static/swagger-ui/index.css vendored Normal file
View File

@ -0,0 +1,17 @@
/*! Swagger UI 4.13.2 | https://swagger.io/tools/swagger-ui/ | Apache License 2.0 (license file can be found at ./LICENSE) */
html {
box-sizing: border-box;
overflow: -moz-scrollbars-vertical;
overflow-y: scroll;
}
*,
*:before,
*:after {
box-sizing: inherit;
}
body {
margin: 0;
background: #fafafa;
}

79
static/swagger-ui/oauth2-redirect.html vendored Normal file
View File

@ -0,0 +1,79 @@
<!doctype html>
<html lang="en-US">
<head>
<title>Swagger UI: OAuth2 Redirect</title>
</head>
<body>
<script>
'use strict';
function run () {
var oauth2 = window.opener.swaggerUIRedirectOauth2;
var sentState = oauth2.state;
var redirectUrl = oauth2.redirectUrl;
var isValid, qp, arr;
if (/code|token|error/.test(window.location.hash)) {
qp = window.location.hash.substring(1);
} else {
qp = location.search.substring(1);
}
arr = qp.split("&");
arr.forEach(function (v,i,_arr) { _arr[i] = '"' + v.replace('=', '":"') + '"';});
qp = qp ? JSON.parse('{' + arr.join() + '}',
function (key, value) {
return key === "" ? value : decodeURIComponent(value);
}
) : {};
isValid = qp.state === sentState;
if ((
oauth2.auth.schema.get("flow") === "accessCode" ||
oauth2.auth.schema.get("flow") === "authorizationCode" ||
oauth2.auth.schema.get("flow") === "authorization_code"
) && !oauth2.auth.code) {
if (!isValid) {
oauth2.errCb({
authId: oauth2.auth.name,
source: "auth",
level: "warning",
message: "Authorization may be unsafe, passed state was changed in server. The passed state wasn't returned from auth server."
});
}
if (qp.code) {
delete oauth2.state;
oauth2.auth.code = qp.code;
oauth2.callback({auth: oauth2.auth, redirectUrl: redirectUrl});
} else {
let oauthErrorMsg;
if (qp.error) {
oauthErrorMsg = "["+qp.error+"]: " +
(qp.error_description ? qp.error_description+ ". " : "no accessCode received from the server. ") +
(qp.error_uri ? "More info: "+qp.error_uri : "");
}
oauth2.errCb({
authId: oauth2.auth.name,
source: "auth",
level: "error",
message: oauthErrorMsg || "[Authorization failed]: no accessCode received from the server."
});
}
} else {
oauth2.callback({auth: oauth2.auth, token: qp, isValid: isValid, redirectUrl: redirectUrl});
}
window.close();
}
if (document.readyState !== 'loading') {
run();
} else {
document.addEventListener('DOMContentLoaded', function () {
run();
});
}
</script>
</body>
</html>

File diff suppressed because one or more lines are too long

3
static/swagger-ui/swagger-ui.css vendored Normal file

File diff suppressed because one or more lines are too long

2
static/swagger-ui/swagger-ui.js vendored Normal file

File diff suppressed because one or more lines are too long

View File

@ -9,7 +9,8 @@
<link rel="stylesheet" href="static/bootstrap.min.css">
<link rel="stylesheet" href="static/bootstrap-toggle.min.css">
<link rel="stylesheet" href="static/open-iconic-bootstrap.min.css">
<link rel="stylesheet" href="static/custom.css?ver=1.18.1a">
<link href="static/open-iconic/css/open-iconic.css" rel="stylesheet">
<link rel="stylesheet" href="static/custom.css?ver=1.18.1c">
<script src="static/jquery-3.6.0.min.js"></script>
<script src="static/jquery-ui.sortable.min.js"></script>
@ -17,7 +18,8 @@
<script src="static/bootstrap.min.js"></script>
<script src="static/bootstrap-toggle.min.js"></script>
<script src="static/rangy-core.min.js"></script>
<script src="static/application.js?ver=1.18.1d"></script>
<script src="static/application.js?ver=1.18.1f"></script>
<script src="static/favicon.js"></script>
</head>
<body>
<input type="file" id="remote-save-select" accept="application/json" style="display:none">
@ -33,6 +35,11 @@
</button>
<div class="collapse navbar-collapse" id="navbarNavDropdown">
<ul class="nav navbar-nav">
{% if not hide_ai_menu %}
<li class="nav-item">
<a class="nav-link" href="#" id="btn_loadmodel">AI</a>
</li>
{% endif %}
<li class="nav-item dropdown">
<a class="nav-link dropdown-toggle" href="#" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false">New Game</a>
<div class="dropdown-menu">
@ -86,6 +93,7 @@
</div>
<div id="connectstatusdiv" class="flex-row-container">
<span id="connectstatus" class="color_orange flex-row">Waiting for connection...</span>
<div class="layer-container status-container flex-push-left" style="color: #FFFFFF;" id="runtime"></div>
<div class="layer-container status-container flex-push-right">
<span class="oi oi-puzzle-piece statusicon layer-bottom" aria-hidden="true">
<div class="statustext statustext-wide">
@ -108,6 +116,11 @@
</div>
<div class="row" id="formatmenu">
</div>
<div id="token_prob_menu" class="row hidden">
<div id="token_prob_container"></div>
</div>
<div class="layer-container">
<div class="layer-bottom row" id="gamescreen">
<span id="gametext" contenteditable="true"><p>...</p></span>
@ -141,7 +154,7 @@
<div id="inputrowmode">
<button type="button" class="btn btn-secondary hidden" id="btnmode">Mode:<br/><b id="btnmode_label">Story</b></button>
</div>
<div id="inputrowleft">
<div id="inputrowleft" class="tokens-counted">
<textarea class="form-control" id="input_text" placeholder="Enter text here"></textarea>
</div>
<div id="inputrowright">
@ -154,7 +167,7 @@
<div class="anotelabel no-padding">
Author's Note
</div>
<div class="anotefield">
<div class="anotefield tokens-counted">
<textarea class="form-control" placeholder="Author's Note" id="anoteinput"></textarea>
</div>
</div>
@ -195,7 +208,7 @@
</div>
</div>
<div class="hidden" id="popupcontainer">
<div id="popup">
<div id="popup_old">
<div id="popuptitlebar">
<div id="popuptitletext">Select an Adventure to Import</div>
</div>
@ -252,7 +265,7 @@
</div>
</div>
<div class="popupcontainer hidden" id="loadcontainer">
<div id="loadpopup">
<div class="loadpopup" id="loadpopup">
<div class="popuptitlebar">
<div class="popuptitletext">Select A Story To Load</div>
</div>
@ -268,10 +281,64 @@
</div>
</div>
</div>
<div class="popupcontainer hidden" id="loadmodelcontainer">
<div class="loadpopup">
<div class="popuptitlebar">
<div class="popuptitletext">Select A Model To Load</div>
</div>
<div id="loadmodellistbreadcrumbs">
</div>
<div id="loadmodellistcontent" style="overflow: auto; height: 300px;">
</div>
<div class="popupfooter">
<input class="form-control hidden" type="text" placeholder="Enter the URL of the server (For example a trycloudflare link)" id="modelurl" onchange="check_enable_model_load()">
<input class="form-control hidden" type="text" placeholder="key" id="modelkey" onblur="socket.send({'cmd': 'OAI_Key_Update', 'key': $('#modelkey')[0].value});">
<input class="form-control hidden" type="text" placeholder="Model Path or Hugging Face Name" id="custommodelname" menu="" onblur="socket.send({'cmd': 'selectmodel', 'data': $(this).attr('menu'), 'path_modelname': $('#custommodelname')[0].value});">
</div>
<div class="popupfooter">
<select class="form-control hidden" id="oaimodel"><option value="">Select Model(s)</option></select>
</div>
<div class="popupfooter hidden" id=modellayers>
<div class='settingitem' style="width:100%">
<div class='settinglabel'>
<div class="justifyleft">
GPU/Disk Layers
<span class="helpicon">?
<span class="helptext">Number of layers to assign to GPUs and to disk cache. Remaining layers will be put into CPU RAM.</span>
</span>
</div>
<div class="justifyright" id="gpu_layers_current">0</div>
</div>
<div id=model_layer_bars style="color: white">
</div>
<input type=hidden id='gpu_count' value=0/>
<div class="settingminmax">
<div class="justifyleft">
0
</div>
<div class="justifyright" id="gpu_layers_max">
24
</div>
</div>
</div>
</div>
<div class="popupfooter">
<button type="button" class="btn btn-primary" id="btn_loadmodelaccept">Load</button>
<button type="button" class="btn btn-primary" id="btn_loadmodelclose">Cancel</button>
<div class="box flex-push-right hidden" id=use_gpu_div>
<input type="checkbox" data-toggle="toggle" data-onstyle="success" id="use_gpu" checked>
<div class="box-label">Use GPU</div>
</div>
</div>
</div>
</div>
<div class="popupcontainer hidden" id="spcontainer">
<div id="sppopup">
<div class="popuptitlebar">
<div class="popuptitletext">Select A Soft Prompt To Use</div>
<button class="btn btn-primary" onclick="socket.emit('show_folder_soft_prompt', {});"><span class="oi" style="color: white;" data-glyph="folder"></span></button>
</div>
<div id="splistcontent">
</div>
@ -285,6 +352,7 @@
<div id="uspopup">
<div class="popuptitlebar">
<div class="popuptitletext">Select userscripts to load; drag-and-drop to reorder</div>
<button class="btn btn-primary" onclick="socket.emit('show_folder_usersripts', {});"><span class="oi" style="color: white;" data-glyph="folder"></span></button>
</div>
<div class="usheadergrid">
<div>[AVAILABLE]</div>
@ -367,6 +435,19 @@
</div>
</div>
</div>
<div class="popupcontainer hidden flex" id="showmodelnamecontainer" style="center;">
<div class="loadpopup">
<div class="popuptitlebar" style="width:50% center;">
<div class="popuptitletext">Model Info</div>
</div>
<div id=showmodelnamecontent style="width:50%;">
Model Info Missing
</div>
<div class="popupfooter" style="width:50% center;">
</div>
</div>
</div>
<div class="popupcontainer hidden" id="rndgamecontainer">
<div id="rspopup">
<div class="popuptitlebar">
@ -393,5 +474,26 @@
</div>
</div>
</div>
<!------------- Pop-Up ------------------------------->
<div class="popupcontainer hidden" id="popup">
<div class="new_popup">
<div style="height:100%;">
<div class="title" id="popup_title">
Popup Title
</div>
<div id="popup_breadcrumbs"></div>
<div class="popup_list_area" id="popup_list"></div>
<div class="popup_load_cancel hidden" id="popup_upload">
<input type=file id="popup_upload_file">
</div>
<div style="background-color: black">Drag file(s) above or click here to Upload File<input id="popup_upload_input" type=file onchange="upload_file(this)"></div>
<div class="popup_load_cancel" id="popup_load_cancel">
<button class="btn btn-secondary popup_load_cancel_button" id="popup_accept">Load</button>
<button class="btn btn-primary popup_load_cancel_button" id="popup_cancel" onclick='document.getElementById("popup").classList.add("hidden");'>Cancel</button>
</div>
</div>
</div>
</div>
</body>
</html>

35
templates/swagger-ui.html Normal file
View File

@ -0,0 +1,35 @@
{# This is the HTML template for Swagger UI (the GUI for the API documentation at /api/latest/docs) #}
<!DOCTYPE html>
<html lang="en">
<head>
<title>KoboldAI API</title>
<meta charset="UTF-8">
<link rel="stylesheet" type="text/css" href="/static/swagger-ui/swagger-ui.css" />
<link rel="stylesheet" type="text/css" href="/static/swagger-ui/index.css" />
<script>
if (window.matchMedia && window.matchMedia("(prefers-color-scheme: dark)").matches) document.write('<link rel="stylesheet" type="text/css" href="/static/swagger-ui/SwaggerDark.css" />');
</script>
</head>
<body>
<div id="swagger-ui"></div>
<script src="/static/swagger-ui/swagger-ui-bundle.js" charset="UTF-8"></script>
<script>
window.onload = function() {
window.ui = SwaggerUIBundle({
url: "{{ url }}",
oauth2RedirectUrl: "/static/swagger-ui/oauth2-redirect.html",
dom_id: "#swagger-ui",
deepLinking: true,
defaultModelsExpandDepth: 0, // Causes the "Schemas" section at the bottom to be collapsed by default
presets: [
SwaggerUIBundle.presets.apis
],
plugins: [
SwaggerUIBundle.plugins.DownloadUrl
],
layout: "BaseLayout"
});
};
</script>
</body>
</html>

226
test_aiserver.py Normal file
View File

@ -0,0 +1,226 @@
import pytest, time
import aiserver
#Test Model List:
test_models = [
('EleutherAI/gpt-neo-1.3B', {'key': False, 'gpu': False, 'layer_count': 24, 'breakmodel': True, 'url': False}),
('gpt2', {'key': False, 'gpu': False, 'layer_count': 12, 'breakmodel': True, 'url': False}),
('facebook/opt-350m', {'key': False, 'gpu': False, 'layer_count': 24, 'breakmodel': True, 'url': False})
]
@pytest.fixture
def client_data():
app = aiserver.app
#app.test_client_class = FlaskLoginClient
client_conn = app.test_client()
socketio_client = aiserver.socketio.test_client(app, flask_test_client=client_conn)
#Clear out the connection message
response = socketio_client.get_received()
return (client_conn, app, socketio_client)
def get_model_menu(model):
for menu in aiserver.model_menu:
for item in aiserver.model_menu[menu]:
if item[1] == model:
for main_menu_line in aiserver.model_menu['mainmenu']:
if main_menu_line[1] == menu:
return (menu, main_menu_line, item)
return None
def generate_story_data(client_data):
(client, app, socketio_client) = client_data
socketio_client.emit('message',{'cmd': 'submit', 'allowabort': False, 'actionmode': 0, 'chatname': None, 'data': ''})
#wait until the game state turns back to start
state = 'wait'
new_text = None
start_time = time.time()
timeout = time.time() + 60*1
while state == 'wait':
if time.time() > timeout:
break
responses = socketio_client.get_received()
for response in responses:
response = response['args'][0]
print(response)
if response['cmd'] == 'setgamestate':
state = response['data']
elif response['cmd'] == 'updatechunk' or response['cmd'] == 'genseqs':
new_text = response['data']
time.sleep(0.1)
assert new_text is not None
def test_basic_connection(client_data):
(client, app, socketio_client) = client_data
response = client.get("/")
assert response.status_code == 200
def test_load_story_from_web_ui(client_data):
(client, app, socketio_client) = client_data
#List out the stories and make sure we have the sample story
socketio_client.emit('message',{'cmd': 'loadlistrequest', 'data': ''})
response = socketio_client.get_received()[0]['args'][0]['data']
found_sample_story = False
for story in response:
if story['name'] == 'sample_story':
found_sample_story = True
assert found_sample_story
#Click on the sample story, then click load
socketio_client.emit('message',{'cmd': 'loadselect', 'data': 'sample_story'})
socketio_client.emit('message',{'cmd': 'loadrequest', 'data': ''})
#Wait until we get the data back from the load
loaded_story = False
timeout = time.time() + 60*2
while not loaded_story:
if time.time() > timeout:
break
responses = socketio_client.get_received()
for response in responses:
response = response['args'][0]
if 'cmd' not in response:
print(response)
assert False
if response['cmd'] == 'updatescreen':
loaded_story = True
story_text = response['data']
break
assert loaded_story
#Verify that it's the right story data
assert story_text == '<chunk n="0" id="n0" tabindex="-1">Niko the kobold stalked carefully down the alley, his small scaly figure obscured by a dusky cloak that fluttered lightly in the cold winter breeze. Holding up his tail to keep it from dragging in the dirty snow that covered the cobblestone, he waited patiently for the butcher to turn his attention from his stall so that he could pilfer his next meal: a tender-looking</chunk><chunk n="1" id="n1" tabindex="-1"> chicken. He crouched just slightly as he neared the stall to ensure that no one was watching, not that anyone would be dumb enough to hassle a small kobold. What else was there for a lowly kobold to</chunk><chunk n="2" id="n2" tabindex="-1"> do in a city? All that Niko needed to know was</chunk><chunk n="3" id="n3" tabindex="-1"> where to find the chicken and then how to make off with it.<br/><br/>A soft thud caused Niko to quickly lift his head. Standing behind the stall where the butcher had been cutting his chicken,</chunk>'
@pytest.mark.parametrize("model, expected_load_options", test_models)
def test_load_model_from_web_ui(client_data, model, expected_load_options):
(client, app, socketio_client) = client_data
#Clear out any old messages
response = socketio_client.get_received()
(menu, menu_line, model_line) = get_model_menu(model)
#Send the ai load model menu option
socketio_client.emit('message',{'cmd': 'list_model', 'data': 'mainmenu'})
response = socketio_client.get_received()[0]['args'][0]['data']
assert menu_line in response
#Send the click model menu option
socketio_client.emit('message',{'cmd': 'list_model', 'data': menu, 'pretty_name': ""})
response = socketio_client.get_received()[0]['args'][0]['data']
assert model_line in response
#Click the model
socketio_client.emit('message',{'cmd': 'selectmodel', 'data': model})
response = socketio_client.get_received()[0]['args'][0]
#Check that we're getting the right load options
print(response)
assert response['key'] == expected_load_options['key']
assert response['gpu'] == expected_load_options['gpu']
assert response['layer_count'] == expected_load_options['layer_count']
assert response['breakmodel'] == expected_load_options['breakmodel']
assert response['url'] == expected_load_options['url']
#Now send the load
socketio_client.emit('message',{'cmd': 'load_model', 'use_gpu': True, 'key': '', 'gpu_layers': str(expected_load_options['layer_count']), 'disk_layers': '0', 'url': '', 'online_model': ''})
#wait until the game state turns back to start
state = 'wait'
start_time = time.time()
timeout = time.time() + 60*2
while state == 'wait':
if time.time() > timeout:
break
responses = socketio_client.get_received()
for response in responses:
response = response['args'][0]
if response['cmd'] == 'setgamestate':
state = response['data']
time.sleep(0.1)
#Give it a second to get all of the settings, etc and clear out the messages
responses = socketio_client.get_received()
#check the model info to see if it's loaded
socketio_client.emit('message',{'cmd': 'show_model', 'data': ''})
response = socketio_client.get_received()[0]['args'][0]
assert response == {'cmd': 'show_model_name', 'data': model}
generate_story_data(client_data)
def test_load_GooseAI_from_web_ui(client_data):
pytest.skip("unsupported configuration")
@pytest.mark.parametrize("model, expected_load_options", test_models)
def test_load_model_from_command_line(client_data, model, expected_load_options):
(client, app, socketio_client) = client_data
#Clear out any old messages
response = socketio_client.get_received()
(menu, menu_line, model_line) = get_model_menu(model)
aiserver.general_startup("--model {}".format(model))
aiserver.load_model(initial_load=True)
#check the model info to see if it's loaded
socketio_client.emit('message',{'cmd': 'show_model', 'data': ''})
response = socketio_client.get_received()[0]['args'][0]
assert response == {'cmd': 'show_model_name', 'data': model}
generate_story_data(client_data)
def test_back_redo(client_data):
(client, app, socketio_client) = client_data
#Make sure we have known story in the ui
test_load_story_from_web_ui(client_data)
#Clear out any old messages
response = socketio_client.get_received()
#run a back action
socketio_client.emit('message',{'cmd': 'back', 'data': ''})
response = socketio_client.get_received()[0]['args'][0]
assert response == {'cmd': 'removechunk', 'data': 3}
#Run a redo action
socketio_client.emit('message',{'cmd': 'redo', 'data': ''})
response = socketio_client.get_received()[0]['args'][0]
assert response == {'cmd': 'updatechunk', 'data': {'index': 3, 'html': '<chunk n="3" id="n3" tabindex="-1"> where to find the chicken and then how to make off with it.<br/><br/>A soft thud caused Niko to quickly lift his head. Standing behind the stall where the butcher had been cutting his chicken,</chunk>'}}
#Go all the way back, then all the way forward
socketio_client.emit('message',{'cmd': 'back', 'data': ''})
response = socketio_client.get_received()[0]['args'][0]
assert response == {'cmd': 'removechunk', 'data': 3}
socketio_client.emit('message',{'cmd': 'back', 'data': ''})
response = socketio_client.get_received()[0]['args'][0]
assert response == {'cmd': 'removechunk', 'data': 2}
socketio_client.emit('message',{'cmd': 'back', 'data': ''})
response = socketio_client.get_received()[0]['args'][0]
assert response == {'cmd': 'removechunk', 'data': 1}
socketio_client.emit('message',{'cmd': 'back', 'data': ''})
response = socketio_client.get_received()[0]['args'][0]
assert response == {'cmd': 'errmsg', 'data': 'Cannot delete the prompt.'}
socketio_client.emit('message',{'cmd': 'redo', 'data': ''})
response = socketio_client.get_received()
assert response == [{'name': 'from_server', 'args': [{'cmd': 'updatescreen', 'gamestarted': True, 'data': '<chunk n="0" id="n0" tabindex="-1">Niko the kobold stalked carefully down the alley, his small scaly figure obscured by a dusky cloak that fluttered lightly in the cold winter breeze. Holding up his tail to keep it from dragging in the dirty snow that covered the cobblestone, he waited patiently for the butcher to turn his attention from his stall so that he could pilfer his next meal: a tender-looking</chunk><chunk n="1" id="n1" tabindex="-1"> chicken. He crouched just slightly as he neared the stall to ensure that no one was watching, not that anyone would be dumb enough to hassle a small kobold. What else was there for a lowly kobold to</chunk>'}], 'namespace': '/'},
{'name': 'from_server', 'args': [{'cmd': 'texteffect', 'data': 1}], 'namespace': '/'}]
socketio_client.emit('message',{'cmd': 'redo', 'data': ''})
response = socketio_client.get_received()
assert response == [{'name': 'from_server', 'args': [{'cmd': 'updatechunk', 'data': {'index': 2, 'html': '<chunk n="2" id="n2" tabindex="-1"> do in a city? All that Niko needed to know was</chunk>'}}], 'namespace': '/'},
{'name': 'from_server', 'args': [{'cmd': 'texteffect', 'data': 2}], 'namespace': '/'}]
socketio_client.emit('message',{'cmd': 'redo', 'data': ''})
response = socketio_client.get_received()
assert response == [{'name': 'from_server', 'args': [{'cmd': 'updatechunk', 'data': {'index': 3, 'html': '<chunk n="3" id="n3" tabindex="-1"> where to find the chicken and then how to make off with it.<br/><br/>A soft thud caused Niko to quickly lift his head. Standing behind the stall where the butcher had been cutting his chicken,</chunk>'}}], 'namespace': '/'},
{'name': 'from_server', 'args': [{'cmd': 'texteffect', 'data': 3}], 'namespace': '/'}]

View File

@ -50,8 +50,13 @@ import itertools
import zipfile
import pickle
import torch
import numpy as np
import collections
import _codecs
import utils
import os
from torch.nn import Module
from typing import Any, Callable, Dict, Optional, Tuple, Union
from typing import Any, Callable, Dict, Optional, Tuple, Type, Union
_EXTRA_STATE_KEY_SUFFIX = '_extra_state'
@ -89,12 +94,16 @@ class LazyTensor:
def __repr__(self):
return self.__view(repr)
def materialize(self, checkpoint: Union[zipfile.ZipFile, zipfile.ZipExtFile], map_location=None, no_grad=True) -> torch.Tensor:
def materialize(self, checkpoint: Union[zipfile.ZipFile, zipfile.ZipExtFile], map_location=None, no_grad=True, filename="pytorch_model.bin") -> torch.Tensor:
filename = os.path.basename(os.path.normpath(filename)).split('.')[0]
size = reduce(lambda x, y: x * y, self.shape, 1)
dtype = self.dtype
nbytes = size if dtype is torch.bool else size * ((torch.finfo if dtype.is_floating_point else torch.iinfo)(dtype).bits >> 3)
if isinstance(checkpoint, zipfile.ZipFile):
f = checkpoint.open(f"archive/data/{self.key}", "r")
try:
f = checkpoint.open(f"archive/data/{self.key}", "r")
except:
f = checkpoint.open(f"{filename}/data/{self.key}", "r")
f.read(self.seek_offset)
else:
f = checkpoint
@ -110,8 +119,50 @@ class LazyTensor:
tensor._backward_hooks = self.backward_hooks
return tensor
class RestrictedUnpickler(pickle.Unpickler):
def original_persistent_load(self, saved_id):
return super().persistent_load(saved_id)
class _LazyUnpickler(pickle.Unpickler):
def forced_persistent_load(self, saved_id):
if saved_id[0] != "storage":
raise pickle.UnpicklingError("`saved_id[0]` must be 'storage'")
return self.original_persistent_load(saved_id)
def find_class(self, module, name):
if module == "collections" and name == "OrderedDict":
return collections.OrderedDict
elif module == "torch._utils" and name == "_rebuild_tensor_v2":
return torch._utils._rebuild_tensor_v2
elif module == "torch" and name in (
"DoubleStorage",
"FloatStorage",
"HalfStorage",
"LongStorage",
"IntStorage",
"ShortStorage",
"CharStorage",
"ByteStorage",
"BoolStorage",
"BFloat16Storage",
):
return getattr(torch, name)
elif module == "numpy.core.multiarray" and name == "scalar":
return np.core.multiarray.scalar
elif module == "numpy" and name == "dtype":
return np.dtype
elif module == "_codecs" and name == "encode":
return _codecs.encode
else:
# Forbid everything else.
qualified_name = name if module == "__builtin__" else f"{module}.{name}"
raise pickle.UnpicklingError(f"`{qualified_name}` is forbidden; the model you are loading probably contains malicious code")
def load(self, *args, **kwargs):
self.original_persistent_load = getattr(self, "persistent_load", pickle.Unpickler.persistent_load)
self.persistent_load = self.forced_persistent_load
return super().load(*args, **kwargs)
class _LazyUnpickler(RestrictedUnpickler):
lazy_loaded_storages: Dict[str, LazyTensor]
def __init__(self, *args, **kwargs):
@ -126,7 +177,6 @@ class _LazyUnpickler(pickle.Unpickler):
return LazyTensor(storage_type, key, location)
def load(self, *args, **kwargs):
self.persistent_load = self.forced_persistent_load
retval = super().load(*args, **kwargs)
self.lazy_loaded_storages = {}
return retval
@ -213,15 +263,32 @@ def _load_from_state_dict(self, state_dict, prefix, local_metadata, strict, miss
@contextlib.contextmanager
def use_lazy_torch_load(enable=True, callback: Optional[Callable] = None, dematerialized_modules=False):
def use_custom_unpickler(unpickler: Type[pickle.Unpickler] = RestrictedUnpickler):
try:
old_unpickler = pickle.Unpickler
pickle.Unpickler = unpickler
old_pickle_load = pickle.load
def new_pickle_load(*args, **kwargs):
return pickle.Unpickler(*args, **kwargs).load()
pickle.load = new_pickle_load
yield
finally:
pickle.Unpickler = old_unpickler
pickle.load = old_pickle_load
@contextlib.contextmanager
def use_lazy_torch_load(enable=True, callback: Optional[Callable] = None, dematerialized_modules=False, use_accelerate_init_empty_weights=False):
if not enable:
yield False
with use_custom_unpickler(RestrictedUnpickler):
yield False
return
try:
old_unpickler = pickle.Unpickler
pickle.Unpickler = _LazyUnpickler
old_rebuild_tensor = torch._utils._rebuild_tensor
torch._utils._rebuild_tensor = _rebuild_tensor
@ -236,33 +303,41 @@ def use_lazy_torch_load(enable=True, callback: Optional[Callable] = None, demate
torch.load = torch_load
if dematerialized_modules:
old_linear_init = torch.nn.Linear.__init__
old_embedding_init = torch.nn.Embedding.__init__
old_layernorm_init = torch.nn.LayerNorm.__init__
if use_accelerate_init_empty_weights and utils.HAS_ACCELERATE:
import accelerate
init_empty_weights = accelerate.init_empty_weights()
init_empty_weights.__enter__()
else:
old_linear_init = torch.nn.Linear.__init__
old_embedding_init = torch.nn.Embedding.__init__
old_layernorm_init = torch.nn.LayerNorm.__init__
def linear_init(self, *args, device=None, **kwargs):
return old_linear_init(self, *args, device="meta", **kwargs)
def linear_init(self, *args, device=None, **kwargs):
return old_linear_init(self, *args, device="meta", **kwargs)
def embedding_init(self, *args, device=None, **kwargs):
return old_embedding_init(self, *args, device="meta", **kwargs)
def embedding_init(self, *args, device=None, **kwargs):
return old_embedding_init(self, *args, device="meta", **kwargs)
def layernorm_init(self, *args, device=None, **kwargs):
return old_layernorm_init(self, *args, device="meta", **kwargs)
def layernorm_init(self, *args, device=None, **kwargs):
return old_layernorm_init(self, *args, device="meta", **kwargs)
torch.nn.Linear.__init__ = linear_init
torch.nn.Embedding.__init__ = embedding_init
torch.nn.LayerNorm.__init__ = layernorm_init
old_load_from_state_dict = torch.nn.Module._load_from_state_dict
torch.nn.Module._load_from_state_dict = _load_from_state_dict
torch.nn.Linear.__init__ = linear_init
torch.nn.Embedding.__init__ = embedding_init
torch.nn.LayerNorm.__init__ = layernorm_init
old_load_from_state_dict = torch.nn.Module._load_from_state_dict
torch.nn.Module._load_from_state_dict = _load_from_state_dict
yield True
with use_custom_unpickler(_LazyUnpickler):
yield True
finally:
pickle.Unpickler = old_unpickler
torch._utils._rebuild_tensor = old_rebuild_tensor
torch.load = old_torch_load
if dematerialized_modules:
torch.nn.Linear.__init__ = old_linear_init
torch.nn.Embedding.__init__ = old_embedding_init
torch.nn.LayerNorm.__init__ = old_layernorm_init
torch.nn.Module._load_from_state_dict = old_load_from_state_dict
if use_accelerate_init_empty_weights and utils.HAS_ACCELERATE:
init_empty_weights.__exit__(None, None, None)
else:
torch.nn.Linear.__init__ = old_linear_init
torch.nn.Embedding.__init__ = old_embedding_init
torch.nn.LayerNorm.__init__ = old_layernorm_init
torch.nn.Module._load_from_state_dict = old_load_from_state_dict

View File

@ -46,7 +46,7 @@ from jax.experimental import maps
import jax.numpy as jnp
import numpy as np
import haiku as hk
from transformers import AutoTokenizer, GPT2TokenizerFast, AutoModelForCausalLM, GPTNeoForCausalLM
from transformers import AutoTokenizer, GPT2Tokenizer, AutoModelForCausalLM, GPTNeoForCausalLM
from tokenizers import Tokenizer
from mesh_transformer.checkpoint import read_ckpt_lowmem
from mesh_transformer.transformer_shard import CausalTransformer, CausalTransformerShard, PlaceholderTensor
@ -55,6 +55,31 @@ from mesh_transformer.util import to_bf16
params: Dict[str, Any] = {}
__seed = random.randrange(2**64)
rng = random.Random(__seed)
def get_rng_seed():
return __seed
def set_rng_seed(seed: int):
global __seed, rng
rng = random.Random(seed)
__seed = seed
return seed
def randomize_rng_seed():
return set_rng_seed(random.randrange(2**64))
def get_rng_state():
return rng
def set_rng_state(state):
global rng
rng = state
def new_rng_state(seed: int):
return random.Random(seed)
def warper_callback(logits) -> np.array:
raise NotImplementedError("`tpu_mtj_backend.warper_callback()` needs to be defined")
@ -167,7 +192,7 @@ def apply_repetition_penalty_dynamic(logits, tokens, repetition_penalty, generat
logits[tokens] = penalty_logits
return logits
def kobold_sample_dynamic(key, logits, sampler_order: Optional[np.ndarray] = None, top_p=0.9, temp=0.5, top_k=0, tfs=1.0, typical=1.0, top_a=0.0):
def kobold_sample_dynamic(key, logits, rpargs, sampler_order: Optional[np.ndarray] = None, top_p=0.9, temp=0.5, top_k=0, tfs=1.0, typical=1.0, top_a=0.0):
'''
This gets called by generate_loop_fn to apply a series of 6 filters
to the logits (top-k, then top-a, then top-p, then TFS, then typical, then temperature)
@ -303,6 +328,7 @@ def kobold_sample_dynamic(key, logits, sampler_order: Optional[np.ndarray] = Non
if k == 3 and tfs < 1.0: logits = tail_free_filter(logits)
if k == 4 and typical < 1.0: logits = typical_filter(logits)
if k == 5 and temp != 1.0: logits = temp_filter(logits)
if k == 6 and rpargs[1] != 1.0: logits = apply_repetition_penalty_dynamic(logits, *rpargs)
# Finally, pick one token using the softmax thingy again (it gives
# an array whose elements sum to 1 so it can be used nicely as a
# probability distribution)
@ -353,7 +379,7 @@ def apply_repetition_penalty_static(logits, tokens, repetition_penalty, generate
# positions in the logits array
return logits.at[tokens].set(penalty_logits)
def kobold_sample_static(key, logits, sampler_order: Optional[np.ndarray] = None, top_p=0.9, temp=0.5, top_k=0, tfs=1.0, typical=1.0, top_a=0.0):
def kobold_sample_static(key, logits, rpargs, sampler_order: Optional[np.ndarray] = None, top_p=0.9, temp=0.5, top_k=0, tfs=1.0, typical=1.0, top_a=0.0):
'''
This gets called by generate_loop_fn to apply a series of 6 filters
to the logits (top-k, then top-a, then top-p, then TFS, then typical, then temperature)
@ -488,6 +514,7 @@ def kobold_sample_static(key, logits, sampler_order: Optional[np.ndarray] = None
logits = jax.lax.cond(jnp.logical_and(k == 3, tfs < 1.0), tail_free_filter, lambda x: x, logits)
logits = jax.lax.cond(jnp.logical_and(k == 4, typical < 1.0), typical_filter, lambda x: x, logits)
logits = jax.lax.cond(jnp.logical_and(k == 5, temp != 1.0), temp_filter, lambda x: x, logits)
logits = jax.lax.cond(jnp.logical_and(k == 6, rpargs[1] != 1.0), lambda x: apply_repetition_penalty_static(*x), lambda x: x[0], (logits, *rpargs))
# Finally, pick one token using the softmax thingy again (it gives
# an array whose elements sum to 1 so it can be used nicely as a
# probability distribution)
@ -504,17 +531,6 @@ def sample_func(data, key, numseqs_aux, badwords, repetition_penalty, generated_
# Get the pseudo-random number generator key that will
# be used by kobold_sample_dynamic to randomly pick a token
sample_key, new_key = jax.random.split(sample_key, num=2)
# Apply repetition penalty to all tokens that are
# currently inside the "generated" array
logits = apply_repetition_penalty_dynamic(
logits,
generated,
repetition_penalty,
generated_index,
gen_length,
rpslope,
rprange,
)
# Remove any tokens in the badwords list by setting
# their logits to negative infinity which effectively
# makes their probabilities of being chosen zero
@ -526,6 +542,14 @@ def sample_func(data, key, numseqs_aux, badwords, repetition_penalty, generated_
next_token = kobold_sample_dynamic(
sample_key,
logits,
(
generated,
repetition_penalty,
generated_index,
gen_length,
rpslope,
rprange,
),
**sampler_options,
)
# Remember what token was picked
@ -597,18 +621,6 @@ class PenalizingCausalTransformer(CausalTransformer):
assert logits.shape == (1, config["n_vocab"])
# Flatten it into a 1D array to make it easier to use
logits = logits[0]
# Apply repetition penalty to all tokens that are
# currently inside the "generated" array
if repetition_penalty is not None:
logits = apply_repetition_penalty_static(
logits,
generated,
repetition_penalty,
generated_index,
gen_length,
rpslope,
rprange,
)
# Remove any tokens in the badwords list by setting
# their logits to negative infinity which effectively
# makes their probabilities of being chosen zero
@ -620,6 +632,14 @@ class PenalizingCausalTransformer(CausalTransformer):
next_token = kobold_sample_static(
sample_key,
logits,
(
generated,
repetition_penalty,
generated_index,
gen_length,
rpslope,
rprange,
),
**sampler_options,
)
# Remember what token was picked
@ -735,7 +755,7 @@ class PenalizingCausalTransformer(CausalTransformer):
assert not return_logits
assert gen_length.ndim == 1
assert soft_embeddings is not None
key = hk.PRNGSequence(random.randint(0, 2 ** 60))
key = hk.PRNGSequence(rng.randint(0, 2 ** 60))
batch_size = ctx.shape[0]
self.batch_size = batch_size
_numseqs_aux = jnp.empty((batch_size, numseqs), dtype=np.uint32)
@ -783,7 +803,7 @@ class PenalizingCausalTransformer(CausalTransformer):
return sample_data, n_generated, regeneration_required, halt
def generate_static(self, ctx, ctx_length, gen_length, numseqs, sampler_options, return_logits=False, soft_embeddings=None):
assert not return_logits
key = hk.PRNGSequence(random.randint(0, 2 ** 60))
key = hk.PRNGSequence(rng.randint(0, 2 ** 60))
batch_size = ctx.shape[0]
self.batch_size = batch_size
started_compiling_callback()
@ -854,6 +874,9 @@ def infer_static(
maps.thread_resources.env = thread_resources_env
if sampler_order is None:
sampler_order = utils.default_sampler_order.copy()
sampler_order = sampler_order[:]
if len(sampler_order) < 7: # Add repetition penalty at beginning if it's not present
sampler_order = [6] + sampler_order
sampler_order = np.uint32(sampler_order)
total_batch = 1
tokens = context
@ -932,6 +955,7 @@ def read_neox_checkpoint(state, path, config, checkpoint_shards=2):
import torch
import torch.utils.dlpack
import torch_lazy_loader
from tqdm.auto import tqdm
move_xmap = jax.experimental.maps.xmap(
@ -973,8 +997,9 @@ def read_neox_checkpoint(state, path, config, checkpoint_shards=2):
continue
layer = checkpoint_layer - 2
shards = []
for checkpoint_shard in range(checkpoint_shards):
shards.append(torch.load(path_template.format(layer=checkpoint_layer, shard=checkpoint_shard), map_location="cpu"))
with torch_lazy_loader.use_custom_unpickler(torch_lazy_loader.RestrictedUnpickler):
for checkpoint_shard in range(checkpoint_shards):
shards.append(torch.load(path_template.format(layer=checkpoint_layer, shard=checkpoint_shard), map_location="cpu"))
for key in shards[0]:
if key == "attention.rotary_emb.inv_freq":
continue
@ -1024,8 +1049,13 @@ def read_neox_checkpoint(state, path, config, checkpoint_shards=2):
raise RuntimeError(error)
def load_model(path: str, driver_version="tpu_driver0.1_dev20210607", hf_checkpoint=False, **kwargs) -> None:
global thread_resources_env, seq, tokenizer, network, params
def load_model(path: str, driver_version="tpu_driver_20221109", hf_checkpoint=False, socketio_queue=None, initial_load=False, logger=None, **kwargs) -> None:
global thread_resources_env, seq, tokenizer, network, params, pad_token_id
if "pad_token_id" in kwargs:
pad_token_id = kwargs["pad_token_id"]
elif "eos_token_id" in kwargs:
pad_token_id = kwargs["eos_token_id"]
if not hasattr(vars, "sampler_order") or not vars.sampler_order:
vars.sampler_order = utils.default_sampler_order.copy()
@ -1042,7 +1072,7 @@ def load_model(path: str, driver_version="tpu_driver0.1_dev20210607", hf_checkpo
"pe_rotary_dims": 64,
"seq": 2048,
"cores_per_replica": 8,
"tokenizer_class": "GPT2TokenizerFast",
"tokenizer_class": "GPT2Tokenizer",
"tokenizer": "gpt2",
}
params = kwargs
@ -1060,7 +1090,7 @@ def load_model(path: str, driver_version="tpu_driver0.1_dev20210607", hf_checkpo
"pe_rotary_dims": 24,
"seq": 2048,
"cores_per_replica": 8,
"tokenizer_class": "GPT2TokenizerFast",
"tokenizer_class": "GPT2Tokenizer",
"tokenizer": "gpt2",
}
@ -1118,6 +1148,10 @@ def load_model(path: str, driver_version="tpu_driver0.1_dev20210607", hf_checkpo
if param not in params:
params[param] = default_params[param]
# Use an optimization that will allow us to avoid one extra transpose operation
if hf_checkpoint:
params["transposed_linear"] = True
# Load tokenizer
if vars.model == "TPUMeshTransformerGPTNeoX":
tokenizer = Tokenizer.from_file(os.path.join(path, "20B_tokenizer.json"))
@ -1161,10 +1195,6 @@ def load_model(path: str, driver_version="tpu_driver0.1_dev20210607", hf_checkpo
thread_resources_env = maps.ResourceEnv(maps.Mesh(devices, ('dp', 'mp')), ())
maps.thread_resources.env = thread_resources_env
global shard_xmap, batch_xmap
shard_xmap = __shard_xmap()
batch_xmap = __batch_xmap(shard_dim=cores_per_replica)
global badwords
# These are the tokens that we don't want the AI to ever write
badwords = jnp.array(vars.badwordsids).squeeze()
@ -1210,6 +1240,7 @@ def load_model(path: str, driver_version="tpu_driver0.1_dev20210607", hf_checkpo
from tqdm.auto import tqdm
import functools
def callback(model_dict, f, **_):
if callback.nested:
return
@ -1217,6 +1248,7 @@ def load_model(path: str, driver_version="tpu_driver0.1_dev20210607", hf_checkpo
with zipfile.ZipFile(f, "r") as z:
try:
last_storage_key = None
zipfolder = os.path.basename(os.path.normpath(f)).split('.')[0]
f = None
current_offset = 0
if utils.current_shard == 0:
@ -1249,7 +1281,10 @@ def load_model(path: str, driver_version="tpu_driver0.1_dev20210607", hf_checkpo
last_storage_key = storage_key
if isinstance(f, zipfile.ZipExtFile):
f.close()
f = z.open(f"archive/data/{storage_key}")
try:
f = z.open(f"archive/data/{storage_key}")
except:
f = z.open(f"{zipfolder}/data/{storage_key}")
current_offset = 0
if current_offset != model_dict[key].seek_offset:
f.read(model_dict[key].seek_offset - current_offset)
@ -1274,23 +1309,25 @@ def load_model(path: str, driver_version="tpu_driver0.1_dev20210607", hf_checkpo
if "divide_by_shards" in transforms:
tensor /= params["cores_per_replica"]
if "vocab_pad" in transforms:
tensor = torch.nn.functional.pad(tensor, (0, 0, 0, params["n_vocab_padding"]))
if "no_transpose" not in transforms and tensor.ndim == 2:
tensor = tensor.T
tensor = torch.nn.functional.pad(tensor, (0,) * (tensor.ndim * 2 - 1) + (params["n_vocab_padding"],))
# We don't need to transpose linear module weights anymore because MTJ will do it for us if `transposed_linear` is set to True in the config
#if "no_transpose" not in transforms and tensor.ndim == 2:
# tensor = tensor.T
tensor.unsqueeze_(0)
if tensor.dtype is torch.float16 or tensor.dtype is torch.float32:
tensor = tensor.bfloat16()
# Shard the tensor so that parts of the tensor can be used
# on different TPU cores
tensor = reshard_reverse(
tensor,
params["cores_per_replica"],
network.state["params"][spec["module"]][spec["param"]].shape,
)
tensor = jnp.array(tensor.detach())
if tensor.dtype is torch.float16 or tensor.dtype is torch.float32:
tensor = tensor.bfloat16()
network.state["params"][spec["module"]][spec["param"]] = move_xmap(
jax.dlpack.from_dlpack(torch.utils.dlpack.to_dlpack(
reshard_reverse(
tensor,
params["cores_per_replica"],
network.state["params"][spec["module"]][spec["param"]].shape,
)
)).copy(),
tensor,
np.empty(params["cores_per_replica"]),
)
@ -1331,52 +1368,52 @@ def load_model(path: str, driver_version="tpu_driver0.1_dev20210607", hf_checkpo
print("\n", flush=True)
with torch_lazy_loader.use_lazy_torch_load(callback=callback, dematerialized_modules=True):
if(os.path.isdir(vars.custmodpth)):
try:
tokenizer = AutoTokenizer.from_pretrained(vars.custmodpth, revision=vars.revision, cache_dir="cache")
except Exception as e:
pass
try:
tokenizer = AutoTokenizer.from_pretrained(vars.custmodpth, revision=vars.revision, cache_dir="cache", use_fast=False)
except Exception as e:
try:
tokenizer = GPT2TokenizerFast.from_pretrained(vars.custmodpth, revision=vars.revision, cache_dir="cache")
tokenizer = AutoTokenizer.from_pretrained(vars.custmodpth, revision=vars.revision, cache_dir="cache")
except Exception as e:
tokenizer = GPT2TokenizerFast.from_pretrained("gpt2", revision=vars.revision, cache_dir="cache")
try:
tokenizer = GPT2Tokenizer.from_pretrained(vars.custmodpth, revision=vars.revision, cache_dir="cache")
except Exception as e:
tokenizer = GPT2Tokenizer.from_pretrained("gpt2", revision=vars.revision, cache_dir="cache")
try:
model = AutoModelForCausalLM.from_pretrained(vars.custmodpth, revision=vars.revision, cache_dir="cache")
except Exception as e:
model = GPTNeoForCausalLM.from_pretrained(vars.custmodpth, revision=vars.revision, cache_dir="cache")
elif(os.path.isdir("models/{}".format(vars.model.replace('/', '_')))):
try:
tokenizer = AutoTokenizer.from_pretrained("models/{}".format(vars.model.replace('/', '_')), revision=vars.revision, cache_dir="cache")
except Exception as e:
pass
try:
tokenizer = AutoTokenizer.from_pretrained("models/{}".format(vars.model.replace('/', '_')), revision=vars.revision, cache_dir="cache", use_fast=False)
except Exception as e:
try:
tokenizer = GPT2TokenizerFast.from_pretrained("models/{}".format(vars.model.replace('/', '_')), revision=vars.revision, cache_dir="cache")
tokenizer = AutoTokenizer.from_pretrained("models/{}".format(vars.model.replace('/', '_')), revision=vars.revision, cache_dir="cache")
except Exception as e:
tokenizer = GPT2TokenizerFast.from_pretrained("gpt2", revision=vars.revision, cache_dir="cache")
try:
tokenizer = GPT2Tokenizer.from_pretrained("models/{}".format(vars.model.replace('/', '_')), revision=vars.revision, cache_dir="cache")
except Exception as e:
tokenizer = GPT2Tokenizer.from_pretrained("gpt2", revision=vars.revision, cache_dir="cache")
try:
model = AutoModelForCausalLM.from_pretrained("models/{}".format(vars.model.replace('/', '_')), revision=vars.revision, cache_dir="cache")
except Exception as e:
model = GPTNeoForCausalLM.from_pretrained("models/{}".format(vars.model.replace('/', '_')), revision=vars.revision, cache_dir="cache")
else:
try:
tokenizer = AutoTokenizer.from_pretrained(vars.model, revision=vars.revision, cache_dir="cache")
except Exception as e:
pass
try:
tokenizer = AutoTokenizer.from_pretrained(vars.model, revision=vars.revision, cache_dir="cache", use_fast=False)
except Exception as e:
try:
tokenizer = GPT2TokenizerFast.from_pretrained(vars.model, revision=vars.revision, cache_dir="cache")
tokenizer = AutoTokenizer.from_pretrained(vars.model, revision=vars.revision, cache_dir="cache")
except Exception as e:
tokenizer = GPT2TokenizerFast.from_pretrained("gpt2", revision=vars.revision, cache_dir="cache")
try:
tokenizer = GPT2Tokenizer.from_pretrained(vars.model, revision=vars.revision, cache_dir="cache")
except Exception as e:
tokenizer = GPT2Tokenizer.from_pretrained("gpt2", revision=vars.revision, cache_dir="cache")
try:
model = AutoModelForCausalLM.from_pretrained(vars.model, revision=vars.revision, cache_dir="cache")
except Exception as e:
model = GPTNeoForCausalLM.from_pretrained(vars.model, revision=vars.revision, cache_dir="cache")
#network.state = network.move_xmap(network.state, np.zeros(cores_per_replica))
global shard_xmap, batch_xmap
shard_xmap = __shard_xmap()
batch_xmap = __batch_xmap(shard_dim=cores_per_replica)

View File

@ -1,5 +1,7 @@
@echo off
cd /d %~dp0
SET CONDA_SHLVL=
TITLE KoboldAI - Updater
SET /P M=<loader.settings
IF %M%==1 GOTO drivemap

View File

@ -500,6 +500,7 @@
<li>kwargs? (<code>table&lt;string, any&gt;</code>): Table of optional keyword arguments from the following list. Defaults to <code>{}</code>.
<ul>
<li>scan_story? (<code>boolean</code>): Whether or not to scan the past few actions of the story for world info keys in addition to the submission like how world info normally behaves. If this is set to <code>false</code>, only the <code>submission</code> is scanned for world info keys. Defaults to <code>true</code>.</li>
<li>include_anote? (<code>boolean</code>): Whether to include the author's note in the story. Defaults to <code>true</code>, pass <code>false</code> to suppress including the author's note.</li>
</ul>
</li>
</ul>
@ -574,6 +575,7 @@
<li>kwargs? (<code>table&lt;string, any&gt;</code>): Table of optional keyword arguments from the following list. Defaults to <code>{}</code>.
<ul>
<li>scan_story? (<code>boolean</code>): Whether or not to scan the past few actions of the story for world info keys in addition to the submission like how world info normally behaves. If this is set to <code>false</code>, only the <code>submission</code> is scanned for world info keys. Defaults to <code>true</code>.</li>
<li>include_anote? (<code>boolean</code>): Whether to include the author's note in the story. Defaults to <code>true</code>, pass <code>false</code> to suppress including the author's note.</li>
</ul>
</li>
</ul>
@ -687,6 +689,7 @@
<li>kwargs? (<code>table&lt;string, any&gt;</code>): Table of optional keyword arguments from the following list. Defaults to <code>{}</code>.
<ul>
<li>scan_story? (<code>boolean</code>): Whether or not to scan the past few actions of the story for world info keys in addition to the submission like how world info normally behaves. If this is set to <code>false</code>, only the <code>submission</code> is scanned for world info keys. Defaults to <code>true</code>.</li>
<li>include_anote? (<code>boolean</code>): Whether to include the author's note in the story. Defaults to <code>true</code>, pass <code>false</code> to suppress including the author's note.</li>
</ul>
</li>
</ul>

View File

@ -538,6 +538,7 @@ Computes the context that would be sent to the generator with the user's current
* entries? (`KoboldWorldInfoEntry|table<any, KoboldWorldInfoEntry>`): A `KoboldWorldInfoEntry` or table thereof that indicates an allowed subset of world info entries to include in the context. Defaults to all world info entries.
* kwargs? (`table<string, any>`): Table of optional keyword arguments from the following list. Defaults to `{}`.
* scan_story? (`boolean`): Whether or not to scan the past few actions of the story for world info keys in addition to the submission like how world info normally behaves. If this is set to `false`, only the `submission` is scanned for world info keys. Defaults to `true`.
* include_anote? (`boolean`): Whether to include the author's note in the story. Defaults to `true`, pass `false` to suppress including the author's note.
### Returns
@ -636,6 +637,7 @@ The same as calling `kobold.worldinfo:compute_context()` with this world info en
* submission (`string`): String to use as simulated user's input after being formatted by input formatting.
* kwargs? (`table<string, any>`): Table of optional keyword arguments from the following list. Defaults to `{}`.
* scan_story? (`boolean`): Whether or not to scan the past few actions of the story for world info keys in addition to the submission like how world info normally behaves. If this is set to `false`, only the `submission` is scanned for world info keys. Defaults to `true`.
* include_anote? (`boolean`): Whether to include the author's note in the story. Defaults to `true`, pass `false` to suppress including the author's note.
### Returns
@ -819,6 +821,7 @@ Unlike `kobold.worldinfo:compute_context()`, this function doesn't include world
* entries? (`KoboldWorldInfoEntry|table<any, KoboldWorldInfoEntry>`): A `KoboldWorldInfoEntry` or table thereof that indicates an allowed subset of world info entries to include in the context. Entries that are not inside of the folder are still not included. Defaults to all world info entries in the folder.
* kwargs? (`table<string, any>`): Table of optional keyword arguments from the following list. Defaults to `{}`.
* scan_story? (`boolean`): Whether or not to scan the past few actions of the story for world info keys in addition to the submission like how world info normally behaves. If this is set to `false`, only the `submission` is scanned for world info keys. Defaults to `true`.
* include_anote? (`boolean`): Whether to include the author's note in the story. Defaults to `true`, pass `false` to suppress including the author's note.
### Returns

122
utils.py
View File

@ -8,6 +8,9 @@ from urllib.error import HTTPError
import requests
import requests.adapters
import time
from transformers import __version__ as transformers_version
from transformers import PreTrainedModel
import packaging.version
from tqdm.auto import tqdm
import os
import itertools
@ -15,9 +18,16 @@ import hashlib
import huggingface_hub
import packaging.version
from pathlib import Path
from typing import Optional
from typing import List, Optional
HAS_ACCELERATE = packaging.version.parse(transformers_version) >= packaging.version.parse("4.20.0.dev0")
try:
import accelerate
except ImportError:
HAS_ACCELERATE = False
vars = None
args = None
num_shards: Optional[int] = None
current_shard = 0
from_pretrained_model_name = ""
@ -25,7 +35,13 @@ from_pretrained_index_filename: Optional[str] = None
from_pretrained_kwargs = {}
bar = None
default_sampler_order = [0, 1, 2, 3, 4, 5]
layers_module_names: Optional[List[str]] = None
module_names: Optional[List[str]] = None
named_buffers: Optional[List[tuple]] = None
default_sampler_order = [6, 0, 1, 2, 3, 4, 5]
emit = None
#==================================================================#
# Decorator to prevent a function's actions from being run until
@ -111,7 +127,7 @@ def addsentencespacing(txt, vars):
else:
action = vars.prompt
lastchar = action[-1] if len(action) else ""
if(lastchar == "." or lastchar == "!" or lastchar == "?" or lastchar == "," or lastchar == ";" or lastchar == ":"):
if(lastchar != " "):
txt = " " + txt
return txt
@ -159,13 +175,33 @@ def decodenewlines(txt):
# Returns number of layers given an HF model config
#==================================================================#
def num_layers(config):
return config.num_layers if hasattr(config, "num_layers") else config.n_layer if hasattr(config, "n_layer") else config.num_hidden_layers
return config["n_layer"] if isinstance(config, dict) else config.num_layers if hasattr(config, "num_layers") else config.n_layer if hasattr(config, "n_layer") else config.num_hidden_layers if hasattr(config, 'num_hidden_layers') else None
#==================================================================#
# Downloads huggingface checkpoints using aria2c if possible
#==================================================================#
from flask_socketio import emit
def _download_with_aria2(aria2_config: str, total_length: int, directory: str = ".", user_agent=None, force_download=False, use_auth_token=None):
class Send_to_socketio(object):
def write(self, bar):
bar = bar.replace("\r", "").replace("\n", "")
if bar != "":
try:
print('\r' + bar, end='')
try:
emit('from_server', {'cmd': 'model_load_status', 'data': bar.replace(" ", "&nbsp;")}, broadcast=True)
except:
pass
eventlet.sleep(seconds=0)
except:
pass
def flush(self):
pass
import transformers
aria2_port = 6799 if vars is None else vars.aria2_port
lengths = {}
s = requests.Session()
s.mount("http://", requests.adapters.HTTPAdapter(max_retries=requests.adapters.Retry(total=120, backoff_factor=1)))
@ -176,9 +212,9 @@ def _download_with_aria2(aria2_config: str, total_length: int, directory: str =
with tempfile.NamedTemporaryFile("w+b", delete=False) as f:
f.write(aria2_config)
f.flush()
p = subprocess.Popen(["aria2c", "-x", "10", "-s", "10", "-j", "10", "--enable-rpc=true", f"--rpc-secret={secret}", "--rpc-listen-port", str(vars.aria2_port), "--disable-ipv6", "--file-allocation=trunc", "--allow-overwrite", "--auto-file-renaming=false", "-d", directory, "-i", f.name, "-U", transformers.file_utils.http_user_agent(user_agent)] + (["-c"] if not force_download else []) + ([f"--header='Authorization: Bearer {use_auth_token}'"] if use_auth_token else []), stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL)
p = subprocess.Popen(["aria2c", "-x", "10", "-s", "10", "-j", "10", "--enable-rpc=true", f"--rpc-secret={secret}", "--rpc-listen-port", str(aria2_port), "--disable-ipv6", "--file-allocation=trunc", "--allow-overwrite", "--auto-file-renaming=false", "-d", directory, "-i", f.name, "-U", transformers.file_utils.http_user_agent(user_agent)] + (["-c"] if not force_download else []) + ([f"--header='Authorization: Bearer {use_auth_token}'"] if use_auth_token else []), stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL)
while p.poll() is None:
r = s.post(f"http://localhost:{vars.aria2_port}/jsonrpc", json={"jsonrpc": "2.0", "id": "kai", "method": "aria2.tellActive", "params": [f"token:{secret}"]}).json()["result"]
r = s.post(f"http://localhost:{aria2_port}/jsonrpc", json={"jsonrpc": "2.0", "id": "kai", "method": "aria2.tellActive", "params": [f"token:{secret}"]}).json()["result"]
if not r:
s.close()
if bar is not None:
@ -188,7 +224,7 @@ def _download_with_aria2(aria2_config: str, total_length: int, directory: str =
done = True
break
if bar is None:
bar = tqdm(total=total_length, desc=f"[aria2] Downloading model", unit="B", unit_scale=True, unit_divisor=1000)
bar = tqdm(total=total_length, desc=f"[aria2] Downloading model", unit="B", unit_scale=True, unit_divisor=1000, file=Send_to_socketio())
visited = set()
for x in r:
filename = x["files"][0]["path"]
@ -225,7 +261,7 @@ def _transformers22_aria2_hook(pretrained_model_name_or_path: str, force_downloa
if token is None:
raise EnvironmentError("You specified use_auth_token=True, but a huggingface token was not found.")
_cache_dir = str(cache_dir) if cache_dir is not None else transformers.TRANSFORMERS_CACHE
_revision = revision if revision is not None else huggingface_hub.constants.DEFAULT_REVISION
_revision = args.revision if args.revision is not None else huggingface_hub.constants.DEFAULT_REVISION
sharded = False
headers = {"user-agent": transformers.file_utils.http_user_agent(user_agent)}
if use_auth_token:
@ -236,7 +272,7 @@ def _transformers22_aria2_hook(pretrained_model_name_or_path: str, force_downloa
def is_cached(filename):
try:
huggingface_hub.hf_hub_download(pretrained_model_name_or_path, filename, cache_dir=cache_dir, local_files_only=True)
huggingface_hub.hf_hub_download(pretrained_model_name_or_path, filename, cache_dir=cache_dir, local_files_only=True, revision=_revision)
except ValueError:
return False
return True
@ -245,7 +281,7 @@ def _transformers22_aria2_hook(pretrained_model_name_or_path: str, force_downloa
filename = transformers.modeling_utils.WEIGHTS_INDEX_NAME if sharded else transformers.modeling_utils.WEIGHTS_NAME
except AttributeError:
return
url = huggingface_hub.hf_hub_url(pretrained_model_name_or_path, filename, revision=revision)
url = huggingface_hub.hf_hub_url(pretrained_model_name_or_path, filename, revision=_revision)
if is_cached(filename) or requests.head(url, allow_redirects=True, proxies=proxies, headers=headers):
break
if sharded:
@ -259,7 +295,7 @@ def _transformers22_aria2_hook(pretrained_model_name_or_path: str, force_downloa
with open(map_filename) as f:
map_data = json.load(f)
filenames = set(map_data["weight_map"].values())
urls = [huggingface_hub.hf_hub_url(pretrained_model_name_or_path, n, revision=revision) for n in filenames]
urls = [huggingface_hub.hf_hub_url(pretrained_model_name_or_path, n, revision=_revision) for n in filenames]
if not force_download:
urls = [u for u, n in zip(urls, filenames) if not is_cached(n)]
if not urls:
@ -406,7 +442,7 @@ def _transformers22_aria2_hook(pretrained_model_name_or_path: str, force_downloa
headers = [requests.head(u, headers=headers, allow_redirects=True, proxies=proxies, timeout=10).headers for u in urls]
for n in filenames:
prefix, suffix = n.rsplit("/", 1)
prefix, suffix = n.rsplit(os.sep, 1)
path = os.path.join(prefix, "kai-tempfile." + suffix + ".aria2")
if os.path.exists(path):
os.remove(path)
@ -414,16 +450,17 @@ def _transformers22_aria2_hook(pretrained_model_name_or_path: str, force_downloa
if os.path.exists(path):
os.remove(path)
total_length = sum(int(h["Content-Length"]) for h in headers)
aria2_config = "\n".join(f"{u}\n out={os.path.join(prefix, 'kai-tempfile.' + suffix)}" for u, n in zip(urls, filenames) for prefix, suffix in [n.rsplit("/", 1)]).encode()
aria2_config = "\n".join(f"{u}\n out={os.path.join(prefix, 'kai-tempfile.' + suffix)}" for u, n in zip(urls, filenames) for prefix, suffix in [n.rsplit(os.sep, 1)]).encode()
_download_with_aria2(aria2_config, total_length, use_auth_token=token if use_auth_token else None, user_agent=user_agent, force_download=force_download)
for u, n in zip(urls, filenames):
prefix, suffix = n.rsplit("/", 1)
prefix, suffix = n.rsplit(os.sep, 1)
os.rename(os.path.join(prefix, "kai-tempfile." + suffix), os.path.join(prefix, suffix))
def aria2_hook(pretrained_model_name_or_path: str, force_download=False, cache_dir=None, proxies=None, resume_download=False, local_files_only=False, use_auth_token=None, user_agent=None, revision=None, **kwargs):
import transformers
import transformers.modeling_utils
from huggingface_hub import HfFolder
_revision = args.revision if args.revision is not None else huggingface_hub.constants.DEFAULT_REVISION
if shutil.which("aria2c") is None: # Don't do anything if aria2 is not installed
return
if local_files_only: # If local_files_only is true, we obviously don't need to download anything
@ -458,7 +495,7 @@ def aria2_hook(pretrained_model_name_or_path: str, force_download=False, cache_d
filename = transformers.modeling_utils.WEIGHTS_INDEX_NAME if sharded else transformers.modeling_utils.WEIGHTS_NAME
except AttributeError:
return
url = huggingface_hub.hf_hub_url(pretrained_model_name_or_path, filename, revision=revision)
url = huggingface_hub.hf_hub_url(pretrained_model_name_or_path, filename, revision=_revision)
if is_cached(url) or requests.head(url, allow_redirects=True, proxies=proxies, headers=headers):
break
if sharded:
@ -472,7 +509,7 @@ def aria2_hook(pretrained_model_name_or_path: str, force_download=False, cache_d
with open(map_filename) as f:
map_data = json.load(f)
filenames = set(map_data["weight_map"].values())
urls = [huggingface_hub.hf_hub_url(pretrained_model_name_or_path, n, revision=revision) for n in filenames]
urls = [huggingface_hub.hf_hub_url(pretrained_model_name_or_path, n, revision=_revision) for n in filenames]
if not force_download:
urls = [u for u in urls if not is_cached(u)]
if not urls:
@ -519,5 +556,56 @@ def get_num_shards(filename):
def get_sharded_checkpoint_num_tensors(pretrained_model_name_or_path, filename, cache_dir=None, force_download=False, proxies=None, resume_download=False, local_files_only=False, use_auth_token=None, user_agent=None, revision=None, **kwargs):
import transformers.modeling_utils
import torch
shard_paths, _ = transformers.modeling_utils.get_checkpoint_shard_files(pretrained_model_name_or_path, filename, cache_dir=cache_dir, force_download=force_download, proxies=proxies, resume_download=resume_download, local_files_only=local_files_only, use_auth_token=use_auth_token, user_agent=user_agent, revision=revision)
_revision = args.revision if args.revision is not None else huggingface_hub.constants.DEFAULT_REVISION
shard_paths, _ = transformers.modeling_utils.get_checkpoint_shard_files(pretrained_model_name_or_path, filename, cache_dir=cache_dir, force_download=force_download, proxies=proxies, resume_download=resume_download, local_files_only=local_files_only, use_auth_token=use_auth_token, user_agent=user_agent, revision=_revision)
return list(itertools.chain(*(torch.load(p, map_location="cpu").keys() for p in shard_paths)))
#==================================================================#
# Given a PreTrainedModel, returns the list of module names that correspond
# to the model's hidden layers.
#==================================================================#
def get_layers_module_names(model: PreTrainedModel) -> List[str]:
names: List[str] = []
def recurse(module, head=""):
for c in module.named_children():
name = head + c[0]
if c[0].isnumeric() and any(c[1].__class__.__name__.endswith(suffix) for suffix in ("Block", "Layer")):
names.append(name)
else:
recurse(c[1], head=name + ".")
recurse(model)
return names
#==================================================================#
# Given a PreTrainedModel, returns the module name that corresponds
# to the model's input embeddings.
#==================================================================#
def get_input_embeddings_module_name(model: PreTrainedModel) -> str:
embeddings = model.get_input_embeddings()
def recurse(module, head=""):
for c in module.named_children():
name = head + c[0]
if c[1] is embeddings:
return name
else:
return recurse(c[1], head=name + ".")
return recurse(model)
#==================================================================#
# Given a PreTrainedModel and a list of module names, returns a list
# of module names such that the union of the set of modules given as input
# and the set of modules returned as output contains all modules in the model.
#==================================================================#
def get_missing_module_names(model: PreTrainedModel, names: List[str]) -> List[str]:
missing_names: List[str] = []
def recurse(module, head=""):
for c in module.named_children():
name = head + c[0]
if any(name.startswith(n) for n in names):
continue
if next(c[1].named_children(), None) is None:
missing_names.append(name)
else:
recurse(c[1], head=name + ".")
recurse(model)
return missing_names

View File

@ -28,10 +28,10 @@ SOFTWARE.
'''
import torch
from transformers import LogitsWarper, LogitsProcessor
from transformers import LogitsWarper
class AdvancedRepetitionPenaltyLogitsProcessor(LogitsProcessor):
class AdvancedRepetitionPenaltyLogitsProcessor(LogitsWarper):
def __init__(self, *args, **kwargs):
pass