Go back down to 2048 tokens instead of 4096 to be in line with the other non-Llama 2-specific presets
These are the same settings as [simple-proxy-for-tavern's default preset](https://github.com/anon998/simple-proxy-for-tavern/blob/main/presets/default.json). I've fixed the sampler order and raised the context size to Llama 2's 4096 tokens.