This is an automated archive made by the Lemmit Bot.
The original was posted on /r/singularity by /u/FeltSteam on 2023-11-06 09:08:01.
Seems we are getting a 128k context version of GPT-4, a turbo model that is said to be more capable over current GPT-4 (though i believe behaviour will be a little different, and i think people will believe this behaviour difference is a quality difference) as well as possible GPT-4V API, and an API for code interpreter.
(also seems like a few other things like DALLE3 and TTS API)
And the 128k context explains how the new teams plan for ChatGPT said “4x context length” even though 32k was going to be generally avaliable (also i find it funny how the 128k (128k!!) version of GPT-4 turbo is cheaper than the current 8k context 😂).
You must log in or register to comment.