This is an automated archive made by the Lemmit Bot.
The original was posted on /r/machinelearning by /u/Amgadoz on 2024-04-02 08:26:33.
So we are all probably aware of state-of-the-art decoder only LLMs like GPT-4, Claude etc. These are great for generating text.
But what I am not aware of is the SOTA BERT-like model. You know, things that can be used for taks like NER, POS tagging, token classification.
Are there models that are significantly better than say Roberta?
You must log in or register to comment.