Please or Register to create posts and topics.

Natural Language Processing (NLP) Latest Developments: T0 Model Outperforms GPT3

In case anyone is interested in natural language processing (NLP), I would like to share this here.
It's a more technical discussion.

Since GPT3 was excellent in text generation, the T0 model reaches a new benchmark in terms of performance in various NLP tasks.

Technical Details & Video Explanation

 Paper: https://arxiv.org/abs/2110.08207
 BigScience: https://bigscience.huggingface.co/
 Models on HF hub: https://huggingface.co/bigscience
 Prompt tool: https://github.com/bigscience-workshop/promptsource

These are the Python Package requirements:
1) transformers
2) pytorch
3) sentencepiece

There's an easy way to try out this model with PyTorch & Hugging Face:
________________________________________________________________
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM

tokenizer = AutoTokenizer.from_pretrained("bigscience/T0pp")
model = AutoModelForSeq2SeqLM.from_pretrained("bigscience/T0pp")

inputs = tokenizer.encode("Is this review positive or negative? Review: this is the best cast iron skillet you will ever buy", return_tensors="pt")

outputs = model.generate(inputs)
print(tokenizer.decode(outputs[0]))
________________________________________________________________

Processing...