gpt-oss-120b is useless

Details
Title | gpt-oss-120b is useless |
Author | What's AI by Louis-François Bouchard |
Duration | 1:51 |
File Format | MP3 / MP4 |
Original URL | https://youtube.com/watch?v=SemHbN-fM80 |
Description
🚀 gpt-oss-120b: hype or helpful?
• 117 B-param MoE (5.1 B active), fits on a single 80 GB GPU
• Near-o4-mini parity on reasoning, coding, math & health tasks
• 128 k context, 3 “effort” levels, full CoT, tool use & code exec
• Trained on English STEM/code, multilingual capacity probably quite bad
• Skip it if energy costs, fine-tuning hurdles, or non-English matter to you; the 20 B may be a smarter first stop for efficiency, but still only trained for English.
—
Louis-François, PhD dropout & CTO @ Towards AI.
Follow for tomorrow’s no-BS roundup.
#OpenAI #gptoss120b #GenerativeAI