Anonymous
8/5/2025, 7:11:49 PM No.106151936
>117B (gpt-oss-120b) and a 21B parameters (gpt-oss-20b)
>MoE (5.1B and 3.6B active, respectively)
>large model requires an H100
>small model fits on a 16GB gpu
>text only
>chain-of-thought and adjustable reasoning effort levels
>instruction following
>tool use
>Apache 2.0
>MoE (5.1B and 3.6B active, respectively)
>large model requires an H100
>small model fits on a 16GB gpu
>text only
>chain-of-thought and adjustable reasoning effort levels
>instruction following
>tool use
>Apache 2.0
Replies: