2

Show HN: We just dropped a 8B alternative of OpenAI GPT-o1 and it's sick

We dropped State-0, a 8B chain-of-thoughts alternative of GPT-o1 which is trained on additional 40million CoT parameters to give better responses with core 'socratic instincts'.

It can't be that good