Using @openrouter.bsky.social OpenRouterAI API with my or-cli script's --eval mode to allow a response from the first LLM model to be evaluated by a specified second LLM model. Asking Llama 3.3 70b Instruct to evaluate the response from Gemini 2.0 Flash Lite 🤓
Reposted from
George
Having a lot of fun playing with OpenRouterAI @openrouter.bsky.social 🤓 Offering some free models is 😎 Testing Deepseek R1
Comments