| 2026-02-28 00:54 | eval_success | Light evaluated: Neutral (0.00) | - - |
| 2026-02-28 00:54 |
eval
|
Evaluated by llama-4-scout-wai: 0.00 (Neutral) | |
| 2026-02-28 00:47 | eval_success | Evaluated: Mild negative (-0.19) | - - |
| 2026-02-28 00:47 |
eval
|
Evaluated by deepseek-v3.2: -0.19 (Mild negative) 14,731 tokens | |
| 2026-02-28 00:42 | eval_success | Light evaluated: Neutral (0.00) | - - |
| 2026-02-28 00:42 |
eval
|
Evaluated by llama-3.3-70b-wai: 0.00 (Neutral) | |
| 2026-02-28 00:33 | dlq | Dead-lettered after 1 attempts: FFmpeg to Google: Fund us or stop sending bugs | - - |
| 2026-02-28 00:33 | eval_failure | Evaluation failed: AiError: 5021: The estimated number of input and maximum output tokens (321294) exceeded this model context window limit (131000). | - - |
| 2026-02-28 00:33 | eval_failure | Evaluation failed: AiError: 5021: The estimated number of input and maximum output tokens (321294) exceeded this model context window limit (131000). | - - |
| 2026-02-28 00:27 | dlq | Dead-lettered after 1 attempts: FFmpeg to Google: Fund us or stop sending bugs | - - |
| 2026-02-28 00:27 | eval_failure | Evaluation failed: AiError: 5021: The estimated number of input and maximum output tokens (321294) exceeded this model context window limit (24000). | - - |
| 2026-02-28 00:27 | eval_failure | Evaluation failed: AiError: 5021: The estimated number of input and maximum output tokens (321294) exceeded this model context window limit (24000). | - - |
| 2026-02-28 00:26 | dlq | Dead-lettered after 1 attempts: FFmpeg to Google: Fund us or stop sending bugs | - - |
| 2026-02-28 00:26 | eval_retry | OpenRouter error 400 model=deepseek-v3.2 | - - |
| 2026-02-28 00:26 | eval_failure | Evaluation failed: Error: OpenRouter API error 400: {"error":{"message":"This endpoint's maximum context length is 163840 tokens. However, you requested about 361947 tokens (353755 of text input, 8192 in the output). Pl | - - |
| 2026-02-28 00:26 | eval_failure | Evaluation failed: Error: OpenRouter API error 400: {"error":{"message":"This endpoint's maximum context length is 163840 tokens. However, you requested about 361947 tokens (353755 of text input, 8192 in the output). Pl | - - |
| 2026-02-28 00:26 | eval_retry | OpenRouter error 400 model=deepseek-v3.2 | - - |
| 2026-02-28 00:26 | eval_retry | OpenRouter error 400 model=deepseek-v3.2 | - - |
| 2026-02-28 00:26 | eval_failure | Evaluation failed: Error: OpenRouter API error 400: {"error":{"message":"This endpoint's maximum context length is 163840 tokens. However, you requested about 361947 tokens (353755 of text input, 8192 in the output). Pl | - - |
| 2026-02-28 00:15 | dlq | Dead-lettered after 1 attempts: FFmpeg to Google: Fund us or stop sending bugs | - - |
| 2026-02-28 00:14 | eval_failure | Evaluation failed: AiError: 5021: The estimated number of input and maximum output tokens (321294) exceeded this model context window limit (131000). | - - |
| 2026-02-28 00:14 | eval_failure | Evaluation failed: AiError: 5021: The estimated number of input and maximum output tokens (321294) exceeded this model context window limit (131000). | - - |
| 2026-02-28 00:14 | dlq | Dead-lettered after 1 attempts: FFmpeg to Google: Fund us or stop sending bugs | - - |
| 2026-02-27 23:41 |
eval
|
Evaluated by claude-haiku-4-5: +0.35 (Moderate positive) | |