However, due to modern LLM postraining paradigms, it’s entirely possible that newer LLMs are specifically RLHF-trained to write better code in Rust despite its relative scarcity. I ran more experiments with Opus 4.5 and using LLMs in Rust on some fun pet projects, and my results were far better than I expected. Here are four such projects:
风浪越大鱼越贵,风险与机遇在此高度浓缩,这,正是2026离钱最近的地方。,更多细节参见搜狗输入法2026
Global news & analysis。关于这个话题,搜狗输入法2026提供了深入分析
Chad Whitacre Head of Open Source, Sentry