注册并分享邀请链接,可获得视频播放与邀请奖励。

与「2B」相关的搜索结果

2B 贴吧
一个关键词就是一个贴吧,路径全站唯一。
创建贴吧
用户
未找到
包含 2B 的内容
"Getting the first one out of the way is the hardest. We finally did it." Jarrett Allen (16p, 10r, 2b) on the @cavs proving they can win on the road in the postseason! They can advance to the Eastern Conference Finals with a win on Friday at 7:00pm/et on Prime.
显示更多
0
63
253
20
转发到社区
@garrytan Your ratchet post made me think: if code is now 3-5x cheaper to write but verification is the new bottleneck, tests should be a design input, not a design output. Short note + template repo I'm testing now:
显示更多
If you love fine-tuning open-source models (like me), then listen. > Start with 1B, 2B, 4B, and 8B models. (Don't start with a 27B model or bigger at first.) > Use WebGPU providers. I use Google Colab Pro for any model smaller than 9B. A single A100 80GB costs around $0.60/hr, which is cheap. Enough for small models. > Don’t buy GPUs unless you fine-tune 7 to 10 models. You'll understand the nitty-gritty in the process. > Use Codex 5.5 × DeepSeek v4 Pro to create datasets. Codex to plan, DeepSeek v4 Pro to generate rows. > Use Unsloth's instruct models as a base from Hugging Face. Yes, there are others too, but Unsloth also provides fast fine-tuning notebooks. > Use Unsloth's fine-tuning notebooks as a reference. Paste them into Codex, and Codex will write a custom notebook with the configs you need. > Spend 1 day learning about: - SFT (supervised fine-tuning) - RL training (GRPO, DPO, PPO, etc.) - LoRA / QLoRA training - Quantization and types - Local inference engines (llama.cpp) - KV cache and prompt cache > Just get started. Claude, Codex, and ChatGPT can design a step-by-step plan for how you can fine-tune your first AI model. Future tech is moving toward small 5B to 15B ELMs (Expert Language Models) rather than general 1T LLMs. So fine-tuning is an important skill that anyone can acquire today. Tune models, test them, use them. Then fine-tune for companies and make a career out of it. (Companies pay $50k+ to fine-tune models on their data so they can get personalized AI models.) Shoot your questions below. I'll be sharing in-depth raw findings about this topic in the coming days.
显示更多
0
97
2.5K
315
转发到社区
We've raised $2.2B in committed capital to invest in the next generation of crypto. Announcing Crypto Fund 5
0
306
2.7K
335
转发到社区
How do you rate my 2B cosplay?🖤
0
10
548
14
转发到社区
A lil 2B to brighten your mon-day <3
0
23
2.5K
81
转发到社区