LLaMA 2.0 is coming soon.
Seems like it will be:
- commercially usable
- closer to closed models like GPT-4/Claude 2 in capabilities
Excited to see the landscape shift.
This is insane! 😱
You can now train a 100-billion parameters LLM on Google Colab.
Explanation and code below ⤵️⤵️⤵️
--- TL;DR ---
Distributed training over the Internet has become operational with the release of the new version of the PETALS distributed training package.
これ度々言ってるけど労働人口あたりのGDPで見ないと意味がないですよ。日本は労働人口減ってるからGDPは成長鈍化してるけど労働人口あたりで見ると成長してます。
"From 1998 to 2019, Japan has grown slightly faster than the U.S. in terms of per working-age adult."
🚨 A new open dataset from the Kaggle Team is out!
Meta Kaggle for Code is an open source dataset made up of ML code created & publicly shared by Kaggle’s community over the past decade 🤯. More on why we released it, how to use it, & licensing info 👇