Bittensor is incentivizing miners like
@TensorplexLabs
to train better foundational models through SN9, the pretraining subnet.
As a result, a 7b model is outperforming Llama 7b, Llama2 7b, and Falcon 7b on the benchmarks below.
It works, and it's just getting started.
$TAO