![harambe_muskš Profile](https://pbs.twimg.com/profile_images/1763673826582560768/DNiqhr17_x96.jpg)
harambe_muskš
@harambe_musk
Followers
2K
Following
4K
Statuses
3K
Where my gorillas at? Achieved AGI externally. š¦ | Gotta get those bananasš
arra kiss
Joined March 2024
@TheoPhysiJ @jerryjliu0 @deedydas What do you mean by interpret? You mean analyse or extract structured information?
0
0
0
@btibor91 Iāll use my cousinās ID if you they can give pro for $20 but I know they will never offer that lmao
0
0
0
Transformers, as great as theyāre, still not valuable enough to be used in most high economic use-cases due to inherent issues like hallucinations and limited context. Transformers are still an unproven tech mainly because of this and thatās the number reason why I think Nvidia isnāt investing their capital and focus on building supply chain for transformers specific GPUs and rather focus on hardware thatās more generally useable. I am sure theyāve prototypes that are on par with something like Groq perhaps better but I think they simply donāt feel from a business standpoint it warrants focus and capital in building transformers specific hardware until these issues are solved because anyway their current hardware does a pretty decent job however inefficient they are.
Itās not just about having a nice hardware, itās the ecosystem, reliable supply chain and ability to handle scalable needs and most of all predictability. These hardwares are very new and unestablished and doesnāt make sense to take that leap of faith when you already have something like NVIDIA. Pretty sure Nvidia already has prototypes that are transformer specific and offer same or perhaps better performance than these hardware. I guess theyāre waiting for transformers tech to mature enough to warrant heavy investment and focus from Nvidia.
0
0
4
Itās not just about having a nice hardware, itās the ecosystem, reliable supply chain and ability to handle scalable needs and most of all predictability. These hardwares are very new and unestablished and doesnāt make sense to take that leap of faith when you already have something like NVIDIA. Pretty sure Nvidia already has prototypes that are transformer specific and offer same or perhaps better performance than these hardware. I guess theyāre waiting for transformers tech to mature enough to warrant heavy investment and focus from Nvidia.
0
1
2
How tf Sonnet so good that it always ranks on top without much updates? When 4 Sonnet coming?
Introducing MultiChallenge by @scale_AI - a new multi-turn conversation benchmark. Current frontier LLMs score under 50% accuracy (top: 44.93%). š„o1 š„Claude 3.5 Sonnet š„Gemini 2.0 Pro Experimental š Paper: šLeaderboard:
3
0
8