![Om Alve Profile](https://pbs.twimg.com/profile_images/1672927269831200769/nA3K-yCM_x96.jpg)
Om Alve
@alve_om
Followers
903
Following
5K
Statuses
2K
AI Research Intern @superkalam_ | Je crois en moi
Thane,India
Joined January 2021
@im_ashishsinha5 @fpdotmoney Nope, I trained a bpe tokenizer with 4096 vocab size because the task is fairly simple and the complexity of language is on the easier side, the context length for this model is 512 tokens
0
0
0
@Eshan1347 check this out
0
0
1