![Raghav{endra}| ರಾಘವೇಂದ್ರ Profile](https://pbs.twimg.com/profile_images/1742212601310793728/s0OeyJtH_x96.jpg)
Raghav{endra}| ರಾಘವೇಂದ್ರ
@raghavian
Followers
1K
Following
4K
Statuses
1K
Assistant Prof. (TT) @MLSectionUCPH @Uni_Copenhagen Author of upcoming book on Sustainable AI (https://t.co/3gueoSZx7Q) https://t.co/r7f66NkC9S
Bangalore/Copenhagen
Joined December 2016
So, I have been working hard on a book for > 1 year now. Something that I am extremely excited about, as it is trying to present a holistic view of my research domain at the intersection of #ArtificialIntelligence and #Sustainability. It is simply called "Sustainable AI".
1
4
23
RT @RasmusPagh1: Interested in a post-doc position in theoretical computer science? Want to find out why they call it "Wonderful Copenhagen…
0
10
0
RT @MLSectionUCPH: Registration is also now open for the HYBRID Master-level course "#MachineLearning B" (22 Apr 2025 - 20 Jun 2025; 7.5 EC…
0
1
0
RT @MLSectionUCPH: Registration is now open for HYBRID Master-level course "Online and #Reinforcement Learning" (3 Feb 2025 - 4 Apr 2025; 7…
0
3
0
RT @dustin_wright37: Working on AI #Sustainability? Remember, ML #Efficiency is only one piece of the puzzle, and there are many opportuni…
0
2
0
Can I plug our topical paper with @dustin_wright37 and @christian_igel here, where we argue that "Efficiency is not Enough"? Although for a different problem 😅
BREAKING: Billionaire Elon Musk and entrepreneur Vivek Ramaswamy will lead a new Department of Government Efficiency, Trump says
0
0
3
RT @AmartyaSanyal: Deadline approaching for applying to ELLIS and working with me on topics in Privacy, Robustness, and Interpretability of…
0
4
0
I had fun at the panel discussion yesterday at #DigitalTechSummit talking about why the material costs of technology should not be decoupled from the technology itself. In this case, energy consumption of #ArtificialInteligence. Looks like my joke on learning Danish landed ;)
0
0
11
@AIatMeta is training Llama 4 on a cluster bigger than 100k H100. Even if "just" trained for a month, this will amount to ~50.4GWh only for the GPUs (using 700W TDP) without taking any additional costs (CPU, memory, cooling). Llama 3 was trained for ~6 months.
Great to visit one of our data centers where we're training Llama 4 models on a cluster bigger than 100K H100’s! So proud of the incredible work we’re doing to advance our products, the AI field and the open source community. We’re hiring top researchers to work on reasoning, code generation, and more - come join us!
0
0
3
Check out the line-up of talks at the Resource Aware ML session at #D3A conference today from 13.30. @pinartozun
0
2
5
RT @theresia_v_r: 🏆We're honoured to receive the Women in RecSys Journal Paper of the Year Award 2024 for our paper "Evaluation Measures of…
0
2
0