
Taimoor Tariq
@TaimoorTariq95
Followers
153
Following
2K
Media
30
Statuses
462
📸 Algorithms at Apple | Human Vision, Computational Photography and Display | ex-intern (AR/VR stuff) @Meta
Joined April 2018
Foveated rendering is a key enabler for spatially realistic real-time VR, but nature hints that a key purpose of peripheral vision is motion perception. Our work on preserving motion perception in AR/VR, to appear in the ACM Transactions on Graphics (SIGGRAPH ‘24) (1/10)
4
10
45
Mounted on Formula One cars and used in real races, the new movie helped bring features like log encoding and ACES support to the iPhone 15 Pro.
wired.com
Mounted on Formula One cars and used at real events, the special module used an iPhone camera sensor and A-series chip to help capture the new movie's racing scenes.
1
11
51
Some very cool stuff in visual perception + aesthetics . If you are into art, photography, or the vision science behind aesthetics. I found these two papers to be a super fun and easy read. https://t.co/rBpd0f5JjY
https://t.co/hCzB8gC4s5
mdpi.com
This essay discusses whether computers, using Artificial Intelligence (AI), could create art. First, the history of technologies that automated aspects of art is surveyed, including photography and...
Adobe Research Principal Scientist @AaronHertzmann won the Computer Graphics Achievement award by @SIGGRAPH,one of the highest honors in the field! Learn about his new theory of perception, and his work at the intersection of art and computer graphics. https://t.co/7Tk2Sfqe5l
0
1
4
We hope that we contribute to a better understanding of human vision, and to the goal of real-time AR/VR that is a truly (not just spatially 😉) realistic representation of the visual world. Let’s talk more about it @siggraph (10/10) 📝 👉: https://t.co/K5Gs2nxlKK
0
0
4
Importantly, no full-res reference is required, as we carefully control the spatio-temporal spectrum of our synthesis by designing an algorithm cognizant of human vision, making motion appear natural without any noticeable artifacts. (9/10)
1
0
5
Our technique is inspired by a fascinating visual illusion. If you see the stimulus, it is moving diagonally. Now, If you look at the red dot, the stimulus appears to move vertically. In peripheral vision, our brain strongly integrates the actual motion and phase drift. (8/10)
1
0
5
We propose the concept of “Motion Metamers”; videos that are structurally different, but equivalent in spatial AND motion perception. We take the first step, and design a REAL-TIME technique that synthesizes controlled motion energy to reconstruct motion cues (7/10)
1
0
3
We show that a loss of spatial resolution in the periphery (even when it is barely visible) may inhibit motion perception, making AR/VR appear slower than it physically is. Not cool; so how do we fix it? (6/10)
1
0
4
More specifically; does foveated rendering damage motion perception in VR? Its a critical question with many consequences e.g when put on a VR-HMD and drive a car (as we saw recently on social media), you definitely want your ability to perceive/quantify motion intact. (5/10)
1
0
1
Interestingly, the actual purpose of peripheral vision is not spatial acuity, but a key role is motion perception. So fundamental question; imagine two videos with all frames being corresponding spatial metamers. Will the two videos be equivalent in perceived motion? (4/10)
1
0
2
However, “reducing quality” is an oversimplification. The goal is efficient “spatial metamers”: images that are structurally different in the periphery, but perceptually indistinguishable. This phenomena aptly captures the spatially compressive nature of peripheral vision. (3/10)
1
0
4
Foveated rendering enables real-time high-quality rendering for AR/VR; reducing rendering cost in the periphery. As human spatial acuity decreases in periphery, quality loss is not noticeable. Its an integral feature of HMDs such a the VisionPro, Quest3, and Sony PS VR2. (2/10)
1
0
2
Our efforts towards making real-time AR/VR perceptually realistic and comfortable, featured on the regional news @USI_INF
0
0
12
It was fun working on such a fundamental and practical problem, with the support of my co-authors (@douglaslanman ,@nathanmatsuda, Alex Chapiro, Eric Penner and Ajit Ninan). Learned a lot and had loads of fun. More details in the paper https://t.co/qw8wViwvCP (9/9)
0
0
1
Importantly, VR-HMDs need to operate at high refresh-rates for comfortable and truly immersive experiences. One of our main contributions is that the framework is carefully designed for great efficiency; running in under 1 ms on a STANDALONE Quest-2. (8/9)
1
0
0
Our concept is designed and tested on the Starburst HDR VR prototype by Meta. It was amazing to play around with it, seeing the most awesome VR sunsets you’ll ever wish to see 🌅 .👇(7/9)
1
0
0
To get closer to the dream of real-time realistic VR, we design a framework that automatically controls tone-mapping curves to maintain appearance. A video shows the comparison with the traditional parameter estimation for the popular Reinhard tone-mapper. (6/9)
1
0
0
For efficiency, most practical real-time tone mapping curves have parameters that are either manually fixed, or estimated using crude heuristics. These parameters have a significant effect on visual appearance; and heuristics may not be able to reproduce actual appearance. (5/9)
1
0
1
Not unlike the oil canvas’s of before, traditional displays have a much smaller dynamic range than required to represent HDR content. Therefore, tone-mapping curves are used to map luminance of HDR media onto the smaller dynamic range of SDR displays. (4/9)
1
0
1