This is Project Starline. Years in the making, it works like a magic window, bringing you together with friends, family, and coworkers, even when you're cities (or countries) apart.
While experimenting with different AR navigation cues in Google Maps, at one point we tried using particle effects to represent paths and curves. Then one user asked why they were "following floating trash". So we moved on. :)
ARCore 1.7 rolls out today. It includes a new Augmented Faces API, which gives you a tracked, 468-point face mesh to build all sorts of neat effects with. No depth sensor required.
I doubt that there is any string of search keywords that would get you to the answer of what kind of car this is.
(It's a variant of a Nash Rambler. Thanks, Google Lens.)
Earlier today we announced the ARCore Depth API. It gives developers per-pixel depth maps for occlusions, hit testing, meshing, synthetic lighting, and more… all using a single RGB camera. More here:
Today we introduced a new feature in YouTube that we're calling AR Beauty Try-On. It lets you virtually try on makeup while following along with YouTube creators.
So excited to announce the first Daydream standalone headset, the Mirage Solo from
@Lenovo
. It's simple to use, comfortable, and, with built-in positional tracking, it's incredibly immersive.
Yesterday we open-sourced Seurat. It enables high-fidelity graphics on mobile VR hardware by processing complex 3D scenes into a representation that renders much more efficiently. More here:
Most people think of AR as an optics and displays problem. That's a big part of it. But first, it's a sensing problem: motion tracking, depth sensing, lighting, localization, scene segmentation, object recognition, and more.
It’s one thing to read that a great white shark can be 18 feet long. It’s another to see one up close in relation to the things around you. Introducing AR in Google Search.
#io19
We’re rolling out new experimental features for developers on the Lenovo Mirage Solo. Experimental 6DoF controllers, see-through mode, and any Android app in VR. Coming soon:
The Google Expeditions app just got a big update, with over 100 new AR experiences. You can visualize everything from Da Vinci's inventions to viruses to elephants, at home or in the classroom.
Check out an experiment that aims to improve the position and orientation accuracy of the little blue dot in Google Maps using global localization, a technique that combines Visual Positioning Service, Street View, machine learning and
#AugmentedReality
→
Accurate lighting and shadows are key to making AR objects feel like part of the physical world. The artists behind “Who Framed Roger Rabbit” nailed many of the principles 30 years ago.
To make amends for not taking a computer graphics course as an undergraduate, over the holidays I wrote a basic ray tracer. Here’s what 130 hours of render time on a 10-core iMac looks like.
We just announced Playground, a new creative mode in the Pixel 3 camera. It’s an evolution of AR Stickers – now with selfies, scene suggestions, and a bunch of awesome new content.
If you're building cross-platform AR or VR content, the new Resonance Audio SDK is for you. High quality, mobile-optimized spatial audio for the Web, Unity, Unreal, iOS, and Android, with plugins for leading audio tools. Also, free.
Spent the morning in my back yard showing my kids how big different animals are, right from Google Search. Lions! Tigers! Bears! And also alpine goats, timberwolves, giant pandas, European hedgehogs, angler fish, and emperor penguins.
Before AR/VR, I worked on apps like Gmail to help people be more productive at work. It’s one of many reasons I’m excited Glass is joining our team now. Today, we're launching Glass Enterprise Edition 2 to help businesses work better, smarter, and faster.
We’ve been working with
@YouTube
and
@VRScout
on the VR Creator Lab, a program that provides training, mentorship, VR camera gear, and production grants to creators and filmmakers. Three days left to apply!