PROFESSIONAL NERD.

PERSONAL BLOG.

What I'm Reading - 5/9/2018

What I'm Reading - 5/9/2018

Three to five links every weekday - Google's interesting, weird, augmented near-future edition.

ar_shopping_google_play_instant_apps.0.png.jpeg

GOOGLE’S BRINGING AUGMENTED REALITY SUPPORT TO INSTANT APPS

It’s been two years since Google introduced Instant Apps, a Google Play feature that lets users try an app without downloading it. While the format has been mostly popular with game developers, today, Google announced at its annual I/O developer conference that Instant Apps will soon support ARCore, allowing users to interact with augmented reality apps without having to download them.

The experience will also be more closely integrated with search rather than within Google Play or in-app advertisements. For example, soon, users will be able to search for shoppable items and see a link to an Instant App that lets them place the products in front of them in augmented reality — much like the Ikea Place app that lets you virtually trial furniture pieces in your home.


GOOGLE DEBUTS ARCORE 1.2 WITH SCENEFORM, AUGMENTED IMAGES, ANDROID AND IOS CLOUD ANCHORS

First up is Sceneform, a new SDK designed to help Java developers create scenes with ARCore, optimized for mobile, without the need to learn OpenGL. The goal with Sceneform is to leverage ARCore’s advanced 3D object and tracking capabilities within old and new Java apps.

Second is Augmented Images, a new ARCore feature that lets developers attach augmented images to real-world objects while tracking and moving with them in 3D. The feature includes support for vertical plane detection so that AR objects can be attached to walls, as well as horizontal surfaces. Martz offered an example of how Augmented Images will work, letting an ARCore user see what’s inside a box before opening it. Both Sceneform and Augmented Images will apparently be for Android devices only.


AT I/O, GOOGLE SHOWED ITS WILLINGNESS TO CHANGE AND SHAPE OUR LIVES

Every company in Silicon Valley will tell you, with operatic grandeur, that it aims to change the world and make it a better place. But set aside the pretenders trying to sell you $400 juicers, Google happens to be a company that can actually alter the way we, as a global society, interact with and understand one another. Google has control over the world’s dominant search engine, web browser, video and email platforms, mapping service, and mobile operating system. The decisions made by this company have far-reaching effects, and Google I/O 2018 presented a vision of the future that makes Google even more personal, influential, and essential in our daily lives.

One of Google’s promotional videos during the event concluded with the tagline “just make Google do it.” As with the very name of the Google Assistant it was advertising, this promo positions Google’s services as your servants, the Alfred to your Batman, as it were. But I feel like this deliberately benign portrayal masks a huge number of proactive decisions that Google makes every day on our behalf, and it’s useful to revisit those in light of the latest announcements from the company.


GOOGLE LENS ACTUALLY SHOWS HOW AI CAN MAKE LIFE EASIER

Google’s AI-AR platform is going to be everywhere soon, and it works really well now. Lens is effectively Google’s engine for seeing, understanding, and augmenting the real world. It lives in the camera viewfinder of Google-powered software like Assistant and, following an announcement at I/O this year, within the native camera of top-tier Android smartphones. For Google, anything a human can recognize is fair game for Lens. That includes objects and environments, people and animals (or even photos of animals), and any scrap of text as it appears on street signs, screens, restaurant menus, and books. From there, Google uses the expansive knowledge base of Search to surface actionable info like purchase links for products and Wikipedia descriptions of famous landmarks. The goal is to give users context about their environments and any and all objects within those environments.

The platform, first announced at last year’s I/O conference, is now being integrated directly into the Android camera on Google Pixel devices, as well as flagship phones from LG, Motorola, Xiaomi, and others. In addition to that, Google announced that Lens now works in real time and can parse text as it appears in the real world. Google Lens can even now recognize the style of clothing and furniture to power a recommendation engine the company calls Style Match, which is designed for helping Lens users decorate their home and build matching outfits.

What I'm Reading - 5/10/2018

What I'm Reading - 5/10/2018

What I'm Reading - 5/8/2018

What I'm Reading - 5/8/2018