Experimental Category Entries

3M™ Intelligent Control Inhaler

Company 3M

Introduction Date April 2, 2018

Project Website http://go.3m.com/intelligencontrol

Why is this project worthy of an award?

Simply put, this device has the potential to improve the lives of people with asthma and COPD. According to one study, up to 94% of patients do not use their inhalers correctly and don’t even realize it. It’s difficult to know if you inhaled the right dose or not, often leaving patients with inadequate treatment and exacerbated health complications. In the 60 years since the first metered dose inhaler (“MDI”) was invented, there’s never been a way to address that problem in one device, until now. Sensing technology is built into the 3M™ Intelligent Control Inhaler, currently in development and not available for commercial sale, that can detect whether a patient inhaled long enough to get the full dose of the drug. Designed to link to patients’ digital devices, the companion app, also in development, will be able to provide additional information, including the opportunity for patients to review their usage, set reminders, and get feedback and training on proper inhalation technique. The app is even being designed to have the ability to transmit adherence data (i.e., how often a patient takes their medication in accordance with their prescription), so healthcare providers may better understand individual patient progress and better direct patient care. The benefits of these kinds of technology have the possibility of not only improving patient outcomes, they may also help to address healthcare costs. Studies have proven that preventive care can save money. The type of real patient data that these kinds of technologies may generate, aim to put actionable analysis into the hands of the people that need it the most with the overall goal to help increase the effectiveness and efficiency of each patient’s treatment.

What else would you like to share about your design? Why is it unique and innovative?

This product was designed from the start with input from patients and their physicians who care for them.

Who worked on the project?

Sam Van Alstyne, Marketing. Louise Righton, Marketing, Richard Brewer, Engineer, Steve Wick, R&D, Richard Moody, R&D. Lesley Hoe, Regulatory, Stewart Griffiths, Project Management

View the project video:


60th GRAMMYs - Play the City

Company Tool

Introduction Date January 25, 2018

Project Website

Why is this project worthy of an award?

‘Play the City’ was a first-of-its-kind AR installation that managed to both promote the 2018 Grammys and introduce unsuspecting Uber passengers to an augmented ‘symphony of the city.’ The result was an unforgettable experience that spread through word-of-mouth and successfully achieved what it set out to do-- honor the 60th Grammy Awards and the city of New York, in all its live, immediate glory. The installation worked on a modest level in terms of size, as it made full use out of one back passenger window, rather than a whole vehicle. There’s a tendency to go as ‘big as possible’ in size with an AR concept, but this installation makes the most use out of its window ‘canvas’ as a screen that visualizes music notes via the moving, everyday elements of NYC life. Mundane aspects of sidewalk life that are usually accompanied by unpleasant sounds such as traffic are transformed into melodic parts of a whole symphony.

What else would you like to share about your design? Why is it unique and innovative?

The AR artistry in ‘Play the City’ is particularly notable, as the installation cleverly uses the motion-filled, right-to-left directional view through a car’s back passenger window as its own music sheet. In this respect, each passing person and object through the window ‘scrolled’ past the passenger’s eye in the direction that he/she would read music. Not only was this a unique visual achievement, the musicality of the installation was also quite groundbreaking in that each musical note synchronized with an element outside the passenger window that ‘scrolled by.’

Who worked on the project?

Managing Partner / Executive Producer - Oliver Fuselier - Tool Managing Partner / Executive Producer - Dustin Callif - Tool Director - Phillip Montgomery - Tool Executive Producer, Integrated - Joshua Greenberg - Tool Head of Production, Live Action - Ian Falvey - Tool Head of Production, Innovation - Michael Bucchino - Tool Sr. Experiential Producer - Matthew Kahn - Tool Creative Director - Ben Priddy - Tool Producer - Jason Manz - Tool Lead Developer - Wim Vanhenden - Tool Lead Developer - Gwen Vanhee - Tool Composer - Eddie Alonso - Tool Renato Fernandez - Chief Creative Officer - TBWA\Chiat\Day Linda Knight - Executive Creative Director - TBWA\Chiat\Day Jason Karley - Creative Director - TBWA\Chiat\Day Nomi Malik - Art Director - TBWA\Chiat\Day Marco Monteiro - Art Directors - TBWA\Chiat\Day Ryan Siepert - Copywriter - TBWA\Chiat\Day Armando Samuels - Copywriters - TBWA\Chiat\Day Brian O’Rourke - Executive Director of Production - TBWA\Chiat\Day Anh-Thu Le - Executive Producer - TBWA\Chiat\Day Erika Buder - Associate Producer - TBWA\Chiat\Day Pamela Lloyd - Brand Director - TBWA\Chiat\Day Caroline Hanley - Brand Manager - TBWA\Chiat\Day Mimi Hirsch - Senior Business Affairs Manager - TBWA\Chiat\Day Neil Barrie - Chief Strategic Officer - TBWA\Chiat\Day Jeremy Schumann - Brand Strategist - TBWA\Chiat\Day Aeden Keffelew - Junior Planner - TBWA\Chiat\Day David Brodie - Editor - Rock Paper Scissors Billy Hobson - Colorist - Shed Miles Essmiller - VFX - Shed Paul O’Shea - VFX - Shed Mark Meyuhas - Audio Mixer - Lime

View the project video:


Adobe Project SonicScape

Company Adobe

Introduction Date October 21, 2017

Project Website

Why is this project worthy of an award?

Project SonicScape is a technology developed by researchers in Adobe’s Design organization, which allows for editing 3D audio in the context of immersive media (virtual reality/360 video). Traditionally, creating fully immersive content for 360 video requires two pieces of hardware – a 360 video camera and an ambisonics microphone. This creates a challenging, time consuming task for editors who must carefully align the audio spatially in relation to the 360 video manually, mainly relying on perception alone. Ambisonics is an audio technology that allows you to capture and playback audio spherically to enable full immersion as it mimics how sounds are perceived in real life all around you. SonicScape presents a simpler, more intuitive and streamlined approach, and takes the guesswork out of the mix. In short, SonicScape is a way to visualize and manipulate ambisonics recording. The technology creates visual cues in the form of particles representing sounds, bringing audio frequencies to life and in space, and creates a seamless, synchronized video and audio editing experience. Users simply drag and drop sounds to where they choose in their immersive video content and have control to make the sounds align perfectly with the direction the sounds are supposed to come from, no matter where the viewer is currently looking in the video. To put this in context – imagine a user is wearing a headset to view 360 or VR content, and they see a scene with a tree to their front and hear birds chirping on their right. The user then pivots to face the birds and, in a 360 or VR setting, the sound should now be coming from the front of the scene where the birds are (instead of the right, which is where the sound was originally coming from). With SonicScape, the editor would only have to drag the visual particles to face the direction the sound should be coming from, only in a matter of seconds. Additionally, traditional ambisonics microphones are great to capture ambiance but sometimes is not ideal to capture crisp details. With SonicScape, users can also drag in new sound clips and place them in 3D space to enhance the complete scene of the 360 videos. SonicScape was created by researchers in Adobe’s Design organization to help solve for real-world problems and is not part of Adobe’s current product offerings. The research project displays a great leap in video editing and provides a tangible vision for what the future of immersive video could look like for users in every industry, from Hollywood video editors to self-starting YouTubers. With the simple and easy-to-use design of SonicScape, users will be able to see and understand their 360 footage as they edit it, import 360 video and audio assets into the scene, change their orientation and easily position text, graphics and sound effects – all in one interface.

What else would you like to share about your design? Why is it unique and innovative?

Project SonicScape was released as one of Adobe’s “sneaks” at the MAX conference in October 2017. Sneaks are experimental in nature and show off some of the latest technology being developed at Adobe. We share new ideas and technology so that we can get real, immediate feedback from our customers and community about what those innovations mean to them — what’s useful, what can be better, what can be more impactful and meaningful to creative people in their daily lives. Immersive media ultimately presents a new canvas for the creative class, one in which we want to remove all barriers for exploration, and Adobe can help accelerate its adoption. We aren’t creating the content; we are creating the tools that will drive creative expression forward. In doing so, Adobe can power the excitement of what immersive media can be into the innovation of what it ultimately becomes. For editors, it is hard to see, understand and manipulate content in a 360 environment on desktop and in head mounted display (HMD). SonicScape solves this by providing an intuitive authoring environment for both desktop and HMD, which is highly tailored for immersive content editing – and Adobe is positioned to provide an authoring environment that encompasses both video and audio within the same tool. By focusing on visualizing meaningful information from both audio and video content with the emphasis on direct manipulation on canvas, editors can move their content intuitively and have it remain in context – as compared to traditional methods, where editors have had to jump between several different applications or plugins to do this workflow, each disconnected from one another. Here we have a system which lets you do everything you need for 360 content. One of the key features of SonicScape is the ability to visualize ambisonic audio recordings. Spatial audio visualization converts the four channel ambisonic recording into a video in the same format as the VR video (a spherical panorama), where each pixel of the video indicates the intensity of sound coming from that direction. This visual representation of the spatial distribution of audio makes the alignment task much simpler by allowing the user to easily click and drag on the audio visualization to rotate it into alignment with the video.

Who worked on the project?

Yaniv De Ridder, Lead Senior Experience Developer Michael Cragg, Lead Senior Experience Designer Ben Farrell, Senior Experience Designer Stephen DiVerdi, Principal Scientist Tim Langlois, Research Scientist


A Future without Synthetic Fertilizer: Rethinking Crop Nutrition

Company Pivot Bio

Introduction Date February 1, 2019

Project Website http://www.pivotbio.com

Why is this project worthy of an award?

Pivot Bio is providing corn farmers with a better performing, cleaner replacement for synthetic nitrogen fertilizer. By using Pivot Bio’s nitrogen-producing microbes to spoon-feed corn plants during the growing season, farmers optimize crop yields while improving water and air quality. While fertilizer use is critical for crop production and contributes to an abundant global food supply, its use has adverse effects on our environment including polluting our air and waterways. A portion of chemical fertilizer decomposes into nitrous oxide and becomes a greenhouse gas 300 times more potent than CO2 making it responsible for about 5% of global warming. In addition, rains wash excess fertilizer into streams and rivers, causing algal blooms that suffocate fish and aquatic life. Traditional fertilizers are washed away through surface runoff and groundwater leaching, making the process inefficient and harmful to the environment. Worldwide fertilizer-linked pollution is responsible for more than 500 dead zones — places so toxic that nothing lives there. One of the largest is where the Mississippi River empties into the Gulf of Mexico. It’s the size of New Jersey. Pivot Bio’s nitrogen-producing microbes provide consistent crop nutrition for every plant in a field, increasing yield predictability and simplifying a farmer’s workload. The number of times a farmer drives a tractor through the field is fewer, and this means reduced levels of destructive soil compression and more time for the farmer to spend with his family or managing other parts of his farm. Pivot Bio’s unique approach uses naturally occurring, non-transgenic microbes to help farmers achieve the same yield with fewer inputs and less global impact than synthetic fertilizer. In one square inch of soil, there are billions of soil microbes. Pivot Bio identifies the rare microbes that have the innate ability to produce nitrogen. This ability, however, has gone dormant after decades of synthetic nitrogen fertilizer use. Using a proprietary process, Pivot Bio optimizes the microbes’ natural ability to produce nitrogen. These microbes are applied in-furrow at planting or eventually as a treatment to the corn seed. Because they live symbiotically with a plant, the microbes spoon-feed nitrogen to the corn plant throughout the growing season and have none of the adverse consequences of synthetic fertilizer.

What else would you like to share about your design? Why is it unique and innovative?

Pivot Bio’s first generation nitrogen-producing microbe product has the potential to replace up to 25 pounds per acre of applied nitrogen in U.S. corn fields. This has significant environmental benefits as it could reduce total NOx emissions by approximately 39-57 million pounds, which is equivalent to getting one million cars off the road. Further, Pivot Bio’s solution could help reduce reactive nitrogen leaching from U.S. corn acreage by 329 million pounds, which equates to saving more than $3 billion in new U.S. wastewater treatment capacity costs. For five seasons of in-field testing, the Pivot Bio team has cultivated strong relationships with farmers to design a product that exceeds their expectations and meets their demands for preserving yield while providing environmental benefits. The Corn Belt’s most innovative farmers are beta testing our inaugural commercial product on their farms this summer. They are part of Pivot Bio’s “Intent to Pivot” program that directly builds relationships with the most progressive farmers. In 2019, Pivot Bio will commercially introduce the world’s first nitrogen producing microbes for corn - a product that improves farming practice and economics while bettering the planet around us. What started as a thesis project for two UC Berkeley postdoc students has become the mission of a rapidly growing Bay Area startup. Pivot Bio will expand from the US to markets throughout the world in the coming years. The product portfolio will also expand beyond corn to include solutions for farmers of wheat, rice and other cereal crops. In 10 years, the company aims to provide the sole source of nutrients needed to crops worldwide. Pivot Bio is pioneering a future without synthetic nitrogen fertilizer and leading the charge to sustainable, eco-friendly crop nutrition. With initial funding from the Bill and Melinda Gates Foundation, the National Science Foundation and DARPA, the team is unlocking new ways to provide nitrogen and other key nutrients required for crops to flourish and to help clean up the world’s air and waterways.

Who worked on the project?

Karsten Temme, Ph.D, Co-founder and CEO Alvin Tamsir, Ph.D, Co-founder and Chief Scientific Officer

View the project video:


Agile, Autonomous, and Customizable Air Intelligence Platform

Company Airgility, Inc.

Introduction Date May 8, 2018

Project Website http://www.airgility.co

Why is this project worthy of an award?

The Company’s mission is to develop and field the most versatile vertical takeoff and landing (VTOL) systems. Our patent pending VTOL flagship products are the HorseSHU, a breakthrough aircraft leveraging a lifting body with thrust vectoring propulsion using swash plate controls, and the miniSHU, a low-cost 3D printed on-demand VTOL small UAS. Both products, while optimizing for conditions of vertical and forward flight regimes, offer the best combination of payload weight, payload volume, flight endurance, speed, and mission adaptability. Our designs are complete Unmanned Aerial System (UAS) solutions suitable for a multitude of commercial and military applications. They employ a tilt nacelle design that combines Vertical Takeoff and Landing (VTOL) and hovering capabilities of a multi- rotor UAS with the fast and efficient forward flight capabilities of an airplane in one practical lightweight platform. Drones can be divided into three categories based on their drive Mechanisms: multi-rotor, fixed wing, or hybrid. Multi-rotors are simple to operate but, extremely limited in their range and payload capacity. Fixed wing are efficient and have great range, but cumbersome to operate and to transport, and require runway real-estate. Helicopters have good payload capacity and medium range, but are mechanically complex, dangerous to operate, easily damaged, and rigorous in maintenance. Airgility is developing hybrid vertical and takeoff aircraft that can hover as needed, but can also transition to faster and more efficient fixed-wing style forward flight while not requiring infrastructure (launcher/capture system nor runway) for takeoff and landing. The number of hybrid drones on the market is currently very low and lacking in innovation, thus this presents the opportunity to have first mover advantage and establish a foothold. In addition, our 3D printed design, maintenance, part fabrication, hardware adaption for new missions can occur on-demand – a feature unavailable by all other hybrid VTOL drones. Finally, Airgility’s designs blend the advantages of payload, distance, and the speed of fixed-wing drones with small footprint, and agility, which represents the future standard of drone design as supported by our customer needs discovery.

What else would you like to share about your design? Why is it unique and innovative?

The miniSHU/duoSHU 3D printed unmanned aerial systems (in picture 8) provide unique value propositions by way of their truly innovative design and engineering level integration of "best of" characteristics from various platforms. The patent pending design blends the aerodynamic efficiency of a lifting body, the controllability of thrust vectoring, the flexibility of rapid prototyping, the robustness of an exoskeleton with minimal internal structure and maximum internal volume flexibility, and the 3D printer friendly shell construction that is both scalable and flexible for future mission iteration/growth. Lastly, most of the benefits and features coming from the additive manufacturing industry are directly applicable to the miniSHU and immediately adoptable. This means that material advancements for 3D printing, industry maturity driving down 3D printing costs, and new material capabilities can further reduce unit cost and open unique market segments. For example, water-soluble 3D printing materials currently exist. This feature could be used for covert operations by Special Ops where the vehicle is deliberately flown into a water body so traces of the vehicle presence are destroyed. The competition cannot offer this kind of capability and tracking for future growth. Our unique aerodynamic design also lends itself to scalability. Meaning that we can make the aircraft smaller or larger and still retain the favorable aerodynamic characteristics as seen in the photos. The unique designs of Airgility are nothing like the common drone/unmanned vehicle designs that we see in the market. So while there are a plethora of drone related patents, none combines exoskeleton, lifting wing and vectored thrust to produce a combination of payload weight and volume with long range, high speed, and overall practicality of the solution that the market demands.

Who worked on the project?

Evandro Valente, CTO Pramod Raheja, CEO