Experimental Category Entries

Lab100 at Mount Sinai

Company Icahn School of Medicine Mount Sinai

Introduction Date November 1, 2017

Project Website http://lab100.org

Why is this project worthy of an award?

Health isn’t merely the absence of disease. It’s a continuum, and all of us are somewhere along it. At Lab100, we believe that this simple insight has the power to transform healthcare from a system that primarily treats disease to one that excels at maximizing health and preventing illness. Equal parts clinic, innovation hub, and research center, Lab100 at Mount Sinai is a first-of-its-kind comprehensive health assessment program. We are dedicated to moving healthcare forward by rapidly prototyping a radically redesigned end-to-end patient experience while simultaneously gathering data for basic research into the science of health. As a Lab100 patient, you are test-piloting the future of healthcare. Your experience will determine which innovations become standard clinical practice, and which go back to the drawing board. Furthermore, your de-identified biomedical data will help researchers identify new biomarkers and develop better health assessment tools, thereby improving our very understanding of what it means to be healthy. You’ll also receive the most comprehensive, quantitative health assessment currently available— information that will empower you to make smarter choices about how to move along the continuum toward better health. Welcome to the future of healthcare.

What else would you like to share about your design? Why is it unique and innovative?

Lab100 at Mount Sinai is a unique clinical experience developed by a team of doctors, data scientists, designers, architects, engineers, and patient-collaborators that radically reimagines how we measure health and deliver care. Lab100 was created from within the Institute for Next Generation Healthcare. It takes 17 years on average for biomedical innovation to reach clinical practice [1]. Lab100 utilizes an iterative product development approach to rapidly accelerate the pace at which promising ideas become clinical practice. WHO WE SERVE By bridging the gap between patient, provider, researcher and innovator, Lab100 creates a virtuous cycle of innovation that benefits each key stakeholder in the equation: We empower patients to track their health over time by providing the most comprehensive health risk assessment and biometric screening currently available. We unburden providers from being data collectors and offer a more holistic understanding of their patients' health by integrating best-in-class measurement tools and data visualization. We equip researchers with longitudinal multi-scale health data enabling discoveries linking health metrics, interventions, and outcomes. We enable product developers to develop, validate, and deploy products and services that solve real patient problems. WHAT WE MEASURE • Vital signs • Blood draw for clinical labs • 3D body surface scan for anthropometrics • Body Composition Analysis • Cognition battery (Vocabulary, Attention, Flexibility, Processing speed, Episodic memory) • Strength • Dexterity • Balance • Robust health surveys (Medical History, Mental Health, Sleep, Nutrition, Physical Activity) • Additional research-grade measurements including DNA sequencing and computational image analysis using advanced A.I. statistical machine learning techniques OUR MISSION “The tools for a more proactive healthcare system already exist, but our current system is not designed for rapid advancement or immediate application of new health technologies. By closing the feedback loop between discovery science and care delivery, Lab100 aims to provide superior care for patients and enable unparalleled insights for physicians, researchers and innovators.” Dr. David Stark, Lab100 Director [1] Balas EA, Boren SA. Managing clinical knowledge for health care improvement. Yearbook of medical informatics. 2000:65-70. *** VIDEO CONTENT Dropbox: https://www.dropbox.com/sh/gr9movdkmbfqgri/AABD1pnKHRA80Qr1uq3oCOSXa?dl=0 Vimeo: Password: lab100atmountsinai Lab100 — Introduction: https://vimeo.com/252503396 Lab100 — Before Your Visit: Patient Introduction: https://vimeo.com/236108246 Blood Draw: https://vimeo.com/236108534 Body Composition Assessment: https://vimeo.com/236108868 Balance Assessment: https://vimeo.com/236989616

Who worked on the project?

David Stark Lead Joel Dudley Executive sponsor Sarah Pesce Clinical operations Savi Glowe Operations Jerome Scelza Project management Max Tomlinson Data science Eddye Golden Clinical research coordination Mark Shervey Data architecture Shelly (Sparshdeep) Kaur Project coordination Arpana (Amy) Divaraniya Data science Noah Zimmerman Thought leadership Brian Kidd Thought leadership Jason Bobe Thought leadership Ryan Viglizzo Design assistance Gregory Botwin Project management Noah Waxman Strategy / Content direction Lucas Werthein Technology / Production direction Marcelo Pontes Architecture/ Design direction Craig Pickard Software development Jorge Proaño Software development Marc Abbey Front-end development


Lachesis: Drawing the Fabric of Life

Company Tufts Living Materials Silklab

Introduction Date March 23, 2018

Project Website

Why is this project worthy of an award?

The myth of Lachesis, the weaver god of destiny, embeds emotion in every thread softly locking fate in the fabric of our lives. We have engineered a future Lachesis by drawing the fabric of life with interactive surfaces able to sense and display mood, health, or pollution. They respond with color to environmental chemistries carried in, for instance, tears, sweat, or rain, unveiling emotional textiles first of their kind. Lachesis is screen-printed with bio-active inks based on regenerated silk protein that induce distributed sensing at multiple scales, from millimeter to several meters, with unprecedented resolution, aesthetics, and successful visual transformations. The favorable encapsulation and stabilization properties of printed living material inks allow for incorporation of a suite of sensing molecules, as well as useful compounds within wearables and flexible surfaces rendering them conductive, thermo-chromic, photo-thermic, iridescent, or perhaps bioluminescent. Applications are multiscale in mass-production of bio-sensing soft wearables and environmental sensing platforms. In human wearable technology, for instance, small soft health sensing devices such as conformal fabric patches and soft bracelets can be developed. Other examples are fashion elements, such as athletic clothing embedded with biosensors. Finally, large-scale environmental monitoring surfaces can be implemented in the built environment through sensing tapestries or wall paper systems, towards macro-scale formats such as weather- and pollution-tracking tensile net structures, façade elements, or shading strategies with specific alterations to functionalized ink formulations.

What else would you like to share about your design? Why is it unique and innovative?

There is a quest for information in the world of wearables and environmental interfaces aiming at bridging transduction and physiologically-relevant information. Efficient and at scale transfer of biologically-active inks onto soft everyday surfaces offers an attractive pathway towards fast integrated sensing of changes in health and surroundings, key to develop distributed monitoring and individualized interfaces. Ready-to-wear objects embedding flexible electronics, biosensors, and/or environmental monitoring - such as rings, tattoos, skin and fabric patches, bandages, bracelets, or smartphones - enable biometric data collection, but do not propose seamless and comfortable integration within everyday objects and wearables. With Lachesis and other manifestations of our technology, we propose to improve current environmental sensing techniques by combining biologically-active inks and large-scale highly versatile screen-printing methods. Specifically, we develop: (i) novel colorimetric pH-sensing inks, with (ii) the appropriate rheological properties to be screen-printed, (iii) efficiently bound to remain onto surfaces like fabric, in (iv) a high-resolution manner, and (v) preserving the mechanical properties of the substrate post printing. Demonstrating bio-sensing into new fabric formats will allow remote recognition and distributed applications that can be fabricated into large-scale geometries. Our vision is to develop interactive and environmentally-connected soft surfaces expandable to multiple parallel detection modalities including monitoring of air and water quality, athletic performance, and human health. Image captions : Fig1: Lachesis by the Silklab is comprised of three-meter tall interactive surfaces able to change color when sensing their surroundings. Image by Design Does. Fig2: Silklab’s living material inks are sustainably developed from regenerated silk in silk cocoons. Silk protein is extracted from cocoons and can be used in a myriad of digital fabrication methods to create from implantable screws, to edible electronics or to 3d-printed fashion. It can also encapsulate and stabilize bio-active molecules to generate diverse types of sensing compounds. Images by Saoirse Loftus-Reid and Laia Mogas. Fig3: Silk-based bio-active inks can be screen-printed to render soft surfaces such as textiles interactive and sensitive to their surroundings. Image by Laia Mogas. Fig4: Interactive textiles can be manufactured at the Silklab at the macro scale to generate Lachesis. Image by Laia Mogas. Fig5: Lachesis can sense changes in sweat, tears or rain and change color according to mood, pollution or human health. Images by Laia Mogas and Design Does. Fig6: Lachesis’s surface pattern was computationally designed to mimic the hierarchically woven fibrous structure of a silk cocoon under the microscope. Image by James Weaver and Laia Mogas. Fig7: Lachesis was commissioned by the Design does exhibit at Barcelona’s Museu del Disseny, gathering a diverse public around the question of ‘what should design do’? Image by Design Does. Fig8: Lachesis is comprised of a triad of three-meter tall interactive surfaces able to change color when sensing human mood, skin heath, and rain pollution, and of a series of small objects that tell the story of silk and silk bio-inks. Images by Laia Mogas Andres Flajszer.

Who worked on the project?

Lachesis was designed and developed by architect Laia Mogas and engineer Giusy Matzeu at the Tufts University Living Materials Silklab led by Prof. Fiorenzo Omenetto. Materials and methods are part of a current effort to bridge science, technology, and art with design and fabrication of living material products. Commissioned for the ‘Design Does’ exhibit at Museu del Disseny de Barcelona and supported in part by the Art and Science seed fund from the Office of the President at Tufts and the U.S. Office of Naval Research and performed in collaboration with the Center for Applied Brain and Cognitive Sciences. With gratitude to the Mayo Clinic, artist Michael Hecht at the Tufts SMFA, videographer Saoirse Loftus-Reid, photographer Andres Flajszer, and craftsman Fernando Suarez at EFS Designs. Heartfelt thanks to the Berklee Global Jazz Institute under the direction of Danilo Perez and Marco Pignataro, and to their musicians for the original music score, Lior Tzemach - guitar, Tyrone Clarence Allen - bass, and Sebastian Kuchczynski - drums.


LAMBCHILD SUPERSTAR: Making Music in the Menagerie of the Holy Cow

Company WITHIN

Introduction Date April 20, 2018

Project Website

Why is this project worthy of an award?

LAMBCHILD SUPERSTAR: Making Music in the Menagerie of the Holy Cow is a shared virtual-reality experience from WITHIN, Chris Milk’s VR/AR company, and OK Go’s lead singer, Damian Kulash. LAMBCHILD was developed to give participants of all musical skill levels the opportunity to experience the pure joy musicians share when they create original songs together. And of course, there’s a twist: the music is made by animals and robots—wondrously working together to create the beats, notes and OK Go partnered with Chris and the WITHIN team to create a virtual reality music video for a new song. But, the team soon realized that they could create something even bigger and more original: help fans feel the surge of glee that comes from pairing the right chords and the right melody. In LAMBCHILD SUPERSTAR, two users connect with each other using an Oculus Rift or HTC Vive — whether they are in the same room or miles apart — and enter a world filled with animals eager to help you perform music: from an electric eel making electro noises, to marching lemmings acting as a drum machine. Again, no instrumental or musical experience is required – only the willingness to push buttons, pull levers and have fun.

What else would you like to share about your design? Why is it unique and innovative?

While existing VR experiences have showcased footage of live music, or allowed you to immerse yourself aurally in a song, the ability to enter another world where you can create an original song from scratch — with a friend, no less — is unprecedented. It was crucial to the WITHIN team and Damian to create an experience centered on how it feels to create music, without any instrumental or musical experience required. To do this, the team married advanced musical theory with an intuitive, friendly interface. The musical algorithms that underlie LAMBCHILD SUPERSTAR make it an incredible tool for pop music composition; the team analyzed more than 30,000 songs, extracting the key elements of catchy pop music. It is those elements that constitute the toolkit offered to users — a creative palette that proves fun and generative for all levels of musical experience. Several core components of the project are rooted in innovative technology: for example, the avatars are standardized to look the same to both users, giving them expressions based on user mic levels, which – combined with pre-made animations – make the avatars feel that much more alive. Additionally, the locomotion within the experience makes movement feel incredibly realistic — indeed, nearly identical to how it feels to walk in real life.

Who worked on the project?

A WITHIN Original in association with Oculus Studios Directed by Chris Milk & Damian Kulash CREDITS Project Creator - Chris Milk, Damian Kulash Key Collaborator - WITHIN, OK Go, Oculus Developers - Horizons Studio, WITHIN Executive Producer - Aaron Koblin, Chan Park, Yelena Rachitsky Director of Development - Jonathan Ahdout Art Director/Lead Visual Designer - Jona Dinges Animation & Character Supervisor - Adam Sidwell Animation Producer - Robert Castaneda Lead Technical Artist - Thor Benitez Associate Producers - Robin Cho, Spencer Burnham Sound & Music - OK Go, Hook Theory, Ryan McGee

View the project video:


Line of Sight

Company Mettle Studio

Introduction Date February 11, 2018

Project Website https://mettle-studio.com/project/Line-of-Sight

Why is this project worthy of an award?

Zebra crossings today are effective, but are only suitable in areas where traffic levels are low, and where line of sight is unlikely to be blocked. In many ways, smart roads of the future will operate in similar ways to zebra crossings. Traffic will flow based upon demand: being held at a red light for no reason will be a thing of the past. In October 2017, Direct Line developed the world’s smartest pedestrian crossing. Following extensive user testing, and the fact that over 7,000 incidents are recorded each year at UK crossings between pedestrians, vehicles and cyclists, it became clear there was demand for a new crossing solution. However there were questions around feasibility, affordability and usability of the original prototype. Earlier this year, we built on the original idea, winning a smart crossing of the future design challenge. We then took our idea – Line of Sight – from concept to working proof-of-concept. Line of Sight is a piece of smart road furniture that uses people detection technology to track and predict pedestrian behaviour at zebra crossings. Drivers and cyclists are alerted to approaching and crossing pedestrians via an LED strip that spans the road. Cameras are positioned on either side of the road to track pedestrians that would otherwise be obscured. This is then communicated this back to vehicles and cyclists via light animations, extending the line of sight of the drivers. We were keen to design something that will immediately improve road crossing safety, was simple to retrofit to current crossings and cost effective to encourage take up by councils. We built our proof of concept using cutting edge tools such as Google’s TensorFlow, making it possible for us to rapidly develop and refine functional machine learning models and prototypes. Using the person detection software, different patterns of light, in line with the highway code, would be set off depending on where a person was detected. The finished proof of concept, a 5 metre-long prototype, was rigorously stress tested with vehicles. Having built our prototype we setup our smart crossing in Somerset House and tested our build in a safe environment. Testing allowed further understanding of people’s behaviours around zebra crossings, informing the development of its future design.

What else would you like to share about your design? Why is it unique and innovative?

The simple, versatile nature of Line of Sight means that not only is the design and installation feasible, but so is the development roadmap. The design can adapt, potentially future proofing itself, as the requirements for road infrastructure change to accommodate connected autonomous vehicle (CAV) networks. The prototype was designed with cost in mind, keeping the design simple, robust and easy for city councils to test. We are now speaking to local London councils, and we envisage the Line of Sight project developing rapidly to provide smarter control of traffic at all types of road intersection, and being at the forefront of development for crossings. Machine learning and AI will help to optimise our transport networks and Line of Sight will be a key first step to making that happen. We envisage that the next generation of city infrastructure will communicate directly with CAV’s, creating a connected and sustainable driving ecosystem. Initially Line of Sight will alert vehicles to the presence of pedestrians, helping to reduce the 7,000 annual road traffic incidents that take place at crossings; later, it could facilitate an adaptive road network or inform pedestrians of CAV intentions. Line of Sight could also have positive effects on the sustainability and efficiency of cars on the road, due to the fact that cars would no longer have to stop unnecessarily at traffic lights. This design looks forward to the future of smart infrastructure. By developing models that understand the human-vehicle interface we can start to take advantage of the real benefits of autonomous vehicles. We’re excited to be playing a role in bringing smart cities to life!

Who worked on the project?

Alex Bone - Creative Lead Luke Forward - Design Lead Sam Parkinson - Technical Lead Eddie Li - Software Engineer Giovanni Lenguito - Software Engineer

View the project video: https://vimeo.com/257129014


Listening NYC

Company IDEO

Introduction Date October 11, 2017

Project Website http://listening.nyc

Why is this project worthy of an award?

When it comes to policing, there's no such thing as a city-wide shared experience. Where a person lives and what they look like can have a major effect on their relationship with the police force. After decades of work in the policy arena, the New York Civil Liberties Union (NYCLU) decided to take a closer look at the cultural forces—both inside and outside the police department—that allow policing inequality to persist. The organization engaged IDEO to transform this research into a campaign that could shift public opinion toward more equitable policing practices in NYC. Interviews with New Yorkers, police officers, and subject matter experts affirmed the common wisdom that there is no silver-bullet solution to big, systemic issues like reforming policing culture. It wasn't realistic to expect real change to come from a single pithy campaign message. Instead, real change would need to start with open, honest dialogue between everyone involved. Listening NYC is a grassroots, multimodal, civic campaign aimed at sparking meaningful discussion and collective action around the current state of policing in NYC. Unlike traditional marketing campaigns, which often tell people what to think, Listening NYC gives people the tools they need to speak for themselves. The campaign is built around a series of online and offline interactions that encourage courageous dialogue, no matter where people are or how much time they have to engage. The centerpiece of the ongoing campaign is the Listening Room, a portable pop-up structure that travels around the city and provides space for New Yorkers to discuss and take action on the issues that are important to them. The Listening Room encourages engagement by offering various levels of participation—from sending a postcard to the mayor using pre-printed stickers of suggested policy solutions to contributing to a community "mad lib" to listening to a radio broadcasting stories about police from diverse voices across the city. At the heart of the experience is a space for small groups to use the Conversation Cards, a graphical card game that sparks open dialogue about policing within a framework of emotional safety. Listening NYC's physical presence is bolstered by a microsite and Instagram profile designed to extend the campaign's reach and give participants an immediately accessible resource for continuing the conversation. The campaign's cross-platform, interactive approach has given the NYCLU a blueprint for applying the same dialogue structure to other complex issues the organization may choose to tackle in future. It's a framework for encouraging discussion and empathy—the first steps toward creating transformational change.

What else would you like to share about your design? Why is it unique and innovative?

Listening NYC is a multimodal civic campaign that inspires conversations about policing practices among New Yorkers of all viewpoints, and drives collective action towards a more just police department. Through a series of public pop-up events in parks, on city streets and at other venues across the city, Listening NYC creates interactive environments that enable deeper listening, encourage open dialogue, and amplify ongoing conversations about policing. + The Listening Room The Listening Room is a pop-up environment in which New Yorkers across the five boroughs can share their stories and views about police interactions and policies, and listen to the experiences of others. This rapidly-assembled, traveling set anchors the in-person experience of the campaign and includes dedicated spaces for engaging with the activities that follow below. + Conversation Cards Conversation Cards is an interactive card game that sparks honest dialogue between people with different experiences of policing and helps them imagine possible solutions, either through personal behavior change or by influencing public policy. Simple and straightforward, the Conversation Cards provide just enough structure to open a challenging a conversation but remain unobtrusive to naturally evolving, open dialogue. Conversation cards are available for download in English and Spanish on the Listening NYC website at listening.nyc. + Listen In Listen In is an audio installation that plays the voices of individual New Yorkers, including two New York Police Department officers, telling personal stories of their interactions with police in all five boroughs. Presented as a physical radio that can dial to the channels Brooklyn, Bronx, Manhattan, Staten Island, and Queens, Listen In highlights how policing is experienced differently depending on where a person lives and what they look like. + Tell The Mayor Visitors to the Listening Room are encouraged to turn their experience into action by writing to New York City mayor Bill de Blasio, aided by a wall of stickers that explain ten suggested policy reforms. Visitors select three stickers, affix them to a pre-addressed postcard along with any additional sentiment they wish to write, and deposit them in a slot for the campaign to send on their behalf. The sticker wall then becomes a "reverse heat map" of which policies are most popular with New Yorkers in that neighborhood. + Neighborhood Madlibs The final in-person experience of the The Listening Room is a blank canvas that invites New Yorkers to express pride in their particular NYC neighborhood. This celebratory activity puts into focus the many treasured qualities that New Yorkers seek to protect when demanding more equitable policing policies for their city. + Listening.nyc and @listeningnyc The Listening NYC microsite and Instagram accounts are important channels through which New Yorkers can participate in the campaign. At Listening.nyc, New Yorkers can find a calendar of upcoming Listening Room pop-ups, downloadable versions of the Conversation Cards in English and Spanish, and a digital version of Tell The Mayor that contacts city officials via email and social media. Meanwhile, the @ListeningNYC Instagram account provides an online forum for discussing policy reforms and celebrates the diversity of New Yorkers engaging with the campaign.

Who worked on the project?

IDEO