An Unconventional Look at the Future of Technology with Baratunde Thurston (Google I/O’19)


[MUSIC PLAYING] BARATUNDE THURSTON:
I/O highlights– It’s been a really big momentous
and mostly beautiful gathering. I saw one of the creators of
machine learning and neural networks this morning. I declare with 100%
certainty that machines will gain consciousness. And none of you
reacted, and that made me more alarmed, right? [LAUGHTER] What’s
going on there? I saw a space pioneer. Dr. Mae Jemison reminds us all
that we live under one sky, under one roof, and remind us
to connect with each other. I saw fruit used last
night to make music. And it made it hard for me to
eat that Apple during lunch today because every time
I took a bite I heard it screaming apple, apple. Like fruit shouldn’t talk. And if it does talk,
it shouldn’t talk in a self-aware kind of way. It’s just creepy. [LAUGHTER] There’s been a lot of amazing
news announcements out of I/O this year. We have Android queue
updates, Flutter for the web, something called a Nest Hub
Max, and of course, the biggest technical innovation,
Dark Theme, which seems to really excite the
super geeks in the room. I got you to glance
away from your screens, so I should say Dark Theme. [APPLAUSE] You’re so easy. You’re so easy. Now my presentation will run
at 30% higher energy efficiency and push fewer pixels. So I was paying attention. I really was. I learned a bunch of things. I learned that in
NBU is Google’s way of saying next billion users. That’s how this company thinks
about growth, a humble brag. The Pixel 3a is out. And we got
privacy-ish, which I’m going to come back to later. So the first
conclusion I came to after many of the sessions and
overhearing people in lines was that machine
learning is going to take all the human jobs. I saw very specific ones where
machine learning algorithms outperform or nearly
outperform humans in things like translation
and transcription, in driving, in
picking things up, and most interestingly of
all, in developing machine learning algorithms. I want to show you a slide that
I saw yesterday, which really– it looks kind of banal. But this is a terrifying image. The red curve is the performance
of a machine learning algorithm built by a
machine learning algorithm. The black line is what
puny humans are capable of. And here’s why that concerns me. I have assumed a
level of self-interest and self-preservation
on the part of the people designing
and developing this new world we live in. So taking away certain
jobs and occupations, cool, but not my job. And yet, the people
with the super power are like you know
what would be cool? To create a superpower capable
of creating superpowers greater than my superpower. Stop. You don’t have to do it. Like literally, no
one’s making you do it. Do something else. Just a thought. Just a thought. That curves should be a warning
line we’ve gone too far. But at the same time, I had
another opposite conclusion, which is that machine learning
will bring us closer together. I spent a lot of time
in the experiments dome. It’s one of several
domes here on planet I/O. And I witnessed and participated
in some really beautiful uses of technology. I saw a conversation between
two people, one of whom was hearing impaired. Yet that conversation itself
was not impaired at all. I took part in a dance lesson
from one of the greatest dancers we as a species
have ever produced, Bill T Jones, thanks to
the PoseNet algorithm. And I’m as good as he is now
as submitted by exhibit a here. I would literally empty a
wallet or max out a credit card if I could get personal dance
lessons from Beyonce, right? That would be really successful. I met this developer
Ezra from Egypt. And we made random beautiful
art together using something called Rotavo. Now I’m assuming this isn’t
some kind of weapon system. It’s just used to make
cute art, but we made this. And as I stared
into it, I recognize it as a metaphor for the
arc of human existence and that we peak sometime
around Candy Crush and are now in the climate
catastrophe decline, unless we intervene. I also recognize that one of the
greatest innovations of I/O ’19 was to get hundreds and I
mean hundreds of developers to go outside. [LAUGHTER] Just go outside. [APPLAUSE] And then be forced to
exercise because everything is 1.3 miles apart
from everything else. I’ve walked 30 miles
in the past two days. I have calluses now. This was also a test of in what
situations and circumstances will people stare
at their screens. It turns out
everywhere, everywhere, including the
digital detox zone. [LAUGHTER] People are like,
I’m here to tox. I don’t know what
you’re here for. I’m a toxer. We’re just sitting on the
street with nothing else to do. Thank you, [INAUDIBLE],,
for the photo. And maybe, I was one of the few
to notice the sort of police overwatch situation. And as I took advantage of
digital imaging technology and zoomed in, I
realized there’s not a human in this car. It’s already here. Skynet starts with
an empty police car overlooking a bunch of super
powerful technologies creating the future. It’s too late to run. Enjoy captivity. The greatest invention,
though, other than dark theme are these porta potties. Can we give it up
for the ultra lav? Come on now. This is amazing. [APPLAUSE] Actually bigger and better than
my New York City apartment. So– [LAUGHTER] –that’s most other reflection
on I/O. Again, my name, my full name is
Baratunde Rafiq Thurston. And I am hereby way of
this woman, Arnita Lorraine Thurston, who raised
me and my older sister Belinda on her own, who
was a multitude of people– a survivor of sexual
assault, a paralegal, a computer programmer, an
activist, and environmentalist. And it was her working
as a systems analyst for the federal government
in the early 1980s that brought the first
computer into our house. A computer that helped introduce
me to the early internet. Clap if you remember
this internet? Yeah! [APPLAUSE] Text, baby. No GIFs. No cats. No ads. It was a simpler
and beautiful time. And I witnessed and was a result
of the power of technology to upgrade my life. My mother’s financial
life got upgraded, which meant my education,
quality of food got better, and my sense of creativity of
literally what was possible was powered by technology
from an early age. This is a image of an article
I wrote in high school in Washington DC in 1993,
headlined Upper School Joins Internet because we
got a full time always on t1 connection, which
changed everything. And among the observations
I had in this article, this one stood out to me. I wrote students have used
the computer room because we segregated them back then
to write papers, solve math problems, conduct
science experiments, and connect to local
libraries and universities. All legal, make-your-momma
proud type activities. What I did not anticipate
was Fortnite, cyber attacks on our election
systems, selfies, or dark theme of course. The path that I
have walked has been enabled by what technology
has brought to us. I registered my
domain name in 1998. Most of the jobs I’ve
ever had were directly influenced by or
working with technology, including working for America’s
finest news source, The Onion, where I was– [APPLAUSE] Yes, for the good
kind of fake news. Yes, I was director of
digital there for years. Right after that, I
helped start a business, which merged
technology with humor and tried to bring
a level of humanity to some of these cool tools. And our signature action
was a series of hackathons that we called Comedy
Hack Day and straight– [APPLAUSE] Yes! We got the– yes! That’s what I’m talking about. We got one person
who knew about it. I’m very excited
about the human– it’s a human connection. It was beautiful. [LAUGHTER] Should do more of that. I like you. So we would bring
developers and designers and comedians together
to imagine and then build working prototypes of jokes. So someone once made a digital
assistant, voice enabled, but only in the
body of a Furby that had to be attached
to your shoulder. So you’d walk around with that. There was a team that
made an app, which was for a long time in the
app stores, called EquiTable. And that would allow you to
split your bill with friends after a night out. But it wouldn’t split
the bill equally. It would split the
bill equitably, taking into account the
pay gap based on gender and race in the United States. So different people would
pay different amounts. It was reparations one meal at
a time is what they call that. [APPLAUSE] And then I helped
in the first year of “The Daily Show
under Trevor Noah” to reimagine how that now
global institution deploys technology for more than
publishing video feeds online but to get way more
interactive with its community and more creative with the
possibilities for jokes. Now technology can be very
personal for all of us. And it has been for me through
a little company up the road called 23andMe. As I mentioned, I have
an older sister, Belinda. She’s nine years
ahead of me in life in more ways than just age. She is also– we have
different fathers. And after our
mother passed away, we had even more questions that
a living human couldn’t answer. So we went to the great
database of 23andMe. And I got incredible
bragging rights thanks to the comparison
capabilities within 23andMe. I got to be able
to find out that I am less Neanderthal
than my older sister. [LAUGHTER] Clap if you’re a
younger sibling, if your younger sibling. [APPLAUSE] We’re the good ones. I’m just kidding. Everyone’s good. But it is really powerful to
be able to say to your older sibling, no, you’re the
Neanderthal because science, right? Like you have backup. But big sis had a genetic
clap back I did not foresee. She said that’s cool, baby bro. But you’re also
way whiter than me. And I was like no,
not that there’s anything wrong with being white
in a room full of white people. But no! Like I’ve got a lot invested
in the blackness thing. And this is way late. And how do I not know this? And what does that even mean? And it’s extra awkward
because I wrote a book called “How To Be Black.” Like that’s not– [LAUGHTER] It’s kind of
undermines my brand. You know what I’m saying? Like this is not good. I didn’t write a book on
how to be 81.6% black. That was a totally different
book, not a bestseller. Very different author. My life is mostly great,
mostly incredible, because I’ve been filled with
great and incredible people, and that includes
the end of last year my then girlfriend
proposed to me. And I had the good
sense to say yes. So we’re getting married. I’m engaged. It’s very exciting. [APPLAUSE] And I’m going to take this
opportunity to share with y’all in the world. We’re starting a family. It’s very exciting
time to do that. It’s an Amazon family,
and what we did is it means we merged
our Amazon account. So she can use my prime
but with her credit card. It’s great invasion. And it really changes– It’s also– it’s
just really nice to have our union recognized
by Chairman Bezos, you know, head of
one of the largest nongovernmental military
operations in the world. It’s really cool to just be
seen, truly seen like that. There was a time when to create
a child, which we are not actually doing, but
to create a child involves a lot of physical work. But you all know disruption,
machine learning. And Amazon makes it
much easier to just add a child to your family. They’ve got a slick
UX, and then you have a bunch of parameters
you can fill out to kind of optimize your child
for your particular situation. Now, I’m not a fan of the
binary gendery choice, but that’s a simple
update on the back end. So what I did was I
created a little girl after my favorite Roller Derby
character’s name, Beyonslay. And I just skipped those early
years where you don’t sleep. So she’s almost 11 now. And she’s a cute little unicorn. But once you’ve got a
child, this is the beauty. Amazon gives you options, as
you can add another child. Or if you’re not happy, you
can edit your existing child. [LAUGHTER] I mean, clap if you ever
wanted to edit a child, right? Like that’s an amazing– we need that IRL, you know? So you choose to edit the child. And when you click
through, you’re given even more
dramatic options. Do you save the child? Or do you remove
the child, which I’m pretty sure is a
human rights violation. But it’s cloud based. It’s Amazon. Who knows. It’s beyond the jurisdiction
of any government. It brings me to Waze,
one of my favorite tools for always letting me
know where I am but denying me the knowledge
of how to get anywhere. It’s an amazing
combination where I know less the more I know. And one thing I took
a look at somewhat recently is just the
level of chaos and mayhem that happens inside of Waze. I’m not sure how many of
you all payed attention to what happens inside of Waze. But this is an average route
through a busy Los Angeles interchange. But when you zoom
in, you see violence. Like there is a blood
thirsty, sword wielding nut chasing Royalty. There’s someone laughing in the
background protected, though, probably why they’re laughing. And the police aren’t
doing anything about it. [LAUGHTER] Oddly reminiscent sadly of
what happens in the real world. So maybe Waze is actually real. When I got here to I/O, I wanted
to see what I could learn, see what some of
the sessions were. So I loaded the
app onto my phone. I found out that
the app is created by some unverified developer
who wants access to a view and edit all the
events on my calendar. And once I just
click through blindly through those permissions,
I started searching. And I searched for machine
and I found so many options and things to learn about. I searched for ethics. I found one, which
is better than zero. Progress. So Y’all should definitely find
the stream on that one and like embody everything
in that session. And then I searched
for avoid apocalypse. And I didn’t find
anything in the session. It brings me to a moment
in my life three years ago. I was invited to South
by Southwest interactive. I’ve been going for almost
a decade at that point. And this particular
year, they brought me in to be inducted into
the Hall of Fame, joining people like Danah Boyd
and Ze Frank and Kara Swisher. And in my brief
acceptance speech, I made some remarks,
which I think are still relevant to the
world we’re living in today. I want to share a brief
moment of that with you. And you can read along. I said to them and I
say to you still now the algorithms are coming. And we know they aren’t
pure or objective. Like journalists, they’re
embedded with the values of their makers. They reflect the
society around them. But this renovation
is all about making the world a better place. And the algorithms and
code that claim to do so derived from this
very imperfect world, sick with racism and sexism,
and crippling poverty. Then isn’t it possible that they
might make the world a worse place? Could we end up with virtual
reality racism or machine learned sexism? And today, could poverty
be policed by drones and an internet of crap? Possibly, maybe
the answer is yes. So today we live in some version
of the world, the world that gives us headlines about social
media disrupting our democracy, about yet another data leak from
yet another organization taking advantage of access to
information about us, about our legislators selling us
out to ISPs who are selling us out to advertisers,
who are selling us out to brands who just want our
money, about self-driving cars who literally can’t see black
people, that’s terrible, about machine learning applied
to resumes, which conveniently sort out all the women because
that’s also what history has done, so we’ve just scaled that
into our present and future, about an automated
response system that was goaded into mentioning
the n-word publicly on Twitter, and about the cloud based
apology company known as Facebook, apologizing
yet again for something it deservedly should
apologize for. I have been working
over the past year to try to integrate my own
thinking around technology. I was a big booster in
the ’90s and early 2000s. I have seen the harms as well. And last year, I wrote a bit of
a manifesto I went on a journey to try to understand
how all my data existed amongst the major platforms,
amongst app developers, amongst the very websites
or web browsers that I visit and that I use. And what came out of that
was a set of principles that I then open sourced
with a Google Doc that others have
contributed to to help guide us more conscientiously
into the future. And so I’m going to
walk you through several of those principles
and hope you take them in the spirit of generosity
and embed them into the code and into the values
as you go out and build this world that
we all want to live in. The first is about transparency
and what I call trust scores. Now look we all get a
score from a system. We get credit scores
from the financial system that determine if we can get
a job, if we can get a home, if we can get a car. And I think it’s time to flip
that scoring system around to create something like
a trust score, which rates the organizations that we
are in relationship with based on how they handle us, how
they handle our information, based on what’s inside
of the technology. And we have a good metaphor
from the world of food. When I want to know
what’s in my food, I don’t drag a chemistry
set to the grocery store and inspect every
item point by point. I read the nutrition label. I know the content, the
calories, the ratings. This is possible. And with that knowledge, we
can jump to another world and look at the
building traits where we get broad ratings of lead
generation for sustainability and energy efficiency. I shouldn’t have to guess about
what’s inside the product. I certainly shouldn’t have
to read 33,000 word legalese terms of service to figure out
what’s really happening inside. All of that should be as
usable as the UX and systems that got me onto the
platforms, into the apps, into the services
in the first place. Now the second principle
is about defaults. Changing those defaults
from open to close, from opt out to opt in. Defaults matter, y’all. Most of us don’t
change the default. As in life, we accept mostly
the world that we walk into. We accept the family beliefs,
the religious beliefs, and the settings
and the programs that we’re using as
far as what rights they claim to have with our data. I think we need to totally
switch to a minimal data default, to something more akin
to data conservation rather than data extraction
and exploitation. There was a great series
of guidelines and a model from Mozilla, which they call
their Lean Data Practices. I encourage you all to look at
those and apply them and ask, do I really need
this information about the customer,
the user, the person, if you choose to call them
a person every now and then? And maybe we should treat
data like other things we want to limit– sugar, Netflix, Trump
tweets, carbon, fossil fuels– and see how far we
can go with how little of that information about
the user as possible. That brings me to the third
point about data ownership and data portability. Now I think of this in a
couple of different layers. I think about the
data that I generate or the data that is
derived from my actions, about the data of us
and the data about us, about the content
and the metadata. So when I take a photo and
upload it, OK, that’s mine. That’s user generated content. But when I walk
from here to here and create a digital
trail, a little history, that is also data of me. And I think we need to
start living in a world where that is mine, where
that is a part of me, where I have a
level of sovereignty and self-determination. As I do with this
body, so should I do with what this body
represents in the virtual world because things are going
both ways in business. They should go both
ways in rights as well. And when we start to think about
ownership of data in a more expansive way, it
makes something like this a bit more
interesting, these capture codes, these little
quizzes and tests to prove you’re not a robot,
which pretty soon we’re all going to fail because the robots
are going to design the tests and then they’re
going to pass it. But I’ve identified so many
hills and stairs and cars that I’m essentially
a co-founder of Waymo and every other self-driving
technology out there. We should all be considered
co owners and partners in the products and services
that are being built. Machine learning
depends on data. Artificial intelligence without
data is artificial stupidity. So every user is
also a contributor. This is much more of a
cooperative economic model than a top down capitalistic
extraction and exploitation model. Let’s have that
framework in mind. And I know it’s complicated
but so is everything else we’ve ever done. And we use technology
for the express purpose of dealing with the
complexity of trying to abstract it and find
ways to still approach it. Number four, I think
we should start shifting our terminology around
permission, not just privacy. Privacy has this vague yet
hard edged binary field. It’s private or it’s public. Privacy or not. Permission has layers. When I think about permission,
I think about the first Unix system that I ever used back at
that high school and the chmod command, C-H-M-O-D, changing
the access permissions on directories, on files, for
users, groups, and others, including are the owner. And I think we need to apply
that as well to information about us. I’m cool entering into a
relationship with a developer around this use for this
app for this amount of time. That doesn’t mean that
seven years from now, the business that
created that app got acquired by a company I
want nothing to do with, and they just automatically
get me, right? I married you, not your cousin. It doesn’t translate that way. So giving much more fine
tooth controls around what apps and technologies
have permission to do with whom for how long,
in what context, including location, will bring much more
self-determination control and respect to the
underlying relationship. And I can’t talk about
respect without talking about inclusion,
systemic inclusion. We all live in this world
of systemic exclusion. We’re sitting atop centuries
of all kinds of exclusion. And so where we are
is a result in part of where our ancestors
were, where the people who came before us were, of
where the system that was designed by people put us. And we know too much now to say
it’s all equal for everybody. We literally have the data
to prove that’s not true. So we should be using
that information to create inclusive systems
across many dimensions. And I want to give some
kudos to this organization and many others for
the steps they’ve taken toward accessibility,
which have gotten so much better than they have been. They can still get better still. ProjectInclude is
one such effort that’s focused on the
technology industry. They got literal work books that
I think you should be using. It’s not about charity for me. Doing the right thing
is a good thing, but it’s good for business. It’s good for human rights. It’s good for designing
not just technology, but the world that we’re
all going to be living in. We’re not operating
in a vertical anymore. This stuff is infecting
every area of life. And so if people from
every area of life are not participating in the
creativity and the construction of the rules and the
systems, then they’re subjects, not citizens. And that’s not the type of
world that I think any of us want to be living in. Number six, imagine harder. I want everyone
to think actively about what the worst thing that
can happen with your technology is. Don’t just think about the
hockey stick and the valuation and the user growth and
the delightful personas and testimonial stories come to
life when something goes well. Think about the
tragedy and the horror if something can go wrong. And then use this
tool, ethical OS, developed in
partnership with Omidyar and a bunch of
other organizations, to literally come
up with a framework to think about
worst case scenarios and how to manage against them. One of my favorite examples
from their workbook says could your product
or business do anything your users are unaware of? If so, why are you not
sharing that with them? Would it pose a risk
to your business if it showed up in
tomorrow’s news? If you’d rather it not be
written about publicly, maybe you’re ashamed of it. Maybe someone in the
organization is ashamed of it, and that is not a good
way to operate in, again, any relationship. Number seven, we’ve got to
break open the black box. So much of what is
happening with technology is beyond us, almost literally
beyond comprehension. So we’ve got to start
building in ways to understand what seems
to be not understandable. There’s an analogy in food
inspectors in that industry and auditors in the
financial industry and peer review and academia. We rarely just let
people skate by. So please, start building
in even more ways to inspect,
interrogate, and measure the impact of the
tools we’re building. When someone commits
driving under the influence offense, a DUI or DWI,
we take away the car at least for a certain
amount of time. I would love a world
where someone abuses the data of millions
of users and so was not allowed to
continue to have access to the data of
millions of users. That’s the most logical
approach that we take in every other area of life. Technology is just
another area of life. And “The New York Times”
launched an experience last week that did
this for advertisers. I can’t be here and not
talk about advertising. We’re here because ad money. 84% of Google’s
money is ad money. So let’s talk a bit more about
how those ads even get made. What the “Times” did
with pretty brilliant. They bought an ad campaign. And the content
of the ads was how revealing how those ads were
derived as in what that ad buyer knew about
the user when they booked the ad in the system. Here’s one example. This ad thinks you’re
trying to lose weight but still love
bakeries, as derived from browsing history probably
and credit card history. That level of
opaqueness– if more of us understood that, not just
as the creators of the tech but the users of it, we’d want
some more breaks, some more visibility, some more
controls to prevent that. That’s probably a bit too much. We can trust, but we need
mechanisms to verify. We also need to
upgrade and enforce the rules around all of this. Again, I return to the idea
of once you identify abuse, you stop the underlying
abusive behavior. You don’t just say, please,
don’t do that again, and I trust that you will
mind your own business. So I’m glad to see the industry
calling for regulation. We need independents as
that comes into play. Finally, this is the ninth
and final for now principal, and it’s actually the
more inspiring one for me. But I think of this
as encouraging folks to think beyond
just consumption. A lot of the models
that were designed, the business model and
the technical models to support them was
about getting people to spend time with,
to log on for longer, to spend money with, to suck
life out of, to consume. And I say let me do what you do. I remember the first time I
used the ad buying capabilities of one of the platforms
where I was merely a user. And I saw my friends in a much
more empowering and interesting way. I could slice and dice them. I could understand them in a
way that I couldn’t as a friend. I could only see
them that way if I chose to see them as an object,
as the target of an advertising campaign both enabled
by this platform. Let’s close that distance
and allow creativity on the part of the users, the
contributors in this new world that I am imagining. Here’s a couple of examples
of beautiful things that have been done
with technology that allow us to see
ourselves a bit more honestly and a bit better. There’s a group called the
Equal Justice Initiative, which does great work around
unearthing and revealing some of the dark history
of the United States with respect to racial
terror lynching. They partnered up with
Google, google.org, to build Lynching in America
as a virtual experience to accompany their museum
in Montgomery, Alabama. There’s the Center for
Policing Equity, which is bringing machine
learning and big data analysis to one of the most
painful and intractable problems in this country, which
is discriminate use of police force where we know that
police use force far more disproportionately
against black people than against white people. Science and math are
helping us understand why, and with police, are
reducing that delta. In New York City, where
I lived for 12 years, JustFix.nyc is putting
power in the hands of tenants to argue
against their slumlords, especially tenants in low
income and public housing. So they can do things like get
repairs in their apartments, find out who owns the building,
respond to an eviction notice. This tool is using that
power with a more of a sense of spider-man rules, right? You can think about the
ability of all of this to create power to create
wealth and then ask for who? Who needs it more? Who needs Cloud music
banging right now? Clearly we do because
we just got it. I timed that perfectly
to have like a Cloud under beat to my super
serious moral talk at the end of the session. But yeah, make sure that
these tools are not merely accruing wealth and power to the
already wealthy and powerful. That’s not a fun world. That’s a fun world for like six
people, literally six people, and the rest of
us suffer in that. So I’m going to leave you with
some resources, some things you can follow up with. All these slides are online,
baratunde.com/googleio. They’re in a Google slideshow. And you can download
them and manipulate them to your own degree. But I want to talk
through ethicalOS.org. Do that– dotEveryone.co.uk,
The Data Society and Research Institute out of
New York City where I’m an advisor, the AI
now Institute, which asks deep questions and does
independent study of some of these systems, Project
Include, which I already mentioned, and then a couple
of books, “Winners Take All” by Anand Giridharadas,
read that, yo, “The Age Of Surveillance
Capitalism,” read that first. And an inspirational
twist on all of this that goes back
much farther than either of those two is a book
called “Decolonizing Wealth.” And it’s based on the ideas
of indigenous wisdom brought to bear on a system that
we’ve built which is, again, very extractive,
very exploitative, but could be much more
contributory, circular, and balanced. For inspiration, The Verge
has a beautiful podcast series called Better World, which
I think of as an inverted “Black Mirror.” It’s thinking of
positive futures that we can use
with this technology and actually giving you
characters and narrative to be able to inhabit. The biggest book
of all, “Drawdown– The Most Comprehensive Plan
Ever To Reverse Climate Change.” I think if you’re not working
on the climate crisis, do that. It’s like the
biggest problem that connects all the other
problems, and there’s so much being done right now to improve
energy efficiency and use machines to find these spots. We can do and should
be doing a lot more. To me, these are
questions not of tech. They’re questions of life
and reality and of power. And as we’re building
these things, we’re also building
the world we’re going to live in where we coexist. And so ask what’s the
world I want to live in? What’s the world
you want to live in? What’s the world
we want to live in? Let’s imagine harder and imagine
better and build that world. Thank you very much. I’m Baratunde Thurston. Wakonda forever. [APPLAUSE] Enjoy the rest of I/O.
Thank you, livestream. [MUSIC PLAYING]

3 thoughts on “An Unconventional Look at the Future of Technology with Baratunde Thurston (Google I/O’19)

Leave a Reply

Your email address will not be published. Required fields are marked *