Close

Innovation Insights from Our 2019 Impact Main Stage

By Kaitlin Milliken |  February 3, 2020
LinkedInTwitterFacebookEmail

Silicon Valley radiates so much entrepreneurial energy that large companies can’t resist. They often set up innovation outposts to try to monitor the activity and take meetings — lots of meetings. But Chris Anderson cautions against that kind of passive, observational approach. 

“By and large, [these tactics] are all failures…” said Anderson. “One thing I’ve learned about innovation is that you learn by doing.”

The do-it-yourself ethos has guided Anderson through many of his own entrepreneurial endeavors. During a mainstage session at InnoLead’s 2019 Impact event, Anderson recounted the founding story of DIY Drones, an open source community where enthusiasts can learn to make their own Unmanned Aerial Vehicles (UAVs). 

In 2007, Anderson spent a weekend with his children building a Lego Mindstorms robot that came with a radio controlled airplane. After finding the end result disappointing and flying the plane directly into a tree, Anderson said he looked for ways to make a better flying robot. 

“It turns out in 2007…if you Google ‘flying robot,’ the first result is drone… [I told my kids,] we’re going to make a LEGO drone. And so we just stuck together the sensors and did a little programming,” he said. “Today that plane is in the LEGO Museum in Billund, Denmark, as the world’s first LEGO UAV.” 

In addition to launching DIY Drones, Anderson also founded DIY Robocars, another open source community focused on autonomous vehicles, and kickstarted the Linux Foundation’s nonprofit Dronecode Project. Anderson is also the CEO of 3DR, a company that sells drone software.

During the conversation, Anderson discussed regulations faced in the autonomous vehicle and aircraft space, working with the FBI, and finding the right use cases for new ideas. (The 2020 Impact conference takes place in Boston, October 19-21.)

On Regulation: Ask for Forgiveness, Not Permission

For better or for worse, Silicon Valley is an…”ask forgiveness not permission” culture. You do it first and then figure out [regulations]. You innovate in the gray zone. So obviously, if it’s illegal, you don’t do it. And if it’s legal, somebody has already done it. But if it’s in the gray zone, that’s where you play around. 

[When] we did [build-your-own] drones, recreational [flight] was allowed, but commercial was not. So we did recreational. … It’s regulated as a weapon, unless it’s public domain. So we open sourced it. You have to go through the [the government], unless you’re selling to developers. So we sold to developers. We found all these little gray zones. The [micromobility] scooters, PayPal, Airbnb, Uber — these are all examples of innovation in the gray zone, where they asked forgiveness, not permission. 

[When it comes to finding applications for drones,] the regulations have been a big barrier. The reason you don’t have drone delivery today, or you can’t fly an air taxi here, is largely FAA regulations, and for good reason. So there are two ways to solve it. One is you can sit down with the FAA before the drones exist and come up with a way to do it. And that went nowhere. The other is, you can put two million drones in the air and…get the genie out of the bottle. And then the FAA says, “Oh, gosh, we have a bit of a problem here. What are we going to do?” And then, they bring you [to the table]. 

When to Work with Regulators 

Anytime you have an open platform, it will be used for good and for evil. … But by and large, you hope the good outweighs the evil. … So we proactively reached out to the FBI, and the CIA, and the NSA… We said, “Look, we just want you to know what we’re doing.”  

[We told the open source community,] “If…you’re in the community doing something irresponsible, we will call up the FBI and tell them.” We are super transparent about that. … As things moved on, [the FBI] would show up every month [at our meetings], and they would say, “Hey, here’s some evidence about people using [drones] to drop drugs into prisons.” And then, there was a bad couple years where they would bring us the folders of ISIS using our stuff. And we were…one of the main drone suppliers to ISIS. That’s not intentional.  

[We asked,] “You know, guys, is there anything you’d like us to do?” They would [say], “[If] you… put a gun to my head… [I] couldn’t think of anything to do. Just keep us informed, help us understand what’s going on here. And maybe we’ll develop counter drone measures…” 

Find the Non-Controversial Use Case for New Technology

What we’ve learned — we’ve learned this in drones and we’re learning in [autonomous] cars — is that you can’t just deploy everywhere at the same time. So right now, there is a retirement community in Florida…where a company called Voyage…has fully autonomous, self-driving car fleet there. … And residents, if they want to go to the gym or go to the store, get their hair done, an autonomous car shows up. … Why does that work? It’s a closed community, not a public road. There’s very few other cars on the road. They’re driving very slowly. They’re providing great service, giving independence to older people and keeping people who probably shouldn’t be driving [from being] behind the wheel. So there’s an example where there’s no controversy whatsoever. Total win-win. 

In drones right now, if you’re in…[parts of Africa]…and you need blood, drones deliver it. A company called Zipline delivers. It’s a perfect application. Blood doesn’t have a long shelf life, it needs to be refrigerated…it tends to be a centralized resource. … You want blood? You wait 20 minutes, a parachute comes out of the sky, and there’s your blood. No controversy whatsoever. Take those same models to the streets of San Francisco, and it’s a shit show.

There’s a phrase in technology: “The future is already here. It’s just unevenly distributed.” The question is not when, but where? Where can technology be deployed in a way that won’t be controversial, or will seem net good? 

LinkedInTwitterFacebookEmail