Fifty years ago, NASA sent its Apollo 11 astronauts to the moon and back. The lunar mission — made possible by an estimated 400,000 engineers, scientists, and technicians — demonstrates how collaboration can accomplish seemingly herculean tasks.
Today, NASA continues to harvest the power of the crowd at the space agency’s Center of Excellence for Collaborative Innovation.
“[I]n this new world of lots of technology that’s rapidly changing, the crowd is actually one of the few ways to get things done effectively,” says Steve Rader, Deputy Manager of the center. “[W]e offer services around the government and help teams to really learn how to use these [open innovation] tools, and then to actually execute [ideas]. “
The initiative has its roots at the Johnson Space Center, when Dr. Jeffrey Davis, a NASA researcher, was notified of an impending budget cut in 2009. In order to continue his research with reduced funds, he began to explore possibilities in open innovation. After studying similar programs at Procter & Gamble and Harvard Business School, Davis began to tap the crowd for challenges and technology searches.
According to Rader, NASA’s pilots caught the attention of other agencies a year later when the White House team began crafting its own open innovation program.
“NASA’s pilots stood out as one of the few places that crowdsourcing and challenges were being used,” Rader says. “And so they actually asked NASA if we would stand up a Center of Excellence to not just help NASA do this, but help all the federal agencies that were interested [in building] this…capability.”
The Center of Excellence formally began operations in 2011. Since then, Rader says, the initiative has run 350 challenges and boasts a 90 percent success rate.
During a recent call with InnoLead, Rader discussed how the center approaches the crowd, what his team measures, and tips for successful crowdsourcing programs.
The Crowdsourcing Process
Rader says members of different federal agencies come to the Center of Excellence looking for guidance on running challenges. Crowdsourcing mentors, like Rader, will sit down with representatives from the agencies. They then decide if the department should set up its own crowdsourcing capability or tap into crowds NASA already has access to.
According to Rader, the teams will then create an inter-agency agreement — a two month process that allows the agency to transfer funding to the Center of Excellence for the project. Once the paperwork is complete, the teams start to work on the challenge using one of the center’s platforms.
“[W]e pretty much help them all the way,” Rader says. “Once they’ve done one or two challenges, we interact with them less… And the idea is that eventually — once they’ve proven they can do it internally at their own agency — then they can go off and have their own [crowdsourcing] contracts in place.”
Innovation Challenges at NASA
According to Rader, challenges at the Center of Excellence fall along a spectrum — varying from technical, science-heavy initiatives to storytelling projects. Rader highlights a few previous challenges below:
- NASA has run challenges on galactic cosmic ray protection, workshopping how to protect Earth from fast moving, heavy particles and radiation. Rader says his team engages nuclear physicist for these types of projects.
- The center also engages a crowd of freelance designers to create graphics, including sew-on patches that represent different teams at agencies.
- Recently, NASA began a storyboard challenge to help explain what happens to the human body in space. “[T]hose storyboard challenges…will be followed by video sample challenges, where we’ll try to find people that can actually take those storyboards and turn them into videos… [Then] whoever we select…[will get] money to go produce those fully rendered videos.”
- NASA also ran a series of 17 challenges to create segments of a miniature robot arm that will be used on a free floating robot at the International Space Station.
What to Measure
“[W]e measure everything,” Rader says. “We use Salesforce, extensively to really capture as much data as we can about each challenge.”
Data Rader highlights include the duration of the challenge, how many prizes were offered, and the number of participants registered. According to Rader, his team groups success into three categories: completely solving a problem, significantly advancing a solution, or making progress on a solution.
His team also looks at the financial impact of challenge outcomes. “We…start asking things like operational costs, life cycle costs,” Rader says. “How much did this save you…? How much is that worth to you over 10 years?”
The Do’s and Don’ts of Crowdsourcing
Even with a high success rate, Rader still faces the challenges of innovating at large organizations.
“[W]hat we found is people within an organization are very much in favor of being more innovative, to be more open, to actually be participants. But…that opportunity has often been yanked away from them,” Rader says. “[T]he organizations [says,] ‘We’re interested in…your innovation.’ But then, they don’t get funding. They don’t get recognized. People tell them pretty much immediately why their idea won’t work.”
Rader recommends that organizers set expectations when crowdsourcing to avoid this disappointment. He also emphasizes creating narrow problem statements before posting a challenge for the crowd.
“Never post a challenge or a statement out there with a site that says, ‘Give us all your best ideas,’ because you’re not equipped to actually go implement any of them. We call that the suggestion box black hole,” Rader says. “[A good challenge] is very narrow; you set expectations for what’s going to happen; [and] you make sure [that] the owner of that challenge is somebody who can actually implement whatever comes out of it…”
While buy-in can help initiatives run smoothly, Rader points to adoption as a common pain point in innovation. Internally, employees may resist crowdsourcing because it is a new technique, outside of traditional work norms. He also says some people feel as if they are pawning off their job by deciding to tap the crowd.
“We’ve been working to say, ‘Hey, that’s not really the way it works. [Crowdsourcing] actually helps you to do your job better,'” Rader says. “And that’s been a slow process.”
Rader says that adopting well-executed challenges — with the right expectations, guidelines for intellectual property, and an implementation plan — can help teams find solutions quickly.
“Within an organization, people working…on a hard problem typically create a little bit of a shell… [T]here’s this invisible barrier between [them and] what’s a really obvious solution,” Rader says. “And…there are these folks that are living in other domains…and if [external innovators] can just see the overlap, they can apply solutions…to problems you have. And that’s where that real innovation magic happens.”