Netflix this week said it’s taking its bug bounty payouts public with Bugcrowd. It joined the likes of Cisco Meraki, Fitbit, and Square, which also recently started using the crowdsourced security program to pay, ahem, “researchers” (read: hackers) to find and report security vulnerabilities in its products.
Since it first launched its private buy bounty program in September 2016, Netflix has received — and fixed — 145 valid threats, according to a Netflix security team blog post. It started with 100 researchers and, over 18 months, grew to more than 700. “We also recognize researcher contributions on our Security Researcher Hall of Fame if they are the first to report the issue and we make a code or configuration change based on the report,” the blog said. “We pay researchers for unique vulnerabilities that meet the guidelines of our scope as soon as we validate them.
Netflix pays up to $15,000 to find a bug. Cisco Meraki said it will pay up to $10,000 for vulnerabilities. And Google last month said it awarded $2.9 million through its Vulnerability Reward Program in 2017, bringing its total bug bounty payout after seven years to about $12 million.
It sounds like a lot of money to pay hackers to find security flaws. But the cost of not finding these vulnerabilities can be even steeper.
Intel recently announced it revamped its bug bounty, opening up the previously invite-only program to the public and paying up to $250,000 per valid vulnerability. At the same time, the company is facing at least 32 class action lawsuits related to the Meltdown and Spectre CPU flaws and three other insider trading lawsuits. Intel CEO Brian Krzanich sold $24 million in stock two months before publicly disclosing the chip bugs.
Paying more for security vulnerabilities “is a good thing to do,” said Dr. Srinivas Mukkamala, co-founder and CEO of RiskSense. “These are publicly available applications. Intel, Cisco [products] — anybody can buy them off the shelf. If someone finds a bug, you can’t prevent them from releasing it. Why not pay somebody to report it? You’ve got to give them enough money where they will never refuse.”
Bug Bounty Risks
There’s a lot more to bug bounty programs than simply doling out cash, however. They can also pose significant risks and legal issues. Before implementing a security vulnerability program, enterprises need to put specific processes and policies in place, Mukkamala said.
“Uber is an example of how things can go south when bug bounties are not managed right,” he explained. In this case, Uber initially didn’t disclose the data breach and paid hackers $100,000 to keep quiet.
Laws vary from country to country, and even within regions companies have to disclose risk to different regulators and oversight groups. The U.S. Securities and Exchange Commission, as one example, requires public companies to disclose cybersecurity risks and breaches that are material to investors.
“If there is a material cybersecurity risk, you’ve got to disclose that, whether it’s found by you or a bug bounty or a third party,” Mukkamala said. “Organizations that want to venture into bug bounty programs first need to understand the scope and then have the proper legal framework in place.”
Companies should also make sure they have the resources to fix the bugs once they are discovered.
“The big problem with bug bounties is that once you get the findings, it’s time to rock and roll,” Mukkamala said. “You cannot say, ‘I don’t have time to fix it,’ or ‘I have $150,000 to run a bug bounty but I never thought about how much it would cost to fix it.’”
A company that spends $50,000 testing security components of a web application will find, on average, 50 to 100 vulnerabilities, Mukkamala said. It will take between two and three weeks to test the app, and fixing the issues and redeploying it will take another few weeks.
“We’re talking 3x to 4x to fix it,” Mukkamala said. “When we talk to people we make this very clear. First you must have a proper security testing program in place before you venture into a bug bounty. It has to be crawl, walk, run.”
The crawl piece involves setting the scope and determining what products and services to test. Then it’s time to walk — testing with greater frequency.
“Every new application that gets built, or whenever we change the architecture we are going to test it,” Mukkamala suggested. “Start with periodic testing, then move on to continuous testing.”
After internal testing teams are established, then companies can expand outward to third-party testing, he said. “And then you can pay bug bounty hunters to do it,” he added. “That’s your run.”