Slides from my PSR keynote on how to secure software by bridging the gap between research and practice.
Video: https://t.co/mRr4CMrfKN
Event: https://iapp.org/conference/privacy-security-risk-2015
1. Cybersecurity:
How to Use What
We Already Know
Jean Yang
Privacy. Security. Risk.
October 1, 2015
@jeanqasau
r
2. @jeanqasaur
Our Future Runs on Software
Smart homes Driverless cars Automatic dating
But first we need to “solve” security!
3. State of the Art
@jeanqasaur
Research Industry
Undo
mechanisms
Encrypted
databases
Program
analyses
Provably
secure
software
Firewalls
The big question:
How can we take advantage
of research ideas in practice?
6. Jean Yang / Jeeves 6
State of the
art.
The Programming Perspective:
We Still Live in the 1970s
Permissions
checks are
required across
the code.
7. Policy-Agnostic Programming
@jeanqasaur
My PhD work. Programs attach policies to
data. The rest of the code may be policy-
agnostic.
Programming model provides
mathematical guarantees.
Implementation strategy scales
for real-world programs.
jeeveslang.org
8. Policy-Agnostic Programming for Our
21st Century Security Concerns
@jeanqasaur
Model View Controller
Without automatic
policy enforcement
With Jacqueline, a policy-agnostic web
framework that extends Python’s Django
jeeveslang.org
9. Part II:
How Can We Use Research
to Build Secure Software?
@jeanqasaur
10. Barriers to Industry Adoption
• Managers need to fight status quo.
• Programmers need to manage legacy code.
@jeanqasaur
What about the startup
route to tech transfer?
11. Security is no Tindog
@jeanqasaur
The Hot New Silicon
Valley Startup
Startup that Helps Us
Build Secure
Software
Fun concept. Slick design.
Toddler nephew can use it.
Integrates with your life.
Technical concept.
Verifiable by experts.
Requires infrastructure
change.
12. Unique Challenges for Security
Startups
@jeanqasaur
Justin Somaini,
Chief Trust
Officer
• Concept is highly technical.
• No flashy demos.
• Adoption requires client expertise and/or trust.
• Solving a technical problem != building a product.
14. Part III:
How To Motivate
Customers to Pay for
Security?
@jeanqasaur
15. Insecurity is Expensive
“A report released this month by the Atlantic
Council and Zurich Insurance Group estimated
that by 2030, an insecure Internet would reduce
global economic net benefit by $90 trillion. In
contrast, a completely secure Internet would
result in a global net gain of $190 trillion.”
-Jeff Kosseff, cybersecurity law professor
@jeanqasaur
17. Creating a Culture Around Caring
Consumer Example:
Snapchat
@jeanqasaur
Numerous privacy
violations, but valued at $16
billion with 100 million users.
Policy Example:
Dentists
Common to email records
in violation of HIPAA, but
HHS does not audit.
18. Summary: How to Secure
Software
@jeanqasaur
1. Ask smart people to come up with technical
solutions.
2. Put solutions into practice.
3. Iterate.
@jeanqasaur
Connect research with industry.
Change incentives for security.
Communicate and educate!
Notes de l'éditeur
Hi everyone. Great to see you all.
As an academic, I’ve spent a lot of time thinking about how to get my ideas out there making real software more secure.
I’m going to talk about how, in order to do this, some things are going to have to change.
TRANSITION: In case you haven’t attended any other talks at this conference, I’m going to start by saying two things.
First, our future runs on software. We have smart homes, smart cars, and even smart dating. If you haven’t read about the Tinderbox face classifier for automatically swiping right on Tinder, well, I don’t know if you want to.
But, second, we first need to “solve” security. Hackers can control our electric skateboards, our rifles, and our cars. Some hacks take years to get fixed and some hacks never get fixed.
TRANSTION: If we look at security research, it looks like we’re getting there.
Researchers have come up all kinds of solutions to protect our data. There are databases that let you search over encrypted data, mechanisms for replaying programs to find vulnerabilities, tools that carefully analyze your program for security bugs, and techniques for building software that is provably secure.
There is, however, this somewhat curious phenomenon that the state of the art in industry is firewalls.
The question, then, is how can we take advantage of these research ideas in practice?
TRANSITIOTN: This talk is about why ideas aren’t flowing out of research and how we can change this.
First, I’ll talk about my work as an example of the problems researchers think about.
I’ll talk what’s preventing security research from being better connected to startups, venture capital and companies.
Finally, I’ll talk about why policy makers and consumers are part of the solution.
The big question is how we can connect researchers to everyone else.
TRANSITION: Let’s start by talking about what researchers already know.
I can tell you what I know from the perspective of a programing languages researcher.
TRANSITION: What I know is that we’re still programming like it’s the 1970s.
I took these screen shots of code from a popular open-source conference management system. I don’t expect you to read the actual code and I’m not only trying to show that my IDE is from the 70s. What I wanted to show are the conditional checks spread throughout the code. I’ve highlighted them here. This code checks the role of the viewer, as well as other attributes of the environment, in order to grant access.
When programs get big, it becomes a lot of work to manage these checks whenever the programmer writes new code or fixes a bug. If you think about all the permissions in an application like Facebook, implementing these checks quickly becomes a programmer bottleneck.
Things are hard because the languages people use today weren’t designed to accommodate privacy and security. In fact, all of the programming paradigms that are in mainstream languages today have been around since the 1970s.
TRANSITION: This is why, for my PhD thesis, I worked on developing a new paradigm called policy-agnostic programming.
In this paradigm, programs can attach policies directly to data instead of expressing them as checks across the program. The rest of the program may be written without regard to the policies.
I spent my PhD developing a programming model that provides mathematical guarantees about policy compliance and also demonstrating that the implementation strategy scales for real-world programs.
TRANSITION: Using this programming model I have built the Jacqueline web framework.
On the left I show the code again for enforcing policies manually.
On the right I show screen shots of code using Jacqueline, a policy-agnostic web framework I built on top of the Django Python web framework. With Jacqueline, the policies are concentrated in a single place instead of repeated across the code. This reduces the opportunity for programmer error to cause information leaks.
TRANSITION: Doing things this way makes sense, but what does it takes to get people using policy-agnostic programming?
This brings us to the next part of our talk, about how to tech transfer these research ideas.
Some barriers to industry adoption include managers, who need to make economic arguments for change, and programmers, who need to manage legacy code and so can’t go around adopting every new language or tool that comes along.
These barriers are more problematic at larger companies, so the startup route to tech transfer is more appealing.
And there’s interest in funding startups. Sam Altman, the president of the Ycombinator incubator, recently Tweeted that he would like to fund dozens of security startups in the next couple of years.
TRANSITION: Now let’s think about what it takes for a technical security solution to become a successful startup.
An issue is that security startups based on these research solutions aren’t going to look like the hot startups we’re used to seeing.
Let’s consider Tindog, a hot new Silicon Valley startup that matches dog owners with each other in the style of the Tinder dating application. It has a fun concept and a slick design. It’s so easy to use that your toddler nephew could use it. You don’t have have to change anything about your life to use it.
Now let’s think about a startup that goes beyond patching and helps us build more secure systems. The concept is probably going to be highly technical and verifiable only by experts. A good solution will probably require us to rethink a least part of our software infrastructure.
TRANSITION: If we want security solutions that do more than help us patch our system.
The concept is probably highly technical.
There are no flashy demos, as the absence of vulnerabilities is harder to show off than the presence of features.
These systems often rely on the guarantees this provides, so adoption requires the client to have enough expertise to understand what’s going on—or trust the company to do it right.
Finally, the people who come up with these technical problems often do not have the expertise to also build a product. According to Box’s Chief Trust Office Justin Somaini, a major reason security products fail is because they’re made by security people. The stereotype is that “security people” are focused on aspects of the problem other than the user experience.
TRANSITION: You may have noticed that this Tweet—and Justin’s t-shirt—has the logo of something called Cybersecurity Factory. This is the accelerator I started with a fellow MIT student to help security startups come into existence.
This summer, in collaboration with Highland Capital Ventures, we ran the pilot program for Cybersecurity Factory, an 8-week accelerator that gives teams a 20K investment, a network of seasoned entrepreneurs, investors, and potential clients, office space, and legal support. Cybersecurity Factory is also the only accelerator to provide mentorship focused on a technical area. We have recruited a stellar team of industry and academic mentors, including Justin from the previous slide, and also Max Krohn, who founded OKCupid and Keybase, Raj Shah of Palo Alto Networks, and David Ting or Imprivata. Our teams said the mentors were the most useful part of the summer, as the mentors helped them chart paths forward, prevented them from going down dead ends, and provided useful introductions.
Our pilot teams are planning to raise funding after this summer.
TRANSITION: Something we learned from this accelerator is that doing all this helps companies come into existence and even raise funding, but there’s still a missing piece.
The missing piece is how to get companies to care enough about their security to pay for the products these startups are offering.
TRANSITION: It’s clear that it’s in everyone’s best interest for software to be more secure.
According to cybersecurity law professor Jeff Kossef, a report by the Atlantic Council and Zurich Insurance Group estimated that by 2030, an insecure Internet would reduce global economic net benefit by $90 trillion, while a completely secure internet would result in a gain of $190 trillion.
TRANSITION: While we can argue all day about what these estimates really mean, what they are pointing to is a sort of prisoner’s dilemma.
For those of you who are unfamiliar, the prisoner’s dilemma is a example in game theory where prisoners have varying sentences depending on whether they choose to cooperate or to betray each other. The optimal solution is to cooperate, but the cost of being betrayed is high, so it is theoretically optimal to betray the other person.
Similarly, in security, while it’s beneficial for everyone to have more secure software, there is a lack of individual incentive for companies to secure their software. Security requires more employee training, more programmer effort, and also doesn’t currently provide the same competitive advantages that adding new features does.
TRANSITION: While there are various policies the government could make to improve the state of things, the important aspect I’ve been thinking about is how to create a culture around caring more.
Right now, companies are getting the message that they can get away with not securing their software.
First, consumers need to show that they are serious about security and privacy. Despite its egregious privacy violations, the ephemeral photo messaging application Snapchat is valued at 16 billion and 100 million. The price of their violations was a few weeks of bad press. The Federal Trade Commission seemed to be the only people who really cared. I was the only person I knew who uninstalled Snapchat not just because I passed age 13, but to boycott their disregard for privacy.
Second, policy enforcers need to show they are serious about catching enforcement. One of our Cybersecurity Factory teams spent some time interviewing dentists about their HIPAA compliance and discovered that it is common practice to send X-rays and other records via email, in violation of the privacy standards. Part of this was simply because there was confusion about what compliance means, but people also cited lack of audit risk as a reason not to take privacy standards more seriously. They found that hospital CISOs face similarly low audit risk.
For a good example of policy enforcement, we can look to the FTC, which automatically detects when questionable trade mergers. We can imagine understanding better the sources of leaks and building tools to better enforce government privacy policies.
TRANSITION: Looking forward, you can take home and apply the following recipe for securing software.
It’s as simple as this.
First, ask smart people to come up with technical solutions.
Then put the solutions into practice.
There will be ways in which the solutions aren’t entirely suited for the application domains, so we’ll have to iterate.
We have a lot of work to do to make these steps possible. First, we need to better connect research with industry and we also need to change incentives so that companies are motivated to pay for security. Communication and education are important components.
I’m hopeful that we can secure our software. We have smart people working on it and we already have many of the pieces. I’m excited to work with you all to open up some of these communication channels and connect the pieces towards some approximation of a completely secure Internet.