Friday, December 27, 2013

Lemons for Security - Information Asymmetry

My wife handed me an article from the Annals of Internal Medicine (Vol 157, No. 2, p.139-140) entitled "Lemons for Obesity" by Michael Lauer, MD. At first, I thought she's trying to hint that I need to lose weight but she said there's a section in the article that might apply to cybersecurity. So, my curiousity got the better of me. Dr. Lauer's article described his thought about the obesity drug Qnexa and issues with aftereffects.

What does this have to do with cybersecurity?

Lauer mentions a Nobel prize winning paper by George Akerlof on the market for bad cars aka "lemons".  He summarizes Akerlof's "lemon" scenario as follows.

"Used car buyers believe 75%  of cars are good (peaches) and 25% have problems (lemons). Buyer know lemon owners want to sell because of these car problems.  Suppose a lemon costs $5K and peaches cost $20K. The buyer has trouble distinguishing lemons from peaches based on this limited information and owners have no way to effectively communicate their inside knowledge. Suppose the buyer seeking a deal offers $16,250. Peach owner will refuse such a low-ball offer but lemon owners will jump at the offer. If on the other hand, a peach owner accepts the low offer, the buyer wonders what's wrong with the car, i.e., it must be a lemon. So, the buyer offers a lower price of say, $12,500 which the peach owner is less likely to accept. So, over time, the only cars that sell are lemons. Information Asymmetry allows bad products to drive out good products."

Twisting one of Dr. Lauer's sentences, if we think about the history of application software security, we've seen plenty of lemons. 



Thursday, April 11, 2013

Identity Verification in the MOOC World. Not!



According to some, Massively Open Online Courses (MOOC) are the latest saviors in the financially strapped EDU world. The idea of having hundreds of thousands of students taking a university course at the same time is an exciting new frontier for higher education.  Just think of the financial gains an institution can achieve. Public universities have seen a dramatic decrease in their financial support from their respective state governments. Virginia Universities receive an average of 3-5% of their total budgets from the state. The money has to come from somewhere to support a growing student body. An income stream from hundreds of thousands of online students is enticing to cash strapped universities. State legislators see MOOCs as a way to continue financial support without raising taxes. After all, the money would come from tuition. There would be a saving cost in personnel, infrastructure and other high costs associated with universities. So, what’s the worry?

First of all, EDUs have been in the online class world for at least 15 years. Interactive Video Conference (IVC) methods have been around for a long time. For example, I started teaching an IVC course in 1999. It was in a special classroom equipped with TV cameras, microphones for the students and 2 way communications. If a student had a question, they pressed a button, their microphone would go live, the TV camera in their classroom would zoom in on them and 2 way conversations would happen.  This format is expensive and today’s generations of students don’t feel comfortable using this medium.  Social media  and a generational change have made MOOCs more popular. EDU faculty have experience in online learning. Learning Technologies (LT) is an growing and exciting field and well poised to address MOOC development.

JoAnn Paul from VA Tech states “Today's students often perceive electronic forms of interaction as LESS impersonal than face to face, traditional classroom settings, regardless of class size.  And why not? Students already work in distributed environments, and increasingly need to learn how best to communicate that way -- to get their point across -- and they know it.”

We need to collect data on MOOC popularity when the students have to a) pay for the courses b) take them for college credit. I suspect then enrollment numbers will be significantly lower. For introductory level courses, MOOCs make sense because they provide a vehicle for accessing large numbers of people.  More advanced courses don’t scale well. Where does an online student go to do Chemistry or Physics lab experiments? How does one replicate the lab facilities and equipment. But that's another issue.....

Apart from having curriculum designed by external entities, the biggest problem with MOOCs is a very basic yet critical issue: cheating.  
Laura Pappano’s NY Times article, “The Year of the MOOC” states  “Cheating is a reality. “We found groups of 20 people in a course submitting identical homework,” says David Patterson, a professor at the University of California, Berkeley, who teaches software engineering, in a tone of disbelief at such blatant copying; Udacity and edX now offer proctored exams.” Frankly, I’m surprised he was surprised about online cheating.

There are some fundamental questions that need to be answered before even attempting to incorporate MOOC style courses as credit for a degree.  
1. How do you verify the identity of the student who registers for the class?
2. How do you verify the identity of the person who submits assignments and takes exams?
3. How do you verify the person in #1 is the same person as the one in #2?

These questions need to be addressed before MOOCs can become a vehicle for furthering one's pursuit of a degree.
 


Wednesday, March 13, 2013

Why Commoners WIll Always Be On The Defensive

In the past year, one of my most requested talks is called "The More Things Change, The More They Stay the Same". I show examples of cyberattacks over the past 20 years, how the root causes are the same and how we're still fighting the same battles after 20 years with no tangible success. I ask "what have we [security types] been doing these past 20 years?". I mention how an entire industry has been created to "combat" cyber attacks but again, there's no economic incentive to really solve the cyber security problem.

A recent article in Forbes, "Shopping For Zero-Days: A Price List For Hackers' Secret Software Exploits" by Andy Greenberg (http://www.forbes.com/sites/andygreenberg/2012/03/23/shopping-for-zero-days-an-price-list-for-hackers-secret-software-exploits/) talks about a particular firm that sell 0-day exploits to anyone who has the money. A quote from the article caught my attention: "Who’s paying these prices? Western governments, and specifically the U.S., says the Grugq, who himself is a native of South Africa. He limits his sales to the American and European agencies and contractors not merely out of ethical concerns, but also because they pay more."  

As other writers have noted, there is now an economic incentive to NOT fix a bug in software. So, the new paradigm is to not fix 0-days, rather, it's to sell them or pressure software vendors to not fix them in order to give the nation-state an advantage.

Yeah, I know. This is nothing new. But here's what this does to the common security folk like you and me. We can't afford to pay for 0-days therefore we have to live with the consequences of having 0-days present in software we buy. We don't know if there are 0-days in software we buy therefore we have to implement reactive defense tactics

While nation-states hoard 0-days for cyber warfare, "civilian" organizations are left vulnerable to effective, successful cyberattacks.  In other words, "civilian" organizations have no choice but to design reactive cyber defense strategies since we can't "prevent" an attack that exploits a software vulnerability inside our net. 

 

Wednesday, January 2, 2013

Application Security Questionnaires - The Time is Now!

Back in 2009, I posted a note talking about vendor software security vulnerabilities and how they undermine our security. Way back in the early 2000's, I was quoted in a USA Today article on Cybersecurity saying that I was surprised that there weren't many product liability lawsuits against software vendors. In my 2009 blog post, I said I feared that comment only caused software vendors to modify their EULAs instead of fixing the problem.

This problem has been around since the first program was written. The difference is that people are actively searching for these bugs to gain access to an organization's network and data.  I believe it is the fundamental vector for APT (I hate that term) attacks. Mudge told President Clinton about this problem in the late 1990's. 

I still hope vendors will actually check their code for common vulnerabilities. However, here are some recent instances that are telling me otherwise.

1. Vendor www application fails a standard vulnerability scan from a commercial and freeware scanning tool. XSS flaws across multiple pages in their hierarchy were the most common error.

2. Vendor supplied password of "changeme" resulted in a compromise while they were onsite installing the software. They were surprised to find out our network was "open" to the net.

3. Vendor password requirements undercutting our password strength requirements.

We're in the process of modifying a Security Questionnaire for Software Vendors doc that we had in place for a number of years. It's outdated now but it did ask www app vendors if their software was vulnerable to any of the flaws mentioned in the OWASP Top 10 Security Risks ( https://www.owasp.org/index.php/Category:OWASP_Top_Ten_Project ).  There have been a number of efforts to create an Application Security Questionnaire but they haven't gained acceptance.

Why? These questionnaires are site-specific by nature. It's hard nee impossible to create a consensus document that addresses all sectors (.com, .mil., .edu, .org, etc.) of business or government.  There is vendor resistance to any "requirements" clause. The recent flap caused by such a requirement in one of the recent Federal cybersecurity bills in Congress are examples of this resistance.  To them, I say "if you had done it in the first place, there wouldn't be this attempt to 'regulate' you."

Some of these include:

  1. http://www.sans.org/appseccontract/
  2. https://www.owasp.org/index.php/Category:OWASP_Application_Security_Requirements_Project
  3. http://searchsoftwarequality.techtarget.com/answer/Security-requirements-for-any-Web-application
For more links, google "application security requirements" for some useful links.


Here is my wishlist for software vendors:

  1. Train your programmers in secure coding techniques. If they still leave security holes in your code, find another place for them in your organization.
  2. Run a vulnerability scanner against your products. Your customers will start doing that soon. It's much worse for your reputation if the customer runs a scanners and finds errors.
  3. Pay attention to the results in #2 and fix the problems before releasing it.
  4. Do NOT assume the network will "protect" your application. 
  5. Follow some sort of best practices for password strength guidelines. Don't ever convert everything to upper or lower case only.
  6. Never store user passwords in the clear. That's just plain idiotic.
  7. Store sensitive data in an encrypted format. It can be done with the common database systems properly. See #1.
The purpose of a site questionnaire is to provide the customer with information about the security of the vendor applications they are considering purchasing. "Failing" the questionnaire isn't an automatic no-buy action. It informs the customer that additional security controls must be in place.

The questionnaire is another component in a risk-based security management strategy. If the software is needed for business purposes and the user accepts the risk, then purchase can continue.

The time of software vendors letting the customer debug their code has to come to an end immediately. OS vendors have done this and the number of OS issues has been reduced. It's time for application vendors to step up and deliver.

1/2/2013 RCM