Tuesday, August 5, 2014

Deja Vu All Over Again - Redux - 1999-2014

Yep, it's time to use this title again. This time we're talking about DDOS amplification attacks. One of the lists I monitor posted the following:

Christian Rossow has done some great work on DDOS.  The two interesting papers are:

 "Exit from Hell? Reducing the Impact of Amplification DDoS Attacks"
   http://christian-rossow.de/publications/exitfromhell-usenix2014.pdf

The authors also look at DNS, NTP,  SNMP, SSDP, CharGen, QOTD and NetBIOS. The last sentence of this paper, "We measured almost 46 million amplifiers for all scanned UDP-based protocols."

 "Hell of a Handshake: Abusing TCP for Reflective Amplification
   DDoS Attacks,"

   http://christian-rossow.de/publications/tcpamplification-woot2014.pdf

  The quote from the Kuhrer paper:

"The basic idea is to send relatively small requests with spoofed source address to public hosts (e.g., NTP servers), which reflect significantly larger responses to the victim of the attack."

is depressing to read.

Why? In 2000, I was part of a Fed/SANS Institute Task Force that wrote a Consensus Roadmap to defeating DDOS attack doc (http://www.sans.org/dosstep/roadmap.php). In there, we stressed the importance of setting your (the collective your) network ingress/egress filters correctly in order to prevent spoofed packets from leaving your network. The above quote says to me that we've (the collective we) has forgotten this basic defense technique. So, my question to the list is "have you set your ingress/egress filters on ALL of your network devices to prevent spoofed packets from leaving your nets. If so, you've taken a giant step in reducing the impact of an amplification attack.

The weird sense of humor in me says that the admins who were around in 2000 and set their filters ave moved on or retired and their replacements looked at those ACLs and said "WTF? Let's take these out."



It's been 14 years now and spoofed packets are still an issue.

I'm just saying......:-)

Friday, December 27, 2013

Lemons for Security - Information Asymmetry

My wife handed me an article from the Annals of Internal Medicine (Vol 157, No. 2, p.139-140) entitled "Lemons for Obesity" by Michael Lauer, MD. At first, I thought she's trying to hint that I need to lose weight but she said there's a section in the article that might apply to cybersecurity. So, my curiousity got the better of me. Dr. Lauer's article described his thought about the obesity drug Qnexa and issues with aftereffects.

What does this have to do with cybersecurity?

Lauer mentions a Nobel prize winning paper by George Akerlof on the market for bad cars aka "lemons".  He summarizes Akerlof's "lemon" scenario as follows.

"Used car buyers believe 75%  of cars are good (peaches) and 25% have problems (lemons). Buyer know lemon owners want to sell because of these car problems.  Suppose a lemon costs $5K and peaches cost $20K. The buyer has trouble distinguishing lemons from peaches based on this limited information and owners have no way to effectively communicate their inside knowledge. Suppose the buyer seeking a deal offers $16,250. Peach owner will refuse such a low-ball offer but lemon owners will jump at the offer. If on the other hand, a peach owner accepts the low offer, the buyer wonders what's wrong with the car, i.e., it must be a lemon. So, the buyer offers a lower price of say, $12,500 which the peach owner is less likely to accept. So, over time, the only cars that sell are lemons. Information Asymmetry allows bad products to drive out good products."

Twisting one of Dr. Lauer's sentences, if we think about the history of application software security, we've seen plenty of lemons. 



Thursday, April 11, 2013

Identity Verification in the MOOC World. Not!



According to some, Massively Open Online Courses (MOOC) are the latest saviors in the financially strapped EDU world. The idea of having hundreds of thousands of students taking a university course at the same time is an exciting new frontier for higher education.  Just think of the financial gains an institution can achieve. Public universities have seen a dramatic decrease in their financial support from their respective state governments. Virginia Universities receive an average of 3-5% of their total budgets from the state. The money has to come from somewhere to support a growing student body. An income stream from hundreds of thousands of online students is enticing to cash strapped universities. State legislators see MOOCs as a way to continue financial support without raising taxes. After all, the money would come from tuition. There would be a saving cost in personnel, infrastructure and other high costs associated with universities. So, what’s the worry?

First of all, EDUs have been in the online class world for at least 15 years. Interactive Video Conference (IVC) methods have been around for a long time. For example, I started teaching an IVC course in 1999. It was in a special classroom equipped with TV cameras, microphones for the students and 2 way communications. If a student had a question, they pressed a button, their microphone would go live, the TV camera in their classroom would zoom in on them and 2 way conversations would happen.  This format is expensive and today’s generations of students don’t feel comfortable using this medium.  Social media  and a generational change have made MOOCs more popular. EDU faculty have experience in online learning. Learning Technologies (LT) is an growing and exciting field and well poised to address MOOC development.

JoAnn Paul from VA Tech states “Today's students often perceive electronic forms of interaction as LESS impersonal than face to face, traditional classroom settings, regardless of class size.  And why not? Students already work in distributed environments, and increasingly need to learn how best to communicate that way -- to get their point across -- and they know it.”

We need to collect data on MOOC popularity when the students have to a) pay for the courses b) take them for college credit. I suspect then enrollment numbers will be significantly lower. For introductory level courses, MOOCs make sense because they provide a vehicle for accessing large numbers of people.  More advanced courses don’t scale well. Where does an online student go to do Chemistry or Physics lab experiments? How does one replicate the lab facilities and equipment. But that's another issue.....

Apart from having curriculum designed by external entities, the biggest problem with MOOCs is a very basic yet critical issue: cheating.  
Laura Pappano’s NY Times article, “The Year of the MOOC” states  “Cheating is a reality. “We found groups of 20 people in a course submitting identical homework,” says David Patterson, a professor at the University of California, Berkeley, who teaches software engineering, in a tone of disbelief at such blatant copying; Udacity and edX now offer proctored exams.” Frankly, I’m surprised he was surprised about online cheating.

There are some fundamental questions that need to be answered before even attempting to incorporate MOOC style courses as credit for a degree.  
1. How do you verify the identity of the student who registers for the class?
2. How do you verify the identity of the person who submits assignments and takes exams?
3. How do you verify the person in #1 is the same person as the one in #2?

These questions need to be addressed before MOOCs can become a vehicle for furthering one's pursuit of a degree.
 


Wednesday, March 13, 2013

Why Commoners WIll Always Be On The Defensive

In the past year, one of my most requested talks is called "The More Things Change, The More They Stay the Same". I show examples of cyberattacks over the past 20 years, how the root causes are the same and how we're still fighting the same battles after 20 years with no tangible success. I ask "what have we [security types] been doing these past 20 years?". I mention how an entire industry has been created to "combat" cyber attacks but again, there's no economic incentive to really solve the cyber security problem.

A recent article in Forbes, "Shopping For Zero-Days: A Price List For Hackers' Secret Software Exploits" by Andy Greenberg (http://www.forbes.com/sites/andygreenberg/2012/03/23/shopping-for-zero-days-an-price-list-for-hackers-secret-software-exploits/) talks about a particular firm that sell 0-day exploits to anyone who has the money. A quote from the article caught my attention: "Who’s paying these prices? Western governments, and specifically the U.S., says the Grugq, who himself is a native of South Africa. He limits his sales to the American and European agencies and contractors not merely out of ethical concerns, but also because they pay more."  

As other writers have noted, there is now an economic incentive to NOT fix a bug in software. So, the new paradigm is to not fix 0-days, rather, it's to sell them or pressure software vendors to not fix them in order to give the nation-state an advantage.

Yeah, I know. This is nothing new. But here's what this does to the common security folk like you and me. We can't afford to pay for 0-days therefore we have to live with the consequences of having 0-days present in software we buy. We don't know if there are 0-days in software we buy therefore we have to implement reactive defense tactics

While nation-states hoard 0-days for cyber warfare, "civilian" organizations are left vulnerable to effective, successful cyberattacks.  In other words, "civilian" organizations have no choice but to design reactive cyber defense strategies since we can't "prevent" an attack that exploits a software vulnerability inside our net. 

 

Wednesday, January 2, 2013

Application Security Questionnaires - The Time is Now!

Back in 2009, I posted a note talking about vendor software security vulnerabilities and how they undermine our security. Way back in the early 2000's, I was quoted in a USA Today article on Cybersecurity saying that I was surprised that there weren't many product liability lawsuits against software vendors. In my 2009 blog post, I said I feared that comment only caused software vendors to modify their EULAs instead of fixing the problem.

This problem has been around since the first program was written. The difference is that people are actively searching for these bugs to gain access to an organization's network and data.  I believe it is the fundamental vector for APT (I hate that term) attacks. Mudge told President Clinton about this problem in the late 1990's. 

I still hope vendors will actually check their code for common vulnerabilities. However, here are some recent instances that are telling me otherwise.

1. Vendor www application fails a standard vulnerability scan from a commercial and freeware scanning tool. XSS flaws across multiple pages in their hierarchy were the most common error.

2. Vendor supplied password of "changeme" resulted in a compromise while they were onsite installing the software. They were surprised to find out our network was "open" to the net.

3. Vendor password requirements undercutting our password strength requirements.

We're in the process of modifying a Security Questionnaire for Software Vendors doc that we had in place for a number of years. It's outdated now but it did ask www app vendors if their software was vulnerable to any of the flaws mentioned in the OWASP Top 10 Security Risks ( https://www.owasp.org/index.php/Category:OWASP_Top_Ten_Project ).  There have been a number of efforts to create an Application Security Questionnaire but they haven't gained acceptance.

Why? These questionnaires are site-specific by nature. It's hard nee impossible to create a consensus document that addresses all sectors (.com, .mil., .edu, .org, etc.) of business or government.  There is vendor resistance to any "requirements" clause. The recent flap caused by such a requirement in one of the recent Federal cybersecurity bills in Congress are examples of this resistance.  To them, I say "if you had done it in the first place, there wouldn't be this attempt to 'regulate' you."

Some of these include:

  1. http://www.sans.org/appseccontract/
  2. https://www.owasp.org/index.php/Category:OWASP_Application_Security_Requirements_Project
  3. http://searchsoftwarequality.techtarget.com/answer/Security-requirements-for-any-Web-application
For more links, google "application security requirements" for some useful links.


Here is my wishlist for software vendors:

  1. Train your programmers in secure coding techniques. If they still leave security holes in your code, find another place for them in your organization.
  2. Run a vulnerability scanner against your products. Your customers will start doing that soon. It's much worse for your reputation if the customer runs a scanners and finds errors.
  3. Pay attention to the results in #2 and fix the problems before releasing it.
  4. Do NOT assume the network will "protect" your application. 
  5. Follow some sort of best practices for password strength guidelines. Don't ever convert everything to upper or lower case only.
  6. Never store user passwords in the clear. That's just plain idiotic.
  7. Store sensitive data in an encrypted format. It can be done with the common database systems properly. See #1.
The purpose of a site questionnaire is to provide the customer with information about the security of the vendor applications they are considering purchasing. "Failing" the questionnaire isn't an automatic no-buy action. It informs the customer that additional security controls must be in place.

The questionnaire is another component in a risk-based security management strategy. If the software is needed for business purposes and the user accepts the risk, then purchase can continue.

The time of software vendors letting the customer debug their code has to come to an end immediately. OS vendors have done this and the number of OS issues has been reduced. It's time for application vendors to step up and deliver.

1/2/2013 RCM




Wednesday, October 3, 2012

Are Silicon Valley "campuses" the 21st century version of coal mining company towns?

I was reading a recent article about the new Facebook "campus" that is being built in Menlo Park at the old Sun Microsystems facility. It's on 57 acres with an additional 22 adjacent acres set aside for more expansion. Later in the article,  it mentions a Facebook official saying they "envisioned a long courtyard at the hear of the cluster of buildings being turned into a play on a European street scene where workers could exchange ideas in an outdoor social scene."

I have some friends and former students who are working for other Silicon Valley companies with similar "campuses". They tell me they love it because they have housing, laundry facilities, dining halls, some stores all on campus. They tell me they don't need a car because everything they need is right there. A couple of them said it was like being in college. Of course, I always ask them about salaries and they were predictably decent salaries. A few of them said they were taking salary cuts in lieu of stock options. I started to get a funny feeling about that but couldn't quite put my finger on what was bothering me.

A couple of weeks ago, I was watching one of my favorite movies, "Matewan", which tells the tale of a struggle between WV coal miners and the local coal mining company. The struggle resulted in a shootout that became known as the Matewan Massacre. Now, my reason for liking the movie is that a bunch of my musician friends are featured in the movie. Anyway, while I was watching the movie, it suddenly hit me why I felt a little uneasy when I was talking with my former students about their jobs in the Silicon Valley campuses.

If you look at the history of coal mining towns, you find that everything in the town was owned by the company. Miners were paid in scrip and a portion of their salary was deducted for living expenses.  You paid for items in the company stores with scrip. Basically, you paid for everything in scrip.

The Facebook article got me thinking about the parallels between the Silicon Valley campuses and the coal mining company towns of the early 20th century. Here's some parallels that occurred to me:

  1. All "living" services - housing, food, laundry, schools, transportation, entertainment, employment provided and owned by the company.
  2. Coal Company "scrip" = 21st Century stock options. Stock options can't buy me a car :-).
  3. Miners/Workers aren't encouraged to leave the town/campus. Companies want them to stay on campus and work more than the traditional 40 hour week.
You could probably find more parallels but these are just a few that came to me. 
I hope this "campus" model of employment doesn't lead to abuses such as those that happened in our history.




Monday, April 9, 2012

A Cyber Security Industrial Complex?

Dwight Eisenhower is one of my heroes. Yep, I said it right here and now. His speech on the military-industrial complex is only now being appreciated. It was done 50 years ago and is still relevant today. Why am I bringing this up and what does this have to do with cybersecurity?

I did a SANS Lightning talk this past month and when I was researching material for the talk, I stumbled across some reference material from 2001 called "Top 10 Security Mistakes" (http://www.computerworld.com/s/article/61986/Top_10_Security_Mistakes). They were:

1. The not-so-subtle Post-it Note.
2. We know better than you.
3. Leaving the machine on, unattended
4. Opening e-mail attachments (remember the Love Bug virus?) from mere acquaintances or even strangers.
5. Poor password selection.
6. Loose lips sink ships.
7. Laptops have legs.
8. Poorly enforced security policies.
9. Failing to consider the staff.
10. Being slow to update security information.

Take a look at this list and tell me which of these mistakes have we eliminated in the past 10 years. If you come up with an answer of "none", then the follow-up question would be "what have we been doing these past 10 years?".

I found another slide from a 2002 presentation I did where I made the following statement:

"Viruses, trojans, rootkits will never be eliminated because we've created a multi-billion dollar industry to combat them. If we eliminate the root causes of cyber attacks, we eliminate a multi-billion dollar industry". I believe there's no economic incentive to eliminate these root causes. Or to put it another way, there is a strong economic incentive to NOT eliminate the root causes of cybersecurity attacks.

Now, mind you, I've been an active part of the Cyber Security "industry" for the past 20 years. I helped write the original SANS/FBI Top 10 Internet Threats document back in 2000. Part of my job is measure the effectiveness of our defense strategies. If I use this 2001 list to examine our effectiveness industry wide, I think while we've made some progress, we (the collective we) have failed miserably.

Alan Paller talked about the 4 quadrants of cybersecurity: Academic Security Researchers, Hunters/Tool Builders, Operator/testers who monitor IPS, IDS, pentest tools and Audit/Policy/Compliance workers. The largest of these quadrants is the Audit/Policy/Compliance group which seems a little backward to me. We're focusing on compliance instead of actually fixing the problem. We need to train and develop more people in the Hunter/Tool Builder category so that we have a chance at fixing the root causes of cyber attacks one of which is insecure code.

And so we come back to President Eisenhower's speech. We're seeing the militarization of cyberdefense. Defense contractors who used to specialize in tanks, helicopters, jets, advanced weaponry are retooling to become cybersecurity "experts". We're seeing a lot of money being spent to defend/monitor instead of fixing the root causes.

Are there parallels between the complex of the 60's and the "complex" of the 201x's? Take a look at a recent NPR article on Eisenhower's speech and see if you can draw the parallels. It's at http://www.npr.org/2011/01/17/132942244/ikes-warning-of-military-expansion-50-years-later.
More on this later.....