Quotes: Problems, solutions, and change

The solution to any problem is found in the proper definition of the problem

— Doug Hale

Everything in software changes. The requirements change. The design changes. The business changes. The technology changes. The team changes. The team members change. The problem isn’t change, because change is going to happen; the problem, rather, is our inability to cope with change

— Kent Beck

Security quotes

“Security systems are never value-neutral; they move power in varying degrees to one set of players from another.” — Bruce Schneier, Beyond Fear p. 35

“People exaggerate spectacular but rare risks and downplay common risks”. — Bruce Schneier, Beyond Fear p. 26

“Technology is generally an enabler, allowing people to do things. Security is the opposite: It tries to prevent something… That’s why technology doesn’t work in security the way it does elsewhere, and why an overreliance on technology often leads to bad security, or even to the opposite of security.” — Bruce Schneier, Beyond Fear p. 13

Stages of Security in the life of computer software

Is your client-server software secure? You may be surprised to find that even mature software, sporting the use of standard encryption, could be putting your mission-critical data at risk. Why is that? It has to do with economics and the lifecycle of software. Here are the stages.

Prototype. No one cares about the security of the client-server communication.

First release. Althought the system (likely little more than an improved prototype) has been shipped, it may not be usable, and if it is, it lacks mission-critical features. Security is low on the priority list. If encryption and authentication were implemented, it is most likely minimal, brittle and insufficient. If the security is robust, the product is not functional and likely never will be because the ISV will go out of business.

Second or third release. If the ISV has survived long enough to release a second or third version, customers likely demanded the use of standard encryption. In the old days (the 1990s), this means the ISV would have switched from XOR “encryption” to DES or 3DES. These days, standard encryption probably means use of SSL/TLS without certificate checks. Note that customers probably won’t ask about the security of the authentication mechanisms.

Unfortunately, use of a standard encryption algorithm doesn’t mean communication is secure. Software is likely to be vulnerable to man-in-the-middle attacks and have authentication bugs. The customer isn’t likely to know this, and neither is the ISV. If the ISV does know, they won’t fix the problems. This shouldn’t be surprising — the risk tolerance is different for the ISV versus their customers. The software vendor isn’t the one that is going to suffer losses due to information disclosure or breach of integrity.

At this stage, customers usually apply more pressure to implement new features than to focus on security. Of course, this is based on ignorance of the actual situation. The vendor isn’t likely to want to pay the price to improve security… unless the customer knows and applies pressure to get it fixed.

Eventually, a security-conscious customer (i.e. a financial institution or a government) hires someone to evaluate the software, and they start asking hard questions of the ISV. Most of the ISV’s software engineers won’t know the answers because the security mechanism is transparent to their daily work — it stays out of sight, and out of mind. Eventually, people figure out that the encryption is vulnerable to man-in-the-middle attacks or authentication bugs. At first, customers may be reluctant to believe that the security holes are serious, and then they will panic. They will apply pressure to the ISV to get it fixed.

Take-away points:

  • SSL without certificate checks is vulnerable to Man in the Middle attacks.
  • Almost no ISV gets encryption right the first time, and they won’t fix problems unless their feet are held to the fire.

Tim Bray tells us what’s awesome about Ruby

Tim Bray explains what is to like about the Ruby programming language:

I’ll jump to the conclusion first. For people like me, who are proficient in Perl and Java, Ruby is remarkably, perhaps irresistibly, attractive. Over the last week I’ve got an unreasonable amount of work done in a ridiculously short period of time, with lots of interruptions, in a language I previously didn’t know. It’s intuitive enough that I’ve often found myself guessing at a syntax or a method or a usage and getting it right first time.

Maybe the single biggest advantage is readability. Once you’ve got over the hump of the block/yield idiom, I find that a chunk of Ruby code shouts its meaning out louder and clearer than any other language. Anything that increases maintainability is a pearl beyond price.

I’ve been programming in C and Java for a quarter-century and I find Ruby easier to read, only a week in. Of course, a language’s culture is often more important than all that technical crap. I’ve found the ruby-talk mailing list to be a fount of wisdom and friendly to ignorant newbies too.

http://www.tbray.org/ongoing/When/200x/2006/07/24/Ruby

Follow the link to find out “What’s Lame” about Ruby.

Best of Breed, or Best of Mediocrity?

Having worked for some time as a software engineer in the enterprise security
software world, I know that customers (enterprises) look for “best of breed”
software. For a large company customer, this usually means that a software
solution distinguishes itself in some way that makes it work well in their
environment. Often, this translates to reliability, cross-platform support,
person-to-person support and the ability to function beyond what is advertised.

As many are aware, there is “consolidation” going on in the security market.
Big fish are swallowing smaller fish, and it’s lucrative, in the short term,
for everyone except customers. Supposedly, the consolidation means that two
separate products can be “integrated”, or unified. Never mind the previous
competitive relationship that may have existed between the product teams and
their management. For some reason, people seem to think that competition
evaporates and that the two product teams will happily work together to build
the next generation “Best of Breed” software solution.

Not so.

In any big corporation or software company, there are constant power plays
being made. You could call this “decision making”, and if you have uncommonly
good leaders, you might even say good decisions are being made.
Unfortunately, it is human nature for most people to misuse and abuse positions
of power. Instead of making product decisions that are best for their merged
customer base, they make decisions that keep themselves in a position of power.

So, we have two best of breed products: Overdog and Underdog. Underdog is
easier to manage, but isn’t as complete in its offerings. Overdog is more
complete, but is more expensive to deploy and manage. Overdog has the advantage
of being used in Fortune 500 companies. Underdog, on the other hand, is trying
to break into that market space.

Enter Big Fish — a.k.a. Consolidator. Consolidator buys Overdog, and a few
years later, buys Underdog. We take two products, both “Best of Breed” in
different ways, and expect to see them merged together to make something “next
generation” — better, faster, stronger, and easier to use.

Whenever there is a consolidation, talented people get fired, and their
creative ideas and abilities are lost. Product integration never happens as
easily as anyone would like to believe (if it happens at all). And in the end, customers end up with a
product that we can best label as “Best of Mediocrity”. Consolidation means
that customers lose their “Best of Breed” solutions.

What can you expect from Software Consolidators? Mediocre solutions. Look
elsewhere for excellence.

The purpose of security…

A coworker made these assertions about security. I think they’re worth repeating:

  • The purpose of security is to establish accountability of an individual.
  • The purpose of auditing is to verify the trust that has been placed in an individual.

Principles of Reputation

One of my past professors, Phil Windley, posted some principles of reputation:

  • Reputation is one of the factors upon which trust is based
  • Reputation is someone else’s story about me – this means that I can’t control what you say about me although I may be able to affect the factors you based your story on. Also, every person should be able to have their own story about me.
  • Reputation exists in the context of community – this is different than saying “communities have a reputation about someone.”
  • Reputation is based on identity – reputation, as someone else’s story, isn’t part of your identity, but is based on an identity or set of identities.
  • Reputation is a currency – while you can’t change it, reputation can be used as a resource. Paul Resnick has a paper showing the value of a positive eBay reputation.
  • Reputation is narrative – you have to apply metaphor to interpret, reputation is dynamic becase the factors that affect it are always changing, reputation may require weaving together of plot lines.
  • Reputation is based on claims (verified or not), transactions, ratings, and endorsements. – this brings up the issue of evidence, recourse for slander or mistakes, etc.
  • Reputation is muti-level – a reputation isn’t just based on facts, but is also based on other’s beliefs about the target of that reputation. This requires some way of signaling beliefs to others.
  • Mutiple people holding the same opinion increases the weight o that opinion – repeat behavior is also another way of weighting reputation.

Is Data Mining Fools Gold?

Here’s a thought provoking article about the problems of large-scale data mining by
governments. It’s written by a person living in the UK.

“Data-mining is complicated, and the more data you are mining, the more
false positives your software will throw up. If you act upon a false
positive for a motoring offence, it’s an inconvenience for the motorist,
but for an alleged case of child abuse, it can rip the family apart and
ruin the child’s life.”

“Furthermore, gathering large amounts of data is inherently dangerous.
Whatever information governments find interesting will also draw the
attention of criminals. Databases can be hard to keep secure, and it’s
not necessarily hackers that we should be worried about, but
unauthorised access by employees of the agencies that use these
databases. Equally, the more data you have, the more difficult it is to
maintain accuracy. In 2000, an audit of the Police National Computer
found that 86% of records contained errors, 85% of those errors were
serious, and some were libellous.”

“Technology can be a very powerful tool, but what it can’t do is replace
real human beings or traditional investigative work. Designed badly or
used poorly, databases are the technological equivalent of fools gold.”