Category: Security

PGP trust model doesn’t work

Published / by Jared

It’s been several years since I used GPG to PGP-sign and encrypt and to verify the authenticity of PGP-signed email messages.

So it was interesting to read why the PGP trust model doesn’t improve security:

http://arstechnica.com/security/2016/12/op-ed-im-giving-up-on-pgp/

I believe that confidentiality isn’t a binary thing — if one desires it, one must continually stay up-to-date on what approaches work and what is economically feasible, and what is no longer effective.

The article recommends Signal or WhatsApp for instant messaging, Magic Wormhold or OnionShare for file sharing, etc. It also recommends the use of Yubikey 4 for authentication.

HTML Subresource Integrity

Published / by Jared

LWN covers the new W3C spec for HTML subresource integrity (SRI):

SRI is designed to combat injection attacks that come through third-party content. The originating site can include cryptographic hashes of third-party script and image files, enabling the user’s browser to hash the corresponding files it receives from the third-party servers and verify that the hashes match.

Most browsers already support SRI, including Firefox, Chrome and Opera.

How to store passwords: Use Argon2

Published / by Jared

If you’re designing a service that requires passwords for authentication, store them using the Argon2 or bcrypt password hashing functions. Don’t use MD5, SHA-1, SHA-2 or SHA-3 — they’re not designed to keep passwords secure against attackers that gain access to your password database.

Reference article: How LinkedIn’s password sloppiness hurts us all by Jeremi M. Gosney

If [online services] aren’t using something like bcrypt or Argon2 for password storage, then they’re doing things very, very wrong. But slow hashing is no longer as effective of a solution as it could have once been had it only been adopted sooner.

When you suspect a password database has been compromised, even just in part, you cash in on that insurance policy [of using forced password resets] immediately by activating your incident response team and your public relations team.

What is Argon2? It’s the winning algorithm from the Password Hashing Competition. Argon2 has been added to recent versions of libsodium.

URL shorteners can compromise security

Published / by Jared

It’s useful to shorten long URLs, especially when sending them in tweets and in text messages. An LWN.net article helped me learn that they can be a security risk:

URL shorteners such as bit.ly and goo.gl perform a straightforward task: they turn long URLs into short ones, consisting of a domain name followed by a 5-, 6-, or 7-character token. This simple convenience feature turns out to have an unintended consequence. The tokens are so short that the entire set of URLs can be scanned by brute force. The actual, long URLs are thus effectively public and can be discovered by anyone with a little patience and a few machines at her disposal.

Around 7% of the OneDrive folders discovered in this fashion allow writing. This means that anyone who randomly scans bit.ly URLs will find thousands of unlocked OneDrive folders and can modify existing files in them or upload arbitrary content

— VITALY SHMATIKOV

KeyCzar: Encryption made easy

Published / by Jared

Encrypting sensitive data-at-rest (i.e. in a database) is a good idea, but how does one manage the encryption keys, and rotate keys or start using a new algorithm down the road without orphaning or migrating the old data? Use KeyCzar

Cryptography is easy to get wrong. Developers can choose improper
cipher modes, use obsolete algorithms, compose primitives in an unsafe
manner, or fail to anticipate the need for key rotation. Keyczar
abstracts some of these details by choosing safe defaults,
automatically tagging outputs with key version information, and
providing a simple programming interface.

Keyczar is designed to be open, extensible, and cross-platform
compatible. It is not intended to replace existing cryptographic
libraries like OpenSSL, PyCrypto, or the Java JCE, and in fact is
built on these libraries.

Or learn from what Google did with KeyCzar, and implement the same ideas (key rotation and key version info) using a more modern encryption library, like libsodium.

LWN.net: “Changes in the TLS certificate ecosystem”

Published / by Jared

I was glad to come up to speed with what has been happening with TLS in the last couple of years, and I highly recommend reading these articles.

  • https://lwn.net/Articles/663875/
  • https://lwn.net/Articles/664385/

I learned about HTTP Public Key Pinning, Certificate Transparency, and STARTTLS stripping, among other things.

Here’s one of many good quotes:

The core problem of the TLS certificate system is that there exist hundreds of certificate authorities. And unless extra protection measures are in place, each of those can create valid certificates for any domain. Therefore the whole system is only as strong as the weakest of all certificate authorities.

And as for embedded devices that handle encryption:

We are well aware that crypto appears to be something that needs to be field replaceable, and yet we more or less have no clue how to do that in deployed embedded hardware. Indeed, we seem to have a very poor idea in general on how to maintain the software on field deployed embedded hardware. — Perry Metzger

As I’ve worked with Python, I realize that it’s one thing to implement TLS, and another thing to verify server certificates. The Python requests library can be configured to do the right thing, but the python SMTP cannot. It’s still another thing to check on certificate revocation. Python doesn’t implement OCSP or CRLs, and those mechanisms are problematic anyway. It doesn’t yet implement HTTP Public Key Pinning. The state of affairs may not be much better in other programming toolboxes.

So I’d guess that machine to machine internet communication is probably more vulnerable to man in the middle attacks than consumer web browsers.

Data security can only be achieved by those empowered

Published / by Jared

Users of online services don’t have the ability (i.e. aren’t empowered) to secure the data stored by those services. Only the engineers and the companies that build the services can do that. So I agree with Cindy Cohn, who says:

…we need to ensure that companies to whom we entrust our data have clear, enforceable obligations to keep it safe from bad guys. This includes those who handle it it directly and those who build the tools we use to store or otherwise handle it ourselves.

In my view, business leadership and software engineers have an ethical responsibility to secure their systems and services so that customer’s data and sensitive information doesn’t get misused or abused.

I’d like it if customers had a reliable and consistent way to evaluate the quality and diligence given to keeping their data safe — something like Charity watch or Consumer Reports.

When your USB devices can be used against you

Published / by Jared

Interesting: “about half of your devices, including chargers, storage, cameras, CD-ROM drives, SD card adapters, keyboards, mice, phones, and so on, are all likely to be proven easily reprogrammable and trivially used to… attack software. Unfortunately, the only current solution on the horizon is to not share any USB devices between computers.” — Dragos Ruiu

Smartphones and Privacy

Published / by Jared

A cautionary note from Arstechnica:

Given how much of what is on smartphones is now automatically backed up to the cloud, anyone should take pause before disrobing before their smartphone camera—regardless of the phone operating system or how that image will be delivered to its intended audience. The security of all of these services is only as secure as the obscurity of the mother’s maiden name of the person you sent that picture to—or of the next zero-day flaw.

I don’t think smartphones belong in bedrooms or bathrooms, but since most people want the convenience of having them there, it may be a good idea to keep the phone in the drawer while changing, or covered while showering, etc.

I think it’s a good idea to assume that what one’s smartphone can hear, see, or the data it contains could be made public someday — and perhaps sooner than we think. The same is true for any data we store “in the cloud”.