Users, Security and Scams

I read Bruce Schneier’s [Crypto-Gram](http://www.schneier.com/crypto-gram.html) monthly. It’s from there that I found most of these links, with the exception of the ones on social engineering. I found the first paper on scam victims to be especially thought provoking (although it’s long). The video clip demonstrating social proof was amusing.

*[Understanding scam victims: seven principles for systems security](http://www.cl.cam.ac.uk/techreports/UCAM-CL-TR-754.pdf)*

Summary: Scammers manipulate people with distraction, deception, herd mentality, greed, time pressure and by impersonating authority. If something sounds too good to be true, it probably is.

—-

*[Social Engineering](http://www.infosectoday.com/Norwich/GI532/Social_Engineering.htm)* [\[2\]](http://www.chips.navy.mil/archives/09_Jan/web_pages/social_engineering.html) [\[3\]](http://packetstormsecurity.nl/docs/social-engineering/aaatalk.html)

*Summary*: Social engineers exploit people’s tendency to trust and to be helpful. They do this with ingratiation, impersonation, diffusion of responsibility, urgency, appeal to conformity (aka “social proof” or herd mentality), intimidation, deception, and authoritative orders.

There’s an entertaining Candid Camera (http://www.social-engineer.org/framework/Influence_Tactics:_Consensus_or_Social_Proof).

—-

*[The Rational Rejection of Security Advice by Users](http://research.microsoft.com/en-us/um/people/cormac/papers/2009/SoLongAndNoThanks.pdf)*

*Summary*: Security practitioners often dole out advice that is perceived by users as too time consuming. So users ignore or reject the security advice. However, “Advice that has compelling cost-benefit tradeoff has real chance of user adoption…. the costs and benefits have to be those the user cares about”. _Time_ is one thing users care about.