Papers in Computer Science

Discussion of computer science publications

Conditioned-safe Ceremonies and a User Study of an Application to Web Authentication

Posted by dcoetzee on June 13th, 2010

Citation: Chris Karlof, J. Doug Tygar, David Wagner. Conditioned-safe Ceremonies and a User Study of an Application to Web Authentication. Sixteenth Annual Network and Distributed Systems Security Symposium, 2009. (PDF)

Abstract: We introduce the notion of a conditioned-safe ceremony. A “ceremony” is similar to the conventional notion of a protocol, except that a ceremony explicitly includes human participants. Our formulation of a conditioned-safe ceremony draws on several ideas and lessons learned from the human factors and human reliability community: forcing functions, defense in depth, and the use of human tendencies, such as rule-based decision making. We propose design principles for building conditioned-safe ceremonies and apply these principles to develop a registration ceremony for machine authentication based on email. We evaluated our email registration ceremony with a user study of 200 participants. We designed our study to be as ecologically valid as possible: we employed deception, did not use a laboratory environment, and attempted to create an experience of risk. We simulated attacks against the users and found that email registration was significantly more secure than challenge question based registration. We also found evidence that conditioning helped email registration users resist attacks, but contributed towards making challenge question users more vulnerable.

Discussion: This paper from NDSS 2009 introduces conditioned-safe ceremonies, an informal model for security protocols that explicitly models the actions of users. Rather than conservatively considering users to be unpredictable agents capable of any action, it takes advantage of their properties as creatures of habit to help facilitate the desired, secure outcome.

Many of us are familiar with the problem of security warnings sometimes known as click fatigue: when a user is asked to dismiss a security warning frequently during normal operation, they begin to disregard it in all situations. This was for example the primary criticism of Windows Vista’s User Account Control (UAC) feature. There are two reasons for this: one is that security is not the primary concern of users who are focused on completing their primary task; the other is that humans asked to perform a process repeatedly will naturally begin to streamline the process by omitting optional steps and completing mandatory steps using rapid rule-based processing and simple pattern matching. If a situation called for a particular response in the past, visually similar stimuli will encourage users to perform the same task nearly automatically. In psychology this kind of decision-making strategy that settles upon an adequate solution is known as satisficing, and is very difficult to reverse.

Unfortunately this is precisely the type of user behavior exploited by phishers: a typical user presented with a log-in form resembling one they have used many times in the past will thoughtlessly enter their credentials, having long since eliminated the optional steps of carefully examining the security indicators that would expose it as a fraudulent replica.

A cryptographic protocol – such as SSL – is usually described in terms of a number of nodes representing participating machines which exchange messages over channels. A ceremony, coined by Intel’s Jesse Walker, extends the concept of protocols by incorporating nodes for the human users themselves and explicitly representing communication between users and their machines via I/O devices (Carl Ellison. Ceremony Design and Analysis. Cryptology ePrint Archive, Report 2007/399, 2007). This model opens up opportunities for modelling user behavior.

A conditioned-safe ceremony is one designed under the assumption that users will satisfice and behave according to habit; it operates by conditioning the user to follow certain rules during the ceremony. A simple example of conditioning is the Windows log-in screen, which asks the user to press CTRL+ALT+DEL before logging in. This key signals to the operating system that the information entered in the log-in dialog should not be made available to applications or keyboard interception drivers. Because users are always asked to do this, and are not permitted to skip this step, they develop a consistent habit of doing so. The inability of a user to log in without first pressing this key is called a forcing function: forcing functions encourage conditioning and discourage omitting steps which may seem unimportant.

A conditioned-safe ceremony should satisfy several important properties:

  • It should only condition safe rules – “rules that are harmless to apply in the presence of an adversary.”
  • It should condition at least one immunizing rule – “a rule which when applied during an attack causes the attack to fail.”
  • Conditioned rules should be safe to follow under all circumstances, without any complex decision-making.
  • It should not assume users will reliably perform any action that is not conditioned by the ceremony.

There are two different types of errors a user can make during a ceremony which may threaten its security:

  • An error of omission: The user was expected to apply a rule but took no action.
  • An error of commission: The user took an unexpected action not conditioned by the ceremony.

An attacker may attempt to induce either type of error. For example, if an application pops up a Windows log-in box, a user may unthinkingly enter their password without pressing the protective CTRL+ALT+DEL hotkey, because they were not instructed to do so in this instance. An error of commission is usually induced by the attacker giving the user specific instructions, such as “visit this URL in your web browser” or ” Users tend to be suspicious of unfamiliar instructions, making attacks of this type more difficult. An ideal conditioned-safe ceremony should protect against as many errors of both types as possible.

The paper presents an example conditioned-safe ceremony for machine validation: the first time a user logs into a site from a particular machine, they must validate their identity. To do this, the site sends them an e-mail containing a link that they must click; after the link is clicked, a cookie is installed and the user has full access to the site from their current machine. The link only works once. The goal of the attacker is to trick the user into not clicking on the link in the e-mail, instead giving it to the attacker; to accomplish this, they display a phishing web page giving specific instructions on how to do this.  This involves both an error of omission (not clicking on a link that they usually click on), and an error of commission (pasting the link into the website, an action they do not normally take). The expectation is that, if users tend to perform actions they are accustomed to, they will ignore or fail to complete the attacker’s instructions, and the attack will fail.

Sure enough, the experiment bears this out: although as many as 40% of users fall for the attack described above, an alternative design that does not follow the principles of conditioned-safe ceremonies leads to attack  success rates of over 90%. Interviews with the subjects who didn’t fall for the attack show that over half of them didn’t notice the attacker’s special instructions, or thought they were unimportant – the same inattentive attitude that makes security warnings useless now benefits users!

The most exciting thing about this work to me is that it’s one of the first to adopt and exploit a successful model of human behavior in the design of security protocols – a critical step, as humans all too often remain the weakest link in any secure system. Previous efforts have given great insight into the types of errors people make, but not into how designs can work around these limitations in human behavior.

On the other hand, the informal model presented in this work stands in stark contrast to the mathematical models used in cryptography, where cryptographic protocols are routinely subjected to formal verification techniques such as model checking and theorem proving – an important future direction is to generalize these same tools to ceremonies. Moreover, although satisficing behavior is evidently an important component in user behavior, it is obviously not the only such component: 40% of users were persuaded to commit multiple errors in the ceremony. It should come as no surprise that humans are complex creatures that cannot be adequately modelled by a simple set of conditioned rules. In this case, what processes underlie these divergent behaviors, and how can they be modelled? Another important question involves modelling of errors, or divergence of users from the model: can we empirically predict the likelihood of certain sets of errors occurring, and then formally validate that attacks are not possible  in the most likely scenarios? The attacks in this work were relatively ad hoc and don’t seem to rule out the possibility of another attack involving only a single user error.

In short, the area of ceremonies is fertile ground for the development of new models that can effectively predict the behavior of the system as a whole, facilitating the development of protocols that will subtly push users towards making all the right security decisions, even when security is the last thing on their mind.

The author releases all rights to all content herein and grants this work into the public domain, with the exception of works owned by others such as abstracts, quotations, and WordPress theme content.

http://www.cs.berkeley.edu/~daw/papers/condsafe-ndss09.pdf(P

One Response to “Conditioned-safe Ceremonies and a User Study of an Application to Web Authentication”

  1. John Ward Says:

    I totally agree about your comment:

    “…..a critical step, as humans all too often remain the weakest link in any secure system…”

    We always do the same things to remember our passwords like birthdays etc.

    John

Leave a Reply

XHTML: You can use these tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>