SMARTCHA

(SeMi Automated Reverse Turing test to tell Computer and Human Apart)

Tests for Human-Interaction Proof (HIP), which are supposedly passed by humans easily but not by computers (automated programs), have become the de-facto security countermeasure for many Internet applications. Although many different types of HIPs were proposed so far, none of them seems to satisfy both security and usability requirements. For instance, many HIP tests which involve distorted characters may be broken by automated scripts. The common response of web sites whose HIPs were broken is to introduce more complicated versions which are even more distressing and laborious for users.

Yet another problem is accessibility i.e., visually-impaired users cannot pass these tests. Large corporations like Google and Yahoo attempt to solve this problem by introducing audio HIPs. Unfortunately, these audio HIPs are shown to be too difficult to solve.

One method that has the potential to solve usability and accessibility problems is pure-text HIPs which do not have any graphical elements and can be presented solely as text. These HIPs can be easily solved by vision-disabled users by the help of software that can read the intended part of the screen by synthesized voice.

We generate the HIP tests semi-automatically i.e., using a large base of diverse questions which are produced manually by human operators from a crowdsourcing service (i.e., Mechanical Turk). Although human computation was previously employed for solving HIP tests, our work is the first which uses it for generating rather than solving HIP tests.

Prepare new question-answer pair Request HIP tests using web service API