Saturday 23rd July, 2011
I've been thinking about people being tricked into doing things by web pages.

How about a sort of distance metric from user decisions to help filter out things like the idiot FaceBook video spam that's been going around?

I warn you: this is currently on an incredibly vague conceptual level.

Let's take a webapp as an example, with a number of widgets that have onClick handlers; and let's say that some lovely lovely person has worked out how to exploit some XSS hole in order to programatically simulate a series of user actions to - for example - spam all their friends.

So - how about each time an actual UI event (or at least a UI event generated by partially trusted code, such as the kernel mouse driver, or something authorised to send messages between applications [1]) occurs, we set a dynamically scoped variable which nothing can ever increment to be 1.0; and then each time an event takes place which deals with alters the user-visible context [2], it gets halved. So - a piece of Javascript that loads a new page, fills in a form then submits it generates a distance from agency of about 0.25, compared to 1.0 if the form submission directly results from the click, and 0.5 if there's one piece of JavaScript in the way.

This metric is then sent back with every HTTP request.

Thoughts? I know this is hopelessly naive, but it may be a starting point worth examining?

[1] I know this is ducking the problem slightly, but it's a thought experiment, dammit!

[2] Obvious examples: anything that causes a reload or relocation of the window; anything that changes the DOM; anything that performs an action on a user's behalf.

posted by Rob Mitchelmore, 14:36 (anchor)