<$BlogRSDUrl$>

Every Friday I pick a paper from the ACM Digital Library that is found by the search term +connected +2005 +"mobile device" +"user interface", and write a brief discussion of it. Why? Because it makes me actually read them.

virtual journal club: "Connected Mobile Devices UI"
Sunday, August 29, 2004
Metadata creation system for mobile images 
Link

Risto Sarvas Helsinki Institute for Information Technology (HIIT), HUT, Finland
Erick Herrarte UC Berkeley, Berkeley, CA
Anita Wilhelm UC Berkeley, Berkeley, CA
Marc Davis UC Berkeley, Berkeley

International Conference On Mobile Systems, Applications And Services archive
Proceedings of the 2nd international conference on Mobile systems, applications, and services table of contents
Boston, MA, USA
SESSION: Mobile applications table of contents
Pages: 36 - 48
Year of Publication: 2004
ISBN:1-58113-793-1

Abstract:
The amount of personal digital media is increasing, and managing it has become a pressing problem. Effective management of media content is not possible without content-related metadata. In this paper we describe a content metadata creation process for images taken with a mobile phone. The design goals were to automate the creation of image content metadata by leveraging automatically available contextual metadata on the mobile phone, to use similarity processing algorithms for reusing shared metadata and images on a remote server, and to interact with the mobile phone user during image capture to confirm and augment the system supplied metadata. We built a prototype system to evaluate the designed metadata creation process. The main findings were that the creation process could be implemented with current technology and it facilitated the creation of semantic metadata at the time of image capture.

My Discussion:
A neat attempt to get a head start on the problem that will confront all of us engineers of mobile consumer devics: how will we deal with the massive amounts of data the users of the cameraphones and other recording devices will generate? Here the authors made a system to allow the user to attach keywords ("metadata") to a picture immediatly after snapping it on a cameraphone. That way the user will be able to organize and retrieve it based on attributes like where it was taken, who was in it, who was close by. Unfortuantly the system created is hampered by very spotty connections with the data network that manages and suggests the annotations, making the experience painful and time consuming instead of a snappy immediate part of taking and keeping a picture. Now this is in a major urban area in California (Berkley) using the deployed data (GPRS) network by a major carrier, AT&T. This paper highlights how we are really nowhere near the ubiquitous wireless cloud of mobile assiting services following us everywhere we go, and how network lag and constrained UI problems seriously can interfere with the design goals, something much ad- and research-copy about the mobile revolutions wishes to ignore.

Comments: Post a Comment

This page is powered by Blogger. Isn't yours?