MediaEval is an international multimedia benchmarking initiative offering innovative new tasks to the multimedia community. MediaEval 2011 featured tasks incorporating social media search, analysis of affect and location placing of images.
MediaEval is an international multimedia benchmarking initiative that offers innovative new content analysis, indexing and search tasks to the multimedia community. MediaEval focuses on social and human aspects of multimedia and strives to emphasize the ‘multi’ in multimedia, including the use of speech, audio, tags, users, and context, as well as visual content. MediaEval seeks to encourage novel and creative approaches to tackling these new and emerging multimedia tasks. Participation in MediaEval tasks is open to any research group who signs up. MediaEval was launched as VideoCLEF as track at CLEF 2008 and became an independent benchmarking campaign in 2010 with sponsorship by the PetaMedia Network of Excellence.
MediaEval workshop participants
MediaEval 2011 offered 6 tasks coordinated in cooperation with various research groups in Europe and elsewhere. The following tasks were offered in the 2011 season:
- Placing Task: This task required participants to assign geographical coordinates (latitude and longitude) to each of a provided set of test videos. Participants could make use of metadata and audio and visual features as well as external resources.
- Spoken Web Search Task: This task involved searching for audio content within audio content using an audio content query. It addresses the challenge of search for multiple, resource-limited languages. The application domain is the Spoken Web being developed for low-literacy communities in the developing world.
- Affect Task: This task required participants to deploy multimodal features to automatically detect portions of movies containing violent material. Violence is defined as “physical violence or accident resulting in human injury or pain”. Any features automatically extracted from the video, including the subtitles, could be used by participants.
- Social Event Detection Task: This task requires participants to discover events and detect media items that are related to either a specific social event or an event-class of interest. Social events of interest were planned by people, attended by people and the social media captured by people.
- Genre Tagging Task: The task required participants to automatically assign tags to Internet videos using features derived from speech, audio, visual content or associated textual or social information. This year the task focused on labels that reflect the genre of the video.
- Rich Speech Retrieval Task: The task went beyond conventional spoken content retrieval by requiring participants to deploy spoken content and its context in order to find jump-points in an audiovisual collection of Internet video for given a set of queries.
The 2011 campaign culminated in the MediaEval 2011 workshop that was held on 1-2 September at Fossabanda Santa Croce in Pisa, Italy. Reflecting MediaEval’s engagement with different research communities, the workshop was an official satellite event of Interspeech 2011. The workshop brought together the task participants to report on their findings, discuss their approaches and learn from each other. A total of 39 submissions were made to the working notes from around 35 different research sites, and almost 60 participants attended the workshop - representing a twofold increase on the 2010 workshop. In addition to organizer and participant presentations, the workshop included a practitioners’ session in which projects, research sites and industry groups that are involved with MediaEval related tasks or technology presented overviews of their work and ideas. The workshop concluded with a meeting of task organizers and other interested researchers that consisted of presentations and discussions of task proposals for MediaEval 2012. The working notes proceedings for the MediaEval 2011 workshop have been published by CEUR workshop proceedings.
In addition to PetaMedia, MediaEval 2011 received support from a number of EU and national projects and other organizations including: AXES, OpenSEM, Glocal, WeKnowIt, Chorus+, Quaero, IISSCoS, Technicolor, IBM Research - India and Carnegie Mellon University.
Further details of MediaEval are available from the MediaEval website. We are now beginning preparations for MediaEval 2012. This begins with a questionnaire to the community seeking their views on proposed tasks and research questions. Feedback from the questionnaire is used to determine the research agenda for the campaign. If you are interested in receiving the questionnaire, participating in a task or even coordinating a task as part of MediaEval 2012, please contact Martha Larson.
Links:
MediaEval website: http://www.multimediaeval.org
MediaEval 2011 online proceedings: http://ceur-ws.org/Vol-807/
Please contact:
Martha Larson
Delft University of Technology, The Netherlands
E-mail: