Difference between revisions of "Event:CLEF 2021"

From ConfIDent
(mobo import Concept___Events-migrated)
 
(3 intermediate revisions by 2 users not shown)
Line 3: Line 3:
 
|Title=CLEF Conference and Labs of Evaluation Forum
 
|Title=CLEF Conference and Labs of Evaluation Forum
 
|Ordinal=12
 
|Ordinal=12
 +
|In Event Series=Event Series:CLEF
 +
|Single Day Event=no
 +
|Start Date=2021/09/21
 +
|End Date=2021/09/24
 +
|Event Status=as scheduled
 +
|Event Mode=online
 +
|Academic Field=Natural Language Processing
 +
|Official Website=http://clef2021.clef-initiative.eu/
 +
|Submission Link=https://www.easychair.org/conferences/?conf=clef2021
 
|Type=Conference
 
|Type=Conference
|Homepage=http://clef2021.clef-initiative.eu/
 
 
|Twitter account=@clef_initiative
 
|Twitter account=@clef_initiative
|City=Bucharest
 
|Country=Country:RO
 
|Submitting link=https://www.easychair.org/conferences/?conf=clef2021
 
 
|has general chair=Bogdan Ionescu, K. Selcuk Candan
 
|has general chair=Bogdan Ionescu, K. Selcuk Candan
 
|has program chair=Henning Müller, Lorraine Goeuriot, Birger Larsen
 
|has program chair=Henning Müller, Lorraine Goeuriot, Birger Larsen
Line 14: Line 19:
 
|pageEditor=User:Curator 53
 
|pageEditor=User:Curator 53
 
|contributionType=1
 
|contributionType=1
|In Event Series=Event Series:CLEF
 
|Single Day Event=no
 
|Start Date=2021/09/21
 
|End Date=2021/09/24
 
|Event Status=as scheduled
 
|Event Mode=on site
 
 
}}
 
}}
 
{{Event Deadline
 
{{Event Deadline
 +
|Notification Deadline=2021/06/04
 
|Paper Deadline=2021/05/03
 
|Paper Deadline=2021/05/03
|Notification Deadline=2021/06/04
 
 
|Camera-Ready Deadline=2021/06/25
 
|Camera-Ready Deadline=2021/06/25
 
|Submission Deadline=2021/05/03
 
|Submission Deadline=2021/05/03
 
}}
 
}}
 +
{{Event Metric}}
 
{{S Event}}
 
{{S Event}}
=== Topics ===
+
===Topics===
  
 
Relevant topics for the CLEF 2021 Conference include but are not limited to:
 
Relevant topics for the CLEF 2021 Conference include but are not limited to:
  
*   Information Access in any language or modality: information retrieval, image retrieval, question answering, search interfaces and design, infrastructures, etc.
+
*Information Access in any language or modality: information retrieval, image retrieval, question answering, search interfaces and design, infrastructures, etc.
*   Analytics for Information Retrieval: theoretical and practical results in the analytics field that are specifically targeted for information access data analysis, data enrichment, etc.
+
*Analytics for Information Retrieval: theoretical and practical results in the analytics field that are specifically targeted for information access data analysis, data enrichment, etc.
*   User studies either based on lab studies or crowdsourcing.
+
*User studies either based on lab studies or crowdsourcing.
*   Past results/run deep analysis both statistically and fine grain based.
+
*Past results/run deep analysis both statistically and fine grain based.
*   Evaluation initiatives: conclusions, lessons learned, impact and projection of any evaluation initiative after completing their cycle.
+
*Evaluation initiatives: conclusions, lessons learned, impact and projection of any evaluation initiative after completing their cycle.
*   Evaluation: methodologies, metrics, statistical and analytical tools, component based, user groups and use cases, ground-truth creation, impact of multilingual/multicultural/multimodal differences, etc.
+
*Evaluation: methodologies, metrics, statistical and analytical tools, component based, user groups and use cases, ground-truth creation, impact of multilingual/multicultural/multimodal differences, etc.
*   Technology transfer: economic impact/sustainability of information access approaches, deployment and exploitation of systems, use cases, etc.
+
*Technology transfer: economic impact/sustainability of information access approaches, deployment and exploitation of systems, use cases, etc.
*   Interactive Information Retrieval evaluation: the interactive evaluation of information retrieval systems using user-centered methods, evaluation of novel search interfaces, novel interactive evaluation methods, simulation of interaction, etc.
+
*Interactive Information Retrieval evaluation: the interactive evaluation of information retrieval systems using user-centered methods, evaluation of novel search interfaces, novel interactive evaluation methods, simulation of interaction, etc.
*   Specific application domains: Information access and its evaluation in application domains such as cultural heritage, digital libraries, social media, expert search, health information, legal documents, patents, news, books, plants, etc.
+
*Specific application domains: Information access and its evaluation in application domains such as cultural heritage, digital libraries, social media, expert search, health information, legal documents, patents, news, books, plants, etc.
*   New data collection: presentation of new data collection with potential high impact on future research, specific collections from companies or labs, multilingual collections.
+
*New data collection: presentation of new data collection with potential high impact on future research, specific collections from companies or labs, multilingual collections.
* Work on data from rare languages, collaborative, social data.
+
*Work on data from rare languages, collaborative, social data.

Latest revision as of 14:15, 14 December 2022

Deadlines
2021-06-04
2021-05-03
2021-06-25
2021-05-03
3
May
2021
Submission
3
May
2021
Paper
4
Jun
2021
Notification
25
Jun
2021
Camera-Ready
Metrics
Venue
Loading map...

Topics

Relevant topics for the CLEF 2021 Conference include but are not limited to:

  • Information Access in any language or modality: information retrieval, image retrieval, question answering, search interfaces and design, infrastructures, etc.
  • Analytics for Information Retrieval: theoretical and practical results in the analytics field that are specifically targeted for information access data analysis, data enrichment, etc.
  • User studies either based on lab studies or crowdsourcing.
  • Past results/run deep analysis both statistically and fine grain based.
  • Evaluation initiatives: conclusions, lessons learned, impact and projection of any evaluation initiative after completing their cycle.
  • Evaluation: methodologies, metrics, statistical and analytical tools, component based, user groups and use cases, ground-truth creation, impact of multilingual/multicultural/multimodal differences, etc.
  • Technology transfer: economic impact/sustainability of information access approaches, deployment and exploitation of systems, use cases, etc.
  • Interactive Information Retrieval evaluation: the interactive evaluation of information retrieval systems using user-centered methods, evaluation of novel search interfaces, novel interactive evaluation methods, simulation of interaction, etc.
  • Specific application domains: Information access and its evaluation in application domains such as cultural heritage, digital libraries, social media, expert search, health information, legal documents, patents, news, books, plants, etc.
  • New data collection: presentation of new data collection with potential high impact on future research, specific collections from companies or labs, multilingual collections.
  • Work on data from rare languages, collaborative, social data.
Cookies help us deliver our services. By using our services, you agree to our use of cookies.