Difference between revisions of "Event:ICMR 2020"

From ConfIDent
 
Line 11: Line 11:
 
|Academic Field=Machine Learning; Multimedia Retrieval
 
|Academic Field=Machine Learning; Multimedia Retrieval
 
|Official Website=http://icmr2020.org/
 
|Official Website=http://icmr2020.org/
 +
|DOI=10.25798/mat1-3h68
 
|Type=Conference
 
|Type=Conference
 
|has general chair=Cathal Gurrin, Björn Þór Jónsson, Noriko Kando
 
|has general chair=Cathal Gurrin, Björn Þór Jónsson, Noriko Kando

Latest revision as of 12:36, 10 October 2023

Deadlines
2020-03-08
2020-01-27
2020-01-27
2020-03-31
2020-01-27
27
Jan
2020
Submission
27
Jan
2020
Abstract
27
Jan
2020
Paper
8
Mar
2020
Notification
31
Mar
2020
Camera-Ready
organization
organization
Metrics
Submitted Papers
149
Accepted Papers
19
Accepted Short Papers
26
Venue
Loading map...

Updated 21 August 2020. ICMR'20 will now be fully virtual.

Topics

  • Content and context based indexing, search and retrieval of multimedia data
  • Multimodal content analysis and retrieval
  • Deep learning for multimedia retrieval
  • Advanced descriptors and similarity metrics for multimedia data
  • Complex multimedia event detection and recounting
  • Learning and relevance feedback in multimedia retrieval
  • Query models, paradigms, and languages for multimedia retrieval
  • Hashing and indexing algorithms for large-scale retrieval
  • Cross-media retrieval
  • Fine-grained visual search
  • Human perception based multimedia retrieval
  • Studies of information-seeking behavior among image/video users
  • Affective/emotional interaction or interfaces for image/video retrieval
  • HCI issues in multimedia retrieval
  • Crowdsourcing in multimedia retrieval
  • Community-based multimedia content management
  • Internet of multimedia things
  • Image/video summarization and visualization
  • Mobile visual search
  • Applications of multimedia retrieval (Healthcare, Fintech, etc.)
  • Evaluation of multimedia retrieval systems
Cookies help us deliver our services. By using our services, you agree to our use of cookies.