Lars Kotthoff and I have applied to organize the 1st Interdisciplinary Workshop on Algorithm Selection and Meta-Learning in Information Retrieval (AMIR) at the 41st European Conference on Information Retrieval (ECIR). Let’s cross fingers and hope it will get accepted.

In the following, you find the proposal (also available on ResearchGate as PDF).

@InProceedings{BeelKotthoff2018,
author = {Beel, Joeran and Kotthoff, Lars},
title = {Proposal for the 1st Interdisciplinary Workshop on Algorithm Selection and Meta-Learning in Information Retrieval (AMIR)},
booktitle = {ResearchGate Repository},
year = {2018},
pages = {1–6},
doi = {10.13140/RG.2.2.14548.65922},
url = {https://www.researchgate.net/publication/328965675_Proposal_for_the_1st_Interdisciplinary_Workshop_on_Algorithm_Selection_and_Meta-Learning_in_Information_Retrieval_AMIR},
abstract = {The algorithm selection problem describes the challenge of identifying the best algorithm for a given problem space. In many domains, particularly artificial intelligence, the algorithm selection problem is well studied, and various approaches and tools exist to tackle it in practice. Especially through meta-learning impressive performance improvements have been achieved. The information retrieval (IR) community, however, has paid little attention to the algorithm selection problem, although the problem is highly relevant in information retrieval. This workshop will bring together researchers from the fields of algorithm selection and meta-learning as well as information retrieval. We aim to raise the awareness in the IR community of the algorithm selection problem; identify the potential for automatic algorithm selection in information retrieval; and explore possible solutions for this context. In particular, we will explore to what extent existing solutions to the algorithm selection problem from other domains can be applied in information retrieval, and also how techniques from IR can be used for automated algorithm selection and meta-learning.},
}

Abstract

The algorithm selection problem describes the challenge of identifying the best algorithm for a given problem space. In many domains, particularly artificial intelligence, the algorithm selection problem is well studied, and various approaches and tools exist to tackle it in practice. Especially through meta-learning impressive performance improvements have been achieved. The information retrieval (IR) community, however, has paid little attention to the algorithm selection problem, although the problem is highly relevant in information retrieval. This workshop will bring together researchers from the fields of algorithm selection and meta-learning as well as information retrieval. We aim to raise the awareness in the IR community of the algorithm selection problem; identify the potential for automatic algorithm selection in information retrieval; and explore possible solutions for this context. In particular, we will explore to what extent existing solutions to the algorithm selection problem from other domains can be applied in information retrieval, and also how techniques from IR can be used for automated algorithm selection and meta-learning.

1. Introduction / Motivation

There is a plethora of algorithms for information retrieval applications, such as search engines and recommender systems. There are about 100 approaches to recommend research papers alone [1]. The question that researchers and practitioners alike are faced with is which one of these approaches to choose for their particular problem. This is a difficult choice even for experts, compounded by ongoing research that develops ever more approaches.

The challenge of identifying the best algorithm for a given application is not new. The so-called “algorithm selection problem” was first mentioned in the 1970s [2] and has attracted significant attention in various disciplines since then, especially in the last decade. Particularly in artificial intelligence, impressive performance achievements have been enabled by algorithm selection systems. A prominent example is the award-winning SATzilla system [3]. More generally, algorithm selection is an example of meta-learning, where the experience gained from solving problems informs how to solve future problems.

Meta-learning and automating modelling processes has gained significant traction in the machine learning community, in particular with so-called AutoML approaches that aim to automate the entire machine learning and data mining process from ingesting the data to making predictions. An example of such a system is Auto-WEKA [4]. There have been multiple competitions [5,6] and workshops, symposia and tutorials [7–11], including a Dagstuhl seminar [8]. The OpenML platform was developed to facilitate the exchange of data and machine learning models to enable research into meta-learning [12].

Despite the significance of the algorithm selection problem and notable advances in solving it in many domains, the information retrieval community has paid little attention to it. There are a few papers that investigate the algorithm selection problem in the context of information retrieval, for example in the field of recommender systems [13–22]. However, the number of researchers interested in this topic is limited, and results so far have been not as impressive as in other domains.

There is potential for applying IR techniques in meta-learning as well. The algorithm selection problem can be seen as a traditional information retrieval task, i.e. the task of identifying the most relevant item (an algorithm) from a large corpus (thousands of potential algorithms and parameters) for a given information need (e.g. classifying photos or making recommendations). We see great potential for the information retrieval community contributing to solving the algorithm selection problem.

2. Objectives, Outcomes & Vision for the Workshop

We propose the 1st Interdisciplinary Workshop on Algorithm Selection and Meta-Learning in Information Retrieval (AMIR). With our workshop, we aim at achieving the following goals:

  1. Raise awareness in the information retrieval community of the algorithm selection problem.
  2. Identify the potential for automated algorithm selection and meta-learning in IR applications.
  3. Familiarize the IR community with algorithm selection and meta-learning tools and research that has been published in related disciplines such as machine learning.
  4. Find solutions to address and solve the algorithm selection problem in IR.

The expected outcome is a workshop proceedings book, which we will publish at http://www.ceur-ws.org/. Our vision is to establish a regular workshop at ECIR or related venues (e.g. SIGIR, UMAP, RecSys) and eventually – in the long run – solve the algorithm selection problem in information retrieval. We hope to stimulate collaborations between researchers in IR and meta-learning through presentations and discussions at the workshop, which will ultimately lead to joint publications and research proposals.

Screenshot of the Homepage of the 1st Interdisciplinary Workshop on Algorithm Selection and Meta-Learning in Information Retrieval (AMIR)

Screenshot of the Homepage of the 1st Interdisciplinary Workshop on Algorithm Selection and Meta-Learning in Information Retrieval (AMIR)

3. Topical Outline

We will explore a) how existing solutions for algorithm selection and meta-learning can be applied to identify the best algorithm for a given information retrieval problem and b) how information retrieval techniques may be applied to solve the algorithm selection problem in IR and other domains.

 More precisely, topics relevant for the workshop are

  • Automated Machine Learning
  • Algorithm Selection
  • Algorithm Configuration
  • Meta-Learning
  • Hyper-Parameter Optimization
  • Evaluation Methods and Metrics
  • Benchmarking
  • Meta-Heuristics
  • Learning to Learn
  • Recommender Systems for Algorithms
  • Algorithm Selection as User Modeling Task
  • Search Engines for Algorithms
  • Neural Network Search

4. Planned Format & Structure

We envision a half-day workshop (3 hours + breaks) with the following submission types.

  • Research Papers, Position Papers, Case Studies (6 or 12 pages, LNCS format) with 5-15 minutes presentations at the Workshop
  • Posters, Demonstrations, Nectar, Datasets (4 pages, LNCS format) with a 1-2 minutes teaser presentation, and a poster session.

The tentative schedule is as follows.

8:30 Poster setup

9:00 Welcome and Introduction by the Organizers

9:10 Keynote Talk

9:45 Poster Pitches (1-2 minutes per poster)

10:00 Coffee Break and Poster Session

10:30 Paper Presentations (ideally 1-2 full papers, and 3-4 short papers)         

12:00 Lunch, Outlook, and Discussion

 Depending on the number of submissions, part of the paper session may be replaced by a panel discussion with experts from IR and machine learning to discuss the algorithm selection problem in the context of information retrieval in depth. 

5. Expected Audience and Attendees

Workshops on the algorithm selection problem in the field of machine learning attracted significant attention. For instance, the Meta-Learning and Algorithm Selection Workshop at ECMLPKDD in 2015 resulted in 15 publications [9]. The Workshop on Meta-Learning (MetaLearn 2017) at NIPPS resulted in 29 publications, including posters [10]. Given that the algorithm selection problem is less known in the IR community, we expect around 5 publications (short and full paper) plus a few posters.

We are confident to receive a significant number of manuscripts and also expect a high number of attendees as the algorithm selection problem is relevant for everyone in information retrieval, particularly for everyone who wants to deploy a real-world system. It is an easy-to-understand problem that every researcher has faced himself/herself; and has attracted a lot of attention in other communities already.

6. Organizers

6.1 Joeran Beel

Joeran Beel is Assistant Professor in Intelligent Systems at the School of Computer Science and Statistics at Trinity College Dublin. He is also affiliated with the ADAPT Centre, an interdisciplinary research center that closely cooperates with industry partners including Google, Deutsche Bank, Huawei, and Novartis. Joeran is further a Visiting Professor at the National Institute of Informatics in Tokyo. His research focuses on information retrieval, recommender systems, user modeling and machine learning. He has developed novel algorithms in these fields and conducted research on the question of how to evaluate information retrieval systems. Joeran also has industry experience as a product manager and as the founder of three business start-ups he experienced the algorithm selection problem first hand. Joeran is serving as general co-chair of the 26th Irish Conference on Artificial Intelligence and Cognitive Science and served on program committees for major information retrieval venues including SIGIR, ECIR, UMAP, RecSys, and ACM TOIS.

6.2 Lars Kotthoff

Lars Kotthoff is Assistant Professor at the University of Wyoming. He leads the Meta-Algorithmics, Learning and Large-scale Empirical Testing (MALLET) lab and has acquired more than $400K in external funding to date. Lars is also the PI for the Artificially Intelligent Manufacturing center (AIM) at the University of Wyoming. He co-organized multiple workshops on meta-learning and automatic machine learning (e.g. [9]) and the Algorithm Selection Competition Series [5]. He was workshop and masterclass chair at the CPAIOR 2014 conference and organized the ACP summer school on constraint programming in 2018. His research combines artificial intelligence and machine learning to build robust systems with state-of-the-art performance. Lars’ more than 60 publications have garnered ~800 citations and his research has been supported by funding agencies and industry in various countries.

References

[1]      J. Beel, B. Gipp, S. Langer, and C. Breitinger, “Research Paper Recommender Systems: A Literature Survey,” International Journal on Digital Libraries, 2016, pp. 305–338.

[2]      J.R. Rice, “The algorithm selection problem,” 1975.

[3]      L. Xu, F. Hutter, H.H. Hoos, and K. Leyton-Brown, “SATzilla: portfolio-based algorithm selection for SAT,” Journal of artificial intelligence research, vol. 32, 2008, pp. 565–606.

[4]      L. Kotthoff, C. Thornton, H.H. Hoos, F. Hutter, and K. Leyton-Brown, “Auto-WEKA 2.0: Automatic model selection and hyperparameter optimization in WEKA,” The Journal of Machine Learning Research, vol. 18, 2017, pp. 826–830.

[5]      M. Lindauer, J.N. van Rijn, and L. Kotthoff, “The Algorithm Selection Competition Series 2015-17,” arXiv preprint arXiv:1805.01214, 2018.

[6]      W.-W. Tu, “The 3rd AutoML Challenge: AutoML for Lifelong Machine Learning,” NIPS 2018 Challenge, 2018.

[7]      P. Brazdil, “Metalearning & Algorithm Selection,” 21st European Conference on Artificial Intelligence (ECAI), 2014.

[8]      H.H. Hoos, F. Neumann, and H. Trautmann, “Automated Algorithm Selection and Configuration,” Report from Dagstuhl Seminar 16412, vol. 6, 2016.

[9]      J. Vanschoren, P. Brazdil, C. Giraud-Carrier, and L. Kotthoff, “Meta-Learning and Algorithm Selection Workshop at ECMLPKDD,” CEUR Workshop Proceedings, 2015.

[10]    R. Calandra, F. Hutter, H. Larochelle, and S. Levine, “Workshop on Meta-Learning (MetaLearn 2017) @NIPS,” http://metalearning.ml, 2017.

[11]    R. Miikkulainen, Q. Le, K. Stanley, and C. Fernando, “Metalearning Symposium @NIPS,” http://metalearning-symposium.ml, 2017.

[12]    J. Vanschoren, J.N. Van Rijn, B. Bischl, and L. Torgo, “OpenML: networked science in machine learning,” ACM SIGKDD Explorations Newsletter, vol. 15, 2014, pp. 49–60.

[13]    M. Ahsan and L. Ngo-Ye, “A Conceptual Model of Recommender System for Algorithm Selection,” AMCIS 2005 Proceedings, 2005, p. 122.

[14]    J. Beel, “A Macro/Micro Recommender System for Recommendation Algorithms [Proposal],” ResearchGate https://www.researchgate.net/publication/322138236_A_MacroMicro_Recommender_System_for_Recommendation_Algorithms_Proposal, 2017.

[15]    A. Collins, D. Tkaczyk, and J. Beel, “One-at-a-time: A Meta-Learning Recommender-System for Recommendation-Algorithm Selection on Micro Level,” 26th Irish Conference on Artificial Intelligence and Cognitive Science, 2018.

[16]    T. Cunha, C. Soares, and A.C. de Carvalho, “Metalearning and Recommender Systems: A literature review and empirical study on the algorithm selection problem for Collaborative Filtering,” Information Sciences, vol. 423, 2018, pp. 128–144.

[17]    T. Cunha, C. Soares, and A.C. de Carvalho, “CF4CF: Recommending Collaborative Filtering algorithms using Collaborative Filtering,” arXiv preprint arXiv:1803.02250, 2018.

[18]    T. Cunha, C. Soares, and A.C. de Carvalho, “Selecting Collaborative Filtering algorithms using Metalearning,” Joint European Conference on Machine Learning and Knowledge Discovery in Databases, Springer, 2016, pp. 393–409.

[19]    P. Matuszyk and M. Spiliopoulou, “Predicting the performance of collaborative filtering algorithms,” Proceedings of the 4th International Conference on Web Intelligence, Mining and Semantics (WIMS14), ACM, 2014, p. 38.

[20]    M. M?s?r and M. Sebag, “ALORS: An algorithm recommender system,” Artificial Intelligence, vol. 244, 2017, pp. 291–314.

[21]    C. Romero, J.L. Olmo, and S. Ventura, “A meta-learning approach for recommending a subset of white-box classification algorithms for Moodle datasets,” Educational Data Mining 2013, 2013.

[22]    M. Vartak, A. Thiagarajan, C. Miranda, J. Bratman, and H. Larochelle, “A Meta-Learning Perspective on Cold-Start Recommendations for Items,” Advances in Neural Information Processing Systems, 2017, pp. 6907–6917.


Joeran Beel

Please visit https://isg.beel.org/people/joeran-beel/ for more details about me.

0 Comments

Leave a Reply

Avatar placeholder

Your email address will not be published. Required fields are marked *