The organizing committee of COPA 2020 is strongly committed to organizing the conference at the planned dates. It will be prepared for full on-site presence in Verona, with a possibility of having online presentations for authors who might not be able to travel. If restrictions would not allow for an on-site conference, it will also be prepared to run the conference in fully online mode, with virtual session rooms and facilities for online presentation of talks, so that the attendees will have an experience very close to the real conference. Every effort will be also taken for publishing the conference proceedings by the Proceedings of Machine Learning Research (PMLR) at earliest possible time.
- Abstract submission: April 27th, 2020
- Paper deadline: May 4th, 2020
- Author notification: June 20th, 2020 - Camera-ready: July 10th, 2020
Prof. Vladimir Vapnik, Facebook AI Research.
Prof. Marco Campi, University of Brescia.
Professor Vladimir Vapnik
Facebook AI Research
Complete Statistical Theory of Learning: Learning Using Statistical Invariants
(co-authored with Dr Rauf Izmailov)
Statistical theory of learning devoted to methods for which the obtained approximations converge to the desired function with increasing number of observations.
The theory uses instruments that provide convergence in L2 norm of space of functions, i.e., the so-called strong mode of convergence. However, in Hilbert space, along with convergence in the space of functions there also exists the so-called weak mode of convergence, i.e., convergence in the space of functionals.
The weak mode of convergence (under some conditions) also leads to the convergence of approximations to the desired function in L2 metrics, although it uses different mechanisms to achieve that convergence.
This talk discusses new learning methods which use both mechanisms (weak and strong) of convergence simultaneously. Such methods allow one
(1) to find the admissible set of functions (i.e., the collection of an appropriate approximating functions) and
(2) to find the desired approximation in this set. Since only two modes of convergence exist in Hilbert space, we call the theory that uses both of them the complete statistical theory of learning.
Along with general reasoning, the talk presents algorithms which constitute a new learning paradigm called Learning Using Statistical Invariants (LUSI). LUSI algorithms were developed for sets of functions belonging to Reproducing Kernel Hilbert Space (RKHS), which include modified SVM method (LUSI-SVM method). Also, talk presents LUSI modification of Neural Networks methods (LUSI-NN). LUSI methods require for learning less training examples.
In conclusion, the talk discusses the general (philosophical) framework of new learning paradigm including concept of intelligence.
Prof. Vapnik is one of the main developers of the Vapnik–Chervonenkis theory of statistical learning, and the co-inventor of the support-vector machine method, and support-vector clustering algorithm.
Vapnik was inducted into the U.S. National Academy of Engineering in 2006. He received the 2005 Gabor Award, the 2008 Paris Kanellakis Award, the 2010 Neural Networks Pioneer Award, the 2012 IEEE Frank Rosenblatt Award, the 2012 Benjamin Franklin Medal in Computer and Cognitive Science from the Franklin Institute, the 2013 C&C Prize from the NEC C&C Foundation, the 2014 Kampé de Fériet Award, the 2017 IEEE John von Neumann Medal. In 2018, he received the Kolmogorov Medal from University of London and delivered the Kolmogorov Lecture.
Professor Marco Campi
Department of Information Engineering, University of Brescia, Italy
Risk and Complexity in Prediction Problems
(joint work with Simone Garatti)
We show that the risk (probability with which a shortfall occurs) and complexity (number of relevant data-points to determine a solution) are strictly connected concepts in a general data-driven decision-making framework.
When these concepts are applied to prediction, they lead to extremely tight evaluations of the distribution of the probability of prediction error.
These results are relevant to all approaches in prediction theory that contain a compression scheme, including SVM (Support Vector Machine), SVR (Support Vector regression), SVDD (Support Vector Data Description) and others.
Marco C. Campi has taught topics related to data-driven and inductive methods for many years. He is a distinguished lecturer of the Control Systems Society and has chaired the Technical Committee IFAC on Modeling, Identification, and Signal Processing in the past six years.
Professor Campi has held visiting and teaching appointments at several institutions and has served in various capacities on the editorial boards of Automatica, Systems and Control Letters, and the European Journal of Control. He is a Fellow of IEEE, a fellow of IFAC, and a recipient of the IEEE CSS George S. Axelby outstanding paper award. He has delivered plenary addresses at major conferences, including Optimization, CDC, MTNS, and SYSID.
The main purpose of conformal prediction is to complement predictions delivered by various algorithms of Machine Learning with provably valid measures of their accuracy and reliability under the assumption that the observations are independent and identically distributed. It was originally developed in the late 1990s and early 2000s but has become more popular and further developed in important directions in recent years.
Conformal prediction is a universal tool in several senses; in particular, it can be used in combination with any known machine-learning algorithm, such as SVM, Neural Networks, Ridge Regression, etc. It has been applied to a variety of problems from diagnostics of depression to drug discovery to the behaviour of bots.
A sister method of Venn prediction was developed at the same time as conformal prediction and is used for probabilistic prediction in the context of classification. Among recent developments are adaptations of conformal and Venn predictors to probabilistic prediction in the context of regression.
The COPA series of workshops is a home for work in both conformal and Venn prediction, as reflected in its full name “Conformal and Probabilistic Prediction with Applications”. The aim of this symposium is to serve as a forum for the presentation of new and ongoing work and the exchange of ideas between researchers on any aspect of Conformal and Probabilistic Prediction and their applications to interesting problems in any field.
Topics of the symposium include, but are not limited to:
Authors are invited to submit original, English-language research contributions or experience reports. Papers should be no longer than 20 pages formatted according to the well-known JMLR (Journal of Machine Learning Research) style. The LaTeX package for the style is available here. All aspects of the submission and notification process will be handled online via the EasyChair Conference System at:
There will be two Alexey Chervonenkis awards for the Best Paper and Best Student Paper, presented at the conference.
Researchers interested in Conformal Prediction may be interested in joining our online discussion group. Future announcements and related materials will be published regularly.