12.03.2020, 17:00, Pere Pardo, Towards a Tractable Epistemic Logic. University of Milan
22.04.2020, 17:00, Ofer Arieli, Sequent-based deductive argumentation. Academic College of Tel-Aviv
Sequent-based argumentation is a general approach to reasoning with deductive (logical) argumentation, motivated by proof-theoretic considerations. In this talk we review some of the principles of this approach and describe several results concerning its characteristics. In particular, we present completeness results in terms of dynamic proofs, show their relations to reasoning with maximal consistency, and study the satisfiability of common rationality postulates in several sequent-based settings. (This is a joint work with Christian Strasser and in some parts also with Anne Marie Borg.)
07.05.2020, 17:00, Jonathan Lawry, Vagueness and Probability: Borderline Cases and Blurred Boundaries. University of Bristol
This talk will consider the relationship between various aspects of vagueness and probability. Along the way we will consider what are potentially the best ways of modelling borderline cases of vague propositions and what type of measures naturally arise if you define probabilities over such models. Adopting this approach, we will outline a probabilistic semantics for fuzzy logic and explore relationships with Dempster-Shafer theory. Finally, we discuss the role of vagueness in consensus formation and show some results from multi-agent systems and swarm robotics.
14.05.2020, 17:00, Marcelo Finger, Logic and Numbers — Combining logical reasoning with probabilities and counting. University of São Paulo
We present a research program which investigates the intersection of deductive reasoning with explicit quantitative capabilities. These capabilities encompass probabilistic reasoning, counting and counting quantifiers, and similar systems. The need to have a combined reasoning system that enables a unified way of reasoning with quantities has always been recognized in modern logic, as proposals of probabilistic logic reasoning are present since the work of Boole [1854]. Equally ubiquitous is the need to deal with cardinality restrictions on finite sets. More recently, a well-founded probabilistic theory has been developed for non-classical settings as well, such as probabilistic reasoning over Lukasiewicz infinitely-valued logic.
We show that there is a common way to deal with these several deductive quantitative capabilities, involving a framework based on Linear Algebra and Linear Programming. The distinction between classical and non-classical reasoning on the one hand, and probabilistic and cardinality reasoning on the other hand, comes from the different family of algebras employed. The quantitative logic systems also allow for the introduction of inconsistency measurements, which quantify the degree of inconsistency of a given quantitative logic theory, following some basic principles of inconsistency measurements.
On the computational level, we aim at exploring quantitative logic systems in which the complexity of reasoning is “only NP-complete”. We provide open-source implementations for solvers operating over those systems and study some notable empirical properties, such as the presence of a phase transition. After a general introduction of the research program, we will detail the classical probabilistic reasoning engine.
21.05.2020, 17:00, Nina Gierasimczuk, From grammar inference to learning action models. A case for Explainable AI. Technical University of Denmark
The endeavour of finding patterns in data spans over many different fields, from computational linguistics to self-driving cars. Machine learning is undeniably a great source of progress across the board. In my talk I will resurrect the old way of thinking about learning, long the lines of symbolic artificial intelligence. I will report on a recent work about inferring action models from descriptions of action executions. Such a framework constitutes an informed, reliable and verifiable way of learning, in which the learner can not only classify objects correctly, but can also report on the symbolic representation she bases her conjectures on. It is nice that the ability for AI to “explain itself” in such a way is nowadays a growing demand in the community. My action models are those of Dynamic Epistemic Logic, and the methodology is automata-theoretic in spirit.
27.05.2020, 17:00, Anthia Solaki, Bridging Epistemic Logic and Resource-Bounded Human Reasoning. University of Amsterdam
Standard epistemic logic suffers from the problem of logical omniscience: its agents are perfect reasoners with unlimited inferential power. We argue that, even from a normative point of view, this modelling is inadequate for it sets requirements that cannot be actually attained, according to solid empirical evidence on performance of humans in reasoning tasks. We therefore design a non-standard framework in line with experimental findings about the cognitive effort involved in deductive reasoning. Inspired by Dynamic Epistemic Logic, we work with dynamic operators denoting explicit applications of inference rules in our logical language. Our models are supplemented by (a) impossible worlds (not closed under logical consequence), and (b) quantitative components capturing the agent’s cognitive capacity and the cognitive costs of rules with respect to certain resources (e.g. memory, time). These ingredients allow us to avoid problematic logical closure principles and at the same time account for (limited) reasoning steps. We further show that our models can be reduced to awareness-like structures; this allows for useful technical results like a sound and complete axiomatization. We finally discuss how this approach can be combined with actions of external information, and result in a more cognitively plausible modelling of group reasoning as well.
11.06.2020, 17:00, Rasmus Rendsvig, University of Copenhagen