[PDF]

Neuro-symbolic approaches for Abstract and Relational Visual Reasoning Tasks


Zhiliang Xiang

16/09/2022

Supervised by Víctor Gutiérrez Basulto; Moderated by Beryl Noe

Recently, there has been a lot of progress on basic vision tasks using neural approaches. However, it is still a big challenge the development of approached for tasks combining vision with abstract and structural reasoning. A main question from this research is whether pure deep learning is the ultimate way of doing abstract visual reasoning [1].

In another direction, there has been an increasing interest in the development of AI systems that combine the strengths of neural systems (e.g. dealing with noisy data) with those of symbolic systems (e.g. structural reasoning). Two prominent examples of neuro-symbolic systems are NeurASP and DeepProbLog.

The main aim of this project is to explore to which extend these systems can be used to solve abstract visual reasoning tasks such as Raven's Progressive Matrices. We would like to understand if these systems can be directly applied to this type of problems. In particular, one of the objectives is to develop a first prototype (even for a restricted version of the problem) and identify what are the challenges to develop neuro-symbolic systems to address this type of tasks.

*Requirements This is a research oriented project, which requires having general knowledge from both symbolic and sub-symbolic AI. The student should be able to self-learn techniques needed to tackle this problem, e.g. DL techniques for vision

(I) The code must be released under MIT Licence https://opensource.org/licenses/MIT- The documentation must be released under CC-BY https://creativecommons.org/licenses/by/4.0/

(II) Mathematical maturity and knowledge on knowledge representation and/or logic programming and foundations of Machine Learning. The student will need to do an in-depth literature review, which involves notions from logic programming, different neural networks and their application to tasks on vision, reasoning, etc.

(III) Learn how to use ARCCA computing facilities


Final Report (16/09/2022) [Zip Archive]

Publication Form