Gary Smith
To fluidly, appropriately, and flexibly interact with other agents, both artificial and human, an autonomous system needs to infer and reason about the internal states of those other agents. This includes reasoning about the goals and plans of the other agents in the system; tracking beliefs and inferring the perceptual states of other agents.
We have identified statistical relational artificial intelligence (StarAI), specifically the probabilistic logic programming language Problog, as a powerful framework that could potentially capitalise on the strengths of different approaches. We propose to use Problog to develop a system that can reason about the internal states of other agents. More specifically, we intend to build agent architectures that can perform intent recognition, manage a knowledge base that represent the beliefs of multiple agents, recognise possibly inaccurate beliefs, and update it's knowledge base given trust parameters and information gained from intent recognition.