# Research

My research involves developing efficient algorithms and effective visualizations to work with uncertainty reasoning techniques. In the world of Artificial Intelligence (AI), there is an increasing need for new methods to address uncertainty, which is unavoidable in many real-world domains. To accommodate uncertainty and data imperfections intelligently, we need effective models to capture them and then we must propagate and reason with these data uncertainties.

The Dempster-Shafer (DS) evidence theory, also referred to as DS belief theory, is a powerful and convenient framework that can handle a wide variety of uncertainty and data imperfections. However computing the DS theoretic (DST) conditionals, a key operation in evidence updating and fusion, and even computing DST belief functions, are non deterministic polynomial-time hard (NP-hard) problems. This heavy computational burden has been the major challenge that the DS formalism has faced up to now. Below are some of my contributions, which significantly reduce the computational complexity associated with current DST reasoning and fusion strategies, along with a set of visualization tools to gain deeper insight into the DST operations. This constitutes an important step towards the utility of DST methods in real AI systems.

I completed my Ph.D. in August, 2019.
Dissertation: "A Framework for Efficient Implementation and Effective Visualization of Dempster-Shafer Belief Theoretic Computations for Reasoning Under Uncertainty"

### DS-Conditional-One and DS-Conditional-All models:

For both Fagin-Halpern and Dempster's conditionals, these two models provide algorithms to compute arbitrary conditional belief in $\mathcal{O}(2^{|\overline{A}|} \times 2^{|A\cap B|})$ ($\leq \mathcal{O}(2^{|\Theta|})$ ) and all conditional beliefs in $\mathcal{O}(max(2^{|A|} \times |A|, 2^{|\Theta|}))$ ($\leq \mathcal{O}(2^{|\Theta|} \times |\Theta|)$). They also provide algorithms to compute arbitrary Dempster's conditional mass in $\mathcal{O}(max(2^{|\overline{A}|},|A\cap B|))$ ($\leq \mathcal{O}(2^{|\Theta|})$ ), and all Dempster's conditional masses in $\mathcal{O}(2^{|\Theta|})$. Therefore the novel contributions in this work provide a seminal advancement in comparison with the well known specialization matrix approach for Demspter's conditional and Conditional Core Theorem approach for Fagin-Halpern conditional, which lead to the computational complexity of $\mathcal{O}(2^{|\Theta|} \times 2^{|\Theta|})$. As a clear example, when we consider an FoD of size 30 ($\sim$1 billion focal elements) the proposed framework can be used to compute all conditional masses or beliefs within 1 minute while the specialization matrix will take more than 1800 cpu years. To be exact, recently conducted experiments reveal that all the conditionals can be computed in 52 seconds. The space complexity of these algorithms is $\mathcal{O}(2^{|\Theta|})$, a significant improvement in contrast to the prohibitive $\mathcal{O}(2^{|\Theta|} \times 2^{|\Theta|})$ space complexity of the specialization matrix approach.
Paper which presents DS-Conditional-One:
A recent poster which summarizes both models:

### Consonant belief functions:

Linear time and space algorithm for computing all the Fagin-Halpern conditional beliefs generated from consonant belief functions. It significantly outperforms the exponential time and space complexity requirements of the brute force approach and the currently available conditional computation strategies. A complexity reduction from $\boldsymbol{\mathcal{O}(2^{n})}$ to $\boldsymbol{\mathcal{O}(n)}$.

### Visualization:

A sequence of scalable graphical illustrations for visualizing DST computations, which can be utilized to gain a deeper understanding of DST computations and to develop efficient algorithms.

### REGAP and data structures:

A novel generalized computational framework with three representations — DS-Vector, DS-Matrix, and DS-Tree — which allow DST computation to be performed in significantly less time. These three representations can also be utilized as tools for visualizing DST models. A new strategy, referred to as REGAP, which allows REcursive Generation of and Access to Propositions, is introduced and harnessed in the development of this framework and computational algorithms.
Data structures for DST implementations:

## Publications

Linear Time and Space Algorithm for Computing All the Fagin-Halpern Conditional Beliefs Generated from Consonant Belief Functions
L. G. Polpitiya, K. Premaratne and M. N. Murthi
The 32nd International FLAIRS Conference, Sarasota, Florida, May 2019

(Accepted, FLAIRS 2019)

Efficient Computation of Belief Theoretic Conditionals
L. G. Polpitiya, K. Premaratne, M. N. Murthi and D. Sarkar
10th International Symposium on Imprecise Probability: Theories and Applications (ISIPTA), Lugano, Switzerland, July 2017

A Framework for Efficient Computation of Belief Theoretic Operations
L. G. Polpitiya, K. Premaratne, M. N. Murthi and D. Sarkar
19th International Conference on Information Fusion (FUSION), Heidelberg, Germany, July 2016

## Recent Posters

Efficient Computation of Belief Theoretic Conditionals for Time Sensitive Uncertainty Reasoning Applications
L. G. Polpitiya, K. Premaratne, M. N. Murthi and D. Sarkar
First Annual Research Day, College of Engineering, University of Miami, September 2018

## Work in Progress

Real-Time Computation of Conditionals in the Dempster-Shafer Belief Theoretic Framework
L. G. Polpitiya, K. Premaratne, M. N. Murthi, S. J. Murrell and D. Sarkar
For submission to IEEE Transactions on Cybernetics

Data Structures for Real-Time Computation of Belief Theoretic Operations on Dynamic Frames
For submission to IEEE Transactions on Knowledge and Data Engineering

This work is based on research supported by ONR, NSF, RSMAS, NOAA and ECE-UM.