Probabilistic Logical Models for Large-Scale Hybrid Domains
Modeles probabilistiques logiques hybrides
Résumé
Statistical relational learning formalisms combine first-order logic with proba-
bility theory in order to obtain expressive models that capture both complex
relational structure and uncertainty. Despite the significant progress made in
this field, several important challenges remain open. First, the expressivity
of statistical relational learning comes at the cost of inefficient learning and
inference in large-scale problems that contain many objects. Second, while many
real-world relational domains are hybrid in that they contain objects that are
described by both continuous and discrete properties, little attention has been
paid to learning from such data. Third, most formalisms ignore the dynamic
nature of real-world problems by considering only the static aspects captured
by a single snapshot of time in the dynamic process.
This thesis tries to tackle these shortcomings and makes the following
four contributions. First, we propose a graph-sampling based approach that
approximately counts the number of pattern occurrences in the data, which
enables scaling up parameter learning of statistical relational models. Second,
we propose a novel statistical relational learning formalism that models hybrid
relational domains. Third, we designed the first structure learning algorithm
that is able to learn hybrid relational models. Fourth, we adapted our algorithm
to learn temporal dependencies present in the data. We demonstrate the utility
of our approaches on several challenging applications, such as planning in a
real-world robotics setup, and learning from financial and citation data.
Ce these recherche des modeles probabilistiques logiques hybrides.
Pour plus de details, voir l'abstract en anglais.
Loading...