Вы находитесь на странице: 1из 38

Ontology, ontologies and ontological reasoning 3: ontological reasoning Breuker Joost

Leibniz Center for Law University of Amsterdam

Overview

  

Semantic Web and OWL Use of ontologies Reasoning with ontologies


 TRACS: testing the Dutch Traffic Regulation  HARNESS: DL-based legal assessment

 

Frameworks and the limits of DL based reasoning Problem solving and reasoning

What the Semantic Web is intended for


Dream part 2 In communicating between people using the Web, computers and networks have as their job to enable the information space, and otherwise get out of their way. But doesnt it make sense to also bring computers more onto action, to put their analytic power to work. In part two of the dream, that is just what they do. The first step is putting data on the web in a form that machines can naturally understand, or converting it to that form. This creates what I call a Semantic Web -- a web of data that can be processed directly or indirectly by machines. [p191]

A decade later (W3C): infrastructural standards for SW

 

Semantics are represented by ontologies Ontologies are represented by a KR formalism


 Note: ontologies were specifications in KE (using Ontolingua; CML, cf UML in SE)  On top of a layer cake of data-handling formalisms  KR formalism is intended for reasoning
 Even suitable for blind trust (OWL-DL is decidable)

Legal ontologies (from Nuria Casellas, 2008/9)

Legal ontologies (from Nuria Casellas, 2008/9)

HOWEVER, in practice

 

Not one of these ontologies is used for reasoning Use:


 Information management (documents)
 That is also what the current Semantic Web efforts are about (not only in legal domains)

 Core ontologies (reuse?)

Why using OWL?

OWL: DL-based knowledge representation

 

OWL-DL is a unique result of 40 years of research in AI about KR Semantic networks, KL-ONE, CLASSIC, LOOM,..
 Concept oriented representation
 Very suitable for ontologies

 vs Rule based KR

End 80-ies: logical foundations


 A KR formalism defines what can be correctly and completely inferred
 On the basis of the semantics (model theory) of the formalism

 Problem: finding a sub-set of predicate logic that is decidable (and computationally tractable)

OWLs semantic web context

OWL

OWLs further requirements/problems

 

besides the expressivity/decidability trade-off The RDF layer (OO based) was a serious obstacle
 Its expressiveness was incompatible with DL research
Ian Horrocks, Peter F. Patel-Schneider, and Frank van Harmelen. From SHIQ and RDF to OWL: The making of a web ontology language. Journal of Web Semantics, pages 726, 2003.

   

No unique naming assumption USA/EU team: KR & KA community NB: improved expressivity in OWL 2! Still: OWL is not a self-evident for novices

Reasoning with OWL

Main inference: classification on the basis of properties of concepts


 Reasoner (inference engine) `classifier
  Complete for DL based Rule based reasoners are not complete  Eg Prolog, unless `closed world assumption  For the Web this assumption cannot hold!

T(erminolgy)-Box: ontology (knowledge)


 Classes (concepts, universals) and properties (relations, attributes, features,..)

A(ssertions)-Box: some situation (information)


 Individuals (instances) with properties

When is REASONING with ontologies indicated

 

(...except for consistency checking etc.) In understanding/modeling situations


 Situation = events and states of entities in space (and over time)

Two main modes


 Text understanding (stories, cases)
 Nb exc. expository discourse

 Scene understanding (robotics)

When is REASONING with ontologies required

When all possible situations have to be modeled


 Typical examples:
   Model based & qualitative reasoning systems Testing system models ..legal case assessment

 

All possible combinations completeness & consistency eg OWL-DL NB: in knowledge systems, situations are usually modeled implicitly in user-system dialogues:
 Asking user (values of/presence of) parameters  Heuristics; human limitations in handling combinatorics

For instance: TRACS (1990 1994)

Testing a new Dutch traffic code (RVV-90)


 art. 3 Vehicles should keep to the right  art. 6 Two bicycles may ride next to each other  art. 33 A trailer should have lights at the back

Questions
 Consistent?  Complete?  In what respect different from RVV-66 (old one)?

These can only be answered when we can model all possible situations distinguished by this code

Traffic participants:

a part of the ontology (`world knowledge)

traffic-participant pedestrian driver of motor vehicle driver

bicyclist

autocyclist

bus driver

lorry driver

car driver

motorcycle driver

Simple example of ontological reasoning

Ontology (T-Box)
 Subsumes (Physical_object, Car)  Right-of (Physical_object, Physical_object)  Inv(Right_of, Left_of)

ase description (A-Box)


 Is-a (car1, Car)  Is-a (car1, Car)  Right_of (car1, car2)

lassifier (eg Pellet)


 Left_of (car2, car1) (A-Box)

simple as that, but necessary

Architecture of TRACS
REGULATION KNOWLEDGE BASE

(Breuker & den Haan, 94)

WORLD KNOWLEDGE BASE REGULATION APPLIER SITUATION APPLICABLE RULES VALIDATOR CONSISTENTLY APPLICABLE RULES SITUATION GENERATOR

META-LEGAL KNOWLEDGE BASE

CONFLICT RESOLVER

TRESPASSED/ NON-TRESPASSED RULES

Btw: some surprising results

Tram on tramway Car on bicycle lane

Just a prototype

About 105 possible combinations


 Analysis of redundancy (symmetry)  Still: too many for humans to inspect!  But:
  Differences with old regulation Differences with foreign regulations ( ontology the same?)

political decisions

HARNESS: OWL 2 DL also for normative reasoning

 

Normative reasoning simultaneously with ontological reasoning using OWL-DL Estrella, 6th framework, 2006-2008
 http://www.estrellaproject.org/

Saskia van de Ven, Joost Breuker, Rinke Hoekstra, Lars Wortel, and Abdallah El-Ali. Automated legal assessment in OWL 2. In Legal Knowledge and Information Systems. Jurix 2008: The 21st Annual Conference, Frontiers in Artificial Intelligence and Applications. IOS Press, December 2008. Andrs Frhcz and Gyrgy Strausz, Legal Assessment Using Conjunctive Queries, Proceedings LOAIT 2009

Representing norms in OWL DL 2

Norm
 Generic case description is a conjunction of generic situation ( ) descriptions  Generic case description is a class (7)  A deontic qualification (P,O,F) is associated with 7

Case description is an individual (C)


 Description is itself composed of classes/individuals as defined in the ontology!

Watch this.

What did you see?

   

Event 1 (Saskia entering, shows ID) Event 2 (Joost entering, shows ID) Event 3 (Radboud entering) (nb : Radboud is president of Jurix)

JURIX 2009 Regulation

1. For entering a U-building, identification is required 2. The President does not need an identification to enter a U-building

Generic situations and generic case

1. For (entering a U-building), (an identification) is required 2. The (President) does not need (an identification) to (enter a U-building) Step 1: Modeled as: (1) (1)

1|

2|

2(

~ 3)

Generic situations and generic case

1. For (entering a U-building), (an identification) is required [for each person] 2. The (President) does not need (an identification) to (enter a U-building) Step 1: Modeled as: (1) (1)

1|

2|

2(

~ 3)

Step 2: Adding deontic qualification to the norms

Permitted( 1) | (1) Obliged( 1) Forbidden( 1) |

2(~

3)

(this is a `design pattern which separates conditions (person, entering and identity from a forbidden generic case) (2) Permitted( 2) |
4 2(

~ 3)

President

Step 3 Normative assessment: classifying C(ase)

 

Case: President Radboud enters U-building


 C: {s4,s2}

Classifying C:
 1 subsumes 2 exception  C is Disallowed-by 1  C is Allowed-by 2  Etc, etc This is not viewed as a logical conflict by Pellet due to the fact that this individual is classified by two different norms (classes)

HARNESS selects subsumed ( 2)

Experimental user interface (Protg plug-in)


Compliance Violation

An important advantage

Three knowledge bases:


 domain ontology (T-Box)  norms (T-Box)  case description (individuals & properties; A-Box)

OWL-DL reasoner (Pellet) `classifies case in terms of concepts and of norms simultaneously in an intertwined fashion Hybrid or only-rule-based solutions cannot preserve all (inferred) information of the ontology as Pellet/OWL 2 does

Knowledge, ontology and meaning

There is more to knowledge than ontology


 Ontology (terminology) provides the basic units for understanding  Regular combinations: patterns of concepts
  Scripts & frames: experience, heuristics, associations Meaningful experience can only be based upon understanding!

 Synthetic learning vs (further) abstraction

Whats further new

Monotonic and deductive:


 Unique & against accepted wisdom  Exceptions do not lead to conflict

Advantage:
 Reasoning is sound and complete (trust)
 No rule formalism allows this with the same expressiveness

 Full use of OWL 2 DLs expressiveness  No loss in translation

Disadvantage:
 Modeling in DL is found to be more intellectually demanding than modeling in rules anyway
 Obligation design pattern is not very intuitive

A serious problem in the use of (OWL-) DL

 

DL representations are `variable free


 (most) rule formalisms have variables

Moreover: in OWL names of individuals are not taken as identifiers of individuals (no unique naming assumption) It is not possible to track changes of a particular individual
 A-Box: colour(block1,red); colour(block1,blue)  OWL: there are (now) two block1s!

A serious problem in the use of (OWL-) DL (2)

 

Also: it is (almost) impossible to `enforce identity of individuals in OWL Example: transaction

OWLs restriction on the form of graphs

`diamond of individuals

what OWL allows: trees

An approximate solution: a special design pattern

Constraining the identity:

Rinke Hoekstra and Joost Breuker. Polishing diamonds in OWL2. In Aldo Gangemi and Jrme Euzenat, editors, Proceedings of the 16th International Conference on Knowledge Engineering and Knowledge Management (EKAW 2008), LNAI/LNCS. Springer Verlag, October 2008.) Rinke Hoekstra. Ontology Representation - Design Patterns and Ontologies that Make Sense, volume 197 of Frontiers of Artificial Intelligence and Applications. IOS Press, Amsterdam, June 2009.

The DL view has a limited scope

 

Excellent for axiomatic grounding of the terms that form the lowest level of granularity of a knowledge base More complex knowledge structures (frameworks) will require also rules `Hybrid solution:

In the hybrid approach there is a strict separation between the ordinary predicates, which are basic rule predicates and ontology predicates, which are only used as constraints in rule antecedents. Reasoning is done by interfacing an existing rule reasoner with an existing ontology reasoner

Problem: rule formalism has to be `DL-safe


 OWL/rule combination still (W3C) research issue

Frameworks

Вам также может понравиться