© University of Kent - Contact | Feedback | Legal | FOI | Cookies
Inductive learning with corroboration
P. Watson
Technical Report 6-99, Computing Laboratory, UKC, May 1999.Abstract
The basis of inductive learning is the process of generating and refuting hypotheses. Natural approaches to this form of learning assume that a data item that causes refutation of one hypothesis opens the way for the introduction of a new (for now unrefuted) hypothesis, and so such data items have attracted the most attention. Data items that do not cause refutation of the current hypothesis have until now been largely ignored in these processes, but in practical learning situations they play the key role of {\em corroborating} those hypotheses that they do not refute.
We formalise a version of K.R. Popper's concept of {\em degree of corroboration} for inductive inference and utilise it in an inductive learning procedure which has the natural behaviour of outputting the most strongly corroborated (non-refuted) hypothesis at each stage. We demonstrate its utility by providing characterisations of several of the commonest identification types. In many cases we believe that these characterisations make the relationships between these types clearer than the standard characterisations. The idea of learning with corroboration therefore provides a unifying approach for the field.
Download publication 113 kbytes
Bibtex Record
@techreport{782, author = {P. Watson}, title = {Inductive learning with corroboration}, month = {May}, year = {1999}, pages = {182-196}, keywords = {determinacy analysis, Craig interpolants}, note = {}, doi = {}, url = {http://www.cs.kent.ac.uk/pubs/1999/782}, institution = {Computing Laboratory, UKC}, number = {6-99}, }