On Data Mining and Classification Using a Bayesian Confidence Propagation Neural Network

(only summary pdf), (thesis cover pdf)

Author: Roland Orre

Abstract

The aim of this thesis is to describe how a statistically based neural network technology, here named BCPNN (Bayesian Confidence Propagation Neural Network), which may be identified by rewriting Bayes' rule, can be used within a few applications, data mining and classification with credibility intervals as well as unsupervised pattern recognition.

BCPNN is a neural network model somewhat reminding about Bayesian decision trees which are often used within artificial intelligence systems. It has previously been successfully applied to classification tasks such as fault diagnosis, supervised pattern recognition, hiearchical clustering and also used as a model for cortical memory. The learning paradigm used in BCPNN is rather different from many other neural network architectures. The learning in, e.g., the popular backpropagation (BP) network, is a gradient method on an error surface, but learning in BCPNN is based upon calculations of marginal and joint probabilities between attributes. This is a quite time efficient process compared to, for instance, gradient learning. The interpretation of the weight values in BCPNN is also easy compared to many other network architechtures. The values of these weights and their uncertainty is also what we are focusing on in our data mining application. The most important results and findings in this thesis can be summarised in the following points:


Roland Orre

Addendums

(Python code to Monty Hall problem page 4)
Special thanks to:
Dean S Horak who wrote a simulation in Java, and
Curt Welch who wrote a simulator in python, which inspired me to try on my own but using the set type in python.
Thanks also to Dean S Horak for adding the code to the pythonfiddle.

Last modified: Thu Sep 22 16:10:00 CEST 2016