*** Welcome to piglix ***

Boltzmann machine


A Boltzmann machine is a type of recurrent neural network and Markov Random Field invented by Geoffrey Hinton and Terry Sejnowski in 1985. Boltzmann machines can be seen as the , generative counterpart of Hopfield nets. They were one of the first examples of a neural network capable of learning internal representations, and are able to represent and (given sufficient time) solve difficult combinatoric problems. They are theoretically intriguing because of the locality and Hebbian nature of their training algorithm, and because of their parallelism and the resemblance of their dynamics to simple physical processes. Due to a number of issues discussed below, Boltzmann machines with unconstrained connectivity have not proven useful for practical problems in machine learning or inference, but if the connectivity is properly constrained, the learning can be made efficient enough to be useful for practical problems

They are named after the Boltzmann distribution in statistical mechanics, which is used in their sampling function.

A Boltzmann machine, like a Hopfield network, is a network of units with an "energy" defined for the network. It also has units, but unlike Hopfield nets, Boltzmann machine units are . The global energy, , in a Boltzmann machine is identical in form to that of a Hopfield network:

Where:

Often the weights are represented in matrix form with a symmetric matrix , with zeros along the diagonal.


...
Wikipedia

...