# Hebbian Learning | Cognitive Science And AI | BSc.CSIT (TU) | Fourth and Fifth Semester

Download our Android App from Google Play Store and start reading Reference Notes Offline.

## Hebbian Learning | Hebb’s Algorithm Subject: Cognitive Science And AI | BSc.CSIT (TU) Fourth And Fifth Semester | Tribhuvan University

Hebbian Learning
The oldest and most famous of all learning rules is Hebb’s postulate of learning:

“When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part in firing it, some growth process or metabolic changes take place in one or both cells such that A’s efficiency as one of the cells firing B is increased”

From the point of view of artificial neurons and artificial neural networks, Hebb’s principle can be described as a method of determining how to alter the weights between model neurons. The weight between two neurons increases if the two neurons activate simultaneously—and reduces if they activate separately. Nodes that tend to be either both
positive or both negative at the same time have strong positive weights, while those that tend to be opposite have strong negative weights.

Hebb’s Algorithm:
Step 0: initialize all weights to 0
Step 1: Given a training input, s, with its target output, t, set the activations of the input units: xi = si
Step 2: Set the activation of the output unit to the target value: y = t
Step 3: Adjust the weights: wi(new) = wi(old) + xiy
Step 4: Adjust the bias (just like the weights): b(new) = b(old) + y

(Visited 109 times, 1 visits today)

Posted By : | Comment RSS | Category : Fifth Semester, Fourth Semester
Subscribe Notes
Enter your email address:
Check your Email Inbox after submitting.
Community | Toolbar | Android App | Founder/Developer : Hari Prasad Chaudhary | CSIT Portal Manager : Digvijay Chaudhary