I am trying to implement a hopfield net to learn patterns.

This is what I current have....

I am current having it create a random number of patterns based on the number of neurons

Next I am trying to make a weight matrix

Should I have a seperate weight matrix for every or is one good enough to apply the hebbs learning rule to? One should be enough since I know based on nodes how many it can recognize based on following formula: N/4 * log N, where N is numofNodes. it is log base 2?

Thanks.

#include <iostream>
#include <time.h>
#include <math.h>

using namespace std;

void randomlyCreatePatterns(int numOfNeurons, int numOfPatterns, int* pattern);
void createRandomWeightMatrix(int numOfNeurons, int numOfPatterns, int* weightmatrix, int* pattern);


int main (int argc, char *argv[])
{ 
    cout<<"Hopfield Artificial Neural Network (ANN):"<<endl<<endl;
    srand(time(NULL)); // use current time to seed random number generator
    int numOfNeurons = 5; // size of each pattern = number of neurons
	int* pattern = NULL;
	int numOfPatterns = 0;
	int* weightmatrix = NULL;
    //int columns,rows,k,sum;

    // Create random pattern matrix to learn, Each row is a separate pattern to learn (n bits each).
    cout<<"Training patterns:"<<endl<<endl;
    // max capacity (number of patterns it can learn) of Hopfield network N/4 * log(N)
    numOfPatterns = static_cast<int>((numOfNeurons/4)*(log10((double)numOfNeurons)/log10((double)2))); // number of patterns (rows)
	cout<<"Number of Patterns being stored is "<<numOfPatterns<<endl;
    pattern = new int[numOfPatterns * numOfNeurons];
	randomlyCreatePatterns(numOfNeurons, numOfPatterns, pattern);

	//Create and Print Out Weight Matrix
	weightmatrix = new int[numOfNeurons * numOfNeurons];
	createRandomWeightMatrix(numOfNeurons, numOfPatterns, weightmatrix, pattern);
}

void randomlyCreatePatterns(int numOfNeurons, int numOfPatterns, int* pattern){
	int rows, columns;
	for(rows = 0; rows < numOfPatterns; rows++){     // rows
		for(columns = 0; columns < numOfNeurons; columns++){ // columns
            pattern[rows * numOfNeurons + columns]=rand()%2;
            cout<<pattern[rows * numOfNeurons + columns];
        }
        cout<<endl;
    }
    cout<<endl;
}

void createRandomWeightMatrix(int numOfNeurons, int numOfPatterns, int* weightmatrix, int* pattern){
	// calculate the weight matrix (symmetric and square)
    // w[i,j]=w[j,i] & i!=j (i==j => w=0)
	int rows, columns, k;
    for(rows = 0; rows <numOfNeurons ; rows++)
        for(columns = rows; columns<numOfNeurons; columns++)
            if(rows==columns)
                weightmatrix[rows*numOfNeurons+columns] = 0;
            else
            {
                //create a weight matrix for each pattern
				int ran = rand()%3;
                for(k=0;k<numOfPatterns;k++)
                weightmatrix[rows*numOfNeurons+columns] = ran;
                weightmatrix[columns*numOfNeurons+rows] = 0;
            }

    // print the weight matrix
    cout<<"The weight matrix:"<<endl<<endl;
    for(rows = 0; rows < numOfNeurons; rows++)
    {
        for(columns = 0; columns < numOfNeurons; columns++)
            printf("%2d ",weightmatrix[rows*numOfNeurons+columns]);
        cout<<endl;
    }
    cout<<endl;
}

Recommended Answers

All 5 Replies

For the record, I'm not an expert on Hopfield nets (but have dealt with other types of ANN).

You are right, you only need one matrix of weights if you have less than the maximum number of patterns for a given net. The Hebbian rule is quite trivial to implement if you look at this link.

Remember also that basically, you need the patterns to be reasonably "uncorrelated" in order to represent them uniquely with the net. This means that generating them at random might not be the greatest idea. Maybe you should test a pattern to see if it is not correlated too much with the other patterns before you add it to the set (if the Hamming distance between two patterns is to low, you might want to generate a new one). Another way to look at this is through the entropy of the set of patterns (with base 2, the entropy is equal to the number of truly distinct patterns in the set).

Trying to implement the learning algorithm but running into problems. How do I skip the value stored since I do not want to do calculations with it?

void Learn(int* weightmatrix, int* pattern, int numOfPatterns, int numOfNeurons){

	int skip[5]; //true means skip
	for(int patternNum = 1; patternNum <= numOfPatterns; patternNum++)
	{
		for(int neuronNum = 1; neuronNum <= numOfNeurons ; neuronNum++)
		{
			skip[5] = {false, false, false, false, false}; //reset
			skip[neuronNum-1] = true; 
			int sum = 0;
			for(int numOfConnectionsToANeuron = 1; numOfConnectionsToANeuron< (numOfNeurons - 1); numOfConnectionsToANeuron++)
			{
				for(int skipCounter = 0; skipCounter < numOfNeurons; skipCounter++)
				{
					if(skip[skipCounter] == false)
					{
						sum = (pattern[(neuronNum-1)* patternNum]* weight[]) + sum;
					}
					else if(skip[skipCounter] == true)
					{

					}
				}
			}
		}
	}
}

Wait a minute... what learning algorithm are you actually implementing? Your code looks like nothing I can recognize. Are you trying to implement this formula:

T_{ij}^{\rm new} = T_{ij}^{\rm old} + M_i^{\rm new}M_j^{\rm new}

This formula is a simple sum. You need to first initialize all weights to zero. Then you have three nested loops (M is the number of patterns and N is the number of neurons):

for each pattern m of M
  for each neuron i of N
    for each weight j of N
       w[i*N+j] += p[m*N+j]*p[m*N+i];

And that's it.. hence my earlier comment about this implementation being trivial.

If it is another algorithm you are implementing, please post the details or at least a link to a page describing it.

thanks I got it to work.

i am only filling in the top half of my my weight matrix.... so if the sum is greater than 0 I change the neuron to the oppossite otherwise i leave it the same?

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.