Posts

Showing posts from March, 2020

Dynamic Weight Matrix Neural Network

I have a neural netwok that takes in a random noise vector and maps it explicitly to an image. If i was to do this with one training example , it would memorize the example. if i increase the number of examples, it starts to unlearn the previous examples in order to learn the new ones. Think of the equation of a straight line. y = mx + c where y is the image, x is the random noise vector, m is the weight matrix and c is the bias term so with the first example we can change m the weight matrix, through training, till we fit x to y if we took another training pair we would need to change m again and so unlearn the first example. we will have to do it another way thsi will involve having two weight matrices. W and a soft copy of W called W2 W will b the template weight matrix. and W2 will be made by swapping around the indeces without affecting their memory allocation as this is a soft copy of W What this will mean is that on the graph we have a family of straight lines, whose gradie