DOC PREVIEW
UW-Madison ECE 539 - Associative Memories A Morphological Approach

This preview shows page 1-2-3-4-5-6 out of 17 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 17 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 17 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 17 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 17 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 17 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 17 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 17 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Associative MemoriesA Morphological ApproachbyJason M. CareySemester Project: CS53919 December 20030Table of ContentsIntroduction 2Motivation 2Morphological Associative Memories 3Work Performed 5Experiment 5Data Collection 6Data Setup for Associative Memory 6Training the Associative Memory Models 7Testing the Associative Memory Models 8User Application 9Results 9Effects of Distortion and Memory Size on Average Recall Rate 9Effects of Memory Size and Letter Font on Average Recall Rate 11Discussion of Results 12Conclusion 13References 14Appendix: Tables of Results 15Results of Memory Performance with 5 Images 15Results of Memory Performance with 10 Images 15Results of Memory Performance with 26 Images 16Results of Memory Performance with 52 Images 171IntroductionMotivationThroughout the course of evolution, humans have acquired the ability to retrieve information from applied associated stimuli. Amazingly, this ability is not hindered by the perturbation of the original information-stimuli pair (i.e. recalling one’s relationship with another after not seeing them for several years despite the other’s physical changes such as aging, growing a beard, etc.). How the human brain efficiently organizes and stores such vast amounts of information and is able to recall it given partial or incompletestimuli has brought forth much interest. Specifically, researchers have developed the theoretical neural network model of the associative memory. One of the earliest variants of this model was the linear associative memory. Despite its simplistic formulation, the memory was severely limited in flexibility, robustness, and capacity. The primary defectsinherent in the memory were its requirement for orthogonality between stored patterns and the condition for perfect recall only under negligible pattern distortion and a constrained memory capacity of no more than the length of the memory [1].In recent years, these limitations have been improved by the development of the fully connected, feed-forward Hopfield network. Using this model, the capacity of the associative memory has been shown to bennlog2, where n is the length of the memory [3].Although the memory capacity of this model is less than that of the linear associative model, there are no necessary conditions placed on the input for perfect recall and, moreover, given substantial input distortion recall is still possible. Today, the Hopfield network is one of the most popular methods for binary associative memories [2]. Unfortunately, even though the Hopfield model as an associative memory drastically improved the performance of the linear associative memory, the limited capacity of this network still made associative memories impractical. This paper will attempt to push the limits of associative memories still further by using the radically different model of the morphological neural network (MNN). Historically, neural networks have assumed the biological phenomena that the strength of2the electric potential of a signal traveling along an axon is the result of a multiplicative process where the mechanism of the postsynaptic membrane of a neuron adds the variouspotentials of electrical impulses to determine activation [4]. However, MNNs are formulated on the belief that the strength of the electric potential is an additive process where the postsynaptic membrane only accepts signals of a certain maximum strength [4]. Mathematically, this causes a change from the historically used ring structure consisting of operators: addition and multiplication to the ring structure using operators: addition and minimum/maximum. Ultimately, the power of MNNs as a model for associative memories lies in their ability to store as many patterns as can be represented (2n in the binary case) with the ability to recall partial or incomplete patterns in one step convergence [5].Morphological Associative MemoriesAn associative memory is a system such that when given input x produces output y. That is, the memory associates x with y. An auto-associative memory is an associativememory such thatxy . Furthermore, a binary associative memory is an associative memory containing strictly binary values. Using neural networks, associative memories are able to recall the desired information given partial or incomplete inputs [5]. The following is a brief formulation to the morphological associative memory taken from [5]. Morphological associative memories require the basic morphological operations of max product and min product. The max product,pxnnxpBAC , is defined for matrices A and B as:kjikpkijbac 1,where is the maximum operator. Similarly, the min product,pxnnxpBAC , is defined for matrices A and B as: kjikpkijbac 1,where is the minimum operator. Using these operations, a morphological associative memory can be constructed using two separate memories,MandW. Here, memoryM3is used for input patterns that are dilated versions of a trained pattern. Informally, dilation means to expand the image. That is, for a black and white image, dilation would cause the image to contain more black pixels. Memory,Massociating input pattern matrix X with output pattern matrix Y, is defined as follows:,)(1TkXYxyM where xi and yi are the ith patterns of the pattern matrices X and Y,


View Full Document

UW-Madison ECE 539 - Associative Memories A Morphological Approach

Documents in this Course
Load more
Download Associative Memories A Morphological Approach
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Associative Memories A Morphological Approach and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Associative Memories A Morphological Approach 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?