Johns Hopkins EN 600 446 - Computer-Integrated Surgery II Critical Paper Review

Unformatted text preview:

Sam ClantonComputer-Integrated Surgery IICritical Paper ReviewThis short report will discuss the works Genetic Learning for Adaptive Image Segmentation i., as well as Genetic Algorithms in Search, Optimization, and Machine Learning ii . The former book deals with genetic search in the optimization of parameters for computer vision algorithms. The second explains some of the mathematical foundations of genetic algorithms in greater detail, and presents some of their other applications in industry and research. Genetic algorithms are essentially used in optimization problems, usually when the function to be optimized is nonlinearly and unpredictably dependent on a number of different “tuning” parameters which are input to the function, as well as the problem input,. They are useful when optimizing functions with a large number of parameters, when the optimization of these parameters n would be an impractical exhaustive search inn-dimensional space. They are especially useful when building adaptive optimization systems, that is when the algorithm not only attempts to optimize on inputs given to it, but also modifies its own optimization procedure based on trends in its input over time. This kind of problem is referred to as the k-armed bandit problemiii, which is a completely bizarre name for a problem involving a slot machine with k arms. This machine pays off a certain amount a when an arm is pulled, with a variance v. The problem is to maximize the payoff of some number of pulls, some number significantly greater than k. This problem involves a sort of gamble on each pull, in that the operator does not know if he has found the highest-payoff arm on each pull, due to the inherentvariance in the payoff for each. At what point should the operator decide that he has the best arm and keep pulling it? It is this kind of trade-off of knowledge exploration and exploitation of that knowledge for maximization of some kind of gain which genetic algorithms are well adapted to. Genetic algorithms operate on the theory of natural selection, in this case “nature”being replaced by fitness in terms of desired output of some kind of function. The function of natural selection involves a number of syntactic operations upon string representations of algorithm parameter setsiv, perhaps somewhat analogous to the operation of the cellular reproductive apparatus upon DNA (nature’s encoding mechanism). These algorithms operate on subsequent generations of parameter sets within a “population” of such sets, in an effort to obtain an optimally tuned population forinput similar to input presented thus far to the system. For a given input, the system matches current known optimal parameter sets within the population to a given input, then “reproduces” and “mutates” the parameter sets in an attempt to produce even more optimal output. This works by joining together some of the more optimal parameter sets, letting some of the parameters “cross over” from each to produce unique “offspring”, which are added to the population, much how genes cross over in biological life. During some reproduction cycles, some members of the population are also allowed to “mutate” such that their parameter sets randomly change, hopefully improving the fitness of that individual in the population. Given sufficient input, genetic optimization techniques tendto produce adapted systems much quicker than other search techniques such as simulated annealing and random search. With a large population of parameter sets, it is possible to create systems which are adapted to a large variety of inputs.In problems involving image segmentation, Bhanu and Lee have shown that genetic techniques can prove tremendously useful in creating adaptive segmentation systems. For these sorts of problems, statistical data on certain qualities of images (e.g. representations of histograms of image hue, saturation, etc.) form the input to the genetic optimization system. The system optimizes a group of parameters which control the operation of a segmentation algorithm. These parameters include such things as threshold values, smoothing constants, etc. The system optimizes these parameters to produce the highest quality output possible with a given segmentation system. With a large number of parameter sets resident in a population, the system can “pick out” the sets closest to that matching a given input, test the segmentation algorithm with those sets, and then apply genetic operators to those population members to produce offspring which perhaps are even better adapted to segmentation of the image. These members are tested, and the best overall segmented image is supplied as output. If the new parameter sets have worked better than those used before, they are added to the population. In this way, there is a balance between learning and “paying off” (producing good output) whichleads to highly adaptive systems. It is interesting to note that there is very little going on with the genetic algorithmsthemselves; there is nothing very complex about the methods employed in their operation. Although two longer books were used for this report, the meat of how genetic algorithms were employed took up perhaps a single chapter within each book, the remainder of the material dedicated to proving the algorithms’ effectiveness and showing their different applications. The operation of these algorithms is surprisingly simple and easy to learn.My own impression of genetic algorithms is that the use of “genetic” algorithms which are supposed mimic the way in which biological systems work is a bit misleading. There is a big difference between the genetic operations on actual genes and those that operate on parameters. In a biological system, genetic operations substitute different amino acids in the construction of proteins, the position of mutation or crossover corresponding to the changing of protein structure at the position of a particular amino acid. In an optimization problem, the crossover and mutation operate on (usually binary)representations of the numbers themselves. Mutation and crossing-over at random locations within the representation of each number would have a much different effect based on the position of crossover, due to the increasing value of each bit as one moves tothe left in a little-endian binary number. The farther left that a bit is changed in a


View Full Document

Johns Hopkins EN 600 446 - Computer-Integrated Surgery II Critical Paper Review

Download Computer-Integrated Surgery II Critical Paper Review
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Computer-Integrated Surgery II Critical Paper Review and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Computer-Integrated Surgery II Critical Paper Review 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?