Home > Computer science essays > Global Optimization

Essay: Global Optimization

Essay details and download:

  • Subject area(s): Computer science essays
  • Reading time: 3 minutes
  • Price: Free download
  • Published: 22 September 2015*
  • Last Modified: 23 July 2024
  • File format: Text
  • Words: 746 (approx)
  • Number of pages: 3 (approx)

Text preview of this essay:

This page of the essay has 746 words.

Global optimization uses that sort of strategies that allow the designer to discriminate between global optimum point and several local optimum points in a specific area under investigation. Global optimization problem generally follows unconstrained optimization i.e., without any bounds.
Conventional global optimization methods can generally be classified into two types which are:
Deterministic methods
Stochastic methods
An outline of the design in which efficiency is most elevated is not generally the best plan on the grounds as there are numerous angles to be considered in motor design procedure including simple assembling, reliability, dependability etc. Generally discovering an acceptable approach to consider all these components into a single objective function is extremely troublesome, so a powerful multi-objective optimization algorithm is needed.
There are indeed few issues to develop multi-objective optimal configuration for a particular machine. The first and foremost reason is that all geometrical constraints such as lower bounds, upper bounds and behavioral considerations can’t be considered. Additionally, the time-consuming analyzes such as transient temperature rise cannot be performed in the course of optimization process because of time cost.
We, in this manner, utilize just the most relevant parameters in building the objective function and apply strategies [29] that distinguish different variables in the objective function by locating local as well as global optima. That is to say, the planner can utilize numerous criteria and his experiences to select the best design among generated solutions.
Two particular algorithms employed in this course of thesis are
Genetic Algorithm (ga)
Simulated Annealing
While general optimization algorithms have the capacity to converge to the highest one and discard others, genetic algorithms and simulated annealing can identify multiple optimal profile.
Genetic Algorithm (ga)
Genetic Algorithm is one of Artificial Intelligence’s methodology that uses computer to simulate the nature’s methodology of development and selection. Genetic Algorithms are used to solve issues which are ill-behaved, discontinuous and non-differentiable for variety of other techniques available. Since the rise of genetic algorithms was in the late 1970s, global optimizations has been one of the real targets, and a ton of exertion has been dedicated to create effective algorithmic models to handle global optimization issues and the one being developed is simulated annealing.
How Genetic Algorithms Work
As we now know genetic algorithmic calculations are focused around the methodology of natural selection, this implies they take the essential properties of natural selection and apply them to whatever issue it is we’re attempting to understand. The basic process for a genetic algorithm is:
Initialization – Create a random initial population (array of individuals) [30]. This population is normally arbitrarily produced and could be any size, from just a couple of individuals to thousands. ‘PopulationSize’ tags what number of individuals there are in each generation. With a huge ‘PopulationSize’, the genetic algorithm looks for the result space all the more completely, consequently lessening a risk that the calculation will return neighborhood minima that is not global minima.
Evaluation – Each individual of the population is then assessed and we ascertain a “fitness value” for this individual in the generation. The fitness value is computed by how well it fits with our wanted prerequisites.
Selection – We need to be continually improving our population’s fitness function value. Selection helps us to do this via disposing of the bad designs and just keeping the best individuals among the generations in the population by considering few selection methods. Some individuals, named as elite individuals [30], in the current generation that have best fitness value to minimize the losses are selected for next generation automatically.
Crossover – During crossover we create new individuals by combining properties of our chosen individuals.
Mutation – We need to add a little bit randomness into our populations’ genetics else each blend of results we make would be in our introductory population. Mutation typically works by introducing little improvements to parents.
And repeat! – Now we have our next generation of individuals, we can start again from step 2 till we have an end condition.
The genetic algorithm over and again adjusts a population of individuals. At each one stage, the genetic algorithm chooses parents from existing individuals in the generation and uses them to produce children by crossover and mutation processes in the next generation until an optimal solution is found. The algorithm stops when stopping criteria (stall generations) is met, which is the point at which the normal relative change in the fitness function value over stall generations is short of tolerance function value, whose default quality is 1e-6.

About this essay:

If you use part of this page in your own work, you need to provide a citation, as follows:

Essay Sauce, Global Optimization. Available from:<https://www.essaysauce.com/computer-science-essays/essay-global-optimization/> [Accessed 22-04-26].

These Computer science essays have been submitted to us by students in order to help you with your studies.

* This essay may have been previously published on EssaySauce.com and/or Essay.uk.com at an earlier date than indicated.