Click here to Skip to main content
15,893,588 members
Articles / General Programming / Algorithms

Introduction to Numerical Solutions

Rate me:
Please Sign up or sign in to vote.
4.89/5 (11 votes)
24 Feb 2012CPOL11 min read 30.4K   970   36  
An introduction to numerical solver algorithms with general purpose demonstration code.
///////////////////////////////////////////////////////////////////////////////
//
//  Solver.cs
//
//  By Philip R. Braica (HoshiKata@aol.com, VeryMadSci@gmail.com)
//
//  Distributed under the The Code Project Open License (CPOL)
//  http://www.codeproject.com/info/cpol10.aspx
///////////////////////////////////////////////////////////////////////////////

// Using.
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;

// Namespace
namespace SolverDemo
{
    /// <summary>
    /// Variate solver, supporting:
    ///    Paris Genetic, 
    ///    Direct Search, 
    ///    Slope Variate.
    /// 
    /// There are many other ways to solve things, steepest decent, 
    /// Linear and non-linear least squares, weighted linear and non-linear
    /// least squares. Those are good approaches but other people have written
    /// about them, and explored them extensively, these are generally very useful,
    /// fast, and reliable alternatives I like.
    /// 
    /// The Slope Variate solution is new, works to refine a solution, 
    /// discovered by Phil Braica, 2011.
    /// </summary>
    public class Solver
    {
        /// <summary>
        /// Error computation delegate.
        /// For a given set of parameters, compare against known data, compute 
        /// an error score where 0 = none, > 0 increasing in worseness.
        /// </summary>
        /// <param name="parameters">Parameters to try.</param>
        /// <returns>Error</returns>
        public delegate double ComputeError_Delegate(double[] parameters);

        /// <summary>
        /// List of parameters to variate.
        /// </summary>
        public List<Parameter> Parameters = new List<Parameter>();

        /// <summary>
        /// Step list.
        /// </summary>
        public List<Step> Steps = null;

        /// <summary>
        /// Best step to use.
        /// </summary>
        public Step BestStep { get; protected set; }

        /// <summary>
        /// Variate solver, paris style genetic algorithm.
        /// 
        /// An N dimensional sampling of what the errors are over a region 
        /// of N parameters is made, and the lowest is considered best. 
        /// The best then seeds the next generation.
        /// 
        /// It is up to the user to balance processing time vs. the likely smoothness
        /// of the error ... surface. For this reason the user selects the range 
        /// for each parameter and not every variation is tested, the "center" 
        /// maxStepsPerGeneration are used by sorting the possible things to try.
        /// 
        /// The approach is more imune to settling on a local minimum instead of the 
        /// best answer, but with each successive generation improves the answer
        /// more slowly than other approaches. Also if the user has set up the parameters
        /// just so, this approach may not converge to the ideal solution.
        /// 
        /// Some solutions converge best if the software focuses on certain parameters first,
        /// then refines the others later. That "order of operations" quality is missing 
        /// in this approach, which brings more "robustness" but causes less accuracy in 
        /// some cases.
        /// </summary>
        /// <param name="computeErrorFunction">Delegate to use.</param>
        /// <param name="maxGenerations">Number of generations</param>
        /// <param name="maxStepsPerGeneration">Maximum number of test cases per generation.</param>
        /// <param name="retestFactor">How much to expand search each generation, 1 = last step size, 1.5 = 1.5 * last step size.</param>
        /// <param name="minStopError">After this error amount, solution is good enough, return.</param>
        /// <returns>The error from ComputeError for this solution.</returns>
        public double Solve_ParisGenetic(
            ComputeError_Delegate computeErrorFunction, 
            int maxGenerations, 
            int maxStepsPerGeneration, 
            double retestFactor,
            double minStopError)
        {
            // No delegate? just return.
            if (computeErrorFunction == null) return -1;

            // Make a working list so we can adjust the parameters.
            List<Parameter> working = new List<Parameter>();
            for (int i = 0; i < Parameters.Count; i++)
            {
                working.Add(new Parameter(Parameters[i]));
            }

            // Keep track of best score and best step.
            double bestScore = 0;
            Step bestStep = null;

            // For each generation.
            for (int i = 0; i < maxGenerations; i++)
            {
                // Make an initial step we will sweep through and variate 
                // recursively. 
                List<Step> steps = new List<Step>();
                Step s = new Step(Parameters.Count);                
    
                // Recursively compute the steps.
                // This is a similar algorithm to a multi-stride tree walk.
                computeSteps(steps, working, - 1, s);
                if (bestStep != null)
                {
                    steps.Add(bestStep); // retry the best so new derivatives.
                }

                // Sort the steps so we only compute error terms for those 
                // Closest to the center / likely solution if we need to 
                // limit the number of steps we process.
                List<Step> sorted = new List<Step>();
                sorted.AddRange(steps);
                sorted = sorted.OrderBy(x => x.CenterDistance).ToList();

                // Compute errors for those we're interested, and keep track of what is best.
                bestStep = null;
                for (int j = 0; j < maxStepsPerGeneration && j < sorted.Count; j++)
                {
                    sorted[j].Score = computeErrorFunction(sorted[j].ParamValues);
                    sorted[j].Scored = true;
                    if ((bestStep == null) || (sorted[j].Score < bestScore))
                    {
                        bestStep = sorted[j];
                        bestScore = sorted[j].Score;
                    }
                }
                Steps = steps;

                // If a best exists and it is good enough break.
                if ((bestStep != null) && (bestScore < minStopError))
                {
                    break;
                }
                
                // Adjust the step size for the next generation and run it.
                bool stepsChanged = false;
                for (int j = 0; j < working.Count; j++)
                {
                    // Set new center.
                    working[j].CenterValue = bestStep.ParamValues[j];

                    // Make sure initial is > 0 and > final, and if it was previously final,
                    // no variation.
                    double oldStep = working[j].InitialStepSize;
                    double desiredStep = 0;
                    if (oldStep != 0)
                    {
                        desiredStep = (working[j].MaxValue - working[j].MinValue) / (oldStep * oldStep);
                    }
                    if (working[j].InitialStepSize != working[j].FinalStepSize)
                    {
                        working[j].InitialStepSize = desiredStep;
                        working[j].InitialStepSize = 
                            working[j].InitialStepSize < 0 ? 0 : working[j].InitialStepSize;
                        working[j].InitialStepSize = 
                            working[j].InitialStepSize < working[j].FinalStepSize ? 
                            working[j].FinalStepSize : working[j].InitialStepSize;
                    }
                    
                    stepsChanged |= (oldStep != working[j].InitialStepSize);
                    
                    // Adjust the bounds.
                    double delta = working[j].InitialStepSize * retestFactor;
                    delta = (oldStep != working[j].InitialStepSize) ? delta : 0;
                    working[j].MinValue = working[j].CenterValue - delta;
                    working[j].MaxValue = working[j].CenterValue + delta;
                }
                // If none of the steps changed then we're just retesting so break.
                if (stepsChanged == false)
                {
                    break;
                }
            }

            // If we have a best set the solution.
            if (bestStep != null)
            {
                for (int i = 0; i < bestStep.ParamValues.Length; i++)
                {
                    Parameters[i].Solution = bestStep.ParamValues[i];
                }
            }
            BestStep = bestStep;

            // Return the score.
            return bestScore;
        }

        /// <summary>
        /// Variate solver, paris style genetic algorithm.
        /// 
        /// The approach is the same as ParisGenetic, BUT multiple likely best cases 
        /// are kept in each generation.
        /// </summary>
        /// <param name="computeErrorFunction">Delegate to use.</param>
        /// <param name="maxGenerations">Number of generations</param>
        /// <param name="maxStepsPerGeneration">Maximum number of test cases per generation.</param>
        /// <param name="retestFactor">How much to expand search each generation, 1 = last step size, 1.5 = 1.5 * last step size.</param>
        /// <param name="minStopError">After this error amount, solution is good enough, return.</param>
        /// <returns>The error from ComputeError for this solution.</returns>
        public double Solve_ParisGeneticMultiple(
            ComputeError_Delegate computeErrorFunction,
            int maxGenerations,
            int maxStepsPerGeneration,
            double retestFactor,
            double minStopError)
        {
            // No delegate? just return.
            if (computeErrorFunction == null) return -1;

            // Make a working list so we can adjust the parameters.
            List<Parameter> working = new List<Parameter>();
            for (int i = 0; i < Parameters.Count; i++)
            {
                working.Add(new Parameter(Parameters[i]));
            }

            // Keep track of best score and best step.
            Step s = new Step(Parameters.Count);
            for (int i = 0; i < Parameters.Count; i++)
            {
                s.ParamValues[i] = Parameters[i].CenterValue;
            }
            List<Step> generation = new List<Step>();
            generation.Add(s);
            int maxGenerationSize = 10;

            // For each generation.
            for (int i = 0; i < maxGenerations; i++)
            {
                // Make an initial step we will sweep through and variate 
                // recursively. 
                List<Step> nextGeneration = new List<Step>();
                for (int j = 0; j < generation.Count; j++)
                {
                    nextGeneration.AddRange(findBestGenetics(
                        computeErrorFunction, working, generation[j], maxGenerationSize,
                        maxStepsPerGeneration, retestFactor, minStopError));
                }

                nextGeneration = nextGeneration.OrderBy(x => x.Score).ToList();
                if (nextGeneration[0].Score < minStopError)
                {
                    break;
                }
                if (nextGeneration.Count > maxGenerationSize)
                {
                    nextGeneration.RemoveRange(maxGenerationSize, nextGeneration.Count - maxGenerationSize);
                }
                generation = nextGeneration;

            }

            for (int i = 0; i < generation[0].ParamValues.Length; i++)
            {
                Parameters[i].Solution = generation[0].ParamValues[i];
            }
            
            return generation[0].Score;
        }

        protected List<Step> findBestGenetics(
            ComputeError_Delegate computeErrorFunction,
            List<Parameter> working, 
            Step s, 
            int maxGoodStepsPerGeneration,
            int maxStepsPerGeneration,
            double retestFactor,
            double minStopError)
        {
            List<Step> steps = new List<Step>();
            computeStepsGenetic(steps, working, -1, s);
                
            // Sort the steps so we only compute error terms for those 
            // Closest to the center / likely solution if we need to 
            // limit the number of steps we process.
            List<Step> sorted = new List<Step>();
            sorted.AddRange(steps);
            sorted = sorted.OrderBy(x => x.CenterDistance).ToList();
            if (sorted.Count > maxStepsPerGeneration)
            {
                sorted.RemoveRange(maxStepsPerGeneration, sorted.Count - maxStepsPerGeneration);
            }
            for (int j = 0; j < sorted.Count; j++)
            {
                sorted[j].Score = computeErrorFunction(sorted[j].ParamValues);
                sorted[j].Scored = true;
            }
            sorted = sorted.OrderBy(x => x.Score).ToList();
            if (sorted.Count > maxGoodStepsPerGeneration)
            {
                sorted.RemoveRange(maxGoodStepsPerGeneration, sorted.Count - maxGoodStepsPerGeneration);
            }
            return sorted;
        }

        /// <summary>
        /// Variate solver, Direct search aka. Hook-Jeves algorithm.
        /// 
        /// Algorithm "walks" a parameter up and down a certain number of times attempting
        /// to reduce the error. Typcially each step divide the range of possibilities by 2.
        /// When this is done aggressively any ripples in the error vs parameter value might
        /// get skipped (either as a benefit or not) thus scaling the sub division and over
        /// testing the region is often good.
        /// 
        /// Numerically this is faster than Direct Search, but it walks a single parameter
        /// at a time and thus not always converges, or converges instead to a local minimum.
        /// </summary>
        /// <param name="computeErrorFunction">Delegate to use.</param>
        /// <param name="maxGenerations">Number of generations</param>
        /// <param name="maxStepsPerGeneration">Maximum number of test cases per generation.</param>
        /// <param name="retestFactor">How much to expand search each generation, 1 = cut step by half, generally between 0.75 and 1.5, above 2 and no effect.</param>
        /// <param name="minStopError">After this error amount, solution is good enough, return.</param>
        /// <returns>The error from ComputeError for this solution.</returns>
        public double Solve_DirectSearch(
            ComputeError_Delegate computeErrorFunction,
            int maxGenerations,
            int maxStepsPerGeneration,
            double retestFactor,
            double minStopError)
        {
            // No delegate? just return.
            if (computeErrorFunction == null) return -1;
            retestFactor = retestFactor < 0.5 ? 0.5 : retestFactor > 1.999 ? 1.999 : retestFactor;

            // Make a working list we can adjust of the parameters.
            List<Parameter> working = new List<Parameter>();
            for (int i = 0; i < Parameters.Count; i++)
            {
                working.Add(new Parameter(Parameters[i]));
            }

            // Keep track of best score and best step.
            
            Step bestStep = new Step(Parameters.Count);
            
            for (int i = 0; i < Parameters.Count; i++)
            {
                bestStep.ParamValues[i] = Parameters[i].CenterValue;
            }
            double bestScore = computeErrorFunction(bestStep.ParamValues);
            Step test = new Step(bestStep);
            retestFactor = retestFactor / 2;
            // For each generation.
            for (int i = 0; i < maxGenerations; i++)
            {
                // Select next parameter.
                for (int j = 0; j < Parameters.Count; j++)
                {
                    for (int k = 0; k < maxStepsPerGeneration; k++)
                    {
                        Step lower = new Step(test);
                        Step higher = new Step(test);
                        lower.ParamValues[j] -= working[j].InitialStepSize;
                        higher.ParamValues[j] += working[j].InitialStepSize;

                        double lowerScore = computeErrorFunction(lower.ParamValues);
                        double centerScore = computeErrorFunction(test.ParamValues);
                        double higherScore = computeErrorFunction(higher.ParamValues);

                        // If between lower and center, go to halfway between
                        if ((lowerScore < centerScore) && (lowerScore < higherScore))
                        {
                            test.ParamValues[j] -= working[j].InitialStepSize / 2;
                            working[j].InitialStepSize *= retestFactor;
                        }

                        // Between higher and center go half between.
                        if ((higherScore < centerScore) && (higherScore < lowerScore))
                        {
                            test.ParamValues[j] += working[j].InitialStepSize / 2;
                            working[j].InitialStepSize *= retestFactor;
                        }

                        // Between lower and upper, reduce step.
                        if ((centerScore < lowerScore) && (centerScore < higherScore))
                        {
                            working[j].InitialStepSize *= retestFactor;
                        }

                        if ((centerScore > lowerScore) && (centerScore > higherScore))
                        {
                            if (lowerScore < higherScore)
                            {
                                test.ParamValues[j] -= working[j].InitialStepSize / 2;
                            }
                            else
                            {
                                test.ParamValues[j] += working[j].InitialStepSize / 2;
                            }
                            working[j].InitialStepSize *= retestFactor;
                        }
                        double err = computeErrorFunction(test.ParamValues);
                        if (err < bestScore)
                        {
                            bestScore = err;
                            bestStep = test;
                        }
                        if (bestScore < minStopError)
                        {
                            break;
                        }
                    }
                    if (bestScore < minStopError)
                    {
                        break;
                    }
                }
                if (bestScore < minStopError)
                {
                    break;
                }
            }

            // If we have a best set the solution.
            if (bestStep != null)
            {
                for (int i = 0; i < bestStep.ParamValues.Length; i++)
                {
                    Parameters[i].Solution = bestStep.ParamValues[i];
                }
            }
            BestStep = bestStep;

            // Return the score.
            return bestScore;
        }


        /// <summary>
        /// Variate solver, slope relationship.
        /// 
        /// This is simular to a direct search, but it makes certain assumptions. It is far less "robust",
        /// the initial guess needs to be relatively accurate, but it converges to an ideal solution faster than
        /// other approaches. 
        /// 
        /// At each stage: 
        ///   1) Compute the curve that fits error vs. parameter k: err = F(k).
        ///   2) Compute best value of k.
        ///   3) Vary parameter j (using step 1 and 2 so that at each j value, the ideal k is used).
        ///   4) Compute the curve fit of jth parameter w/ ideal k @ j vs. error.
        ///   5) Compute the ideal jth parameter value.
        /// 
        /// The output is an attempt to improve jth parameter by analyzing the way k interacts with j.
        /// The curve is always quadratic, so when the solution is close to being ideal, the lack of 
        /// convergence is usually due to a shape that is either a cone or a sadle. Other solvers then spiral 
        /// around the ideal solution but can't find it because they either under or overshoot parameter j, 
        /// and then when refining parameter k have the same problem and it doesn't converge. 
        /// This method variates slices through the curve to attempt to refine one of the parameters by
        /// actively refining the jth in terms of the kth parameter looking for a sadle point and driving to 
        /// the minimum.
        /// 
        /// </summary>
        /// <param name="computeErrorFunction">Delegate to use.</param>
        /// <param name="maxGenerations">Number of generations</param>
        /// <param name="maxStepsPerGeneration">Maximum number of test cases per generation.</param>
        /// <param name="retestFactor">How much to expand search each generation, 1 = cut step by half, generally between 0.75 and 1.5, above 2 and no effect.</param>
        /// <param name="minStopError">After this error amount, solution is good enough, return.</param>
        /// <returns>The error from ComputeError for this solution.</returns>
        public double Solve_SlopeVariate(
            ComputeError_Delegate computeErrorFunction,
            int maxGenerations,
            int maxStepsPerGeneration,
            double retestFactor,
            double minStopError)
        {
            // No delegate? just return.
            if (computeErrorFunction == null) return -1;
            retestFactor = retestFactor < 0.5 ? 0.5 : retestFactor > 1.999 ? 1.999 : retestFactor;

            // Make a working list we can adjust of the parameters.
            List<Parameter> working = new List<Parameter>();
            for (int i = 0; i < Parameters.Count; i++)
            {
                working.Add(new Parameter(Parameters[i]));
            }

            // Keep track of best step.
            Step bestStep = new Step(Parameters.Count);
            for (int i = 0; i < Parameters.Count; i++)
            {
                bestStep.ParamValues[i] = Parameters[i].CenterValue;
            }
            bestStep.Score = computeErrorFunction(bestStep.ParamValues);
            bestStep.Scored = true;

            // Iterate through every pair of parameters maxGenerations number of times.
            for (int i = 0; i < maxGenerations; i++)
            {
                // Compute a nextBest.
                Step nextBest = new Step(bestStep);
                for (int j = 0; j < Parameters.Count; j++)
                {
                    for (int k = 0; k < Parameters.Count; k++)
                    {
                        if (j == k) continue;

                        // Test for j, k
                        Step t = new Step(nextBest);

                        // Vary j with k, the k with j.
                        // By doing it twice, and not meeting a curve that doesn't fit,
                        // two things occur:
                        //   1) Instead of just refining j, and perhaps making the next
                        //      fit hard or impossible, j and k are addressed.
                        //   2) We only consider worth continuing if there is a stability 
                        //      of j on k and k on j. This ensures we didn't "improve" j
                        //      when really the curve fit was bad and the relationship 
                        //      between j & k wasn't correct.
                        for (int m = 0; m < maxStepsPerGeneration; m++)
                        {
                            // Improve jth of t, if no fit, don't improve t.
                            t = vary_jk(computeErrorFunction, t, working, j, k);
                            if (t == null) break;

                            // Improve kth of t, 
                            t = vary_jk(computeErrorFunction, t, working, k, j);
                            if (t == null) break;
                        }
                        if (t == null) continue;
                        nextBest = t;
                        nextBest.Score = computeErrorFunction(nextBest.ParamValues);
                        nextBest.Scored = true;
                        
                        // If this variation is good enough, stop.
                        if (nextBest.Score < minStopError)
                        {
                            break;
                        }
                    }
                    // If this variation is good enough, stop.
                    if (nextBest.Score < minStopError)
                    {
                        break;
                    }
                }
                bestStep = nextBest;
                // If this variation is good enough, stop.
                if (bestStep.Score < minStopError)
                {
                    break;
                }
            }

            BestStep = bestStep;
            BestStep.Scored = true;
            if (bestStep != null)
            {
                for (int i = 0; i < bestStep.ParamValues.Length; i++)
                {
                    Parameters[i].Solution = bestStep.ParamValues[i];
                }
            }
            // Return the score.
            return BestStep.Score;
        }



        /// <summary>
        /// Vary j, k, finding ideal kth params for each jth varied and then the ideal jth
        /// by minimal curve fit. This could be done recursively to compute an ideal solution.
        /// The problem is with measurement noise the likely stability as a general solution 
        /// drops as the number of axises are increased. What is presented here is an iterative
        /// approach based on experiments.
        /// </summary>
        /// <param name="computeErrorFunction"></param>
        /// <param name="s"></param>
        /// <param name="working"></param>
        /// <param name="j"></param>
        /// <param name="k"></param>
        /// <returns></returns>
        protected Step vary_jk(ComputeError_Delegate computeErrorFunction, Step s, List<Parameter> working, int j, int k)
        {
            int fp = 7;  // fit points.
            double fp1 = fp - 1;
            double[] jx = new double[fp];
            double[] jy = new double[fp];
            double[] kx = new double[fp];
            double[] ky = new double[fp];
            Step vary = new Step(s);
            LinearLeastSquares lls = new LinearLeastSquares();
            for (int ji = 0; ji < fp; ji++)
            {
                double fj = (fp1 - ((double)ji)) / fp1;
                double xfj = (working[j].MinValue * fj) + ((1 - fj) * working[j].MaxValue);
                jx[ji] = xfj;
                vary.ParamValues[j] = xfj;

                // Walk through kth at this j.
                for (int ki = 0; ki < fp; ki++)
                {
                    double fk = (fp1 - ((double)ki)) / fp1;
                    double xfk = (working[k].MinValue * fk) + ((1 - fk) * working[k].MaxValue);
                    vary.ParamValues[k] = xfk;
                    kx[ki] = xfk;
                    ky[ki] = computeErrorFunction(vary.ParamValues);
                }

                // Now solve directly for what the zero point should be, given this j, varying k.
                lls.Solve_SecondOrder(kx, ky);

                // zy = a0 + a1*zx + a2*zx*zx 
                // Slope is zero, is either a max or min, 2nd derivative 
                // tells us if it's max or min.
                // zy' = a1 + 2*a2*zx = 0 -> a1 = -2*a2 *zx -> zx = -a1 / 2*a2
                if (lls.Solution[2] <= 0) return null;
                double idealk = -0.5 * lls.Solution[1] / lls.Solution[2];
                // idealk is the ideal value for the kth coefficient at this jth coefficient value.
                vary.ParamValues[k] = idealk;

                // This is now the smallest theoretical error when kth coefficient is perfectly 
                // adjusted for jth coefficient.
                jy[ji] = computeErrorFunction(vary.ParamValues);
            }
            // Now the error equation of params j, k is jx, jy.
            // jx is walked through producing minimum errors for ideal k.
            lls.Solve_SecondOrder(jx, jy);

            // The minimum is at, slope = 0 if 2nd param is > 0.
            if (lls.Solution[2] <= 0) return null;
            Step t = new Step(s);
            double idealj = -0.5 * lls.Solution[1] / lls.Solution[2];
            if (idealj < working[j].MinValue) return null;
            if (idealj > working[j].MaxValue) return null;
            t.ParamValues[j] = idealj;
            return t;
        }


        /// <summary>
        /// Compute the steps recursively starting at parameter p (-1 to init) based off
        /// of an initial step s (generated by this function when p = -1, then used recursively)
        /// based on the working parameters.
        /// </summary>
        /// <param name="steps"></param>
        /// <param name="working"></param>
        /// <param name="p"></param>
        /// <param name="s"></param>
        protected void computeSteps(List<Step> steps, List<Parameter> working, int p, Step s)
        {
            // Done. Also acts as a "depth charge" for the recursion.
            if (p >= Parameters.Count) return;

            if (p == -1)
            {
                for (int i = 0; i < working.Count; i++)
                {
                    s.ParamValues[i] = working[i].MinValue;
                }
                double sum = 0;
                for (int i = 0; i < working.Count; i++)
                {
                    double px = s.ParamValues[i];
                    double dx = px - working[i].CenterValue;
                    sum += dx * dx;
                }
                // This is an initial center distance, since we are sorting min to max 
                // not computing or relying on a real "distance" from the center of the 
                // solution space there is no need to compute the square root.
                s.CenterDistance = sum;
                steps.Add(s);
                p++;
            }
        
            // This call variates index p.
            double cx = working[p].CenterValue;
            Step s2 = null;
            for (double x = working[p].MinValue; x <= working[p].MaxValue; x += working[p].InitialStepSize)
            {
                s2 = new Step(s);
                s2.ParamValues[p] = x;
                double ox = s.ParamValues[p];

                // Because no square root was done, we can subtract the old parameters distance 
                // effect from the sum, and add in the new parameter values affect on the sum.
                s2.CenterDistance = s.CenterDistance + ((cx - x) * (cx - x)) - ((ox - cx) * (ox - cx));
                steps.Add(s2);

                // Recurse.
                computeSteps(steps, working, p + 1, s2);
            }
        }


        /// <summary>
        /// Compute the steps recursively starting at parameter p (-1 to init) based off
        /// of an initial step s (generated by this function when p = -1, then used recursively)
        /// based on the working parameters.
        /// </summary>
        /// <param name="steps"></param>
        /// <param name="working"></param>
        /// <param name="p"></param>
        /// <param name="s"></param>
        protected void computeStepsGenetic(List<Step> steps, List<Parameter> working, int p, Step s)
        {
            // s is the center.
            // Done. Also acts as a "depth charge" for the recursion.
            if (p >= Parameters.Count) return;

            if (p == -1)
            {
                // This is an initial center distance, since we are sorting min to max 
                // not computing or relying on a real "distance" from the center of the 
                // solution space there is no need to compute the square root.
                s.CenterDistance = 0;
                steps.Add(s);
                p++;
            }

            // This call variates index p.
            double cx = s.ParamValues[p];
            Step s2 = null;
            double delta2 = 0.5 * (working[p].MaxValue - working[p].MinValue);
            for (double x = s.ParamValues[p] - delta2; x <= s.ParamValues[p] + delta2; x += working[p].InitialStepSize)
            {
                s2 = new Step(s);
                s2.ParamValues[p] = x;

                // Because no square root was done, we can subtract the old parameters distance 
                // effect from the sum, and add in the new parameter values affect on the sum.
                s2.CenterDistance = s.CenterDistance + ((cx - x) * (cx - x));
                steps.Add(s2);

                // Recurse.
                computeSteps(steps, working, p + 1, s2);
            }
        }

        /// <summary>
        /// Parameter class.
        /// </summary>
        public class Parameter
        {
            /// <summary>
            /// Copy constructor.
            /// </summary>
            /// <param name="p"></param>
            public Parameter(Parameter p)
            {
                Name = p.Name;
                CenterValue = p.CenterValue;
                MaxValue = p.MaxValue;
                MinValue = p.MinValue;
                InitialStepSize = p.InitialStepSize;
                FinalStepSize = p.FinalStepSize;
                Solution = p.Solution;
            }

            /// <summary>
            /// Default constructor.
            /// </summary>
            public Parameter()
            {
                Name = "";
                CenterValue = 0;
                MaxValue = 0;
                MinValue = 0;
                InitialStepSize = 0;
                FinalStepSize = 0;
                Solution = 0;
            }

            /// <summary>
            /// Constructor.
            /// </summary>
            /// <param name="name"></param>
            /// <param name="centerValue"></param>
            /// <param name="maxValue"></param>
            /// <param name="minValue"></param>
            /// <param name="initialStepSize"></param>
            /// <param name="finalStepSize"></param>
            /// <param name="stepStepSize"></param>
            public Parameter(
                string name,
                double centerValue, 
                double maxValue, 
                double minValue, 
                double initialStepSize, 
                double finalStepSize)
            {
                Name = name;
                CenterValue = centerValue;
                MaxValue = maxValue;
                MinValue = minValue;
                InitialStepSize = initialStepSize;
                Solution = 0;
            }

            /// <summary>
            /// Name.
            /// </summary>
            public string Name { get; set; }

            /// <summary>
            /// Center value (likely initial guess).
            /// </summary>
            public double CenterValue { get; set; }

            /// <summary>
            /// Max value.
            /// </summary>
            public double MaxValue { get; set; }

            /// <summary>
            /// Minimum value.
            /// </summary>
            public double MinValue { get; set; }

            /// <summary>
            /// Initial step size.
            /// </summary>
            public double InitialStepSize { get; set; }

            /// <summary>
            /// Final step size.
            /// </summary>
            public double FinalStepSize { get; set; }

            /// <summary>
            /// Final solution.
            /// </summary>
            public double Solution { get; set; }
        }

        /// <summary>
        /// A step.
        /// </summary>
        public class Step
        {
            /// <summary>
            /// Constructor.
            /// </summary>
            public Step()
            {
            }

            /// <summary>
            /// Constructor with size.
            /// </summary>
            /// <param name="size"></param>
            public Step(int size)
            {
                ParamValues = new double[size];
            }

            /// <summary>
            /// Copy constructor.
            /// </summary>
            /// <param name="s"></param>
            public Step(Step s)
            {
                ParamValues = new double[s.ParamValues.Length];
                for (int i = 0; i < s.ParamValues.Length; i++)
                {
                    ParamValues[i] = s.ParamValues[i];
                }
                Score = s.Score;
            }

            /// <summary>
            /// Scored flag.
            /// </summary>
            public bool Scored = false;

            /// <summary>
            /// Center distance.
            /// </summary>
            public double CenterDistance = 0;

            /// <summary>
            /// Parameter values.
            /// </summary>
            public double[] ParamValues;

            /// <summary>
            /// Score.
            /// </summary>
            public double Score;
        }
    }
}

By viewing downloads associated with this article you agree to the Terms of Service and the article's licence.

If a file you wish to view isn't highlighted, and is a text file (not binary), please let us know and we'll add colourisation support for it.

License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)


Written By
Technical Lead
United States United States
Phil is a Principal Software developer focusing on weird yet practical algorithms that run the gamut of embedded and desktop (PID loops, Kalman filters, FFTs, client-server SOAP bindings, ASIC design, communication protocols, game engines, robotics).

In his personal life he is a part time mad scientist, full time dad, and studies small circle jujitsu, plays guitar and piano.

Comments and Discussions