DEV Community

Cover image for Unlock Nature's Genius: Evolutionary Algorithms in JavaScript for Powerful Problem-Solving
Aarav Joshi
Aarav Joshi

Posted on

Unlock Nature's Genius: Evolutionary Algorithms in JavaScript for Powerful Problem-Solving

Evolutionary algorithms are a fascinating way to solve complex optimization problems in JavaScript. I've been exploring these nature-inspired techniques and want to share what I've learned about implementing them effectively.

At their core, evolutionary algorithms mimic the process of natural selection to iteratively improve potential solutions. We start with a population of candidate solutions, evaluate their fitness, select the best ones, and create new solutions through processes like crossover and mutation. Over many generations, the population evolves towards better and better solutions.

Let's look at how to implement a basic genetic algorithm in JavaScript:

class GeneticAlgorithm {
  constructor(populationSize, geneLength, fitnessFunction) {
    this.populationSize = populationSize;
    this.geneLength = geneLength;
    this.fitnessFunction = fitnessFunction;
    this.population = this.initializePopulation();
  }

  initializePopulation() {
    return Array.from({ length: this.populationSize }, () => 
      Array.from({ length: this.geneLength }, () => Math.random() < 0.5 ? 0 : 1)
    );
  }

  evolve(generations) {
    for (let i = 0; i < generations; i++) {
      const fitnesses = this.population.map(this.fitnessFunction);
      const newPopulation = [];

      while (newPopulation.length < this.populationSize) {
        const parent1 = this.selectParent(fitnesses);
        const parent2 = this.selectParent(fitnesses);
        const [child1, child2] = this.crossover(parent1, parent2);
        newPopulation.push(this.mutate(child1), this.mutate(child2));
      }

      this.population = newPopulation;
    }

    return this.getBestSolution();
  }

  selectParent(fitnesses) {
    const totalFitness = fitnesses.reduce((sum, fitness) => sum + fitness, 0);
    let random = Math.random() * totalFitness;
    for (let i = 0; i < this.populationSize; i++) {
      random -= fitnesses[i];
      if (random <= 0) return this.population[i];
    }
  }

  crossover(parent1, parent2) {
    const crossoverPoint = Math.floor(Math.random() * this.geneLength);
    const child1 = [...parent1.slice(0, crossoverPoint), ...parent2.slice(crossoverPoint)];
    const child2 = [...parent2.slice(0, crossoverPoint), ...parent1.slice(crossoverPoint)];
    return [child1, child2];
  }

  mutate(individual) {
    return individual.map(gene => Math.random() < 0.01 ? 1 - gene : gene);
  }

  getBestSolution() {
    const fitnesses = this.population.map(this.fitnessFunction);
    const bestIndex = fitnesses.indexOf(Math.max(...fitnesses));
    return this.population[bestIndex];
  }
}
Enter fullscreen mode Exit fullscreen mode

This implementation covers the key components of a genetic algorithm: population initialization, parent selection, crossover, mutation, and generational evolution. We can use it to solve various optimization problems by defining an appropriate fitness function.

For example, let's use this genetic algorithm to find the maximum of a simple function:

const fitnessFunction = individual => {
  const x = parseInt(individual.join(''), 2);
  return -(x * x) + 5 * x;
};

const ga = new GeneticAlgorithm(100, 10, fitnessFunction);
const bestSolution = ga.evolve(100);

console.log('Best solution:', parseInt(bestSolution.join(''), 2));
Enter fullscreen mode Exit fullscreen mode

This will find the maximum of the function -x^2 + 5x in the range [0, 1023].

While genetic algorithms are versatile, other evolutionary techniques can be more suitable for certain problems. Evolutionary strategies, for instance, work well for real-valued optimization problems.

Here's a basic implementation of an evolutionary strategy:

class EvolutionStrategy {
  constructor(populationSize, dimensions, fitnessFunction) {
    this.populationSize = populationSize;
    this.dimensions = dimensions;
    this.fitnessFunction = fitnessFunction;
    this.population = this.initializePopulation();
  }

  initializePopulation() {
    return Array.from({ length: this.populationSize }, () => 
      Array.from({ length: this.dimensions }, () => Math.random())
    );
  }

  evolve(generations) {
    for (let i = 0; i < generations; i++) {
      const offspring = this.population.flatMap(parent => this.mutate(parent));
      const allIndividuals = [...this.population, ...offspring];
      const fitnesses = allIndividuals.map(this.fitnessFunction);
      this.population = this.select(allIndividuals, fitnesses);
    }
    return this.getBestSolution();
  }

  mutate(individual) {
    const sigma = 0.1;
    return individual.map(gene => gene + sigma * this.randomGaussian());
  }

  randomGaussian() {
    let u = 0, v = 0;
    while (u === 0) u = Math.random();
    while (v === 0) v = Math.random();
    return Math.sqrt(-2.0 * Math.log(u)) * Math.cos(2.0 * Math.PI * v);
  }

  select(individuals, fitnesses) {
    const sorted = individuals.map((ind, i) => [ind, fitnesses[i]])
      .sort((a, b) => b[1] - a[1]);
    return sorted.slice(0, this.populationSize).map(([ind]) => ind);
  }

  getBestSolution() {
    const fitnesses = this.population.map(this.fitnessFunction);
    const bestIndex = fitnesses.indexOf(Math.max(...fitnesses));
    return this.population[bestIndex];
  }
}
Enter fullscreen mode Exit fullscreen mode

This implementation uses a (μ + λ) selection strategy, where μ parents produce λ offspring, and the best μ individuals from both parents and offspring form the next generation.

We can use this to optimize real-valued functions:

const fitnessFunction = individual => {
  const [x, y] = individual;
  return -(x * x + y * y) + 5 * (x + y);
};

const es = new EvolutionStrategy(50, 2, fitnessFunction);
const bestSolution = es.evolve(100);

console.log('Best solution:', bestSolution);
Enter fullscreen mode Exit fullscreen mode

This finds the maximum of the function -(x^2 + y^2) + 5(x + y).

Another popular evolutionary technique is Particle Swarm Optimization (PSO). It's inspired by the social behavior of bird flocking or fish schooling. Here's a basic implementation:

class ParticleSwarmOptimization {
  constructor(populationSize, dimensions, fitnessFunction) {
    this.populationSize = populationSize;
    this.dimensions = dimensions;
    this.fitnessFunction = fitnessFunction;
    this.particles = this.initializeParticles();
    this.globalBest = null;
  }

  initializeParticles() {
    return Array.from({ length: this.populationSize }, () => ({
      position: Array.from({ length: this.dimensions }, () => Math.random()),
      velocity: Array.from({ length: this.dimensions }, () => Math.random() - 0.5),
      personalBest: null,
      personalBestFitness: -Infinity
    }));
  }

  evolve(iterations) {
    for (let i = 0; i < iterations; i++) {
      for (const particle of this.particles) {
        const fitness = this.fitnessFunction(particle.position);

        if (fitness > particle.personalBestFitness) {
          particle.personalBest = [...particle.position];
          particle.personalBestFitness = fitness;
        }

        if (fitness > (this.globalBest?.fitness ?? -Infinity)) {
          this.globalBest = { position: [...particle.position], fitness };
        }

        this.updateParticle(particle);
      }
    }

    return this.globalBest.position;
  }

  updateParticle(particle) {
    const w = 0.7;  // inertia weight
    const c1 = 1.5; // cognitive parameter
    const c2 = 1.5; // social parameter

    for (let i = 0; i < this.dimensions; i++) {
      const r1 = Math.random();
      const r2 = Math.random();

      particle.velocity[i] = w * particle.velocity[i] +
        c1 * r1 * (particle.personalBest[i] - particle.position[i]) +
        c2 * r2 * (this.globalBest.position[i] - particle.position[i]);

      particle.position[i] += particle.velocity[i];
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

PSO is particularly effective for continuous optimization problems and can often converge faster than genetic algorithms for certain types of problems.

Here's how we can use it:

const fitnessFunction = position => {
  const [x, y] = position;
  return -(x * x + y * y) + 5 * (x + y);
};

const pso = new ParticleSwarmOptimization(50, 2, fitnessFunction);
const bestSolution = pso.evolve(100);

console.log('Best solution:', bestSolution);
Enter fullscreen mode Exit fullscreen mode

When implementing these algorithms, it's crucial to design effective fitness functions. The fitness function guides the evolution process, so it needs to accurately represent the problem you're trying to solve. For complex problems, you might need to combine multiple objectives or add penalties for constraint violations.

Let's look at a more practical example: optimizing a trading strategy. We'll use a genetic algorithm to find the best parameters for a simple moving average crossover strategy:

class TradingStrategy {
  constructor(shortPeriod, longPeriod, data) {
    this.shortPeriod = shortPeriod;
    this.longPeriod = longPeriod;
    this.data = data;
  }

  backtest() {
    let position = 0;
    let balance = 1000;
    const shortMA = this.movingAverage(this.shortPeriod);
    const longMA = this.movingAverage(this.longPeriod);

    for (let i = this.longPeriod; i < this.data.length; i++) {
      if (shortMA[i] > longMA[i] && position <= 0) {
        position = balance / this.data[i];
        balance = 0;
      } else if (shortMA[i] < longMA[i] && position > 0) {
        balance = position * this.data[i];
        position = 0;
      }
    }

    return balance + position * this.data[this.data.length - 1];
  }

  movingAverage(period) {
    const ma = [];
    for (let i = 0; i < this.data.length; i++) {
      if (i < period) {
        ma.push(null);
      } else {
        const sum = this.data.slice(i - period, i).reduce((a, b) => a + b, 0);
        ma.push(sum / period);
      }
    }
    return ma;
  }
}

const historicalData = [/* ... */]; // Your historical price data here

const fitnessFunction = individual => {
  const [shortPeriod, longPeriod] = individual.map(gene => Math.floor(gene * 100) + 1);
  const strategy = new TradingStrategy(shortPeriod, longPeriod, historicalData);
  return strategy.backtest();
};

const ga = new GeneticAlgorithm(100, 2, fitnessFunction);
const bestSolution = ga.evolve(100);

console.log('Best MA periods:', bestSolution.map(gene => Math.floor(gene * 100) + 1));
Enter fullscreen mode Exit fullscreen mode

This example optimizes the short and long periods for a moving average crossover strategy to maximize the final portfolio value.

When working with computationally intensive fitness functions like this, performance can become an issue. One way to improve performance is to parallelize the fitness evaluations using Web Workers. Here's how we can modify our genetic algorithm to use Web Workers:

class ParallelGeneticAlgorithm {
  constructor(populationSize, geneLength, workerScript) {
    this.populationSize = populationSize;
    this.geneLength = geneLength;
    this.workerScript = workerScript;
    this.population = this.initializePopulation();
  }

  initializePopulation() {
    return Array.from({ length: this.populationSize }, () => 
      Array.from({ length: this.geneLength }, () => Math.random())
    );
  }

  async evolve(generations) {
    const workers = Array.from({ length: navigator.hardwareConcurrency }, () => 
      new Worker(this.workerScript)
    );

    for (let i = 0; i < generations; i++) {
      const fitnesses = await this.evaluatePopulation(workers);
      const newPopulation = [];

      while (newPopulation.length < this.populationSize) {
        const parent1 = this.selectParent(fitnesses);
        const parent2 = this.selectParent(fitnesses);
        const [child1, child2] = this.crossover(parent1, parent2);
        newPopulation.push(this.mutate(child1), this.mutate(child2));
      }

      this.population = newPopulation;
    }

    workers.forEach(worker => worker.terminate());
    return this.getBestSolution();
  }

  async evaluatePopulation(workers) {
    const chunkSize = Math.ceil(this.populationSize / workers.length);
    const promises = workers.map((worker, i) => {
      const start = i * chunkSize;
      const end = Math.min((i + 1) * chunkSize, this.populationSize);
      const chunk = this.population.slice(start, end);
      return new Promise(resolve => {
        worker.onmessage = e => resolve(e.data);
        worker.postMessage(chunk);
      });
    });
    const results = await Promise.all(promises);
    return results.flat();
  }

  // Other methods (selectParent, crossover, mutate, getBestSolution) remain the same
}
Enter fullscreen mode Exit fullscreen mode

The worker script (fitnessWorker.js) would look like this:

self.onmessage = function(e) {
  const individuals = e.data;
  const fitnesses = individuals.map(fitnessFunction);
  self.postMessage(fitnesses);
};

function fitnessFunction(individual) {
  // Your fitness function implementation here
}
Enter fullscreen mode Exit fullscreen mode

This parallel implementation can significantly speed up the evolution process for computationally intensive fitness functions.

Evolutionary algorithms are powerful tools for solving complex optimization problems, but they're not always the best choice. They work well for problems with large, complex search spaces where traditional optimization methods struggle. However, for simpler problems or those with well-defined mathematical properties, other optimization techniques might be more efficient.

When using evolutionary algorithms, it's important to carefully tune parameters like population size, mutation rate, and the number of generations. These can have a big impact on the algorithm's performance and the quality of the solutions it finds. It's often worth experimenting with different settings to find what works best for your specific problem.

Remember that evolutionary algorithms are stochastic – they use randomness, so you'll get different results each time you run them. It's a good idea to run your algorithm multiple times and look at the distribution of results, rather than relying on a single run.

Evolutionary algorithms can be applied to a wide range of problems beyond what we've covered here. They're used in fields as diverse as engineering design, financial modeling, and artificial intelligence. As you explore further, you'll discover many interesting variations and applications of these nature-inspired optimization techniques.

In conclusion, evolutionary algorithms offer a powerful and flexible approach to solving complex optimization problems in JavaScript. By understanding the principles behind these algorithms and how to implement them effectively, you can tackle challenging tasks that might be difficult or impossible to solve with traditional methods. Whether you're optimizing parameters, designing systems, or solving complex scheduling problems, evolutionary algorithms provide a valuable tool in your development toolkit.


Our Creations

Be sure to check out our creations:

Investor Central | Smart Living | Epochs & Echoes | Puzzling Mysteries | Hindutva | Elite Dev | JS Schools


We are on Medium

Tech Koala Insights | Epochs & Echoes World | Investor Central Medium | Puzzling Mysteries Medium | Science & Epochs Medium | Modern Hindutva

Top comments (0)