James Kennedy is social psychologist with the Dept. of Labor's Bureau of Labor Statistics. He's speaking at today's BYU CS colloquium on "The Essential Particle Swarm." He was introduced as the inventor of particle swarming algorithms. He muses whether he's the inventor or discoverer of the algorithm, given that this is a process inherent in many places in nature. The term discoverer might be more apt.

James started his work doing computer simulations of the interactions of individuals and their interactions in social context. Social dynamics are adaptive. Societies adapt to their environment, not just the physical environment, but their social environment.

The partial swarm is a kind of program does simulations comprising simple individuals that interact with one another according to a simple set of rules. The particle swarm algorithm despite it's simplicity, can be used in man different applications.

Particle swarms are used to find global optimizations of multi-variable functions. Each variable is called a dimension. An element of randomness causes particles to explore the function space in search of the optimal value.

A particle as a position, xid, a previous best position, pid, a velocity, vid, and a previous best function result pbesti. A particles neighbors have these same properties. James uses 20 particles in simulations. there are two popular population topologies: Gbest, a fully connected topology, and Lbest, where each particle is only connected to immediate neighbors (two connections each). Since particles learn from each other, the topology can make a big difference.

The algorithm for any one partical in a particular dimension can be summarized as follows:

 
New position = current position + 
               persistence + 
               social influence 

Persistence is a factor that keep particles moving in a direction (inertia might be a better term). Social influence let's particles learn from their neighbors.

In the algorithm, each particle explores along each dimension with a velocity calculated from previous values and an element of randomness. The particles also use information from their neighbors allowing them to converge on a solution. When a neighbor finds a better point, particles may start exploring a new space. This allows particle swarms to shift from exploitation of a promising area to exploring a new area automatically.


Please leave comments using the Hypothes.is sidebar.

Last modified: Thu Oct 10 12:47:18 2019.