Particle Filtering

Background and Terminology

Particle filtering is a technique for estimating the position of a system in its state space. A common usage example is tracking the position of a mobile robot as it moves in the world. The process works by sampling the potential state space with a number of "particles", each of which represents a hypothesis regarding the current position.

One of its strengths of this technique is the ability to track a multi-modal distribution, where there may be multiple symmetric hypotheses. The particle filter can maintain these distinct populations until some non-symmetrical information is able to differentiate the true position.

However, keep in mind this is a statistical technique. Its accuracy is limited by the number of particles being used, and is dependent on correctly estimating the accuracy/noise of the models. Also note that each particle represents not only a hypothesis regarding current position, but by its very existence, also represents some history regarding the path taken. This is why we do not use a static sampling over the entire space -- for a given sensor measurement, there may be many locations in state space for which that measurement is feasible. However, we only want to consider those locations which have a history of being feasible.

Implementation

Tekkotsu provides a generic particle filter implementation in Shared/ParticleFilter.h. You specify what defines a 'particle' via the template argument. The ParticleBase class shows the minimum requirements expected by the ParticleFilter class. You can either inherit from ParticleBase or re-implement the requirements in your own class. (Since the particle type is specified by the template, inheritance isn't required, only content.)

The particle filter specifies some base classes which you do need to inherit from in order to define your application-specific handling of the particles:

To facilitate the most common usage of robot localization, we provide a LocalizationParticle, which can be used with a DeadReckoningBehavior as the motion model to track a robot moving in a plane with x, y, and angular velocities. The sensor model assumes a certain type of landmark, which is covered in the next section.

Usage: Localization example

We provide all the pieces you need to do localization using visual landmarks on the ground. This solution is based on using the DualCoding package for extracting shapes and doing ground plane projection. To do this, the generic ParticleFilter is subclassed as DualCoding::ShapeLocalizationPF to provide the application specific pieces for you. This means it is preconfigured with all the policies and models you need to get moving -- all you need to do is configure it and call update()!

Initialization and Configuration

First you'll need to declare an instance of ShapeLocalizationPF:

#include "DualCoding/DualCoding.h"

using namespace DualCoding;
	
ShapeLocalizationPF filter;

Now you can begin customizing settings. Below you will see two common important parameters of the default resampling policy. See class documentation for more.

typedef ShapeLocalizationPF::LowVarianceResamplingPolicy ResamplingPolicy;
if(ResamplingPolicy* resample = 
        dynamic_cast<ResamplingPolicy*>(filter.getResamplingPolicy())) {
    // by default, resampling occurs on each update (delay=0)
    // Since groundplane projection is quite noisy while walking, we'll average over
    // many samples so we get a decent estimate of particle accuracy
    resample->resampleDelay=30;
    
    // by default the minimum is -FLT_MAX (essentially no minimum)
    // we'll require particles to have at least some feasability or we'll randomize
    // to try to re-localize
    resample->minAcceptableWeight=std::log(2e-9f);
}

Changing the DistributionPolicy's variance controls how "tight" the cluster can get after we resample the particles. As mentioned in the ResamplingPolicy class notes, this reduces accuracy and is generally best to avoid, but may be necessary to counteract an overconfident sensor model.

// Regarding the typedef: every resampling policy embeds a distribution policy, the
// default policy is specified by the particle's own DistributionPolicy typedef
typedef ShapeLocalizationPF::particle_type::DistributionPolicy DistributionPolicy;
if(DistributionPolicy* dp = dynamic_cast<DistributionPolicy*>
        (&filter.getResamplingPolicy()->getDistributionPolicy())) {
    dp->positionVariance *= 1; // not actually changing value, just demonstration
    dp->orientationVariance *= 1;
}

We can also specify motion model variance (how much particles spread apart as they travel). Note that we could instead add velocity as parameters of state space and have particles track the velocity as well as position. Something for you to experiment with...

typedef HolonomicMotionModel<ShapeLocalizationPF::particle_type> MotionModel;
if(MotionModel * motion = dynamic_cast<MotionModel*>(filter.getMotionModel())) {
    // these are currently the default parameters, but explicitly reset for demonstration:
    motion->setVariance(50,50,.15);
}

The next step is to place the landmarks on the world map. Here, we'll define the world to consist of just a pair of circles, one yellow, one pink, 130mm apart. (All distances are in millimeters.)

//! this function defines the expected layout of the world
void setupLandmarksOnMap() {
    // a pair of ellipses (pink and yellow)
    float r=27.5; // radius, not currently used other than display purposes
    
    NEW_SHAPE(pinkm,EllipseData,new EllipseData(worldShS,Point(65,0),r,r));
    pinkm->setColor("pink");
    
    NEW_SHAPE(yellowm,EllipseData,new EllipseData(worldShS,Point(-65,0),r,r));
    yellowm->setColor("yellow");
}

The DualCoding module's MapBuilder will handle determining the placement of objects by making a flat-world/groundplane assumption. However, we need to tell it what to look for with a MapBuilderRequest. You can setup one of these during initialization and then reuse it anytime you want to update the "local map" corresponding to what is seen by the camera.

MapBuilderRequest mapreq(MapBuilderRequest::localMap);
const int pink_index = ProjectInterface::getColorIndex("pink");
const int yellow_index = ProjectInterface::getColorIndex("yellow");
mapreq.objectColors[ellipseDataType].insert(pink_index);
mapreq.objectColors[ellipseDataType].insert(yellow_index);

mapreq.maxDist=2000; // can ignore things beyond a certain distance (here, 2m)

Usage

Now that we've got the settings we want, it's time to put it all together. Assuming we're running this from a subclass of VisualRoutinesBehavior (so that we have access to the DualCoding MapBuilder), you would do something like this:

virtual void DoStart() {
    VisualRoutinesBehavior::DoStart(); // do this first (required)
    
    // only doing processing on camera frames where there's something to see
    erouter->addListener(this, EventBase::visObjEGID,
        ProjectInterface::visPinkBallSID, EventBase::statusETID);
    erouter->addListener(this, EventBase::visObjEGID,
        ProjectInterface::visYellowBallSID, EventBase::statusETID);
}

virtual void processEvent(const EventBase& event) {
    mapBuilder.executeRequest(mapreq); // project camera space to "local" space
    filter.update(); // have our particle filter match local space against world space
	
    // Report current position:
    const ShapeLocalizationPF::particle_type& p = filter.getBestParticle();
    cout << "Best index: " << filter.getBestIndex() << " with score: " << p.weight << endl;
    mapBuilder.setAgent(Point(p.x,p.y), p.theta); // updates graphical display
    cout << "Current positions: " << p.x << " " << p.y << " " << p.theta << endl;
}

You can view a souped-up and ready to run version of the code shown here in Behaviors/Demos/LocalizationBehavior.h