A New Benchmark for the Energy-Constrained Fastest Path with Waypoints (EFP+W) Problem

This public benchmark addresses a new type of problem, that mixes two well-known problems of the literature that have never been considered together:

The goal of this benchmark is to compare the efficiency of various solving approaches on instances of different complexities. In the medium term, the objective is to determine the best category of algorithms able to solve this kind of problems.

The problem that is addressed and described below is a Energy-Constrained Fastest Path with Waypoints. Should you require any information about this page, please contact François LUCAS.

The terrain is represented by a graph made of vertices and oriented edges. The goal is to determine the best path of a single vehicle from a start position S to a goal position G in the graph. The constraints are : a subset of mandatory vertices to pass through (with no predetermined order), a maximum speed and an initial energy resource for the mission. A simple energy consumption law (function of the vehicle speed) is given ; each edge of the graph may be run at a specific but constant speed. The best path should minimize the overall vehicle travel duration while satisfying the aforementioned constraints.

A mission describes a path request over a given graph with specified constraints. It is defined by a XML file (Mission_XXX.xml) furnished in respect with a unique XSD file (Mission.xsd).

A planning describes a path solution for a single mission. It is defined by a XML file (Planning_XXX.xml) to be furnished in respect with a unique XSD file (Planning.xsd).

Please see XML and XSD (XML schema definition) for more information about these formats (Cf. W3C).

A test set contains a set of missions data XML files. A first one is available on may 2011. Please see last Test set.

The whole benchmark should be made of several test sets.

Each test set may be split in different test subsets. Each test subset gathers missions over a same graph. Subset data is composed of a single zipped file containing : a list of XML files (each corresponding to a single mission) and a PNG file showing the graph representation.

One may download the benchmark data below. If programming in Java, some code examples are also given to easily extract mission data and write planning results in the correct format. Results may be posted with any additional comments. They will be evaluated, compared and published amongst the results of the other candidates.

To evaluate the performance of solving algorithms, a ranking method has been established (please refer to the FAQ section).

Test set 1

Shipping date : 2011/05/30

Cost function and consumption constraints

The parameters are :

  • Smax : Max speed (kilometers per hour) ; ex: 60 km/h
  • R0 : Initial energy resource (equiv. liters) ; ex: 30 L
  • F0 : Minimal resource flow consumed at null speed (equiv. liters per hour) ; ex: 1 L/h (0 L/h for an electric vehicle)
  • Q : Utility constant ; ex: 400
  • m(e) : Consumption coefficient (%), varying over each edge ; ex: 110%

The variable for the consumption constraint is :

  • s(e) : (Constant) vehicle speed (kilometers per hour) over edge e

The consumption flow law is a function of vehicle speed : F = F0+(M/100)*S2/Q

Representation of this consumption law for an instance set of parameter values

Consumption, function of speed

Coordinates :

  • horizontally rightwards (5 to 100) : speed (kilometers per hour)
  • vertically upwards (5 to 30) : consumption (liters per 100 kilometers)

Parameters :

  • F0 (static flow) = 1 (liters per hour)
  • M = 100 (%)
  • Q = 400

© MatLab output screenshot

Consumption, function of speed and static flow

Coordinates :

  • horizontally right&frontwards (5 to 100) : speed (kilometers per hour)
  • horizontally right&backwards (0 to 60) : static flow F0 parameter (liters per hour)
  • vertically upwards (0 to 60) : consumption (liters per 100 kilometers)

Parameters :

  • M = 100 (%)
  • Q = 400

© MatLab output screenshot

In order to preserve the final solution precision, consumption variables are expressed in centiliters and time variables are expressed in seconds. One may be free to implement sharper precision, by using finer units.

Mission data

Our data are defined in a set of mission data XML files (… Mission_XXX.xml…) patterned on the furnished commented unique mission data XSD file (Mission.xsd). In each XML mission file is also furnished the optimal planning for this mission.

Please see XML and XSD (XML schema definition) for more information about these formats.

This test set is splitted in 3 mission data files subsets 1, 2 and 3 corresponding to 3 graphs 1, 2 and 3 ; these graphs are all flat (altitude = 0), undirected (each edge has its revert edge) ; the different mission data of each subset correspond to various values of the parameters (local and global constraints).

A table representing the test set mission data is available in Test set 1 page.

A mission data XML file contains (please see the commented XML schema definition fileMission.xsd for details) :

  • missionId
  • graph
    • vertices (list)
      • id
      • X
      • Y
    • edges (list)
      • srcVertexId
      • dstVertexId
      • overConsumption
  • localConstraints
    • startVertexId
    • goalVertexId
    • mandWaypointVertexId (list)
  • globalConstraints
    • maxSpeed (Smax)
    • initResource (R0)
    • minResourceFlow (F0)
    • utilitaryConstant (Q)
  • optimSolution (list)
    • pathVertexId
    • pathTime

Tools

In order to facilitate your code writing, 2 JAVA files are furnished :

  • readMission.java : a reader (extractor) for XML mission data files (from XML file to memory structure)
  • writePlanning.java : a writer for XML planning results files (from memory structure to XML file)

Planning results

Your planning result XML files, one for each mission (… Planning_XXX.xml corresponding to Mission_XXX.xml…) should be formatted in respect with the furnished commented unique planning result XSD file (Planning.xsd). In order to facilitate his work, an example of XML planning result file (Planning_Ex.xml) is added. It is also possible to use our tools (cf. Tools section) to generate easily the files.

Please see XML and XSD (XML schema definition) for more information about theese formats (Cf. W3C).

A planning results XML file contains (please see the commented XML schema definition file Planning.xsd for details) :

  • identifier
    • entity
    • deliveryDate
    • missionId
  • computer
    • cpuType
    • processorNb
    • clockFreq
    • ramSize
    • cacheMemorySize
  • algorithm
  • foundSolution
    • computationTime
    • pathVertices (list)
      • pathVertexId
      • pathTime

The clock frequency shall be defined in MHz ; RAM size and cache memory size (L2) shall be defined in MB. It is up to the candidate to give details about the algorithm used (which will be publish also on this webpage). Useful link to a description website can possibly be added. Any additional comments are also welcome.

In order to publish your results, please refer to the section Publishing your results.

Dataset description

This test set contains :

  • 2 XSD files :
    • Mission.xsd (please clic Mission.xsd to see this file))
    • Planning.xsd (please clic Planning.xsd to see this file))
  • 1 XML example planning file :
    • Planning_Ex.xml
  • 2 JAVA IO files :
    • readMission.java
    • writePlanning.java
  • 3 subsets, each of then attached to a specific graph and containing XML mission files (… Mission_XXX.xml…) :
    • Subset 1 (graph 1), containing 60 XML mission files (… Mission_1-YYY.xml…) and 1 PNG graph file (Graph1.png)
    • Subset 2 (graph 2), containing 60 XML mission files (… Mission_2-YYY.xml…) and 1 PNG graph file (Graph2.png)
    • Subset 3 (graph 3), containing 60 XML mission files (… Mission_3-YYY.xml…) and 1 PNG graph file (Graph3.png)

Downloads

© Free of use for any evaluation purpose but please read the copyrights information first.

Global download (single compact file) :

  • Download the whole Test set (Coming soon)

Elementary download (splitted in several compact files) :

  • Download Subsets :
Download
Subset 1 : XML mission files and PNG graph file
(ZIP, 6+1 files, 63 kB)
Download
Subset 2 : XML mission files and PNG graph file
(ZIP, 6+1 files, 50 kB)
Download
Subset 3 : XML mission files and PNG graph file
(ZIP, 6+1 files, 59 kB)

Benchmark results

We will be pleased to publish the result of your own mission planning algorithm on our website as long as you use the same databases.

A table representing the test set mission data is available in Test set 1 page. It contains for each mission :

  • Id
  • Optimal solution path (unseparated character chain, list of pathVertexId(pathTime))
  • Optimal solution value (mission duration)

Global results for the centers or companies already publishing results in this benchmark :

Shipping Analysis Synthesis Environ
Rate Mean relative Note Mean rel.
Id Reference Optim. proximity Computing speed Split Global Perf
Author Date Solution Sol. optim. proof Solution Solution Sol. optim. proof Solution Sol. optim. proof Indicator
0 Safran Sagem Défense Sécurité 2010/09/14 180 / 180 180 / 180 1.000 1.000 1.000 1.000 1.000 1.000 1.000
1 Candidat1... 2010/10/15 150 / 180 120 / 150 0.900 0.800 0.700 0.600 0.560 0.336 0.750


In this table :

  • for details on column signification, please see FAQ section,
  • for details on mission data test set, please see Test set 1 and access to this test set page,
  • for details on each planning result shipping, please choose a line and clic on the Id item in this line and access to shipping page,
  • for details on each author, please choose a line and clic on the Author item in this line and access to author site.

FAQ

How can I add the performance of my algorithm on this page ?

Please refer to the section Publishing your results.

Which method is used to evaluate the performance of the algorithms ?

The evaluation method is based on an open scale referenced to the first candidate to each benchmark test set : value = 1 (1.000 in the tables).

It synthetizes : the rates of mission planning solutions found and of solution optimality proved, the mean relative optimum proximity and the mean relative computing delays to find the solution and to proove its optimality.

In details, the analysis values correspond to :

  1. Rate of solution : Number of satisfying solutions / Number of missions
  2. Rate of solution optimality proof : Number of optimal proved solutions / Number of satisfying solutions
  3. Mean relative solution optimum proximity : Mean, for all satisfying solutions, of quotients of Optimum mission duration / Found mission duration
  4. Mean relative computing speed for solution : Mean, for all satisfying solutions, of quotients of Reference computing delay to solution / Computing delay to solution
  5. Mean relative computing speed for solution optimality proof : Mean, for all satisfactory proved solutions, of quotients of Reference computing delay to proove optimality of solution / Computing delay to proove optimality of solution

And the final synthesis notes correspond to (refering to these 5 rates and means) :

  1. Note. solution = 1 * 3 * 4
  2. Note. solution optimality proof = 2 * 5
  3. Note. global = Product of both last notes = 1 * 2 * 3 * 4 * 5

A Mean relative test environment performance indicator is added at the end of the table. It is the mean, for all satisfactory solution of the capacity indicators manually elaborated from the textual information furnished by the candidate for each mission.

Publishing your results

In order to publish your performance on this webpage, please send us the results of your algorithm and any additional information. Your post should be sent to :

Notice that the performance are computed according to the rules described in the FAQ section and are (of course) exactly the same for all algorithms.

Copyrights

All data are free, publicly available and can be used for any research purposes.
However, if you publish results (or make public your tests in any other way) please acknowledge that data are coming from the Robotics Centre of Mines ParisTech and are publicly available at : http://www.lara.prd.fr/benchmarks/missionplanning
Commercial use is NOT ALLOWED without our official agreement.

Contact

For any information or for question about commercial use, please contact :


©20110530 Christian JOUBERT, François LUCAS Mines-Paristech CAOR, Safran Sagem Défense Sécurité

 
benchmarks/missionplanning.txt · Last modified: 2012/05/14 23:50 by Francois Lucas
Recent changes · Show pagesource · Login