Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Write the docs #17

Draft
wants to merge 14 commits into
base: main
Choose a base branch
from
2 changes: 1 addition & 1 deletion .github/workflows/Test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ jobs:
fail-fast: false
matrix:
version:
- "1.6"
- "1.7"
- "1"
os:
- ubuntu-latest
Expand Down
8 changes: 6 additions & 2 deletions Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -4,12 +4,16 @@ authors = ["Matthijs Arnoldus <[email protected]> and contributors"]
version = "0.1.0"

[deps]
DataStructures = "864edb3b-99cc-5e75-8d2d-829cb0a9cfe8"
Distances = "b4f34e82-e78d-54a5-968a-f98e89d6e8f7"
JuMP = "4076af6c-e467-56ae-b986-b466b2749572"
MathOptInterface = "b8f27783-ece8-5eb3-8dc8-9495eed66fee"
Metaheuristics = "bcdb8e00-2c21-11e9-3065-2b553b22f898"
Statistics = "10745b16-79ce-11e8-11f9-7d13ad32a3b2"

[compat]
Distances = "0.10"
JuMP = "1"
MathOptInterface = "1"
Distances = "0.10"
julia = "1.6"
Metaheuristics = "3.3"
julia = "1.7"
3 changes: 3 additions & 0 deletions docs/make.jl
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,9 @@ makedocs(;
),
pages = [
"Home" => "index.md",
"How to Use" => "how-to-use.md",
"Tutorials" => "tutorials.md",
"Concepts" => "concepts.md",
"Contributing" => "contributing.md",
"Dev setup" => "developer.md",
"Reference" => "reference.md",
Expand Down
50 changes: 50 additions & 0 deletions docs/src/concepts.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
```@contents
Pages = ["concepts.md"]
Depth = 5
```

# Concepts

Here we explain in more detail the underlying theoretical concepts of NearOptimalAlternatives.jl. We first discuss the traditional approach modelling-to-generate-alternatives and then discuss the evolutionary approach.

## Modelling-to-Generate-Alternatives (MGA)

Modelling-to-generate-alternatives (MGA) is a technique to find alternative solutions to an optimisation problem that are as different as possible from the optimal solution, introduced by Brill Jr et al. [^brill]. Their approach consists of a Hop-Skip-Jump MGA method and works as follows. First, an initial solution is found using any optimisation method. Next, an amount of slack specified by the user is added to the objective function. Then, this objective function is encoded as a constraint and a new objective function that minimises the weighted sum of decision variables that appeared in previous solutions. This process is iterated as long as changes in the solutions are visible.

[^brill]: E. D. Brill Jr, S.-Y. Chang, and L. D. Hopkins, “Modeling to generate alternatives: The hsj approach and an illustration using a problem in land use planning,” Management Science, vol. 28, no. 3, pp. 221–235, 1982.

For problems with non-binary variables the corresponding MGA problem can be formulated as follows. Given the optimal solution $x^*$ to an optimisation problem with constraints $Ax \leq b, x \geq 0$ and objective $c^{\top}x$, we solve the following problem

$$
\begin{align}
&max &|| x - x^* ||_d \\
&s.t. &c^{\top}x \geq (1-\epsilon) c^{\top}x^*\\
&&Ax \leq b \\
&&x \geq 0,
\end{align}
$$
where $\epsilon$ is the objective gap which specifies the maximum difference between the objective value of a solution and the optimal objective value and $d$ is any distance metric.

## Evolutionary Algorithms for Generating Alternatives (EAGA)

Evolutionary algorithms have been proposed as an alternative method to mathematical programming for generating alternative solutions by Zechman and Ranjithan [^zechman].

[^zechman]: E. M. Zechman and S. R. Ranjithan, “An evolutionary algorithm to generate alternatives (eaga) for engineering optimization problems,” Engineering Optimization, vol. 36, no. 5, pp. 539–553, 2004.

Their method works as follows. Instead of simply initialising an initial population as a regular evolutionary algorithm would do, they divide this population into $P$ subpopulations, where $P$ is equal to the number of alternative solutions to be found. Each subpopulation is dedicated to search for one alternative solution. The first subpopulation can also be used to find the global optimum. After initialising the population, they take the following steps iteratively. First, evaluate all individuals with respect to the objective and feasibility. Also, the distance between this solution and other subpopulations, or there centroids, is taken into account. So, the best individual is a feasible solution which is furthest away from other subpopulations. They used elitism to preserve the best solution in each subpopulation. Afterwards, after checking stopping criteria, they applied binary tournament selection based on the fitness of the solution to select the rest of the individuals.

### Particle Swarm Optimisation for Generating Alternatives (PSOGA)

In this package we developed PSOGA, an modification of EAGA using Particle Swarm Optimisation (PSO). It works as follows.

When initialising the algorithm, the population of individuals is divided into $n$ equal-size subpopulations, where $n$ is the number of alternative solutions sought. As with regular PSO, each individual has a position $x$ and a velocity $v$.

The update step of the algorithm works very similar to regular PSO. In every iteration, each individual is updated as follows. First, its velocity is updated and becomes
$$v = \omega \cdot v + \textit{rand}(0,1) \cdot c_1 \cdot (p_{\textit{best}} - x) + \textit{rand}(0,1) \cdot c_2 \cdot (s_{\textit{best}} - x).$$
In the above equation, $\omega$ represents the inertia, $c_1$ is the cognitive parameter and $c_2$ is the social parameter. These make sure that the old velocity is taken into account, previous information from this individual is used and information from other individuals in the subpopulation is used, respectively. Therefore, the variables $p_\textit{best}$, representing the personal best position of this individual, and $s_\textit{best}$,representing the alltime best of the subpopulation this individual is in, are required. Note that $s_\textit{best}$ replaces $g_\textit{best}$, which is used in regular PSO and represents the global best solution of the full population.

After updating the velocity of each individual, all positions are updated using $x = x + v$. Subsequently, all personal bests and subpopulation bests are updated based on the objective value. For PSOGA the objective is to generate alternatives that are as different as possible from the optimal solution, but also from each other. The aim here is to make sure each subpopulation finds one alternative, and these are spread out over the search space.

When comparing two solutions to decide which is better, we therefore take the following approach. If either of the solutions is infeasible, we take the solution with the smallest constraint violation. If none are infeasible we pick the one with the largest distance, where the distance can be defined in two ways. Either, we calculate the sum of all distances to other subpopulations and to the original optimal solution, or we calculate the minimum of the distances to other subpopulations and the original optimal solution. To compute the distance to other subpopulations, we calculate the centroid (average) of all points in the subpopulation and compute the distance to that centroid.

The algorithm terminates when the subpopulations have converged, or the maximum number of iterations has been met. By then, the subpopulations should be spread out over the feasible space and as far as possible from the initial optimal solution.
45 changes: 45 additions & 0 deletions docs/src/how-to-use.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
# How to Use

```@contents
Pages = ["how-to-use.md"]
Depth = 5
```

## Install

In Julia:

- Enter package mode (press "]")

```pkg
pkg> add NearOptimalAlternatives
```

- Return to Julia mode (backspace)

```julia
julia> using NearOptimalAlternatives
```

## Generate alternatives

To generate alternative solutions to a solved JuMP model, use either of the functions:
[`generate_alternatives!(model,optimality_gap,n_alternatives)`](@ref)
[`generate_alternatives(model,optimality_gap,n_alternatives, metaheuristic_algorithm)`](@ref)

The `model` should be a solved JuMP model. The `optimality_gap` is the maximum factor of deviation from the optimal solution. `n_alternatives` specifies the desired number of alternative solutions. If you want to generate alternatives with a metaheuristic instead of mathematical optimisation, specify the `metaheuristic_algorithm` to use. Other optional input parameters are specified in the Input section below.

## Input

The following parameters can be supplied to either of the alternative generating functions (unless otherwise specified). The ones alreay mentioned in the previous section are required, the rest is optional.

- `model`: The solved JuMP model for which we want to find alternative solutions. When using optimisation to find alternatives, the solver specified to solve this model will also perform the optimisation for finding alternatives.
- `optimality_gap`: The maximum objective value deviation each of the alternative solutions may have from the original solution. An optimality gap of `0.5` means that the objective value of an alternative solution must be at least `50%` of the optimal objective value found by solving `model` (in case of a maximisation problem).
- `n_alternatives`: The number of alternative solutions to be found by this package.
- `metaheuristic_algorithm` (only for metaheuristic `generate_alternatives`): The algorithm used to find alternative solutions. Can be an algorithm from [Metaheuristics.jl](https://jmejia8.github.io/Metaheuristics.jl/stable/algorithms/) or the algorithm we developed: `PSOGA`. The former are repeated iteratively to find multiple alternatives, the latter generates multiple alternatives concurrently.
- `metric`: The distance metric used to compute the difference between solutions (between different alternatives and between alternatives and the optimal solution). This metric should be a `SemiMetric` from the [Distances.jl](https://github.com/JuliaStats/Distances.jl) package. Note that, depending on the solver used, several metrics might not be usable when finding alternatives using optimisation. When using a metaheuristic, any metric is usable.
- `fixed_variables`: A vector of variables that should remain fixed when finding alternative solutions. One can use this to find near optimal alternative solutions that only modify a subset of all variables and leave the rest unchanged.

## Output

Both methods for generating alternative solutions return the results in the same form: a structure `AlternativeSolutions` containing a vector `solutions` and `objective_values`. `solutions` holds a dictionary containing the solution value for every JuMP variables (based on its `VariableRef`), per alternative solution. `objective_values` is a vector of floats representing the objective value of each of the alternative solutions.
14 changes: 12 additions & 2 deletions docs/src/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,16 @@
CurrentModule = NearOptimalAlternatives
```

# NearOptimalAlternatives
# NearOptimalAlternatives.jl Documentation

Documentation for [NearOptimalAlternatives](https://github.com/TulipaEnergy/NearOptimalAlternatives.jl).
[NearOptimalAlternatives.jl](https://github.com/TulipaEnergy/NearOptimalAlternatives.jl) is a package for generating near optimal alternative solutions to a solved [JuMP.jl](https://github.com/jump-dev/JuMP.jl) optimisation problem. The alternative solutions are within a maximum specified percentage of the optimum and are as different from the optimal solution (and other alternatives) as possible. Alternatives can either be generated using mathematical optimisation or using a metaheuristic algorithm. For the latter, this package depends on [Metaheuristics.jl](https://github.com/jmejia8/Metaheuristics.jl).

## License

This content is released under the [Apache License 2.0](https://www.apache.org/licenses/LICENSE-2.0) license.

## Contents

```@contents
Pages = ["index.md", "how-to-use.md", "tutorials.md", "concepts.md", "contributing.md", "developer.md", reference.md"]
```
62 changes: 62 additions & 0 deletions docs/src/tutorials.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,62 @@
```@contents
Pages = ["tutorials.md"]
Depth = 5
```

# Tutorials

Here are three tutorials on how to use NearOptimalAlternatives.jl. The tutorials show how to generate alternatives using optimisation, a metaheuristic algorithm and our metaheuristic PSOGA, respectively.

## Alternatives using optimisation

Given a solved JuMP model called `model`, one should first define the number of alternatives they want to generate and the maximum deviation in objective value compared to the optimal solution. For instance,

```julia
optimality_gap = 0.5 # Objective value may deviate at most 50% from optimal solution.
n_alternatives = 2
```

Now, they can call the following function to generate alternatives

```julia
alternatives = NearOptimalAlternatives.generate_alternatives!(model, optimality_gap, n_alternatives)
```

As a default, this method uses the squared euclidean metric from the [Distances.jl](https://github.com/JuliaStats/Distances.jl) package. If you want to use a different distance metric, you can simply define the metric and supply it as an argument to the function as follows (weighted metrics are also supported).

```julia
metric = Distances.Euclidean() # Use Euclidean instead of SqEuclidean
alternatives = NearOptimalAlternatives.generate_alternatives(model, optimality_gap, n_alternatives, metric=metric)
```

If you only want to change specific variables of a problem when generating alternatives, you can fix the other variables as follows. Suppose you are solving a problem for which the model contains 3 variables ($x_1$, $x_2$, $x_3$) and you want to fix $x_2$. You then simply create a vector of fixed variables and supply this as a parameter to the function.

```julia
fixed_variables = [x_2] # x_2 should be the VariableRef in the JuMP model.
alternatives = NearOptimalAlternatives.generate_alternatives(model, optimality_gap, n_alternatives, fixed_variables=fixed_variables)
```

## Alternatives using a metaheuristic algorithm

Generating alternatives using a metaheuristic algorithm from [Metaheuristics.jl](https://github.com/jmejia8/Metaheuristics.jl) works similarly. We still need a solved `model`, an `optimality_gap` and the amount of alternatives `n_alternatives`. As an extra, we now need to define the algorithm we want to use. For instance:

```julia
metaheuristic_algorithm = Metaheuristics.PSO()
```

Then we call the following function using all parameters to obtain the results.

```julia
alternatives = generate_alternatives(model, optimality_gap, n_alternatives, metaheuristic_algorithm)
```

Again, `metric` and `fixed_variables` can be supplied as optional parameters. The parameters of the `metaheuristic_algorithm` can be defined when initialising it. For more details on this, take a look at the [Metaheuristics.jl documentation](https://jmejia8.github.io/Metaheuristics.jl/stable/).

### Alternatives using PSOGA

To use our concurrent Particle Swarm Optimisation metaheuristic PSOGA, the same steps should be taken as when using another metaheuristic. The only difference is that you have to supply the number of alternatives to the algorithm as well, so it knows how many subpopulations it should keep in its parameters. The following code shows how to do this and obtain the alternatives.

```julia
metaheuristic_algorithm = NearOptimalAlternatives.PSOGA(N_solutions=n_alternatives)
alternatives = generate_alternatives(model, optimality_gap, n_alternatives, metaheuristic_algorithm)
```
5 changes: 5 additions & 0 deletions src/NearOptimalAlternatives.jl
Original file line number Diff line number Diff line change
Expand Up @@ -5,9 +5,14 @@ module NearOptimalAlternatives
using JuMP
using Distances
using MathOptInterface
using Metaheuristics
using DataStructures
using Statistics

include("results.jl")
include("alternative-optimisation.jl")
include("generate-alternatives.jl")
include("alternative-metaheuristics.jl")
include("algorithms/PSOGA/PSOGA.jl")

end
Loading
Loading