Skip to content

Commit 9005c13

Browse files
author
Documenter.jl
committed
delete history
0 parents  commit 9005c13

File tree

21,113 files changed

+6096493
-0
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

21,113 files changed

+6096493
-0
lines changed

.gitignore

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
.DS_Store

.nojekyll

Whitespace-only changes.

0.12/_sources/callbacks.txt

Lines changed: 411 additions & 0 deletions
Large diffs are not rendered by default.

0.12/_sources/example.txt

Lines changed: 91 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,91 @@
1+
.. _simple-example:
2+
3+
Simple Example
4+
^^^^^^^^^^^^^^
5+
6+
In this section we will construct a simple model and explain every step along the way.
7+
The are more complex examples in the ``JuMP/examples/`` `folder <https://github.com/JuliaOpt/JuMP.jl/tree/master/examples>`_. Here is the code we will walk through::
8+
9+
using JuMP
10+
11+
m = Model()
12+
@defVar(m, 0 <= x <= 2 )
13+
@defVar(m, 0 <= y <= 30 )
14+
15+
@setObjective(m, Max, 5x + 3*y )
16+
@addConstraint(m, 1x + 5y <= 3.0 )
17+
18+
print(m)
19+
20+
status = solve(m)
21+
22+
println("Objective value: ", getObjectiveValue(m))
23+
println("x = ", getValue(x))
24+
println("y = ", getValue(y))
25+
26+
Once JuMP is :ref:`installed <jump-installation>`, to use JuMP in your
27+
programs, you just need to say::
28+
29+
using JuMP
30+
31+
Models are created with the ``Model()`` function::
32+
33+
m = Model()
34+
35+
.. note::
36+
Your model doesn't have to be called m - it's just a name.
37+
38+
There are a few options for defining a variable, depending on whether you want
39+
to have lower bounds, upper bounds, both bounds, or even no bounds. The following
40+
commands will create two variables, ``x`` and ``y``, with both lower and upper bounds.
41+
Note the first argument is our model variable ``m``. These variables are associated
42+
with this model and cannot be used in another model.::
43+
44+
@defVar(m, 0 <= x <= 2 )
45+
@defVar(m, 0 <= y <= 30 )
46+
47+
Next we'll set our objective. Note again the ``m``, so we know which model's
48+
objective we are setting! The objective sense, ``Max`` or ``Min``, should
49+
be provided as the second argument. Note also that we don't have a multiplication ``*``
50+
symbol between 5 and our variable ``x`` - Julia is smart enough to not need it!
51+
Feel free to stick with ``*`` if it makes you feel more comfortable, as we have
52+
done with ``3*y``::
53+
54+
@setObjective(m, Max, 5x + 3*y )
55+
56+
Adding constraints is a lot like setting the objective. Here we create a
57+
less-than-or-equal-to constraint using ``<=``, but we can also create equality
58+
constraints using ``==`` and greater-than-or-equal-to constraints with ``>=``::
59+
60+
@addConstraint(m, 1x + 5y <= 3.0 )
61+
62+
If you want to see what your model looks like in a human-readable format,
63+
the ``print`` function is defined for models.
64+
65+
::
66+
67+
print(m)
68+
69+
Models are solved with the ``solve()`` function. This function will not raise
70+
an error if your model is infeasible - instead it will return a flag. In this
71+
case, the model is feasible so the value of ``status`` will be ``:Optimal``,
72+
where ``:`` again denotes a symbol. The possible values of ``status``
73+
are described :ref:`here <solvestatus>`.
74+
75+
::
76+
77+
status = solve(m)
78+
79+
Finally, we can access the results of our optimization. Getting the objective
80+
value is simple::
81+
82+
println("Objective value: ", getObjectiveValue(m))
83+
84+
To get the value from a variable, we call the ``getValue()`` function. If ``x``
85+
is not a single variable, but instead a range of variables, ``getValue()`` will
86+
return a list. In this case, however, it will just return a single value.
87+
88+
::
89+
90+
println("x = ", getValue(x))
91+
println("y = ", getValue(y))

0.12/_sources/index.txt

Lines changed: 124 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,124 @@
1+
===========================================
2+
JuMP --- Julia for Mathematical Programming
3+
===========================================
4+
5+
.. module:: JuMP
6+
:synopsis: Julia for Mathematical Programming
7+
8+
`JuMP <https://github.com/JuliaOpt/JuMP.jl>`_ is a domain-specific modeling language for
9+
`mathematical programming <http://en.wikipedia.org/wiki/Mathematical_optimization>`_
10+
embedded in `Julia <http://julialang.org/>`_.
11+
It currently supports a number of open-source and commercial solvers (see below)
12+
for a variety of problem classes, including **linear programming**, **mixed-integer programming**, **second-order conic programming**, **semidefinite programming**, and **nonlinear programming**.
13+
JuMP's features include:
14+
15+
* User friendliness
16+
17+
* Syntax that mimics natural mathematical expressions.
18+
* Complete documentation.
19+
20+
* Speed
21+
22+
* Benchmarking has shown that JuMP can create problems at similar speeds to
23+
special-purpose modeling languages such as `AMPL <http://www.ampl.com/>`_.
24+
* JuMP communicates with solvers in memory, avoiding the need to write
25+
intermediary files.
26+
27+
* Solver independence
28+
29+
* JuMP uses a generic solver-independent interface provided by the
30+
`MathProgBase <https://github.com/mlubin/MathProgBase.jl>`_ package, making it easy
31+
to change between a number of open-source and commercial optimization software packages ("solvers").
32+
* Currently supported solvers include
33+
`Bonmin <https://projects.coin-or.org/Bonmin>`_,
34+
`Cbc <https://projects.coin-or.org/Cbc>`_,
35+
`Clp <https://projects.coin-or.org/Clp>`_,
36+
`Couenne <https://projects.coin-or.org/Couenne>`_,
37+
`CPLEX <http://www-01.ibm.com/software/commerce/optimization/cplex-optimizer/>`_,
38+
`ECOS <https://github.com/ifa-ethz/ecos>`_,
39+
`GLPK <http://www.gnu.org/software/glpk/>`_,
40+
`Gurobi <http://www.gurobi.com>`_,
41+
`Ipopt <https://projects.coin-or.org/Ipopt>`_,
42+
`KNITRO <http://www.ziena.com/knitro.htm>`_,
43+
`MOSEK <http://www.mosek.com/>`_,
44+
`NLopt <http://ab-initio.mit.edu/wiki/index.php/NLopt>`_,
45+
and `SCS <https://github.com/cvxgrp/scs>`_.
46+
47+
* Access to advanced algorithmic techniques
48+
49+
* Including :ref:`efficient LP re-solves <probmod>` and :ref:`callbacks for mixed-integer programming <callbacks>` which previously required using solver-specific and/or low-level C++ libraries.
50+
51+
* Ease of embedding
52+
53+
* JuMP itself is written purely in Julia. Solvers are the only binary dependencies.
54+
* Being embedded in a general-purpose programming language makes it easy to solve optimization problems as part of a larger workflow (e.g., inside a simulation, behind a web server, or as a subproblem in a decomposition algorithm).
55+
56+
* As a trade-off, JuMP's syntax is constrained by the syntax available in Julia.
57+
58+
* JuMP is `MPL <https://www.mozilla.org/MPL/2.0/>`_ licensed, meaning that it can be embedded in commercial software that complies with the terms of the license.
59+
60+
While neither Julia nor JuMP have reached version 1.0 yet, the releases are stable enough for everyday use and are being used in a number of research projects and neat applications by a growing community of users who are early adopters. JuMP remains under active development, and we welcome your feedback, suggestions, and bug reports.
61+
62+
Installing JuMP
63+
---------------
64+
65+
If you are familiar with Julia you can get started quickly by using the
66+
package manager to install JuMP::
67+
68+
julia> Pkg.add("JuMP")
69+
70+
And a solver, e.g.::
71+
72+
julia> Pkg.add("Clp") # Will install Cbc as well
73+
74+
Then read the :ref:`quick-start` and/or see a :ref:`simple-example`.
75+
The subsequent sections detail the complete functionality of JuMP.
76+
77+
Contents
78+
--------
79+
80+
.. toctree::
81+
:maxdepth: 2
82+
83+
installation.rst
84+
quickstart.rst
85+
refmodel.rst
86+
refvariable.rst
87+
refexpr.rst
88+
probmod.rst
89+
callbacks.rst
90+
nlp.rst
91+
92+
-----------
93+
Citing JuMP
94+
-----------
95+
96+
If you find JuMP useful in your work, we kindly request that you cite the following `paper <http://dx.doi.org/10.1287/ijoc.2014.0623>`_:
97+
98+
.. code-block:: none
99+
100+
@article{LubinDunningIJOC,
101+
author = {Miles Lubin and Iain Dunning},
102+
title = {Computing in Operations Research Using Julia},
103+
journal = {INFORMS Journal on Computing},
104+
volume = {27},
105+
number = {2},
106+
pages = {238-248},
107+
year = {2015},
108+
doi = {10.1287/ijoc.2014.0623},
109+
URL = {http://dx.doi.org/10.1287/ijoc.2014.0623}
110+
}
111+
112+
A preprint of this paper is freely available on `arXiv <http://arxiv.org/abs/1312.1431>`_.
113+
114+
If you use the nonlinear or conic optimization functionality of JuMP, please cite the following `preprint <http://arxiv.org/abs/1508.01982>`_ which describes the methods implemented in JuMP. You may cite it as:
115+
116+
.. code-block:: none
117+
118+
@article{DunningHuchetteLubin2015,
119+
title = {{JuMP}: {A} modeling language for mathematical optimization},
120+
author = {Iain Dunning and Joey Huchette and Miles Lubin},
121+
journal = {arXiv:1508.01982 [math.OC]},
122+
year = {2015},
123+
url = {http://arxiv.org/abs/1508.01982}
124+
}

0 commit comments

Comments
 (0)