# PyMC for Linear Programming

In :
```import pymc as pm, numpy as np, matplotlib.pyplot as plt, seaborn as sns
%matplotlib inline
```

While I understand that there are far superior methods for solving LP problems (e.g. interior point algorithms), can MCMC be used to solve a stochastic LP problem?

PyMC2 can be combined with the LP solver of your choice to solve stochastic LP problems like this one. Here is code to do it for this very simple case. I’ve left a note on how I would change this for a more complex LP.

In :
```c1 = pm.Normal('c1', mu=2, tau=.5**-2)
c2 = -3
b1 = pm.Normal('b1', mu=0, tau=3.**-2)
```
In :
```@pm.deterministic
def x(c1=c1, c2=c2, b1=b1):
# use an LP solver here for a complex problem
arg_min = np.empty(2)
min_val = np.inf
for x1,x2 in [[0,0], [0, b1], [-b1, 0]]: # there are only three possible extreme points,
if -x1 + x2 <= b1 and x1 >= 0 and x2 >= 0: #  so check obj value at each valid one
val = c1*x1 + c2*x2
if val < min_val:
min_val = val
arg_min = [x1,x2]

return np.array(arg_min, dtype=float)
```
In :
```m = pm.MCMC(dict(c1=c1, c2=c2, b1=b1, x=x))
m.sample(20000, 10000, 10)
```
` [-----------------100%-----------------] 20000 of 20000 complete in 0.6 sec`
In :
```pm.Matplot.plot(m)
```
```Plotting b1
Plotting c1
Plotting x_0
Plotting x_1
```   Look at the weird joint distribution for \$(x_1, x_2)\$:

In :
```sns.jointplot(m.x.trace()[:,0], m.x.trace()[:,1], stat_func=None)
```
Out: In :
```import pymc as pm, numpy as np, matplotlib.pyplot as plt, seaborn as sns
%matplotlib inline
```

While I understand that there are far superior methods for solving LP problems (e.g. interior point algorithms), can MCMC be used to solve a stochastic LP problem?

PyMC2 can be combined with the LP solver of your choice to solve stochastic LP problems like this one. Here is code to do it for this very simple case. I’ve left a note on how I would change this for a more complex LP.

In :
```c1 = pm.Normal('c1', mu=2, tau=.5**-2)
c2 = -3
b1 = pm.Normal('b1', mu=0, tau=3.**-2)
```
In :
```@pm.deterministic
def x(c1=c1, c2=c2, b1=b1):
# use an LP solver here for a complex problem
arg_min = np.empty(2)
min_val = np.inf
for x1,x2 in [[0,0], [0, b1], [-b1, 0]]: # there are only three possible extreme points,
if -x1 + x2 <= b1 and x1 >= 0 and x2 >= 0: #  so check obj value at each valid one
val = c1*x1 + c2*x2
if val < min_val:
min_val = val
arg_min = [x1,x2]

return np.array(arg_min, dtype=float)
```
In :
```m = pm.MCMC(dict(c1=c1, c2=c2, b1=b1, x=x))
m.sample(20000, 10000, 10)
```
` [-----------------100%-----------------] 20000 of 20000 complete in 0.6 sec`
In :
```pm.Matplot.plot(m)
```
```Plotting b1
Plotting c1
Plotting x_0
Plotting x_1
```   Look at the weird joint distribution for \$(x_1, x_2)\$:

In :
```sns.jointplot(m.x.trace()[:,0], m.x.trace()[:,1], stat_func=None)
```
Out: 