Warning: Undefined array key "rcommentid" in /customers/6/5/f/pcm.me/httpd.www/wp-content/plugins/wp-recaptcha/recaptcha.php on line 348 Warning: Undefined array key "rchash" in /customers/6/5/f/pcm.me/httpd.www/wp-content/plugins/wp-recaptcha/recaptcha.php on line 349

Linear Programming Topics

Linear programming is a quantitative analysis technique for optimizing an objective function given a set of constraints. As the name implies, the functions must be linear in order for linear programming techniques to be used.

 

Problem Formulation Checklist

The objective function and constraints are formulated from information extracted from the problem statement. The following checklist is useful for minimizing the risk of errors in problem formulation.

  1. Every number in problem statement should be either implemented in the formulation or rejected as irrelevant, e.g. sunk costs.
  2. Don’t forget any initial conditions, e.g. initial staff on hand at beginning of first staffing period.
  3. Ensure every variable in the objective function is listed somewhere in the constraints.
  4. Ensure that any non-negativity constraints are listed.
  5. Ensure that binary integer variables are restricted to 0,1.
    For example, Y ∈ {0,1}
  6. For good form, move all variables to left hand side of equation, writing them in the order of their subscripts.

 

Sensitivity Analysis

While problems may be modeled using deterministic objective functions, in the real world there is variation. A sensitivity analysis can be performed to determine the sensitivity of the solution to changes in parameters.

Microsoft Excel can generates a sensitivity report in two parts – a changing cells report and a constraints report.

Changing Cells (Adjustable Cells)

 

Cell

 

 

Name

 

 

Final
Value

 

 

Reduced
Cost

 

 

Objective
Coefficient

 

 

Allowable
Increase

 

 

Allowable
Decrease

 

For the Changing Cells report, the allowable increase and decrease refers to how much the objective function decision variable coefficient can change without changing the values of any of the decision variables. However, the objective function value will have to change if a coefficient changes and the corresponding decision variable does not change. Note though, that multiplying each term in the objective function by a constant does not change the values of the decision variables.

The 100% rule can be used to determine if a change in multiple objective function coefficients will change the values of the decision variables. Under this rule, any combination of changes can occur without a change in the solution as long as the total percentage deviation from the coordinate extremes does not exceed 100%. However, the objective value would change since the objective coefficients are changing.

For the purpose of this analysis, the decision variable coefficient is the effective number that is multiplied by the decision variable when the objective function is simplified so that each decision variable appears once.

Reduced Cost is how much more attractive the variable’s coefficient in the objective function must be before the variable is worth using. Ignore the sign reported by Excel.

 

Constraints

 

Cell

 

 

Name

 

 

Final
Value

 

 

Shadow
Price

 

 

Constraint
R.H. Side

 

 

Allowable
Increase

 

 

Allowable
Decrease

 

 

The shadow price is the amount that the objective function value would change if the named constraint changed by one unit. The shadow price is valid up to the allowable increase or decrease in the constraint. The shadow price after the constraint is changed by the entire allowable amount is unknown, but is always less favorable than the reported value due to the law of diminishing returns.

To determine if a constraint is binding, compare the Final Value with the Constraint R.H. Side. If a constraint is non-binding, its shadow price is zero.

 

Linearity from Non-Linear Problems

Many problems that initially may be non-linear may be made linear by careful formulation. For example, one can avoid using the inequality ≠. For binary integer variables, X + Y ≠ 1 is the same as saying X = Y.

 

Binary Variables

When relating continuous decision variables to binary switch variables, the following form often is useful:

continuous variable expression  <  (some large number) (binary variable)

 

 

Simulation

Linear programming techniques assume certainty and by themselves do not deal well with significant randomness. The following is one possible procedure for maximizing or minimizing some objective function that contains random variables.

  1. Express the objective function in terms of the decision variable.
  2. Define a search range and incremental search value for the decision variable, possibly using problem information to reduce the search range.
  3. Run a simulation for each incremental value of the decision variable using a Monte Carlo simulator such as Crystal Ball.
  4. Compare the mean expected values of the objective function and their confidence intervals, possibly using statistical hypothesis testing to identify the best solution.

Confidence Intervals

95% confidence intervals:   +/- 1.96 sigma

90% confidence intervals:   +/- 1.645 sigma

 

Useful Functions

MIN(x,y) or MAX(x,y)

x,y can be any variable (possibly a random variable), expression, or number.

 

Applications include revenue calculations that might be limited by either demand or production quantity.

 

~N(μ = x, σ = y)

normal distribution with mean x, std dev. y; directly to right of random variable

~U[x,y]

uniform distribution with min x, max y; directly to right of random variable

 

Leave a Reply