Methods: Linear programming

Definition

Under line­ar pro­gramming (LP), all optimi­zation pro­blems are grou­ped who­se objec­ti­ve func­tion and cons­traints are of a line­ar natu­re. It is the simp­lest class of optimi­zation pro­blems that requi­re nume­ri­cal solu­ti­on approa­ches. They can all be writ­ten in the form:

$$ \begin{align} \min_x ~~~& \langle c,x\rangle \\ \text{s.t.} ~~~&Ax=b \\ ~~~& Gx\ge h \end{align}$$

in the optimi­zation varia­ble \(x\), whe­re \(x, c, b, h\) are vec­tors and \(A, G\) are matri­ces. The abo­ve pro­blem in detail­ed nota­ti­on is as follows.

  • Find the varia­bles \(x_1, …, x_n\) such that \(\langle c,x\rangle = c_1x_1 + \dots + c_nx_n\) is minimized.
  • The fol­lo­wing con­di­ti­ons needs to be satistified:
    1. \(a_{k1} x_1 + , …, + a_{kn}x_n =b_k ~~~~k=1, …, m_1\)
    2. \(g_{k1} x_1 + , …, + g_{kn}x_n \ge h_k ~~~~k=1, …, m_2\)

The pro­blem con­sists of a sum \(\langle c,x\rangle\) to be mini­mi­zed, \(m_1\) equa­li­ties, and \(m_2\) ine­qua­li­ties. All terms are line­ar in the optimi­zation varia­ble \(x\), hence the name “line­ar programming”.

Example

LP invol­ves arche­typ­al eco­no­mic pro­duc­tion pro­blems such as pro­duc­tion plan­ning [1, pp. 14–16]. For ins­tance, if \(x_1, x_2\) are the quan­ti­ties of two pro­ducts with pro­duc­tion cos­ts \(c_1, c_2\), then the optimi­zation problem

$$ \begin{align} \min_x ~~~& c_1 x_1 + c_2 x_2 \\ \text{s.t.} ~~~&x_1 + x_2 =2 \\ ~~~& x_1 \ge 0 ~~~ x_2 \ge 0 \end{align}$$

can be inter­pre­ted as an ins­truc­tion to mini­mi­ze pro­duc­tion cos­ts while cove­ring a total demand of 2. Nega­ti­ve pro­duc­tion quan­ti­ties are not allo­wed. Exten­si­ons of this exam­p­le to mul­ti­ple pro­ducts, mini­mum demands, and resour­ce con­sump­ti­on cons­traints are easi­ly feasible.

Figu­re 1: Geo­me­tric illus­tra­ti­on of the exem­pla­ry pro­duc­tion plan­ning pro­blem. Each equa­ti­on and ine­qua­li­ty rest­ricts the fea­si­ble set to a line or half-space.

Geometrical aspects

The abo­ve exam­p­le has a clear geo­me­tric inter­pre­ta­ti­on as deter­mi­ning a two-dimen­sio­nal point \(x = [x_1, x_2]\) that is cons­trai­ned in its posi­ti­on by some lines.

The opti­mum is rea­ched at the ver­tex of the fea­si­ble set through which the line of mini­mal cost equi­va­lence runs. In high-dimen­sio­nal pro­blems, two-dimen­sio­nal vec­tors beco­me n‑dimensional vec­tors, line equa­tions beco­me hyper­pla­ne equa­tions, and the ine­qua­li­ties defi­ne half-spaces. The fun­da­men­tal geo­me­tric pro­per­ties and inter­pre­ta­ti­ons remain the same.

Applications

The appli­ca­ti­ons of line­ar pro­gramming extend far bey­ond pro­duc­tion plan­ning. They include pro­blems from logi­stics, trans­por­ta­ti­on and rou­ting plan­ning, machi­ne sche­du­ling, inven­to­ry con­trol, diet design, mate­ri­al flow, fac­to­ry lay­out, sto­cha­stic ine­qua­li­ties, image qua­li­ty optimi­zation, data com­pres­si­on, robust appro­xi­ma­ti­on, game theo­ry, and logic. Depen­ding on the appli­ca­ti­on, \(\langle c,x\rangle\) repres­ents mone­ta­ry cos­ts, nega­ti­ve pro­fit, dura­ti­ons, maxi­mum devia­ti­ons, pro­ba­bi­li­ties, or auxi­lia­ry quan­ti­ties that are dif­fi­cult to inter­pret. More details on the­se appli­ca­ti­ons can be found here. The repre­sen­ta­ti­ons of the­se pro­blems as LP are often not obvious.

Solution procedure

Line­ar pro­grams are his­to­ri­cal­ly the first high-dimen­sio­nal optimi­zation pro­blems rele­vant to real-world appli­ca­ti­ons that could be relia­bly sol­ved. In the con­text of plan­ning pro­blems for the US Air Force after World War II, Dant­zig deve­lo­ped the so-cal­led Sim­plex Method, who­se fur­ther deve­lo­p­ment and use for resour­ce allo­ca­ti­on pro­blems hel­ped Koop­mans and Kan­to­ro­vic win a Nobel Pri­ze in Eco­no­mics [2, p. 10].

As alre­a­dy noted in Figu­re 1 and gene­ral­ly pro­va­ble, the opti­mal point \(x\) of an LP is loca­ted at one of the ver­ti­ces of the fea­si­ble set defi­ned by the inter­sec­tions of hyper­pla­nes. Enu­me­ra­ting and sequen­ti­al­ly test­ing each ver­tex leads to the relia­ble iden­ti­fi­ca­ti­on of the opti­mum [3, p. 72]. Sin­ce the num­ber of ver­ti­ces can grow dis­pro­por­tio­na­te­ly with the num­ber of dimen­si­ons, this com­ple­te enu­me­ra­ti­on is often cos­t­ly, and bet­ter methods now exist; the inte­ri­or-point methods. The­se methods invol­ve repla­cing the ine­qua­li­ties with bar­ri­er func­tions, who­se weight is incre­men­tal­ly redu­ced, ther­eby gene­ra­ting a sequence of points—the cen­tral path—that con­ver­ges to the optimum.

Figu­re 2: Illus­tra­ti­on of the Sim­plex Method, whe­re all ver­ti­ces of the fea­si­ble set are enu­me­ra­ted and tes­ted (a). In the case of the inte­ri­or-point methods, an incre­men­tal­ly redu­ced bar­ri­er func­tion takes the place of the ine­qua­li­ties (b).

Practical aspects

LPs with hundreds of thou­sands of optimi­zation varia­bles are now well sol­va­ble on com­mer­cial home com­pu­ters. Open source algo­rith­ms such as tho­se imple­men­ted in the GNU Line­ar Pro­gramming Kit (GLPK) or Goo­g­le’s GLOP sol­ver are available for this pur­po­se. The fact that even Excel is now capa­ble of sol­ving LPs can be seen as fur­ther evi­dence that LP has rea­ched tech­ni­cal matu­ri­ty. The main chall­enge lies in the for­mu­la­ti­on of the optimi­zation pro­blem itself.

Sources

[1] Hu, T. C., & Kahng, Andrew B. (2016). Line­ar and Inte­ger Pro­gramming Made Easy. Ber­lin, Hei­del­berg: Springer.

[2] Van­der­bei, Robert J. (2013). Line­ar Pro­gramming: Foun­da­ti­ons and Exten­si­ons. USA: Sprin­ger US.

[3] Gass, Saul I. (2003). Line­ar Pro­gramming: Methods and Appli­ca­ti­ons. New York: Cou­rier Corporation.