Loading [MathJax]/jax/output/HTML-CSS/jax.js

Wednesday, May 9, 2012

Fractional Programming

First, the general form of fractional programming is that:
Minimizeλ=ƒ(x)=a(x)b(x)(xS) s.t.xSb(x)>0
      so how do we solve this problem? a effective solution is parametric search: means we assume x is our answer and check whether x can Minimize λ; similar to binary search
      First,we must prove why we can use parametric search:
      suppose λ*=ƒ(x*) is the optimal solution,we have
λ=ƒ(x)=a(x)b(x) λb=a aλb=0
     then we can get a new function: g(λ)=min(a(x)λb(x))xS
      Let's talk something  about g(λ)
      1. monotonicity:g(λ) is a decreasing monotone function, ∀x, if λ1<λ2,then g(λ1)>g(λ2)
      Prov:  suppose when λ=λ1, min(a(x)λ1b(x))=a(x1)λ1b(x1)   g(λ1)=min(a(x)λ1b(x))=a(x1)λ1b(x1)>a(x1)λ2b(x1)min(a(x)λ2b(x))=g(λ2)
     the last step shows that the answer x1make g(λ1)Minimize, does not means that g(λ2)Minimize too.
     monotonicity means that we can use binary search, but what is termination ?
     2. Dinkelbach Theorem: suppose λ is the optimal solution, then g(λ)=0 if and only if λ=λ
    First we prove necessary condition: λ is the optimal solution ,if λ=λ, \⇒(g(λ)=0\): ∀x∈S, we have
λ=a(x)b(x)a(x)b(x)a(x)λb(x)0
but to x:  λ=a(x)b(x)a(x)λb(x)=0
so, the minimize value of g(λ) is 0, when x=x
    now we move to sufficient condition:  if λ is optimal solution, g(λ)=0λ=λ if exist x1  we have g(λ1)=min(a(x)λ1b(x))=0, we will make sure λ1=ƒ(x1) is the optimal solution
    we use contradition to prove: if there exist other λ2=ƒ(x2),λ2<λ1
λ2=a(x2)b(x2)<λ1a(x2)λ1b(x2)<0
    so we get a contradition .
    we can get conclusion now
    core idea of this solution is to transform fractional programming to non-fractional programming, 0-1 fractional programming is special case of fractional programming. 
    Two example about 0-1 fractional programming:
    maximum density subgraph,  the optimum ratio spanning tree