Minimize λ=ƒ(x)=a(x)b(x) (x∈S)
s.t. ∀x∈S b(x)>0
so how do we solve this problem? a effective solution is parametric search: means we assume x is our answer and check whether x can Minimize λ; similar to binary search
First,we must prove why we can use parametric search:
First,we must prove why we can use parametric search:
suppose λ*=ƒ(x*) is the optimal solution,we have
λ∗=ƒ(x∗)=a(x∗)b(x∗) λ∗∗b∗=a∗ a∗−λ∗∗b∗=0
then we can get a new function: g(λ)=min(a(x)−λ∗b(x))x∈S
Let's talk something about g(λ)
1. monotonicity:g(λ) is a decreasing monotone function, ∀x, if λ1<λ2,then g(λ1)>g(λ2)
Prov: suppose when λ=λ1, min(a(x)−λ1∗b(x))=a(x1)−λ1∗b(x1) g(λ1)=min(a(x)−λ1∗b(x))=a(x1)−λ1∗b(x1)>a(x1)−λ2∗b(x1)≥min(a(x)−λ2∗b(x))=g(λ2)
Prov: suppose when λ=λ1, min(a(x)−λ1∗b(x))=a(x1)−λ1∗b(x1) g(λ1)=min(a(x)−λ1∗b(x))=a(x1)−λ1∗b(x1)>a(x1)−λ2∗b(x1)≥min(a(x)−λ2∗b(x))=g(λ2)
the last step shows that the answer x1make g(λ1)Minimize, does not means that g(λ2)Minimize too.
monotonicity means that we can use binary search, but what is termination ?
2. Dinkelbach Theorem: suppose λ∗ is the optimal solution, then g(λ)=0 if and only if λ=λ∗
First we prove necessary condition: λ∗ is the optimal solution ,if λ=λ∗, \⇒(g(λ)=0\): ∀x∈S, we have
monotonicity means that we can use binary search, but what is termination ?
2. Dinkelbach Theorem: suppose λ∗ is the optimal solution, then g(λ)=0 if and only if λ=λ∗
First we prove necessary condition: λ∗ is the optimal solution ,if λ=λ∗, \⇒(g(λ)=0\): ∀x∈S, we have
but to x∗: λ∗=a(x∗)b(x∗)⇒a(x∗)−λ∗∗b(x∗)=0
so, the minimize value of g(λ∗) is 0, when x=x∗
now we move to sufficient condition: if λ∗ is optimal solution, g(λ)=0⇒λ=λ∗ if exist x1 we have g(λ1)=min(a(x)−λ1∗b(x))=0, we will make sure λ1=ƒ(x1) is the optimal solution
we use contradition to prove: if there exist other λ2=ƒ(x2),λ2<λ1
λ2=a(x2)b(x2)<λ1⇒a(x2)−λ1∗b(x2)<0
now we move to sufficient condition: if λ∗ is optimal solution, g(λ)=0⇒λ=λ∗ if exist x1 we have g(λ1)=min(a(x)−λ1∗b(x))=0, we will make sure λ1=ƒ(x1) is the optimal solution
we use contradition to prove: if there exist other λ2=ƒ(x2),λ2<λ1
so we get a contradition .
we can get conclusion now
core idea of this solution is to transform fractional programming to non-fractional programming, 0-1 fractional programming is special case of fractional programming.
Two example about 0-1 fractional programming:
maximum density subgraph, the optimum ratio spanning tree
maximum density subgraph, the optimum ratio spanning tree