XJTU System Optimization and Scheduling Course Assignments
西安交通大学研究生《系统优化与调度》课程作业
Assignment 1
(1)1.1.2
(a)
so there are three stationary points: (2, 0) (-2, 0) and (0, 0)
the Hesse Matrix is positive semi-definite only when
so points (2, 0) (-2, 0) are local minimum points, point (0, 0) is
not a local minimum.
let
(b)
if
if
so the stationary points of
for points
function
(c)
(1)-(2):
if
using double angle formula:
solution:
if
solution:
The stationary points are
for point
for point
for point
(d)
Thus the stationary point is
(e)
Since the conclusion of (d)——the stationary point is neither a local maximum nor a local minimum, so the minimum point could only exist on the boundary.
Consider
In this case, the minimum point is
Consider
In this case, the minimum points are
Summarizing, with the constraint
(2)1.1.6
suppose
and
(b)
there is only one stationary point in this question, so the solution is unique
(C)
let the plane points on be the reference level
suppose the length of string in weight
(3) 1.2.1
(a)
the start point is
at first,
then
so the first stepsize is 0.0625, after the iteration, the point
changes to
(b)
when
then
(c)
at point
at first,
Comparing with (a), the computing cost of Newton's method is less
(4) 1.2.6
(a)
PROOF:
PROOF:
then:
(5) 1.3.1
Let
to m and M, respectively, and let
rule,(i.e., a stepsize
(6) 1.3.4
then:
(7) 1.4.1
In Newton's Method, we have:
## (8)
PROOF:
Since
(9)
(a)
the solution of the equation set above is
it is the point satisfying the first-order necessary conditions
(b)
it is easy to know the Hessian Matrix is always positive definite, so
the point
(c)
for the steepest decent method,
(10)
(a)
the solution of the equation set above is
so point
(b)
Because Hessian Matrix is always positive definite, the function
(c)
the minimum points must on the boundary of the constraint.
consider
consider
therefore, the minimum point of f subject to
(d)
for the steepest decent method,
Assignment 2
(1)
Q: Consider the quadratic program
Prove that is a local minimum point if and only if it is a global minimum point.(No convexity is assumed)
Proof:
Since no convexity is assumed, we consider two cases:
If Q is a positive definitive matrix. It’s obvious that
is a local minimum point if and only if it is a global minimum point.If Q is not a positive definitive matrix.
Assume
so
and because
so
so we have
then according to the optimality condition, we have
then
and because
(2)
Suppose that the projected negative nablaient d is calculated satisfying
and that some component of , corresponding to an inequality, is negative. Show that if the th inequality is dropped, the projection of the negative nablaient onto the remaining constraints is a feasible direction of decent
Proof:
Let the new direction vector
Let
We know that
Multiplying the equation
So the vector
(3) 3.1.9
(a)
consider the Lagrangian Relaxation:
(b)
consider all lines parallel to the line that passes through
intersect the circle.
the maximal area appears only if the point
and we can see that the line through
so this line must passes through the center of the circle.
(c)
Fix two vertices and the corresponding side of the corresponding side of the triangle and optimize vertex and the vertex of the circle should be orthogonal to the fixed side. Hence, the optimal is obtained when the two remaining side are equal. Repeating this argument, fixing each side of the triangle in turn, we see that all sides should be equal.
(d)
Similar line of argument as in (c).
(4) 3.3.5
(a)
Let
And we have:
Because
So,
(b)
If x* is a constrained local minimum, from (a) we can get
we have the desired result.
(c)
We want to show that
From Mean Value Theorem, for each
Because
From each of 1-4, we can also prove that
So we can get that
(d)
We can easily get the local minimum is
So the condition2 is violated. Similarly, condition3 is violated. The
vectors
Assume
(e)
From the description we know that
The modified problem has a single constraint
Since
Because
(5) 5.1.7
(a)
Then the total weight and value of objects included is
- t._{i=1}^nw_ix_iA, xi ∈ {0, 1}, i= 1, . . . , n. $$
(b)
Let
And
If
(c)
Consider a relaxed version of the problem of minimizing
- t._{i=1}^nw_ix_iA, xi ∈ [0, 1], i= 1, . . . , n. $$ Let
and be the optimal values of the relaxed problem and its dual, respectively.
In the relaxed problem, the cost function is convex over
Thus, according to Prop. 5.2.1, there is no duality gap, and
Again,
Following Example 5.1.2, it can be seen that the optimal primal and
dual solution pair
μ∗ w_i v_i , if x_i^*=0,\
μ∗ w_i v_i , if x_i^*= 1, $$
In fact, it is straightforward to show that there is an optimal
solution of the relaxed problem such that at most one
(d)
Since the object weights and values remain the same we have from (c)
that
Assignment 3
5.5.2
Using Lagrangian relaxiation:
the Lagrangian funcion is
Using constraint relaxation:
the releaxed problem is:
- when
, the problem is: when , the value of object function is minimization. so the solution is , it is also integer solution, so the point is the optimal point of the primal problem, minimization is 10
so the lower bound of Lagrangian relaxiation is less than constraint relaxation.
5.5.10
so we have , and we have by the weak duality theorem, so
if
so we can get
and we can get
6.3.1
Let
Let
Since
Let
6.3.3
Suppose that
As seen in 6.3.1, we have for all dual optimal solutions
Assignment 4
2.1
(a)

(b)
We cloud use dynamic program to solve this question, the computational process is following:

and the makespan of this project is 39 weeks
(c)

This is the most cost trace of the project, shorting each of them could save a week time.
job 7 (Walls) should be shorten most since it cost 10 weeks and all the following work must wait for its finishing.
2.4
(a)
The processor sharing is a service discipline where a number of jobs are processed simultaneously, it can be seen that the processing could be interrupt any time. For CPU, the processing already done on the preempted job is not lost, so the model of the processor sharing concept on CPU is Preemptive-Resume form.
(b)
Assume that the switch of job takes no time, there are
We would like the cpu process the job with high weight as quickly as
possible, so we get the following objective optimization problem:
2.9
jobs | 1 | 2 | 3 | 4 |
---|---|---|---|---|
8 | 6 | 4 | 12 | |
4 | 9 | 10 | 6 |
Consider the instance of O2 || Cmax with 4 jobs. The processing times of the four jobs on the two machines are above. Find all optimal schedules and compute the makespan under an optimal schedule.
we could easily find that the sum of processing time of job 1 and job 4 on machine 1 is similiar as the sum of processing time of job 2 and job 3 on machine 2; and the sum of processing time of job 1 and job 4 on machine 21 is same as the sum of processing time of job 2 and job 3 on machine 1. so all the optimial schedule is following:
machine 1 | machine 2 |
---|---|
1->4->2->3 | 2->3->1->4 |
1->4->2->3 | 3->2->1->4 |
4->1->2->3 | 2->3->4->1 |
4->1->2->3 | 3->2->4->1 |
(2->3 or 3->2)->(1->4 or 4->1) | (1->4 or 4->1)->(2->3 or 3->2) |
and the minimum cost time is 30
a
Find the optimal sequence for the following
Jobs 1 2 3 4 5 6 7
pi 6 18 12 10 10 17 16
wi 1 5 2 4 1 4 2
di 8 42 44 24 90 85 68
For this problem,
so we firstly choose the job 1 and job 4, the sequence now is 1->4, then the problem change to:
jobs | 2 | 3 | 5 | 6 | 7 |
---|---|---|---|---|---|
18 | 12 | 10 | 17 | 16 | |
5 | 2 | 1 | 4 | 2 | |
26 | 28 | 74 | 69 | 52 |
then we choose job 2 and job 3, and we can see that
jobs | 5 | 6 | 7 |
---|---|---|---|
10 | 17 | 16 | |
1 | 4 | 2 | |
44 | 39 | 22 |
then we could get the fianl sequence: 1->4->2->3->7->6->5。
and the minimum of the objective funciton
b
For the above problem, what is the optimal schedule if there are two identical machines?
the schedual changes to:
machine 1: 1->2->7->5
machine 2: 4->3->6
c
In applying Lagrangian relaxation to parallel machine scheduling problems, what will happen when ALL the parts are identical, i.e., have the same processing time and the same due date? Examine the problem from both the primal aspect and the dual aspect. What can be done to improve algorithm performance?
We can separate the problem to sub-problems, no matter in primal or dual aspect, if all the parts are identical. We also can consider all the parts as a whole and so do the machines and simplify the parallel-machine problem to one machine problem.
预览: