Basic algorithms
Last updated
Last updated
Single instance tasks fired synchronously:
J = Ji(Ci, (ai = 0,)Di), i = 1..n
Executing the tasks by non-decreasing deadlines minimizes the maximum lateness:
(Lmax(J) = maxi(fi − di))
Complexity:
O(n.log(n))
J = {J1(1, 5), J2(2, 4), J3(1, 3), J4(2, 7)}
Determine the maximum lateness.
Lmax,EDD(J) = -1
Single instance or periodic, asynchronous arrivals, preemptive:
J = Ji (Ci , ai , Di ), i = 1..n
Always executing the task with the shorter absolute deadline minimizes the maximum latency:
Lmax(J) = maxi(fi − di)
Complexity:
O(n.log (n))
Optimal among all scheduling algorithms of this class.
J = {J1(1, 0, 5), J2(2, 1, 5), J3(1, 2, 3), J4(2, 1, 8)}
Determine the maximum lateness.
Lmax,EDD(J) = -2
Single instance or periodic tasks, asynchronous arrivals, non-preemptive:
J = Ji (Ci , ai , Di ), i = 1..n
Based on building an exhaustive search in the permutation tree space, finding all possible execution sequences:
Complexity:
O(n!)
J = {J1(1, 0, 5), J2(2, 1, 3), J3(1, 2, 4), J4(2, 1, 7)}
Periodic tasks
The release/activation instants are known a priori
Γ = {τi (Ci , ∅i , Ti , Di )}, i = {1...n}
ai,k = ∅i + (k − 1) · Ti , k = 1, 2, ...
Thus, in this case the schedule can be built:
Tasks to execute are selected as they are released and finished, during normal system operation.
The task execution order is computed before the system enters in normal operation and stored in a table, which is used at runtime time to execute the tasks (static cyclic scheduling).