Suppose you are given a set S={a1,a2,,an}S = \{a_1, a_2, \ldots, a_n\} of tasks, where task aia_i requires pip_i units of processing time to complete, once it has started. You have one computer on which to run these tasks, and the computer can run only one task at a time. Let cic_i be the completion time of task aia_i , that is, the time at which task aia_i completes processing. Your goal is to minimize the average completion time, that is, to minimize (1/n)i=1nci(1 / n) \sum_{i = 1}^n c_i. For example, suppose there are two tasks, a1a_1 and a2a_2, with p1=3p_1 = 3 and p2=5p_2 = 5, and consider the schedule in which a2a_2 runs first, followed by a1a_1. Then c2=5c_2 = 5, c1=8c_1 = 8, and the average completion time is (5+8)/2=6.5(5 + 8) / 2 = 6.5. If task a1a_1 runs first, however, then c1=3c_1 = 3, c2=8c_2 = 8, and the average completion time is (3+8)/2=5.5(3 + 8) / 2 = 5.5.

a. Give an algorithm that schedules the tasks so as to minimize the average completion time. Each task must run non-preemptively, that is, once task aia_i starts, it must run continuously for pip_i units of time. Prove that your algorithm minimizes the average completion time, and state the running time of your algorithm.

b. Suppose now that the tasks are not all available at once. That is, each task cannot start until its release time rir_i. Suppose also that we allow preemption, so that a task can be suspended and restarted at a later time. For example, a task aia_i with processing time pi=6p_i = 6 and release time ri=1r_i = 1 might start running at time 11 and be preempted at time 44. It might then resume at time 1010 but be preempted at time 1111, and it might finally resume at time 1313 and complete at time 1515. Task aia_i has run for a total of 66 time units, but its running time has been divided into three pieces. In this scenario, aia_i's completion time is 1515. Give an algorithm that schedules the tasks so as to minimize the average completion time in this new scenario. Prove that your algorithm minimizes the average completion time, and state the running time of your algorithm.

a. Order the tasks by processing time from smallest to largest and run them in that order. To see that this greedy solution is optimal, first observe that the problem exhibits optimal substructure: if we run the first task in an optimal solution, then we obtain an optimal solution by running the remaining tasks in a way which minimizes the average completion time. Let OO be an optimal solution. Let aa be the task which has the smallest processing time and let b be the first task run in OO. Let GG be the solution obtained by switching the order in which we run aa and bb in OO. This amounts reducing the completion times of a and the completion times of all tasks in GG between aa and bb by the difference in processing times of aa and bb. Since all other completion times remain the same, the average completion time of GG is less than or equal to the average completion time of OO, proving that the greedy solution gives an optimal solution. This has runtime O(nlgn)O(n\lg n) because we must first sort the elements.

b. Without loss of generality we my assume that every task is a unit time task. Apply the same strategy as in part (a), except this time if a task which we would like to add next to the schedule isn't allowed to run yet, we must skip over it. Since there could be many tasks of short processing time which have late release time, the runtime becomes O(n2)O(n^2) since we might have to spend O(n)O(n) time deciding which task to add next at each step.