0

1)UNIX operating system uses Round robin Time slicing with multilevel feed back.

Assume that there are 10 processes waiting in a queue which is implemented as a linked list of PCB’s ( process control blocks). Assume the PCB have information about Process ID , CPU burst time required , amount of memory being used .

Assume the time slice is 2 units. Simulate Round robin Time slicing until all the jobs complete and find average waiting time. Modify your program to include random arrival of jobs with a fixed burst time required and find the average waiting time of the jobs completed over a simulation time of 100 units.

I really need your help to do this in C or C++ for my assignment.Due date is on Monday. Any example for me to refer?I desperately need the solution. Anyone just help me please..I don't want to fail the subject.

;)

2
Contributors
1
Reply
2
Views
12 Years
Discussion Span
Last Post by Dave Sinkula
0

It seems like your highlighted text would make some good Googling. STFW first. Follow up here to try to get questions to specific problems related to whichever language you choose.

This topic has been dead for over six months. Start a new discussion instead.
Have something to contribute to this discussion? Please be thoughtful, detailed and courteous, and be sure to adhere to our posting rules.