1)UNIX operating system uses Round robin Time slicing with multilevel feed back.

Assume that there are 10 processes waiting in a queue which is implemented as a linked list of PCB’s ( process control blocks). Assume the PCB have information about Process ID , CPU burst time required , amount of memory being used .

Assume the time slice is 2 units. Simulate Round robin Time slicing until all the jobs complete and find average waiting time. Modify your program to include random arrival of jobs with a fixed burst time required and find the average waiting time of the jobs completed over a simulation time of 100 units.

I really need your help to do this in C or C++ for my assignment.Due date is on Monday. Any example for me to refer?I desperately need the solution. Anyone just help me please..I don't want to fail the subject.

;)

It seems like your highlighted text would make some good Googling. STFW first. Follow up here to try to get questions to specific problems related to whichever language you choose.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.