1)UNIX operating system uses Round robin Time slicing with multilevel feed back.

Assume that there are 10 processes waiting in a queue which is implemented as a linked list of PCB’s ( process control blocks). Assume the PCB have information about Process ID , CPU burst time required , amount of memory being used .

Assume the time slice is 2 units. Simulate Round robin Time slicing until all the jobs complete and find average waiting time. Modify your program to include random arrival of jobs with a fixed burst time required and find the average waiting time of the jobs completed over a simulation time of 100 units.

Can any expert help me to explain what this question mean?I need to do this in C or C++ for my assignment. Any example for me to refer?

I know round robin, but to implement a linked list of PCB anf some other information like "PCB have information about Process ID , CPU burst time required , amount of memory being used ." and time slice =2units make me confuse. Please help me, I really desperate to get the solution. I am scare I will fail this subject. thank you very much.

Round robin(RR) - Are you taking an OS class? This is not hard man,
just think a little about it. You've got 10 processes, and you wanna slice the time to 2 units and switch between them to make it look like all the 10 processes are executing concurrently. Run Proc1 for 2 units, then move it to the end of the queue or list, then run Proc2 for 2 units then move it to the back of the queue, and so on ...

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.