Process Synchronization in Operating System

0
43

Introduction

Operating System involves execution of process which are parts of program under execution and is essential for completion of a program.These processes require certain system resources which include CPU, video card,memory.Process synchronization term refers to sharing of these resources by two or more processes in order to prevent conflict in execution so that data consistency is maintained.The need for process Synchronization mainly occurs when more than one process in under execution.

Critical Section and Race Condition

To get the main motive and need for process Synchronization involves understanding of critical-section :

CRITICAL SECTION

It is the part of the code that require shared access of resources which means the part of code which includes variables which can be accessed by multiple process simultaneously which may lead to RACE CONDITION which is a situation when two or more threads try to concurrently read,write,modify memory.

Conditions to satisfy in order to overcome critical section problem:

Mutual Exclusion

Out of a group of processes, only one process is allowed to be in the critical section at a given point of time.

Progress

Each process in Independent of each other which means that if a particular process do not want to enter into the critical section it must not hinder other process to enter into the critical section.

Bounded Waiting

After a process makes a request for getting into its critical section, there is a limit for how many other processes can get into their critical section, before this process’s request is granted. So after the limit is reached, system must grant the process permission to get into its critical section.

The following are the different solutions or algorithms used in synchronizing the different processes of any operating system:

1. Peterson’s solution

Peterson’s solution is one of the famous solutions to critical section problems. It’s solution states that when a process is executing in its critical state, then the other process executes the rest of the code and vice versa. This ensures that only one process is in the critical section at a particular instant of time. When a process wants to execute its critical section, it sets its flag to true and turn as the index of the other process. This means that the process wants to execute but it will allow the other process to run first. The process performs busy waiting until the other process has finished its own critical section.

2. Locking solution

Locking solution is another solution to critical problems in which a process acquires a lock before entering its critical section. When a process finishes its executing process in the critical section, then it releases the lock. Then the lock is available for any other process that wants to execute its critical section. The locking mechanism also ensures that only one process is in the critical section at a particular instant of time.

3. Semaphore solution

Semaphore solutions are another algorithm or solution to the critical section problem. It is basically a synchronization tool in which the value of an integer variable called semaphore is retrieved and set using wait and signal operations. Based on the value of the semaphore variable, a process is allowed to enter its critical section.

FAQ’S

What is A Critical Section

A Critical Section is a code segment that accesses shared variables and has to be executed as an atomic action.

What is Mutex Locks?

In this approach, in the entry section of code, a LOCK is acquired over the critical resources modified and used inside critical section, and in the exit section that LOCK is released.

What is Producer Consumer problem?

A finite buffer pool is used to exchange messages between producer and consumer processes.