DEV Community

Chan
Chan

Posted on • Edited on

📓A Glossary for Concurrency: Doing several things at the same time

🐾Introduction

When I dove into the world of concurrency, I found out there are a whole bunch of different terminologies describing different concepts. The thing is that they have some common things and there is still clear differencies which gave me a lot of headache. If it's the pain in the ass for you now, this might be the right remedy! Leave a comment if you don't understand things clearly. Plus if you think I should add other terminologies for this post, please let me know!

It's more simple than you think

🚣‍♂️Process

A running program on any computer. Each process has its own code, data, stack, and heap segment.

➗Thread

Imagine a process has a lot of I/O tasks and some tasks demand a large computing power, while there are still some tasks that can run within a short time. These I/O an heavy tasks would prevent other tasks from running. Here's the idea: you divide this process into multiple threads and each thread is responsible for running different kinds of tasks. If the OS runs a thread on the CPU for a certain amount of time and then switches to another one when the time is over, we can imagine that each task can run quite fairly instead of some some task grabbing the computing resource too much time.

Multiprogramming

Suppose there's a processor which has been using the CPU and then dealing with I/O. While the processor is dealing with the I/O task, CPU doesn't get assigned any task so you can basically say the CPU is being wasted. Instead of letting the CPU do nothing and waiting for the process to finishe the I/O task, the operating system could run other process on the CPU, getting the most of the computing resources. We call it multiprogramming.

Time-sharing

Time sharing means the systematic behavior in which an operating system runs a process for a fixed time period(slice), stores the contextual running information of the process, switch to other process, and run this switched process.

Multiprocessing

Most of modern computers come with several processors(cores) to make it compute faster. These multiple processors make up the single CPU. If multiple processes need to run, the operating system maps each processor to each processor, executing multiple processors at the same time literally. We call this system a multiprocessing system.

Multithreading

Remember what we talked about threads. You can run threads on processor(s) intead of processes. We call it multithreading.

Parallel Computing

Parallel Computing means using multiprocessing(or multithreading) to solve a problem that requires a lot of computation. This field is directly related to GPU programming, and GPU plays an imporant role these days because machines need to compute faster to get the AI to run.

Concurrent Computing

Concurrent Computing means multiple tasks are in progress. 'In progress' here refers to the situation in which multiple processes(or threads) can even run on the cpu for a fixed amount of time, whether the computation is done simultanious or in the time-sharing manner.

Non-Blocking

Non-Blocking means a task doesn't take over the CPU when it deals with the (I/O) operations. Instead another task runs. Different from multiprogramming, this is done at the application level.

Asynchronous

An asynchronous system is the system in which the execution of the next code line doesn't get delayed because of the running time of the current code line.

Conclusion

We looked into the meaning of all the terms-process, thread, multiprogramming, time-sharing, multiprocessing, multithreading, prallel computing, concurrent computing, non-blocking, and asynchornous. Now that we don't get confused with the terms!:)

Top comments (0)