1 of 1 people found this helpful
I think the book gives sufficient illustration on what is the difference between the two - Page 6 and 7.
Concurrency is a very broad set. For example, As I am typing this reply, You are probably reading other threads. These are concurrent activities. Concurrency could just happen out of mere existence of universe. Everything is running concurrently at the same time.....But, it cannot be construed as parallel (i.e. done parallel with a purpose).
Similarly even in computers, there are lot of things that can happen concurrently - e.g. a file being downloaded from internet, the user filling an e-form, music player playing music....This is due to multi-tasking. However, one cannot construe these programs as parallel programs. These are just programs running at same time (without even knowing that they are running concurrent).
However. when you write a multi-threaded application -- the software explicitly knows that it is running concurrently. Done with a purpose. Concurrent by its DNA. These applications are called Parallel Programs. They are a subset of concurrence.
The book is talking at a very high conceptual level. Just probably talking about a very subtle difference...
Lee, You might want to correct/comment on this... Thanks,
Ah yes. This was an interesting topic of debate at the time, even among the authors. It is an important distinction, though.
In a concurrent program multiple tasks are live at the same time, when looking at their start to end ranges. However, there is no requirement that those tasks are actually running together at any given time step. So on a single core CPU, in any modern operating system, you have multiple threads. Those threads are concurrent, but you have no task-level parallelism on the CPU. The threads are interleaved via time slicing. However, you may have instruction level parallelism or fine-grained data-parallelism going on whereby multiple instructions execute on each clock cycle on multiple different execution units or across vector units. When you add multiple cores in the system and can run more than one thread at once you get parallelism also from the point of view of the set of concurrent tasks in the system. So in that case you might have 100 way concurrency, and yet only 2 way parallelism.
A quick google on this subject led me to this post over at stackoverflow: http://stackoverflow.com/questions/1050222/concurrency-vs-parallelism-what-is-the-difference
Or here is a fascinating insight in terms of gophers: http://concur.rspace.googlecode.com/hg/talk/concur.html#slide-29
1 of 1 people found this helpful
I feel the confusion must have arisen from the statement:
"Parallel programs must be concurrent,but concurrent programs need not be parallel"
Parallelism and Concurrency - by your definition here in this forum - is related to the run-time -- that executes the program.
So, "Concurrent Program" and "Parallel Program" are the same -- at least to a developer.
Only the execution platform - makes whether the program realizes parallelism or not.
Yes, that's largely true. SIMD parallelism for example is fairly explicit. Concurrency can also refer to patterns that require tasks to work together even without automatic context switching, and that may mean that without parallelism you have to be careful about how the code is written to ensure correctness.