Parallel Computing

 

Traditionally, computer software has been written for serial computation. To solve a problem, an algorithm is constructed which produces a serial stream of instructions. These instructions are executed on a central processing unit on one computer. Only one instruction may execute at a given time – after that instruction is finished, the next is executed.

Parallel computing on the other hand uses multiple  processing elements simultaneously to solve a problem. The problem is broken into parts which are independent so that each processing element can execute its part of the algorithm simultaneously with others. The processing elements can be diverse and include resources such as a single computer with multiple processors, a number of networked computers, specialized hardware or any combination of the above.

Frequency scaling was the dominant reason for computer performance increases from the mid-1980s until 2004. The total runtime of a program is proportional to the total number of instructions multiplied by the average time per instruction. Maintaining everything else constant, increasing the clock frequency decreases the average time it takes to execute an instruction. An increase in frequency thus decreases runtime for all computation-bounded programs. However, power consumption in a chip is given by the equation  P = C ×V × F2, where P is power, C is the capacitance being switched per clock cycle (proportional to the number of transistors whose inputs change), V is voltage, and F is the processor frequency (cycles per second). Increases in frequency thus increase the amount of 2power used in a processor. Increasing processor power consumption led ultimately to Intel’s May 2004 cancellation of its Tejas and Jayhawk processors, which is generally cited as the end of frequency scaling as the dominant computer architecture paradigm. Moore’s Law is the empirical observation that transistor density in a microprocessor doubles every 18 to 24 months. Despite power issues, and repeated predictions of its end, Moore’s law is still in effect. With the end of frequency scaling, these additional transistors (which are no longer used to facilitate frequency scaling) can be used to add extra hardware to facilitate parallel computing.

Parallel computing is a form of computing in which  many instructions are carried out simultaneously. Parallel computing operates on the principle that large problems can almost always be divided into smaller ones, which may be carried out concurrently (“in parallel”). Parallel  computing exists in several different forms: bit-level parallelism, instruction level parallelism, data parallelism, and task parallelism. It has been used for many years, mainly in high performance computing, but interest in it has  become greater in recent years due to physical constraints preventing frequency scaling. Parallel computing has recently become the dominant paradigm in computer architecture, mainly in the form of multicore processors. Parallel computer programs are harder to write than sequential ones, because concurrency introduces several new classes of potential software bugs, of which race conditions are the most common. Communication and synchronization between the different subtasks is typically one of the greatest barriers to getting good parallel program performance. In recent years, power consumption in parallel computers has also become a great concern. The speed up of a program as a result of parallelization is given by Amdahl’s law.

For much details, go through following links and have fun  and dig deep into this wonderful world 🙂

https://computing.llnl.gov/tutorials/parallel_comp/#Whatis

http://www.cs.rit.edu/~ncs/parallel.html

http://www.eecs.umich.edu/~qstout/parlinks.html

Advertisements

About Arun Mishra

“We often becomes what we believes ourself to be. If I keep on saying to myself that I cannot do a certain thing, it is possible that I may end by really becoming incapable of doing it. On the contrary, if I have the belief that I can do it, I shall surely acquire the capacity to do it even if I may not have it at the beginning.” SO, “If you put your effort and concentration into playing to your potential, to be the best that you can be, I don't care what the scoreboard says at the end of the game, in my book we're gonna be winners. Because "I feel like my wings are finally coming back. They were broken, and there was a point where I thought I was confined to this earth. But I feel like they're back now. And I'm excited to fly again. And sure, there are going to be bad and tough times. I can easily see them now but that's not a reason to stay on the ground. Everyone has to fall sometime but no matter how long it takes you, you eventually get tired of dragging your feet through the mud, and you get up and find your wings have healed and they ache to fly again. So I'll fly, I'll fall, I'll get back up, and I'll live."
This entry was posted in Research. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s