Parallel computing this topic has been portrayed in order to give the general concept on how the computers (CPU) are organizing their activities. Abdala Sasya presented the topic with collaboration Mary Mwakisisile. The presentation was good so far, it gave different concepts to the class. Traditionally the software to be run in the computer was written in serial way, this made them to function in single computer having a single CPU.
CPU is the integrated electronic device that interprets instructions to the computer, performs the logical operation, and causes the input/output operations to occur. Under serial computation, the problem is broken into a discrete series of instruction and executed one after another at a moment.
Mathematician Neumann invented the idea of parallel computing when he was asking himself if a single CPU can solve a problem in ten seconds, then could ten CPU solve the same problem in one second. To get the answer for that question took long run; so he can explain that the parallel computing is the form of computation in which many tasks are carried out simultaneously then software was programmed in parallel computation in order to be run using multiple CPUs. Under parallel computing, a problem is broken into discrete parts that are solved in at the same time.
Those are forms of parallel of computing; Bit level parallelism is the amount of information the processor can execute. Instruction level parallelism controls how many operations in one computer program can be performed concurrently. Data parallelism is the distributing the data across different computing nodes. Task parallelism the different calculations either can be performed on the same or sets of data. These forms of parallel computing can be used in electrical engineering e.g. circuit design, Computer science e.g. Mathematics manipulation. Physics applied e.g. nuclear atoms.
The importances of parallel computing are those:-
o Parallel computing saves time in the sense that the concurrent execution of the problems limits the time.
o Solves even large problems which are complex
o Data sharing is also possible due to fact one machine may be doing normal activities and transferring other files.
Challenges of parallel computing.
The system is very expensive due to facts mounting many chips need good technology
Increase the traffic congestion to the shared memory (CPU path).
There must be a programmer to ensure correct access of global memory.
Parallel computing is the great idea to computer experts as it serves time to execute any problem also sharing information and solving large problems.
References:
.htt://en.wikipedia.org/wiki/computing.
.Leadbetter chips, (2004) Computer studies and information technology. Cambridge university.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment