Практикум составлен в соответствии с требованиями к коммуникационной подготовки выпускников инновационной образовательной программы
Скачать 186.22 Kb.
|
Part of the job is handled by the manufacturer: extremely high-end I/O subsystems arranged in topologies that minimized the effective distances between processors while also minimizing the amount of intercommunication required for the processors to get their jobs done. The other part of the job is borne by the users of the system. If they expect to get their money's worth from these highly expensive machines, they must make every effort to optimize or "parallelize" their programs so that they can make use of the many processors. If this is not done properly it will result in processors sitting idle while the information they need to continue executing is held up in an undetected bottleneck caused by poor parallelization. Worse yet, parallelization adds a certain, not insignificant, amount of complexity to the program, increasing the number of bugs and the amount of side-effects caused by changes to the code.
Activities I. Find the equivalents to the words given in box A from box B
II. Choose the right answer on the text information 1. A bottleneck means … . a) a neck of a wine bottle b) a narrow place of road c) a part of a process where performance is slowed down 2. Input/output devices allow the computer to … with its external environment. a) compute b) command c) communicate 3. Minimizing both the effective distances between processors and the amount of intercommunication required for the processors is aimed at … a) reducing costs b) processor speed increase c) simplifying the program 4. Side-effects caused by poor parallelization result in … . a) bugs b) sound effects c) problems with the code change III. Complete the following sentences using the text information 1. The main three problems that supercomputers manufacturers face are … . 2. The problem of input/output speed between the data-storage medium and memory can be solved by … . 3. The cost in research and development of processors … all the time. 4. A new gallium arsenide technology is not developing because … . 5. Parallel processing means adding … in order to give … . 6. The two problems of parallel processing are … . 7. Distributed computing systems can be the best way out for … . 8. If the buyers expect to get their money's worth from these highly expensive machines they must … . IV. Speak about the problems of supercomputers and different solutions using the graph given below Limits i/o speed between processors overhead individual processor speed solution gallium arsenide processors parallel processing i/o subsystems topology improvement V. Questions for discussion: 1. What are the main problems of supercomputer performance? What are the ways of their solving? 2. What is parallel processing? What is the purpose of parallelization? 3. Is parallel processing a real solution of the problems? Why? 4. How can both manufacturers and customers help in solving the problem? 5. A computer with extremely high-end I/O subsystems will be inefficient in case of a poor user. Do you agree? Why? 6. What are the drawbacks of poor parallelization? VI. Read the text and suggest the title. Find key words in the text The National Nuclear Security Administration (NNSA) today officially dedicated two new, next-generation supercomputers that will help ensure the U.S. nuclear weapons stockpile remains safe and reliable without nuclear testing. These are the Purple and BlueGene/L systems. The dedication marks the culmination of a ten-year campaign to use supercomputers to run three-dimensional codes at lightning-fast speeds to achieve much of the nuclear weapons analysis that was formerly accomplished by underground nuclear testing. NNSA announced that the BlueGene/L supercomputer performed a record 280.6 trillion operations per second on the industry standard LINPACK benchmark. The supercomputing community uses the LINPACK benchmark application as the measure of performance to determine rankings on the Top 500 computer list. BlueGene/L moved into classified production in February 2006, to address critical problems of materials aging. The machine is primarily intended for stockpile science molecular dynamics and turbulence calculations. High peak speed, superb scalability for molecular dynamics codes, low cost and low power consumption make this an ideal solution for this area of science. The 101 teraflop record-setting materials science calculations involved the simulation of the cooling process in a molten actinide uranium system, a material and process of importance to stockpile stewardship. This was the largest simulation of its kind ever attempted and demonstrates that BlueGene/L's architecture can operate with real-world applications. Purple, the other half of the most powerful supercomputing twosome on earth, is a machine capable of 100 teraflops as it conducts simulations of a complete nuclear weapons performance. Purple consists of 94 teraflop classified and 6 teraflop unclassified environments together totaling 100 teraflops. The machine’s architecture with large memory powerful processors and massive network bandwidth is effective for running newly developed 3D weapons codes needed to simulate complete nuclear weapons performance. The insights and data gained from materials aging calculations to be run on BlueGene/L will be vital for the creation of improved models to be used for future full weapons performance simulations on Purple. The machines were designed to meet requirements in weapons simulations and materials science. The approach of dividing requirements across two machines, rather than building a single machine to meet all requirements, turned out to be the efficient and cost effective way to meet program objectives. Comprehensive text-related glossary
VII. Match the beginning of a sentence with an ending to produce a statement that is correct according to the text
VIII. Translate the following attribute constructions: stockpile science molecular dynamics and turbulence calculations ;101 teraflop record-setting materials science calculations; new, next-generation supercomputers; U.S. nuclear weapons stockpile; a ten-year campaign culmination; newly developed 3D weapons codes; complete nuclear weapons performance; materials aging calculations. IX. Questions for discussion: 1.What are the best supercomputers announced by the National Nuclear Security Administration? 2. What are the problems these machines are considered to solve? 3. Try to remind the characteristics of both supercomputers mentioned in the text. 4. Why the use of two supercomputers working together is more efficient than building one super powerful machine? 5. Do you agree that the described two supercomputers will make the world safer? Why? X. Group work. Using key words try to make a denotation graph on the text. Tell a summary of the text according to your graph to another group who has to draw the graph after listening to your summary. Compare your graphs. ---------------------------------------------------------------------------------------Unit Five Text. COMPUTER SIMULATION Key words: a computer simulation, mathematical model, computer modeling, analytic solutions, test objects, types of simulation A computer simulation, a computer model or a computational model is a computer program, or network of computers, that attempts to simulate an abstract model of a particular system. Computer simulations have become a useful part of mathematical modeling of many natural systems in physics (computational physics), chemistry and biology, human systems in economics, psychology, and social science and in the process of engineering new technology, to gain insight into the operation of those systems, or to observe their behavior. Computer simulation varies from computer programs that run a few minutes, to network-based groups of computers running for hours, to ongoing simulations that run the days. The scale of events being simulating by computer simulations has far exceeded anything possible (or perhaps even imaginable) using the traditional paper-and-pencil mathematical modeling: over 10 years ago, a desert-battle simulation, of one force invading another, involved the modeling of 66,239 tanks, trucks and other vehicles on simulated terrain around Kuwait, using multiple supercomputers in the DoD High Performance Computer Modernization Program; another simulation ran a 1-billion-atom model, where previously, a 2.64-million-atom model of a ribosome in 2005 had been considered a massive computer simulation, and the Blue Brain (project of Switzerland) began in May 2005, to create the first computer simulation of the entire human brain, right down to the molecular level. Traditionally, the formal modelling or modeling of systems has been via a mathematical model, which attempts to find analytical solutions to problems which enables the prediction of the behavior of the system from a set of parameters and initial conditions. While computer simulations might use some algorithms from purely mathematical models, computers can combine simulations with reality of actual events, such as generating input responses, to simulate test objects who are no longer present. Whereas the missing test subjects are being modelled/ simulated, the system they use could be the actual equipment, revealing performance limits or defects in long-term use by the simulated users. Note that the term computer simulation is broader than computer modeling, which implies that all aspects are being modelled in the computer representation. However, computer simulation also includes generating inputs from simulated users to run actual computer software or equipment with only part of the system being modelled: an example would be flight simulators which can run machines as well as actual flight software. Simulation in general is to pretend that one deals with a real thing while really working with a imitation. In operations research the imitation is a computer model of the simulated reality. Also a flight simulator on a PC is a computer model of some aspects of the flight: it shows on the screen the controls and what the “pilot” (the youngster who operates it) is supposed to see from the “cockpit”. Computer simulation was developed hand-in-hand with the rapid growth of the computer, following its first large-scale deployment during the Manhattan Project in World War II to model the process of nuclear detonation. Computer simulation is often used as an adjunct to, or substitution for modeling systems for which simple closed form analytic solutions are not possible. The common feature all types of computer simulation can share is the attempt to generate a sample of representative scenarios for a model in which a complete enumeration of all possible states of the model would be prohibitive or impossible. Computer models were initially used as a supplement for other arguments, but their use later became widespread. Comprehensive text-related glossary
Activities I. Complete the following sentences on the text information 1. Modelling of systems has been achieved by means of … .2. Computer simulation is used in … . 3. The first computer simulation helped to create … . 4. The common feature they all share is … . 5. Computer models were initially used as … . 6. Computer simulation varies from … II. Questions for discussion: 1. Which of the terms (simulation/ modelling) includes more meanings? Why? 2. What computer programs are familiar to you? 3. What does computer simulation include? 4. What are the main differences between computer simulation and computer modeling? 5. What models can computer simulation combine? III. Find the paragraph where is mentioned about: 1) mathematical models as a basis of simulation; 2) different spheres where computer simulation is used; 3) the difference in terms computer modeling and computer simulation; 4) the basis of computer models; 5) computer simulation of actual events. |