Английский_МУ_Акулина_Информационные тех. Учебное пособие для развития навыков устной речи на английском языке для студентов факультета информационных технологий и компьютерных систем
Скачать 1.24 Mb.
|
PART II SUPPLEMENTARY MATERIAL Задания к текстам:
(1) The text (article) entitled deals with… studies… discusses… is devoted to… is about… (2) Firstly, the author says about… Secondly, he points out… shows… gives… mentions… Then he comments on… illustrates… (3) In conclusion, he summerises… notes… (4) To my mind this text of great use to… is important… In my opinion this article interesting…
Слова и выражения, которые помогут при обсуждении текстов.
TEXT I SIX COMPUTER GENERATIONS The first three generations of computers have traditionally been identified as those using vacuum tubes, transistors, and integrated circuits, respectively. The fourth generation was never so clearly delineated, but has generally been associated with the use of large scale integrated circuits that enabled the creation of microprocessor chips. The next major deviation in computer technology, therefore, could be considered (in 1980) to be the fifth generation. The development of the fifth generation of computer systems is characterized mainly by the acceptance of parallel processing. Until this time parallelism was limited to pipelining and vector processing, or at most to a few processors sharing jobs. The fifth generation saw the introduction of machines with hundreds of processors that could all be working on different parts of a single program. The scale of integration in semiconductor continued at an incredible pace – by 1990 it was possible to build chips with a million components – and semiconductor memories became standard on all computers. All of the mainstream commercial computers to date have followed very much in the footsteps of the original stored program computer, the EDVAC, attributed to John von Neumann. Thus, this conventional computer architecture is referred to as “von Neumann”. It has been generally accepted that the computers of the future would need to break away from this traditional, sequential, kind of processing in order to achieve the kinds of speeds necessary to accommodate the applications expected to be wanted or required. It is expected that future computers will need to be more intelligent providing natural language interfaces, able to “see” and “hear”, and having a large store of knowledge. The amount of computing power required to support these capabilities will naturally be immense. Other new developments were the widespread use of computer networks and the increasing use of single-user workstations. Prior to 1985large scale parallel processing was viewed as a research goal, but two systems introduced around this time are typical of the first commercial products to be based on parallel processing. The Sequent Balance 8000 connected up to 20 processors to a single shared memory module (but each processor had its own local cache). The machine was designed the compete with the DEC VAX-780 as a general purpose Unix system, with each processor working on a different user’s job. However Sequent provided a library of subroutines that would allow programmers to write programs that would use more than one processor, and the machine was widely used to explore parallel algorithms and programming techniques. The Intel iPSC-1, nicknamed “the hypercube”, took a different approach. Instead of using one memory module, Intel connected each processor to its own memory and used a network interface to connect processors. This distributed memory architecture meant memory was no longer a bottleneck and large systems (using more processors) could be built. The largest iPSC-1 had 128 processors. Toward the end of this period a third type of parallel processor was introduced to the market. In this style of machine, known as a data-parallel or SIMD, there are several thousand very simple processors. All processors work under the direction of a single control unit. Scientific computing in this period was still dominated by vector processing. Most manufacturers of vector processors in this parallel models, but there were very few (two to eight) processors introduced parallel machines. In the area of computer networking, both wide area network (WAN) and local area network (LAN) technology developed at a rapid pace, stimulating a transition from the traditional mainframe computing environment toward a distributed computing environment in which each user has their own workstation for relatively simple tasks (editing and compiling programs, reading mail). One of the most dramatic changes in the sixth generation will be the explosive growth of wide area networking. Network bandwidth has expanded tremendously in the last few years and will continue to improve for the next several years. T1 transmission rates are now standard for regional networks, and the national “backbone” that interconnects regional networks uses T3. Networking technology is becoming more widespread than its original strong base in universities and government laboratories as it is rapidly finding application in K-12 education, community networks and private industry. A little over a decade after the warning voiced in the Lax report, the future of a strong computational science infrastructure is bright. The federal commitment to high performance computing has been further strengthened with the passage of two particularly significant pieces of legislation: the High Performance Computing Act of 1991, which established the High Performance Computing and Communication Program (HPCCP) and Sen. Core’s Information Infrastructure and Technology Act of 1992, which addresses a broad spectrum of issues ranging from high performance computing to expanded network access as the necessity to make leading edge technologies available to educators from kindergarten through graduate school. Контрольные вопросы:
TEXT II PROGRAMMING LANGUAGE Programming language is a machine-readable artificial language designed to express computations that can be performed by a machine, particularly a computer. Programming language can be used to create programs that specify the behavior of a machine, to express algorithms precisely, or a mode of human communication. The earliest programming languages predate the invention of the computer, and were used to direct the behavior of machines such as automated looms and player pianos. Thousands of different programming languages have been created, mainly in the computer field, where many more arte being created year. Function: A programming language is a language used to write computer programs, which involve a computer performing some kind of computation or algorithm and possibly control external devices such as printers, robots, and so on. Target: Programming language differ from natural languages in that natural languages also allow humans to communicate instructions to machines. Constructs: Programming languages may contain constructs for defining and manipulating data structures or controlling the flow of execution. Some authors restrict the term “programming language” to those languages that can express all possible algorithms; sometimes the term “computer language” is used for more limited artificial languages. Non-computational languages, such as markup languages like HTML or formal grammars like BNF are usually not considered programming languages. Usage: A programming language provides a structured mechanism for defining pieces of data, and the operations or transformations that may be carried out automatically on that data. Programs for a computer might be executed in a batch process without human interaction, or a user might type commands in an interactive session of an interpreter. In this case the “commands” are simply programs, whose execution is chained together. When a language is used to give commands to a software application (such as a shell) it is called a scripting language. Programs range from tiny scripts written by individual hobbyists to huge systems written by hundreds of programmers. Programs must balance speed, size, and simplicity on systems ranging from microcontrollers to supercomputers. Elements: All programming languages have some primitive building clocks for the description of data and the processes or transformations applied to them (like the addition of two numbers or the selection of an item from a collection). These primitives are defined by syntactic and semantic rules which describe their structure and meaning respectively. A programming language’s surface form is known as its syntax. Most programming languages are purely textual; they use sequences of text including words, numbers, and punctuation, much like written natural languages. On the other hand, there are some programming languages which are more graphical in nature, using visual relationship between symbols to specify a program. The syntax of a language describes the possible combinations of symbols that form a syntactically correct program. Implementation: An implementation of a programming language provides a way to execute that program on one or more configuration of hardware or a program called an o\interpreter. In some implementations that make use of the interpreter approach there is no distinct boundary between compiling and interpreting. For instance, some implementations of the DASIC programming language compile and then execute the source a line at a time. One technique for improving the performance of interpreted programs is just-in-time compilation. Here the virtual machine, just before execution, translates the clocks of bytecode which are going to be used to code, for direct execution on the hardware. Контрольные вопросы:
TEXT III COMPUTER-AIDED DESIGN Computer-Aided Design (CAD) is the use of computer technology to aid in the design and particularly the drafting (technical drawing and engineering drawing) of a part or product, including entire buildings. It is both a visual (or drawing) and symbol-based method of communication whose conventions are particular to a specific technical field. Current Computer-Aided Design software packages range from 2D vector-bases drafting systems to 3D solid and surface modelers. Modern CAD packages can also frequently allow rotations in three dimensions. Allowing viewing of a designed object from any desired angle, even from the inside looking out. Some CAD software is capable of dynamic mathematic modeling, in which case it may be marceted as CADD – computer-aided design and drafting. Software technologies: Originally software for Computer-Aided Design systems was developed with computer languages such as Fortran, but with the advancement of object-oriented programming methods this has radically changed. Typical modern parametric feature based modeler and freeform surface systems are built around a number of key C (programming language) modules with their own APIs. A CAD system can be seen as built up from the interaction of a graphical user interface (GUI) with NURBS geometry and/or boundary representation (B-rep) data via a geometric modeling kernel. A geometry constraint engine may also be employed to manage the associative relationships between geometry, such as wireframe geometry in a sketch or components in an assembly. Today most Computer-Aided Design computers are Windows based PCs. Some CAD systems also run on one of the Unix operating systems and with Linux. Some CAD systems such as QCad, NX or CATIA V5 provide multiplatform support including Windows, Linux, UNIX and Mac OS X. Computer-Aided Design is one of the many tools used by engineers and designers and is used in many ways depending on the profession of the user and the type of software in question. There are several different types of CAD. Each of there different types of CAD systems require the operator to think differently about how he or she will use them and he or she must design their virtual components in a different manner for each. The Effects of CAD Starting in the late 1980s, the development of readily affordable Computer-Aided Design programs that could be run on personal computers began a trend of massive downsizing in drafting departments in many small to mid-size companies. As a general rule, one CAD operator could readily replace at least three to five drafters using traditional methods. Контрольные вопросы:
TEXT IV DATABASE A database is a structured collection or records or data that is stored in a computer system. The structure is achieved be organizing the data according to a database model. The model in most common us today is the relational model. Other models such as the hierarchical model and the network model use a more explicit representation of relationships. Depending on the intended use, there are a number of database architectures in use. Many databases use a combination of strategies. On-line Transaction Processing systems (OLTP) often use a row-oriented datastore architecture, while data-warehouse and other retrieval-focused applications like Google’s Big Table, or bibliographic database (library catalogue) systems may use a Column-oriented DBMS architecture. There are also other types of database which cannot be classified as relational databases. Database management systems A computer database relies on software to organize the storage of data. This software is known as a database management system (DBMS). Database management systems are categorized according to the database model that they support. The model tends to determine the query languages that are available to access the database. A great deal of the internal engineering of a DBMS, however, is independent of the data model, and is concerned with managing factors such as performance, concurrency, integrity, and recovery from hardware failures. In these areas there are large differences between products. A relational Database Management System (RDBMS) implements the features of the relational model outlined above. In this context, Date’s “Information Principle” states: “the entire information content of the database is represented in one and only one way”. Database models Products offering a more general data model than the relational model are sometimes classified as post-relational. The data model in such products incorporates relations but is not constrained by the Information Principle, which requires that all information is represented by data values in relations. Object database models In recent years, the object-oriented paradigm has been applied to database technology, creating a new programming model known as object databases. These databases attempt to bring the database world and the application programming world closer together, in particular by ensuring that the database uses the same type system as the application program. This aims to avoid the overhead (sometimes referred to as the impedance mismatch) of converting information between its representation in the database. Database storage structures Relational database tables/indexes are typically stored in memory or on hard disk in one of many forms, ordered/unordered flat files, ISAM, heaps, hash buckets or B+ trees. Контрольные вопросы:
TEXT V EMBEDDED SYSTEMS An embedded system is a special-purpose computer system designed to perform one or a few dedicated functions, often with real-time computing constraints. It is usually embedded as part of a complete device including hardware and mechanical parts. In contrast, a general-purpose computer, such as a personal computer, can do many different tasks depending on programming. Embedded systems control many of the common devices in use today. Since the embedded system is dedicated to specific tasks, design engineers can optimize it, reducing the size and cost of the product, or increasing the reliability and performance. Some embedded systems are mass-produced, benefiting from economies of scale. Physically, embedded systems range from portable devices such as digital watches and MP4 players, to large stationary installations like traffic lights, factory controllers, or the systems controlling nuclear power plants. Complexity varies from low, with a single microcontroller chip, to very high with multiple units, peripherals and networks mounted inside a large chassis or enclosure. In general. “embedded system” is not an exactly defined term, as manu systems have some element of programmability. For example, Handheld computers share some elements with embedded systems – such as the operating systems and microprocessors which power them – but are not truly embedded systems, because they allow different applications to be loaded and peripherals to be connected. Debugging. Embedded Debugging may be performed at different levels, depending on the facilities available. From simplest to most sophisticated they can be roughly grouped into the following areas: Interactive resident debugging, using the simple shell by the embedded operating system (e.g. Forth and Basic). External debugging using logging or serial port output to trace operation using either a monitor in flash or using a debug server like the Remedy Debugger which even works for heterogeneous multicore systems. An in-curcuit debugger (ICD), a hardware device that connects to the microprocessor via a JTAG or NEXUS interface. This allows the operating of the microprocessor to be controlled externally, but is typically restricted to specific debugging capabilities in the processor. Because an embedded system is often composed of a wide variety of elements, the debugging strategy may vary. For instance, debugging a software- (and microprocessor-) centric embedded system is different from debugging an embedded system where most of the processing is performed by peripherals (DSP, FPGA, co-processor). |