Поможем написать учебную работу
Если у вас возникли сложности с курсовой, контрольной, дипломной, рефератом, отчетом по практике, научно-исследовательской и любой другой работой - мы готовы помочь.
Если у вас возникли сложности с курсовой, контрольной, дипломной, рефератом, отчетом по практике, научно-исследовательской и любой другой работой - мы готовы помочь.
UNIT 1. THE HISTORY OF COMPUTER
The computer is no doubt the most amazing achievement of mankind. Unknown.
I do not fear computers. I fear the lack of them. Isaac Asimov.
WARM-UP
1899 |
"Everything that can be invented has already been invented."- Charles H. Duell, director of the U.S. Patent Office. |
1949 "Computers in the future may weigh no more than 1.5 tons."
- Popular Mechanics.
1943 "I think there is a world market for maybe five computers."
- Thomas Watson, chairman of IBM.
1957 "I have travelled the length and breadth of this country and talked with the best people, and I can assure you that data processings is a fad that won't last out the year. - The editor in charge of business books for Prentice-Hall.
1968 |
"But what ... is it good for?" Engineer at the Advanced Computing Systems Division of IBM commenting on the microchip. |
1977 "There is no reason anyone would want a computer in their home”
Ken Olson, president, chairman and founder of DEC.
1980 |
"DOS addresses only 1 Megabyte of RAM because we cannot imagine any applications needing more." - Microsoft on the development of DOS. |
1981 |
"640k ought to be enough for anybody." -Bill Gates1989 "We will never make a 32-bit operating system." -Bill Gates. |
1992 |
"Windows NT addresses 2 Gigabytes of RAM which is more than any application will ever need".- Microsoft on the development of Windows NT |
1994 "Indeed, it would not be an exaggeration to describe the history of the computer industry for the past decade as a massive effort to keep up with Apple."
- Byte.
chips, dual core, megabytes, megahertz, motherboard, processor, speed, upgraded |
The "brain" of a computer is the 1)_______. Most of these are made by Intel and AMD, and are sometimes referred to as "2)_______". The fastest processors are 3)_______, which means that there are two processors working together. The 4) ______ of a processor is measured in 5)_______, which is usually written as MHz.
A computer's memory is measured in 6)_________. If a computer has 1,024 megabytes of memory, and the memory type is SDRAM, this is written as 1,024 MB SDRAM, and is pronounced "a thousand and twenty-four megabytes ess-dee-dram". The processor and memory modules are located on the 7)_______. Changing a computer's processor is not generally practical, but the memory can usually be 8)________.
|
The original IBM Personal Computer (PC) |
READING
multiplication, abacus, predominantly, reckon, computation, logarithm, accurate, fundamental, descendent, engine, exceedingly, programmable, mechanism, furthermore, distinguish, ubiquitous, partnership, incorporate, routinely, representation, mankind.
-the abacus; -the slide rule; -the stepped reckoner; -punched cards; -the Analytic Engine |
the abacus |
decimal, Pascaline, room, programmable, microprocessor, transistors, cryptographic codes, binary, eliminate, alternative, repetitive calculations, calculators, computation, invention |
The History of Computers
Who invented the computer?" is not a question with a simple answer. The real answer is that many inventors contributed to the history of computers and that a computer is a complex piece of machinery made up of many parts, each of which can be considered a separate invention. PC technology has seen remarkable advances - and more than a few false starts and outright blunders. So let's look back and see how today's systems got the way they are. Return now to the dawn of computer history. The first computers were people! That is, the earlier mechanical computers were given this name because they performed the work that had previously been assigned to people. "Computer" was originally a job title: it was used to describe those human beings (predominantly women) whose job it was to perform the 1) … required to compute such things as navigational tables, tide charts, and planetary positions for astronomical almanacs. Imagine you had a job where hour after hour, day after day, you were to do nothing but compute multiplications. Boredom would quickly set in, leading to carelessness, leading to mistakes. And even on your best days you wouldn't be producing answers very fast. Therefore, inventors have been searching for hundreds of years for a way to mechanize this task.
The abacus was the earliest known tool or aid for mathematical computations, developed in period 27002300 BC in Sumer.The Babylonians used it more than 4400 years ago! Its only value was that it aided the memory of the human performing the calculation. A skilled abacus operator could work on addition and subtraction problems at the speed of a person equipped with a hand calculator (multiplication and division are slower). The abacus is still in use today. A modern abacus consists of rings that slide over rods, but the older one pictured below dates from the time when pebbles were used for counting (the word "calculus" comes from the Latin word for pebble).
In 1617 John Napier invented logarithms, which were a technology that allowed multiplication to be performed via addition. The magic ingredient was the logarithm of each operand, which was originally obtained from a printed table. But Napier also invented an 2) … to tables, where the logarithm values were carved on ivory sticks which are now called Napier's Bones. Napier's invention led directly to the slide rule, first built in England in 1632 Still it was in use: in the 1960's by the NASA engineers of the programs which landed men on the moon.
. Slide Rule
Leonardo da Vinci (1452-1519) made drawings of gear-driven calculating machines but apparently never built any. The first gear-driven calculating machine actually to be built was probably the calculating clock, so named by its inventor, the German professor Wilhelm Schickard in 1623. This device got little publicity because Schickard died soon. |
In 1642 Blaise Pascal, at age 19, invented the 3) … as an aid for his father who was a tax collector. Pascal built 50 of this gear-driven one-function calculator adding, but couldn't sell many because of their cost, and they really weren't accurate (without required precision). Shown below is an 8 digit version of the Pascaline, and two views of a 6 digit version:
Just a few years after Pascal, Gottfried Leibniz (co-inventor with Newton of calculus) managed to build a four-function (addition, subtraction, multiplication, and division) calculator that he called the stepped reckoner because, instead of gears, it employed fluted drums having ten flutes arranged around their circumference in a stair-step fashion. Although the stepped reckoner employed the 4) … number system (each drum had 10 flutes), Leibniz was the first to advocate use of the 5) … binary number system which is fundamental to the operation of modern computers. Leibniz is considered to be one of the greatest of the philosophers but he died poor and alone.
In 1801 Joseph Marie Jacquard invented a power loom that could base its weave upon a pattern automatically read from punched wooden cards, held together in a long row by rope. Descendents of these punched cards have been in use ever since. |
In 1822 the English mathematician Charles Babbage proposed a steam driven calculating machine the size of a 6) … , which he called the Difference Engine. This machine would be able to compute tables of numbers, such as logarithm tables. He obtained government funding for this project due to the importance of numeric tables in ocean navigation. But in that time the volume of corrections showed that the set had over 1000 numerical errors. It was hoped that Babbage's machine could 7) … errors. But construction of Babbage's Difference Engine proved exceedingly difficult and the project soon became the most expensive government funded project. It was one of the main reason why the device was never finished.
Babbage was not deterred, and by then was on to his next brainstorm, which he called the Analytic Engine. This device, large as a house and powered by 6 steam engines, would be more general purpose in nature because it would be programmable, thanks to the punched card technology of Jacquard. But it was Babbage who made an important intellectual leap regarding the punched cards. Babbage saw that the pattern of holes could be used to represent an abstract idea such as a problem statement or the raw data required for that problem's solution. Babbage saw that there was no requirement that the problem matter itself physically pass thru the holes.
Furthermore, Babbage realized that punched paper could be employed as a storage mechanism, holding computed numbers for future reference. The Analytic Engine also had a key function that distinguishes computers from 8) … : the conditional statement. A conditional statement allows a program to achieve different results each time it is run. The path of the program (that is, what statements are executed next) can be determined based upon a condition or situation that is detected at the very moment the program is running.
Hollerith's invention, known as the Hollerith desk, consisted of a card reader which sensed the holes in the cards, a gear driven mechanism which could count (using Pascal's mechanism) and a large wall of dial indicators to display the results of the count. Hollerith had the insight to convert punched cards to what is today called a read/write technology. Today, we call this a read-only form of information storage. Hollerith's technique was successful. He built a company, the Tabulating Machine Company which, after a few buyouts, became International Business Machines, known today as IBM. IBM grew rapidly and punched cards became ubiquitous. |
But the U.S. military desired a mechanical calculator more optimized for scientific computation and was willing to invest in to automate the 9) … .
One early success was the Harvard Mark I computer which was built as a partnership between Harvard and IBM in 1944. This was the first 10) … digital computer made in the U.S. But it was not a purely electronic computer. Instead the Mark I was constructed out of switches, relays, rotating shafts, and clutches. The machine weighed 5 tons, incorporated 500 miles of wire, was 8 feet tall and 51 feet long, and had a 50 ft rotating shaft running its length, turned by a 5 horsepower electric motor. The Mark I ran non-stop for 15 years.
The Mark I operated on numbers that were 23 digits wide. It could add or subtract two of these numbers in three-tenths of a second, multiply them in four seconds, and divide them in ten seconds. Forty-five years later computers could perform an addition in a billionth of a second! Even though the Mark I had three quarters of a million components, it could only store 72 numbers! Today, home computers can store 30 million numbers in RAM and another 10 billion numbers on their hard disk.
The application of semiconductor electronics could change the history of computers. They began to use 11) … in 1955. The IBM company marketed a computer in which 1250 valves had been replaced by 2220 transistors, reducing the power consumption by 95 per cent. By the way, the computer of that time cost $500,000 ($4.33 million as of 2013)! Today's Pentium 4 microprocessor contains 42,000,000 transistors in a thumbnail sized piece of silicon.
One of the candidates for granddaddy of the modern computer was Colossus, built during the World War II by Britain for the purpose of breaking the 12) … used by Germany. Britain led the world in designing and building electronic machines dedicated to code breaking, and was routinely able to read coded Germany radio transmissions. But Colossus was definitely not a general purpose, reprogrammable machine.
The Harvard Mark I, the Atanasoff-Berry computer, and the British Colossus all made important contributions. American and British computer pioneers were still arguing over who was first to do what, when in 1965 the work of the German Konrad Zuse was published for the first time in English. Zuse had built a sequence of general purpose computers in Nazi Germany between 1936 and 1938.
Zuse's third machine built in 1941, was probably the first operational, general-purpose, programmable (that is, software controlled) digital computer. Without knowledge of any calculating machine inventors since Leibniz (who lived in the 1600's), Zuse reinvented Babbage's concept of programming and decided on his own to employ binary representation for numbers (Babbage had advocated decimal). The Zuse-1,-2,-3 didnt survive. Because these machines were unknown outside Germany, they did not influence the path of computing in America. But their architecture is identical to that still in use today.
The title of forefather of today's all-electronic digital computers is usually awarded to ENIAC (Electronic Numerical Integrator and Calculator). ENIAC was built at the University of Pennsylvania between 1943 and 1945 by John Mauchly and Presper Eckert. ENIAC filled a 20 by 40- foot room, weighed 30 tons. Note that ENIAC's first task was to compute whether or not it was possible to build a hydrogen bomb.
But things changed fast. The invention of the microprocessor has made its contribution into the development of the computer. Computers had been around for 20 years before the first microprocessor was developed at Intel in 1971. Intel didn't invent the electronic computer. But they were the first to succeed in cramming an entire computer on a single chip.
Could John Napier, Gottfried Leibniz or Charles Babbage imagine what a computer would look like in future and how it would influence the mankind? Laptops with touchpad, netbooks, desktops, tiny home theater PCs, Androids have taken place of the archaic abacus, Napier's Bones, slide rules, stepped reckoners, Hollerith desks. Today new scientific and engineering problems connot be solved with the help of obsolete technique. A powerful stimulus of further advancement is the constantly growing intellectual potential of computers. But one question arises: can artificial intelligence completely replace a human being in all spheres of our life?
blunder |
fluted drum |
assign |
circumference |
predominantly |
loom |
tide chart |
gear-driven-(machine) |
abacus |
weave |
to slide |
to punch |
pebbles |
the Difference Engine |
operand |
to deter |
ivory stick |
the Holleriths) desk |
Napiers Bones |
ubiquitour |
slide rule |
rotating shaft |
calculating clock |
clutch |
stepped reckoner |
to cram |
similar meaning: keyboard, to reckon, digit, elimination, scroll weel, button, QWERTY, obsolete, to take place of, forefather, to replace, key, archaic, ancestor, to compute, reduction, mouse weel, number;
opposite meaning: forefather, rough, to consent, identical, entire, descendant, partial, accurate,to argue, different.
The history of a mouse Pen-mouse
Today, the mouse is an essential input device for all modern computers but it 1) had not been / wasn't so long ago that computers had no mouse and no graphical user interface. Data 2) was / were entered by typing commands on a keyboard. The mouse was 3) inventing/ invented by Douglas Engelbart in 1964 and consisted of a wooden shell, circuit board and two metal wheels that came into contact with the surface it 4) had used / was being used on. It was 8 years later in 1972 that Bill English 5) developed / had developed the design further by inventing what 6) was/ is known as the "Ball Mouse" that we know today. The ball 7) replaced / replaces the wheels and was capable of monitoring movement in any diection. The ball came into contact with two rollers that in turn spun wheels with graduations on them that 8) can turn / could be turned into electrical pulses representing direction and speed. At the time Bill English 9) was working / worked for Xerox Parc (Palo Al Research Centre) the research and development center set-up by Xerox to 'design the future of computing'. The mouse 10) have become / became part of the ground breaking Xerox Alto computer system which was the first minicomputer system to offer a graphical user interface. It 11) would be / will be another 8 years before the mouse would be developed any further. An optical mouse 12) had been developing / was developed in around 1980, eliminating the ball which often became dirty from rolling round the desktop, negatively affecting its operation. However, 13) it was / they were far too expensive to be used widely. In fact it wasn't until around 1998 with the increase in microcontroller processing power and the reduction in component costs that optical mice became a commercially viable alternative to the ball mouse and infiltrated the mass consumer market. Today the optical mouse 14) replaces / has completely replaced the ball mouse being supplied as standard with all new computers. Engelbart's mouse 15) demonstrated / was first publicly demonstrated at the 1968 Computer Conference. The mouse on old computers required a wire for connection. New computers have wireless capabilities for the mouse, and laptops even have touchpads that don't require a mouse at all.
A mouse of 1968 |
optical, roll, on, scroll up, scroll down, touchpad, joystick, single, double, pointer, hold down, repetitive strain injury, scroll weel (mouse weel) |
1. _____ to see pages above.
2. _____ to see pages below.
3. To select text, _____ the left button, and move the mouse pointer.
4. If you use a mouse for many hours every day, you can get _____ in your fingers.
5. With a laptop computer, plug in a mouse, or use the _____ in front of the keyboard.
6. To play some games, you need to use a _____ instead of a mouse.
7. To move up and down a page, you can _____ the mouse wheel.
8. This mouse doesn't have a ball. It's an _____ mouse.
9. One click of a mouse button is called a _____ click.
10. Two clicks of a mouse button are called a _____ click.
11. Click _____the folder to open it.
Verb |
Noun |
Adjective |
…… |
care, ........ |
......., ......... |
...... |
........ |
predominant |
…… |
........ |
boredom, ........, ....... |
compute |
........, ......., ........ |
........, ...... |
execute |
........ |
........, ...... |
……. |
engine, ......., ........, ....... |
........, ......., ........ |
……. |
advance, ....... |
........, ....... |
....... |
punched, ....... |
The History of the Computer Keyboard
Sometimes it is difficult and even impossible to name the exact date of birth of this or that component of the computer. A good example is the development of a key board. Lets trace its history.
In the far 1st century A.D. Vitruvius, in his work on architecture, describes an organ with balanced keys. Some musicians applied the keyboard to stringed instruments in the first part of the 11th century. After the 15th century nearly all the makers of key-stringed instruments used the chromatic scale practically as we find it in the modern piano.
As you can see, the modern keyboard has gone through many changes, however, the basic concept of the key lay-out has been fairly consistent. This is a result of the order in which the whole tones and semi-tones are arranged, and has evolved over centuries.
The invention of the modern computer keyboard began with the invention of the typewriter. Christopher Latham Sholes patented the typewriter, that we commonly use today, in 1868 (who also invented the QWERTY layout). The Remington Company mass marketed the first typewriters starting in 1877.
Elsewhere, punched card systems were combined with typewriters to create what was called keypunches. Keypunches were the basis of early adding machines and IBM was selling over one million dollars worth of adding machines in 1931.
In 1946, the Eniac computer used a punched card reader as its input and output device. In 1948, the Binac computer used an electromechanically controlled typewriter to both input data directly onto magnetic tape (for feeding the computer data) and to print results.
Several alternatives to QWERTY have been developed over the years, claimed by their designers and users to be more efficient, intuitive and ergonomic. Nevertheless, none has seen widespread adoption, due partly to the sheer dominance of available keyboards and training. The emerging electric typewriter further improved the technological marriage between the typewriter and the computer. There are a number of various designs, especially the various split and ergonomic designs. Today the strangest keyboard there isnt really a keyboard at all. Its a laser projector the size of a cigarette lighter that projects the image of a keyboard onto any flat surface.
Note! QWERTY (is the most common modern-day keyboard layout). The name comes from the first six letters.
True Touch Rollup Keyboard Maltron Ergonomic 3D keyboard
Історія комп'ютерної клавіатури
Інколи йому є тяжкий та навіть неможливий назвати точну дату народження того чи іншого компоненту комп'ютеру. Гарний приклад являє собою розвиток ключового комітету. Дозволений слід його історія.
У далекий 1 вікі A. D. Vitruvius, у його роботі на архітектурі, описує орган з збалансованим ключем Деякі музиканти застосували клавіатуру stringed інструменти у першій частині 11 віку. Після 15 віку майже всі виробники ключа-stringed інструменти використали chromatic шкалу практично тому що ми знаходимо це у сучасному піаніно.
Тому що ви можете побачити, сучасна клавіатура поїхала через багато змін, але, базова концепція ключа лягла досить стійка. Це є результат замови у котрій всі тони та напів-тони організовуються, та еволюціонували над віками.
Винахід сучасної комп'ютерної клавіатури починався з винаходом друкарської машинки. Christopher Latham Sholes patented друкарська машинка, що ми загально використовуємо сьогодні, у 1868 (хто також винайшов QWERTY layout). Remington маса Компанії рекламувала перші друкарські машинки вихідні 1877.
В іншому місці, закомпостував системи карти були об'єднані з друкарськими машинками створювати що був закликаний keypunches. Keypunches являв собою базу ранніх додаючих машин та IBM закінчився продаж один долар мільону вартого додаючих машин1931.
У 1946, Eniac комп'ютері використало закомпостованого читача карти як його введення та прилад випуску. У 1948, Binac комп'ютері використана electromechanically контрольована друкарська машинка до обидва даних введення прямо на магнітну стрічку (для того, щоб годувати комп'ютерні даних) та щоб надрукувати результати. Декілька альтернатив до QWERTY були розроблені над роками, заявленими їхніми дизайнерами та користувачами бути більше ефективний, intuitive та ергономічний. Але, ніхто не побачило поширене прийняття, належне частково до абсолютне dominance наявних клавіатур та підготовки. З'являюча електрична друкарська машинка подальша покращила технологічний шлюб між друкарською машинкою та комп'ютером. Є декілька різноманітних проектів, особливо різноманітної тріщини та ергономічних проектів. Сьогодні найбільш незвичайна клавіатура є не дійсно клавіатура зовсім. Це - лазерний проектор величина сигарети більш світлої що проектує образ клавіатури на будь-яку квартиру поверхня.Примітка! QWERTY (являє собою найбільш спільний сучасний-день клавіатура layout). Назва прибуває з перших шести листів.
A.Think if all kinds of keys were mentioned. Name those that are not giving here
shift key, alt key, control key, escape key, delete key, tab key, caps lock key, backspace key |
B.
standard keyboard, ergonomic keyboard, key in (or type in), enter, data input |
A Joke
Scientists 1 a) prepared / b)were preparing an experiment to ask the ultimate question. They 2 a) had worked / b) were working for months gathering one each of every computer that 3 a) was building / b.was built. Finally the big day 4 a) was / b) is at hand. All the computers 5 a) had been linked/ b) were linked together. They asked the question, “Is there a God?”. Lights 6 a) started / b) starts blinking, flashing and blinking some more. Suddenly, there 7 a)was / b) were a loud crash, and a bolt of lightning 8 a) has come/ b) came down from the sky, 9 a) struck / b) strucked the computers, and welded all the connections permanently together.”There 10 a) are / b) is now”, came the reply.
|
slide rule |
|
рахівниця |
|
blunder |
|
зчеплення |
|
abacus |
|
клапан |
|
clutch |
|
логарифмічна лінійка |
|
valve |
|
груба помилка |
В 1941 році Kонрад Цузе створює обчислювальну машину Z3, що мала всі властивості сучасного комп'ютера.
1942 рік Джон Атанасов з університету штату Айова разом зі своїм студентом Кліффордом Беррі створюють, а точніше розробляють та починають монтувати, першу в США електронну цифрову обчислювальну. Хоча ця машина так і не була завершена, вона мала великий вплив на Джона Моклі, який створює через два роки, у 1946, першу ЕОМ “ENIAC”.
В кінці 1943 року починає працювати англійська обчислювальна машина спеціального призначення “Колосс”.
В 1944 році Цузе розробляє ще більш швидку обчислювальну машинуZ4.
В 1950 році в Києві під керівництвом академіка Лебедєва створюється перша в континентальній Європі ЕОМ “МЕКМ”.
Цей перелік технологій не є вичерпним, він описує тільки основну тенденцію розвитку обчислювальної техніки. У різні періоди історії досліджувалась можливість створення обчислювальних машин на основі багатьох інших, нині забутих і часом досить екзотичних технологій.
На даний час ведуться серйозні роботи по створенню оптичних комп'ютерів. Інший перспективний напрям передбачає використання досягнень молекулярної біології та досліджень ДНК. І, накінець, один з найновіших підходів, який здатний привести до грандіозних змін в області обчислювальної техніки оснований на розробці квантових комп'ютерів. Проте, у більшості випадків технологія виконання комп'ютера є набагато менш важливою ніж закледні в його основу конструкторські рішення.
|
|
|
The computers had been incredibly expensive because |
|
that proved very popular. |
|
The IBM company introduced a smaller, more affordable computer in 1954 |
|
new development as a full-time PC option. |
|
Laptops are a relatively |
|
whereas new computers are sleek and smooth. |
|
Old computers were boxy and big, |
|
that proved very popular. |
|
It is arguable which of the early microcomputers |
|
they required so much hand assembly. |
Iсторія обчислюваної техніки |
||
Перше покоління (механічні та електромеханічні пристрої) |
Калькулятори |
Антикітерський механізм. Різницева машина |
Програмовані пристрої |
Ткацький верстат. Аналітична машина. Марк1, Z3 |
|
Друге покоління (електронні вакуумні прилади) |
Калькулятори |
Калькулятор Атанасова-Беррі, IBM604, UNIVAK60.and 120. |
Програмовані пристрої |
ENIAC, Ferranti, Mercury, EDVAC, UNIAC1, IBM701,702,-655, Z22 |
|
Третє покоління (на дискретних транзисторах та мікросхемах) |
Мейнфрейми |
IBM7090, 7080, BUNCH |
Мінікомпютери |
PDP8,-11, IBM-System/32,-36, VAX |
|
Четверте покоління (надвеликі інтегральні схеми) |
4-бітні комп'ютери |
Intel 4004,-4040 |
8-бітні комп'ютери |
Intel8008,-8080, Моторола6800,-6809,MOS Technology6502, Zilog-Z80 |
|
16-бітні комп'ютери |
Intel8088, Zilog-Z8000, WDC65816/65802 |
|
32-бітні комп'ютери |
Intel80386, Pentium, Motorola68000, ARM |
|
64-бітні комп'ютери |
Alpha, MIPS, PA-RISC, PowerPCSPARC, x86-64 |
|
Вбудовані комп'ютери |
Intel8048, -8051 |
|
Персональний комп'ютер |
настільний комп'ютер, домашній комп'ютер, портативний комп'ютер, особистий цифровий помічник(PDA), Tablet PC |
|
Теоретичні та експериментальні проекти |
Квантовий комп'ютер, хімічний комп'ютер, ДНК комп'ютер, оптичний комп'ютер, спінтронний комп'ютер |
Three-dimensional map of the area of the earth surface is constructed by computer. |
Застосування комп'ютерів
Перші комп'ютери створювалися виключно для обчислень. Навіть найпримітивніші комп'ютери в цій галузі у багато разів перевершують людей.До речі, першою високорівневою мовою програмування був Фортран, призначений виключно для виконання математичних розрахунків.
Іншою сферою застосування компютерів стали бази даних. Перш за все вони були потрібні урядам і банкам, які вимагають вже складніших комп'ютерів з розвиненими системами введення-виведення та зберігання інформації. Для цих цілей був розроблено мову Кобол. Пізніше з'явилися СКБД зі своїми власними мовами програмування.
Третім застосуванням було управління всілякими пристроями. Тут розвиток йшов від вузькоспеціалізованих пристроїв (часто аналогових) до поступового впровадження стандартних комп'ютерних систем, на яких запускаються керуючі програми. Крім того, все більша частина техніки починає включати в себе керуючий комп'ютер.
Комп'ютери розвинулися настільки, що стали головним інформаційним інструментом як в офісі, так і вдома. Тепер майже будь-яка робота з інформацією найчастіше здійснюється через комп'ютер. Це відноситься як до зберігання інформації, так і до її пересилання каналами зв'язку. Сучасні суперкомпютери використовуються для компютерного моделювання складних фізичних, біологічних й інших процесів та вирішення прикладних завдань, таких як, моделювання ядерних реакцій або кліматичних змін. Деякі проекти проводяться за допомогою розподілених обчислень, коли велика кількість відносно слабких комп'ютерів одночасно працює над невеликими частинами загальної задачі, формуючи таким чином дуже потужну компютерну систему.
Найбільш складним і слаборозвиненим областю застосуванням комп'ютерів є штучний інтелект - застосування комп'ютерів для вирішення таких завдань, де немає чітко визначеного більш-менш простого алгоритму. Приклади таких завдань ігри, переклад тексту,експертні системи.
SPEAKING
Charles Babbage befriended Ada Byron, the daughter of the famous poet Lord Byron. Though she was only 19, she was fascinated by Babbage's ideas and thru letters and meetings with Babbage she learned enough about the design of the Analytic Engine to begin fashioning programs for the still unbuilt machine. While Babbage refused to publish his knowledge for another 30 years, Ada wrote a series of "Notes" wherein she detailed sequences of instructions she had prepared for the Analytic Engine. The Analytic Engine remained unbuilt (the British government refused to get involved with this one) but Ada earned her spot in history as the first computer programmer. Ada invented the subroutine and was the first to recognize the importance of looping. Babbage himself went on to invent the modern postal system, and the ophthalmoscope, which is still used today to treat the eye.
One of the primary programmers for the Mark I was a woman, Grace Hopper. Hopper found the first computer "bug": a dead moth that had gotten into the Mark I and whose wings were blocking the reading of the holes in the paper tape. The word "bug" had been used to describe a defect since at least 1889 but Hopper is credited with coining the word "debugging" to describe the work to eliminate program faults.
In 1953 Grace Hopper invented the first high-level language, "Flow-matic". This language eventually became COBOL which was the language most affected by the infamous Y2K problem. A high-level language is designed to be more understandable by humans than is the binary language understood by the computing machinery. A high-level language is worthless without a program - known as a compiler - to translate it into the binary language of the computer and hence Grace Hopper also constructed the world's first compiler. Grace remained active as a Rear Admiral in the Navy Reserves until she was 79.
Robert Firth
Infocom motto
LISTENING
Lunchboxes,Luggable computers, PDAs (personal digital assistants), subnotebooks, wearable computers, All-in One-PCs, feature, to inspire.
WRITING
William Henry "Bill" Gates III (born October 28, 1955) is the former chief executive and current chairman of Microsoft, the worlds largest personal-computer software company, which he co-founded with Paul Allen. He is consistently ranked among the words wealthiest people He has also authored and co-authored several books.
Gates is one of the best-known entrepreneurs of the personal computer revolution. Gates donates large amounts of money to various charitable organizations and scientific research programs. He remains at Microsoft as non-executive chairman.
Interestingly, that a Harvard freshman Bill Gates decided to drop out of college so he could concentrate all his time writing programs for this computer. This early experienced put Bill Gates in the right place at the right time once IBM decided to standardize on the Intel microprocessors for their line of PCs in 1981. The Intel Pentium 4 used in today's PCs is still compatible with the Intel 8088 used in IBM's first PC.
? |
Steven Paul Jobs (February 24, 1955 - October 5, 2011)
1-Introduction. 2- Main Body. 3- Conclusion.
TEXT FOR LISTENING
The History of Portable Computers
As it turned out the idea of a laptop-like portable computer, existed even before it was possible to create one. Portable or, as they are called mobile computers, by their nature, are generally microcomputers. Portable computers, because of their size, are also commonly known as 'Lunchbox' or 'Luggable' computers. They can also be called a 'Portable Workstation' or 'Portable PC'. Of course, much of the evolution of portable computers was enabled by the evolution of microprocessors, LCD displays, battery technology and so on. This evolution ultimately allowed computers to become even smaller, more portable and smarter than laptops, such as sub-notebooks, PDAs (personal digital assistants), tablets, pocket computers, smartphones, wearable computers with handsfree interface, speech recognition and speech synthesis.
The IBM 5100 Portable Computer, introduced in September 1975 was perhaps the first portable computer.
In 1976 Alan Kay developed Portable PC at Xerox PARC and called it the Dynabook and intended it for children. This first portable computer named Notetaker was not so popular. Only 10 ones were produced.
The first laptop that was commercialized was Osborne 1 in 1981, with a small 5″ monitor and a keyboard that sits inside of the lid when closed. Later portable computers included Bondwell 2 released in 1985, which was among the first with a LCD display. The first portable computers which resemble modern laptops in features were Apples Powerbooks, which first introduced a built-in trackball, and later a trackpad and optional color LCD screens. IBMs ThinkPad was largely inspired by Powerbooks design, and the evolution of the two led to laptops and notebook computers as we know them.
Portable computers have been increasing in popularity over the past decade, as they do not restrict the user in terms of mobility as desktop computers would. Wireless Internet, extended battery life and more comfortable ergonomics have been factors driving this increase in popularity.All-in-One PCs such as the iMac can also be considered portable computers and often have handles built-in to the case.
The History of Computers
blunder |
груба помилка |
to assign |
призначати |
predominantly |
переважно |
tide chart |
діаграма потоку |
abacus |
рахівниця |
to slide |
ковзати |
pebbles |
галечник, камінчик |
operand |
операнд |
ivory stick |
маленька палиця з слонової кістки |
slide rule |
логарифмічна лінійка |
calculating clock |
тактовий генератор для обчислення |
reckoner |
таблиця для обчислення |
fluted drum |
рифлений барабан |
circumference |
окружність, коло |
loom |
ткацький.станок |
gear-driven-(machine) |
машина з приводом |
to weave |
cплітати, ткати |
to punch |
пробивати вибивати |
punched card |
перфокарта |
to deter |
стримувати |
the Hollerith(s) desk |
пульт Холеріта |
ubiquitour |
повсюдний |
rotating shaft |
обертовий вал |
clutch |
зчеплення |
to cram |
втиснути |
valve |
клапан |
to scroll |
прокручувати |
strain injury |
розтягнення |
absolete |
застарілий |
FURTHER READING History of Computer Technology
The evolution of digital computing is often divided into generations. Each generation is characterized by dramatic improvements over the previous generation in the technology used to build computers, the internal organization of computer systems, and programming languages.
The Mechanical Era (1623-1945)
The idea of using machines to solve mathematical problems can be traced at least as far as the early 17th century. Mathematicians who designed and implemented calculators that were capable of addition, subtraction, multiplication and division included Wilhelm Schickhard, Blaise Pascal and Gottfried Leibnitz.
The first multi-purpose, i.e. programmable, computing device was Charles Babbages Difference Engine, which was begun in 1823. In spite of never building a complete working machine, Babbage and his colleagues recognized several important programming techniques, including conditional branches, iterative loops and index variables.
George Scheutz by 1853 had constructed a machine that could process 15-digit numbers and calculate fourth-order differences. One of the first commercial uses of mechanical computers was by the US Census Bureau, which used punch-card equipment designed by Herman Hollerith to tabulate data for the 1890 census. In 1911 Holleriths company merged with a competitor to found the corporation which in 1924 became International Business Machines.
First Generation Electronic Computers (1937-1953)
The first general purpose programmable electronic computer was the Electronic Numerical Integrator and Computer (ENIAC), built by J. Presper Eckert and John V. Mauchly. Work began in 1943. The machine wasnt completed until 1945, but then it was used extensively for calculations during the design of the hydrogen bomb. Eckert, Mauchly, and John von Neumann, a consultant to the ENIAC project, began work on a new machine before ENIAC was finished. The main contribution of EDVAC, their new project, was the notion of a stored program. ENIAC was controlled by a set of external switches and dials; to change the program required physically altering the settings on these controls. Through the use of a memory that was large enough to hold both instructions and data, and using the program stored in memory to control the order of arithmetic operations, EDVAC was able to run orders of magnitude faster than ENIAC.
Software technology during this period was very primitive. The first programs were written out in machine code, i.e. programmers directly wrote down the numbers that corresponded to the instructions they wanted to store in memory. By the 1950s programmers were using a symbolic notation, known as assembly language, then hand-translating the symbolic notation into machine code.
As primitive as they were, these first electronic machines were quite useful in applied science and engineering. The first problem run on the ENIAC, a numerical simulation used in the design of the hydrogen bomb, required 20 seconds, as opposed to forty hours using mechanical calculators.
Second Generation (1954-1962)
Electronic switches in this era were based on discrete diode and transistor technology. The first machines to be built with this technology include TRADIC at Bell Laboratories in 1954 and TX-0 at MITs Lincoln Laboratory. Memory technology was based on magnetic cores which could be accessed in random order.
During this second generation many high level programming languages were introduced, including FORTRAN (1956), ALGOL (1958), and COBOL (1959). Important commercial machines of this era include the IBM 704 and its successors, the 709 and 7094.
The second generation also saw the first two supercomputers designed specifically for numeric processing in scientific applications. They were machines that overlapped memory operations with processor operations and had primitive forms of parallel processing.
Third Generation (1963-1972)
The third generation brought huge gains in computational power. Innovations in this era include the use of integrated circuits, or Ics (semiconductor devices with several transistors built into one physical component), semiconductor memories starting to be used instead of magnetic cores, microprogramming as a technique for efficiently designing complex processors, the coming of age of pipelining and other forms of parallel processing (described in detail in Chapter CA), and the introduction of operating systems and time-sharing.
Early in the third generation Cambridge and the University of London cooperated in the development of Combined Programming Language. CPL was large with many features that were hard to learn. In 1970 Ken Thompson of Bell Labs developed simplification of CPL called simply B, in connection with an early implementation of the UNIX operating system.
Fourth Generation (1972-1984)
The next generation of computer systems saw the use of large scale integration (LSI 1000 devices per chip) and very large scale integration (VLSI 100,000 devices per chip) in the construction of computing elements. At this scale entire processors will fit onto a single chip, and for simple systems the entire computer (processor, main memory, and I/O controllers) can fit on one chip. Gate delays dropped to about 1ns per gate.
Developments in software include very high level languages such as Prolog (programming in logic). These languages tend to use a declarative programming style as opposed to the imperative style of Pascal, C and FORTRAN. These languages are not yet in wide use, but are very promising as notations for programs that will run on systems with over 1,000 processors.
Two important events marked the early part of the third generation: the development of the C programming language and the UNIX operating system, both at Bell Labs. In 1972, Dennis Ritchie, seeking to generalize Thompsons B, developed the C language. Thompson and Ritchie then used C to write a version of UNIX for the DEC PDP-11. This C-based UNIX was soon ported to many different computers, relieving users form having to learn a new operating system each time they change computer hardware. UNIX or a derivative of UNIX is now a de facto standard on virtually every computer system.
Fifth Generation (1984-1990)
The development of the next generation of computer systems is characterized mainly by the acceptance of parallel processing. Until this time parallelism was limited to pipelining and vector processing, or at most to a few processors sharing jobs. The fifth generation saw the introduction of machines with hundreds of processors that could all be working on different parts of a single program. The scale of integration in semiconductors continued at an incredible pace by 1990 it was possible to build chips with a million components and semiconductor memories became standard on all computers.
In the area of computer networking, both wide area network (WAN) and local area network (LAN) technology developed at a rapid pace, stimulating a transition from the traditional mainframe computing environment toward a distributed computing environment in which each user has their own workstation for relatively simple tasks (editing and compiling programs, reading mail) but sharing large, expensive resources such as file servers and supercomputers. This period also saw a marked increase in both the quality and quantity of scientific visualization.
Sixth Generation (1990- )
Transitions between generations in computer technology are hard to define, especially as they are taking place. Some changes, such as the switch from vacuum tubes to transistors, are immediately apparent as fundamental changes, but others are clear only in retrospect. Many of the developments in computer systems since 1990 reflect gradual improvements over established systems and thus it is hard to claim they represent a transition to a new “generation”, but other developments will prove to be significant changes.
This generation is beginning with many gains in parallel computing, both in the hardware area and in improved understanding of how to develop algorithms to exploit diverse, massively parallel architectures. Parallel systems now compete with vector processors in terms of total computing power and most expect parallel systems to dominate the future.
Combinations of parallel/vector architectures are well established, and one corporation (Fujitsu) has announced plans to build a system with over 200 of its high end vector processors. Manufacturers have set themselves the goal of achieving teraflops (1012 arithmetic operations per second) performance by the middle of the decade, and it is clear this will be obtained only by a system with a thousand processors or more. Workstation technology has continued to improve, with processor designs now using a combination of RISC, pipelining, and parallel processing. As a result it is now possible to purchase a desktop workstation for about $30,000 that has the same overall computing power (100 megaflops) as fourth generation supercomputers. This development has sparked an interest in heterogeneous computing: a program started on one workstation can find idle workstations elsewhere in the local network to run parallel subtasks.
One of the, most dramatic changes in the sixth generation will be the explosive growth of wide area networking. Network bandwidth has expanded tremendously in the last few years and will continue to improve for the next several years.