Você está na página 1de 26

Stack Exchange sign up log in careers 2.

Stack Overflow

Questions Tags Tour Users Ask Question

Tell me more Stack Overflow is a question and answer site for professional and enthusiast programmers. It's 100% free, no registration required.

Why is GW-BASIC still taught in schools? [closed]

I dunno about USA and the UK, but in India, schools still teach GW-BASIC. Yes, it's:
10 PRINT "HELLO WORLD" 20 GOTO 10

As far as my experience goes, even writing assembler is easier than this mess of a language. It could easily be replaced by something like Python, which would make it up vote 6 easier for students to actually understand the basic concepts of programming and down vote help them to understand the logic behind what they're doing better. favorite gw-basic edited Apr 5 '12 at 14:25 asked Oct 2 '09 at 18:35 share|improve this question casperOne aviraldg 45.9k976144 4,0301437

I don't know of any schools that teach this in my area (Northeast USA) James Jones Oct 2 '09 at 18:40 So you're proposing to have children at school learn assembly instead of GW 4 BASIC? Wake up call, not all babies had punch cards; most of them were playing with teddy bears. Anax Oct 2 '09 at 18:43 No, I'm saying that GWBASIC is harder than asm. aviraldg Oct 2 '09 at 18:51 GW basic isn't harder than assembly, I've been programming for 4+ years and I 2 still don't know assembly while I know Gw basic, php, javascript, Java.. Click Upvote Oct 2 '09 at 18:59 @ Aviraldg - I bet you that many children would totally get Scheme far better than college students would. Children are good at grasping difficult, abstract 5 concepts. Like languages. I bet if you hit them early enough with recursion, lambda functions, they'll get them better than either you/I can. J. Polfer Oct 2 '09 at 20:06 show 7 more comments

closed as not constructive by casperOne Apr 5 '12 at 14:25


As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.If this question can be reworded to fit the rules in the help center, please edit the question.

13 Answers
active oldest votes Because Basic is the most uhh... basic introduction into von-Neumann architecture which is what all modern computers and (by extension) programming languages are based on. Think about it: up vote 13 down vote accepted

Line numbers = Memory Addresses Variables = CPU Registers Current Line = CPU Instruction Pointer Goto = Jump instruction answered Oct 2 '09 at 18:40

share|improve this answer

edited Oct 2 '09 at 18:46

zvolkov 8,51233153 I don't think that's the reason. I think it's just inertia. Roberto Bonvallet Nov 27 '09 at 23:12 Basic should be thrown away. Any modern language like python would be 1 better suited for teaching. Rook Jan 16 '10 at 19:17 @Rook basic tends to be very forgiving (Like Ruby) which is nice for a beginning language and requires no beginning knowledge--hard to get easier than 'print "Hello"' for an entire first program. Line numbers are no longer needed and it can be fully structured, however forgiving languages (those with loose syntax) are less valuable the larger your team gets and most developers eventually prefer very strict languages that detect the most errors as soon as possible even in the editor if possible, this can be frustrating when learning though. Bill K Jan 4 '12 at 17:55 9

Ever try teaching programming to someone with no idea what it's about? I did for 4 years. For absolutely starting out, GWBASIC is pretty good. You can get the most action for the least effort, while still conveying basic ideas, like:

up vote 11 down vote

The computer finishes one statement before starting the next. (Newbies are inclined to think the computer does everything "at once".) A program is like something built out of tinker-toys. There are only a few basic pieces, and you assemble them to make it do what you want. (Newbies often think since the language has words like IF and PRINT that it will just understand whatever they type in.) Variables are a key concept. They have a name that you give them, and they have values that they get when the programs runs. That's complicated. The name and the value are not the same thing, and there is a distinction between write-time and run-time.

Once you get past some basic concepts with the help of GWBASIC you can begin to introduce a more modern disciplined language. answered Oct 2 '09 at 22:11 share|improve this answer Mike Dunlavey 23k74068

GW-Basic was taught to me in 7th grade about 10 years ago. I found it was a great language and easy to experiment with as a beginner. Even the non-pc-freaks had little problem learning the language. In my opinion it is a great tool to motivate beginners to learn more advanced programming languages. answered Oct 2 '09 at 18:49 share|improve this answer marg 1,94011014 As far as teaching in India is concerned and why they use GW-Basic, I can only guess (being from the USA): 1. It's cheap. Perhaps they have received old hardware with GW-Basic on it. Hey, it's there, it's free, why not use it to teach children. 2. The teacher knows it. If the teacher knows/understands it, he/she can teach it. At a prev. employer, I met a number of people who immigrated to the USA from India and explained that the first time they worked with Windows was when they arrived over here, none of the schools (not even college/university) had it. It might depend on the school they went to, but maybe its a matter of the available equipment. It's possible this GW-Basic usage you speak of works the same way: they used what technology they up vote had. 7 down Maybe it means they are, well, resourceful. vote As to whether its good that they are learning something so old, I'm not so sure it's such a good idea. But as the famous (American West) folk wisdom says, "Do with what you got. It'll pay off in the end." Better to expose them when they are young. answered Oct 2 '09 at 19:01 share|improve this answer edited Oct 2 '09 at 21:46 J. Polfer 5,00822052 GW-Basic is a great language for new programmers. If someone has never done any up vote programming before, something simple like GW-Basic will be a lot easier for them to 5 down comprehend as compared to something like Python. Also, Java has a lot better support vote for Object Oriented programming as compared to C++. More commercial applications

up vote 8 down vote

these days are written in Java than C++.[source]. Hence I would say that its a good thing they are switching to Java over C++. answered Oct 2 '09 at 18:40 share|improve this answer Click Upvote 24.3k92261418 Well, then by that logic, one could use COBOL! Also, I believe that people should know how things work in the underlying system... implies C++. A good C++ programmer can pick up Java without flinching , but a Java programmer will probably be in a fix if he has to learn how to do the low level ops. possible in C++ aviraldg Oct 2 '09 at 18:46 Why would a java programmer need to take care of any low level stuff? All the garbage collection etc is taken care of in Java without the programmer needing to 2 do anything. And Cobol is arguably a lot more difficult than GW-Basic Click Upvote Oct 2 '09 at 18:52 I think it depends on the student, and how motivated they are. With assembler, while it may be simpler, takes longer to do "cool things." Like output stuff to the 3 screen, it takes a couple hundred (simple though) lines of mnemonics. In GWBasic, it's one (extremely simple, English language line). J. Polfer Oct 2 '09 at 18:55 The argument about it being easier is OK, but point is that : 1) since it is easy, almost anyone can learn it and do it. 2) depending upon their performance in programming BASIC , people are are put into a special "Computer Science" section. Incompetence rocks. aviraldg Oct 2 '09 at 19:11 Well, in my high school when we had GW basic I was the only person out of the class of 20-30 who got it. Most people hated it or they just couldn't get the programming concepts. So, only people with an aptitude for programming make it past that, most people who aren't cut out for being programmers might pass the class but they won't go forward with a programming career. Click Upvote Oct 2 '09 at 20:05 show 2 more comments If someone is truly interested in programming, they will take what they learn in that class and apply it to a language learned on their own time. up vote 5 There's also something to be said for starting in a language that is much less powerful down than Java or C++. vote share|improve this answer answered Oct 2 '09 at 18:45

Jon Seigel 8,24223063 It's funny how fast humans forget. Remember the first time you struggled with the concept of a loop? With the idea of a variable and how it retained values? With remembering syntax? Basic has a relatively small built-in syntax, it has fairly flexible structures for loops and other constructs. I guess over all it's "loose". This helps a lot in learning. Loose is very bad for good, stable programs. You want very little flexibility, you want patterns that you can count on and very few options (even if you don't know that this is what you want, you will understand it as soon as you have to lead a team of 5 developers from another country). If any here haven't really considered it, the reason we don't like basic isn't a lack of up vote "power" or speed--is because it's loose--the exact same reason it's good for teaching. 5 down You don't start out running, you learn to crawl in a wobbly sort of way, then you vote stumble, etc. But once you are running sprints, you really want to make sure that every footfall is placed exactly where you want it, and if the guy ahead of you decides he suddenly wants to start crawling, you're screwed. Of course, if you're running along the track alone or in a small, in-sync team, it doesn't matter much what you do. Feel free to use any language you want :) answered Oct 2 '09 at 22:22 share|improve this answer Bill K 29.1k44790 1 ++ Nice way to put it. Mike Dunlavey Nov 29 '09 at 1:27 up vote 4 so you'll learn NOT to use GOTO down vote share|improve this answer answered Oct 2 '09 at 22:24

pageman 1,280822 1 +1 for humour........... aviraldg Oct 3 '09 at 4:57 Thats easy to learn,school dont target to teach new technology,school want to teach basics of informatics up vote 2 down vote answered Oct 2 '09 at 18:37 share|improve this answer SomeUser 8101540 I found python 1000% easier to learn. Besides , what is the practical use of teaching children GW-BASIC?????? And what of C++ being replaced by Java???? aviraldg Oct 2 '09 at 18:41 The partical use is that once someone gets the basics of programming, if/else 1 statements, loops, etc in the most simple environment, they can then expand into better programming languages Click Upvote Oct 2 '09 at 18:44 do you really want to teach people how to program in a language that is 3 dependant on whitespace? Matthew Whited Oct 2 '09 at 19:17 I think in my school GW Basic is still taught at 6-7 years (of 10) and the reason of it is that little girls and boys can't understand anything harder than basic :) Even more, in my university we program on QBasic o_O omg you say? yeah, i'm shoked too :) oh, and they promise one semester of C++ on 4th grade.. yay! up vote 1 down vote share|improve this answer edited Jan 17 '10 at 8:40 xCOREx 313 I am from India and GW-BASIC was my first language way back in 1995. It was fun. up vote Things have changed now. My school now teaches another BASIC variant, QBASIC as 1 down the first language. Then students move to C++ and Java in standards 8,9,10. Hopefully, Python will take over sometime. vote answered Jan 16 '10 at 19:10

As someone already pointed out, its plain inertia. Its not much of inexpensive hardware which is the reason. Its just the mindset to continue doing whatever has been going on.sigh. answered Apr 12 '10 at 7:03 share|improve this answer Amit 5,12342546 As far as GW-BASIC is concerned I couldn't agree more. This is why a Ruby programmer known only as "_why the lucky stiff" created an amazing platform for learning to program called "Hackety Hack". He in fact had quite a lot of insight into teaching programming to young people at the Art & Code symposium: http://vodpod.com/watch/2078103-art-code-symposium-hackety-hack-why-the-luckyup vote stiff-on-vimeo 0 down vote answered Oct 2 '09 at 18:47 share|improve this answer apiguy 2,7711414 thanks for the link! zvolkov Oct 2 '09 at 19:02 I think GW-BASIC is a good tool to teach programming to children. I am teaching programming to school children for about 10 years. GW-BASIC provides an easy to learn enviornment without going into techniqual details. If we use some hi-fi programming language to teach kids they will learn the programming language not the programming. Using GW-BASIC it is easy to teach programming, and we can concentrate on programming techniques rather then up vote discussing the structures of programming languages. It has very easy and english like 0 down syntax so students understand it easily. vote Another thing to keep in mind is its an interpreter to BASIC so we can execute different instructions line by line and can execute any part of the program, this give clear understanding to students. Direct mode of GW-BASIC provides great help to explain the memory concepts as we can monitor the changing states of variables (memory addresses and values)

answered Apr 24 '11 at 12:38 share|improve this answer Sohail Nazir 1 1 Almost all of the above is also possible in Python. Invalid argument. aviraldg Oct 10 '11 at 14:05

Not the answer you're looking for? Browse other questions tagged gw-basic or ask your own question.
tagged gw-basic 14 asked 4 years ago viewed 1765 times active 1 year ago

Software Development Engineer (ASP / C#.NET, Sharepoint, Red Bull Technology Milton Keynes, United Kingdom Foosball, Beer and Code Rally Software Development Denver, CO / relocation Quality Engineer, Schroeder Institute American Legacy Foundation Washington, DC

Related

8 How to create a plain text file in GW-BASIC 2.01? 3 How to run GW-Basic under Ubuntu? 2 What's the meaning of GW in GW-BASIC? 1 Printing to something besides LPT1 in GW Basic 0 GW Basic default variable initialization 0 GW basic for window 7(64bit) 0 Visual Basic equivalent to QBasic's Draw statement 0 printing to usb QW BASIC/ QUICK BASIC 10 Unknown GW-BASIC function/syntax: Q(var) = var 1 Set margin in printing GW-BASIC about help badges blog chat data legal privacy policy jobs advertising info mobile contact us feedback Culture / Technology Life / Arts Science Other Recreation 1. Stack 1. Progra 1. Datab 1. Photo 1. Engli 1. Mathe 1. St Overf mmer ase graph sh matics a low s Admi y Lang 2. Cross c 2. Serve 2. Unix nistrat 2. Scien uage Validat k r & ors ce & ed A Fault Linux 2. Drupa Fictio Usag (stats) p 3. Super 3. Ask l n& e 3. Theore p User Differ Answ Fanta 2. Skept tical s 4. Web ent ers sy ics Compu 2. M Appli (Appl 3. Share 3. Seaso 3. Mi ter et catio e) Point ned Yode Scienc a ns 4. Word 4. User Advic ya e St 5. Ask Press Exper e (Juda 4. Physic a

Ubun tu 6. Web maste rs 7. Game Devel opme nt 8. TeX LaTe X

5.

6.

7.

8.

Answ ers Geogr aphic Infor matio n Syste ms Electri cal Engin eering Andro id Enthu siasts Infor matio n Securi ty

ience 5. Mathe matic a 6. more (13)

(cook ing) 4. Home Impro veme nt 5. more (13)

ism) 4. Trav el 5. Chris tianit y 6. Arqa de (gam ing) 7. Bicy cles 8. Roleplayi ng Gam es 9. more (21)

s 5. MathO verflo w 6. more (7)

c k O v er fl o w 3. A re a 5 1 4. St a c k O v er fl o w C ar e er s

site design / logo 2013 stack exchange inc; user contributions licensed under cc-wiki with attribution required rev 2013.10.9.1067
v

Von Neumann architecture


From Wikipedia, the free encyclopedia Jump to: navigation, search See also: Stored-program computer and Universal Turing machine#Stored-program computer

Von Neumann architecture scheme

Von Neumann architecture scheme The term Von Neumann architecture, also known as the Von Neumann model or the Princeton architecture, derives from a 1945 computer architecture description by the mathematician and early computer scientist John von Neumann and others, First Draft of a Report on the EDVAC.[1] This describes a design architecture for an electronic digital computer with subdivisions of a processing unit consisting of an arithmetic logic unit and processor registers, a control unit containing an instruction register and program counter, a memory to store both data and instructions, external mass storage, and input and output mechanisms.[1][2] The meaning of the term has evolved to mean a stored-program computer in which an instruction

fetch and a data operation cannot occur at the same time because they share a common bus. This is referred to as the Von Neumann bottleneck and often limits the performance of the system.[3] The design of a Von Neumann architecture is simpler than the more modern Harvard architecture which is also a stored-program system but has one dedicated set of address and data buses for reading data from and writing data to memory, and another set of address and data buses for fetching instructions. A stored-program digital computer is one that keeps its programmed instructions, as well as its data, in read-write, random-access memory (RAM). Stored-program computers were an advancement over the program-controlled computers of the 1940s, such as the Colossus and the ENIAC, which were programmed by setting switches and inserting patch leads to route data and to control signals between various functional units. In the vast majority of modern computers, the same memory is used for both data and program instructions, and the Von Neumann vs. Harvard distinction applies to the cache architecture, not main memory.

Contents

1 History 2 Development of the stored-program concept 3 Early von Neumann-architecture computers 4 Early stored-program computers 5 Evolution 6 Von Neumann bottleneck 7 Non-von Neumann processors 8 See also 9 References o 9.1 Inline o 9.2 General 10 External links

History
The earliest computing machines had fixed programs. Some very simple computers still use this design, either for simplicity or training purposes. For example, a desk calculator (in principle) is a fixed program computer. It can do basic mathematics, but it cannot be used as a word processor or a gaming console. Changing the program of a fixed-program machine requires re-wiring, restructuring, or re-designing the machine. The earliest computers were not so much "programmed" as they were "designed". "Reprogramming", when it was possible at all, was a laborious process, starting with flowcharts and paper notes, followed by detailed engineering designs, and then the often-arduous process of physically re-wiring and re-building the machine. It could take three weeks to set up a program on ENIAC and get it working.[4]

With the proposal of the stored-program computer this changed. A stored-program computer includes by design an instruction set and can store in memory a set of instructions (a program) that details the computation. A stored-program design also allows for self-modifying code. One early motivation for such a facility was the need for a program to increment or otherwise modify the address portion of instructions, which had to be done manually in early designs. This became less important when index registers and indirect addressing became usual features of machine architecture. Another use was to embed frequently used data in the instruction stream using immediate addressing. Self-modifying code has largely fallen out of favor, since it is usually hard to understand and debug, as well as being inefficient under modern processor pipelining and caching schemes. On a large scale, the ability to treat instructions as data is what makes assemblers, compilers and other automated programming tools possible. One can "write programs which write programs".[5] On a smaller scale, repetitive I/O-intensive operations such as the BITBLT image manipulation primitive or pixel & vertex shaders in modern 3D graphics, were considered inefficient to run without custom hardware. These operations could be accelerated on general purpose processors with "on the fly compilation" ("just-in-time compilation") technology, e.g., code-generating programsone form of self-modifying code that has remained popular. There are drawbacks to the Von Neumann design. Aside from the Von Neumann bottleneck described below, program modifications can be quite harmful, either by accident or design. In some simple stored-program computer designs, a malfunctioning program can damage itself, other programs, or the operating system, possibly leading to a computer crash. Memory protection and other forms of access control can usually protect against both accidental and malicious program modification.

Development of the stored-program concept


The mathematician Alan Turing, who had been alerted to a problem of mathematical logic by the lectures of Max Newman at the University of Cambridge, wrote a paper in 1936 entitled On Computable Numbers, with an Application to the Entscheidungsproblem, which was published in the Proceedings of the London Mathematical Society.[6] In it he described a hypothetical machine which he called a "universal computing machine", and which is now known as the "Universal Turing machine". The hypothetical machine had an infinite store (memory in today's terminology) that contained both instructions and data. John von Neumann became acquainted with Turing while he was a visiting professor at Cambridge in 1935, and also during Turing's PhD year at the Institute for Advanced Study in Princeton, New Jersey during 1936 37. Whether he knew of Turing's paper of 1936 at that time is not clear. In 1936 Konrad Zuse also anticipated in two patent applications that machine instructions could be stored in the same storage used for data.[7] Independently, J. Presper Eckert and John Mauchly, who were developing the ENIAC at the Moore School of Electrical Engineering, at the University of Pennsylvania, wrote about the stored-program concept in December 1943.[8][9] In planning a new machine, EDVAC, Eckert

wrote in January 1944 that they would store data and programs in a new addressable memory device, a mercury metal delay line memory. This was the first time the construction of a practical stored-program machine was proposed. At that time, he and Mauchly were not aware of Turing's work. Von Neumann was involved in the Manhattan Project at the Los Alamos National Laboratory, which required huge amounts of calculation. This drew him to the ENIAC project, during the summer of 1944. There he joined into the ongoing discussions on the design of this storedprogram computer, the EDVAC. As part of that group, he volunteered to write up a description of it and produced the First Draft of a Report on the EDVAC[1] which included ideas from Eckert and Mauchly. It was unfinished when his colleague Herman Goldstine circulated it with only von Neumann's name on it, to the consternation of Eckert and Mauchly.[10] The paper was read by dozens of von Neumann's colleagues in America and Europe, and influenced the next round of computer designs. Hence, Von Neumann was not alone in developing the idea of the stored-program architecture, and Jack Copeland considers that it is "historically inappropriate, to refer to electronic storedprogram digital computers as 'von Neumann machines'".[11] His Los Alamos colleague Stan Frankel said of von Neumann's regard for Turing's ideas: I know that in or about 1943 or '44 von Neumann was well aware of the fundamental importance of Turing's paper of 1936 ... Von Neumann introduced me to that paper and at his urging I studied it with care. Many people have acclaimed von Neumann as the "father of the computer" (in a modern sense of the term) but I am sure that he would never have made that mistake himself. He might well be called the midwife, perhaps, but he firmly emphasized to me, and to others I am sure, that the fundamental conception is owing to Turing in so far as not anticipated by Babbage ... Both Turing and von Neumann, of course, also made substantial contributions to the "reduction to practice" of these concepts but I would not regard these as comparable in importance with the introduction and explication of the concept of a computer able to store in its memory its program of activities and of modifying that program in the course of these activities. [12] At the time that the "First Draft" report was circulated, Turing was producing a report entitled Proposed Electronic Calculator which described in engineering and programming detail, his idea of a machine that was called the Automatic Computing Engine (ACE).[13] He presented this to the Executive Committee of the British National Physical Laboratory on February 19, 1946. Although Turing knew from his wartime experience at Bletchley Park that what he proposed was feasible, the secrecy surrounding Colossus, that was subsequently maintained for several decades, prevented him from saying so. Various successful implementations of the ACE design were produced. Both von Neumann's and Turing's papers described stored-program computers, but von Neumann's earlier paper achieved greater circulation and the computer architecture it outlined became known as the "von Neumann architecture". In the 1953 publication Faster than Thought: A Symposium on Digital Computing Machines (edited by B.V. Bowden), a section in the chapter on Computers in America reads as follows:[14]

THE MACHINE OF THE INSTITUTE FOR ADVANCED STUDIES, PRINCETON In 1945, Professor J. von Neumann, who was then working at the Moore School of Engineering in Philadelphia, where the E.N.I.A.C. had been built, issued on behalf of a group of his coworkers a report on the logical design of digital computers. The report contained a fairly detailed proposal for the design of the machine which has since become known as the E.D.V.A.C. (electronic discrete variable automatic computer). This machine has only recently been completed in America, but the von Neumann report inspired the construction of the E.D.S.A.C. (electronic delay-storage automatic calculator) in Cambridge (see page 130). In 1947, Burks, Goldstine and von Neumann published another report which outlined the design of another type of machine (a parallel machine this time) which should be exceedingly fast, capable perhaps of 20,000 operations per second. They pointed out that the outstanding problem in constructing such a machine was in the development of a suitable memory, all the contents of which were instantaneously accessible, and at first they suggested the use of a special vacuum tube called the "Selectron" which had been invented by the Princeton Laboratories of the R.C.A. These tubes were expensive and difficult to make, so von Neumann subsequently decided to build a machine based on the Williams memory. This machine, which was completed in June, 1952 in Princeton has become popularly known as the Maniac. The design of this machine has inspired that of half a dozen or more machines which are now being built in America, all of which are known affectionately as "Johniacs."' In the same book, the first two paragraphs of a chapter on ACE read as follows:[15] AUTOMATIC COMPUTATION AT THE NATIONAL PHYSICAL LABORATORY' One of the most modern digital computers which embodies developments and improvements in the technique of automatic electronic computing was recently demonstrated at the National Physical Laboratory, Teddington, where it has been designed and built by a small team of mathematicians and electronics research engineers on the staff of the Laboratory, assisted by a number of production engineers from the English Electric Company, Limited. The equipment so far erected at the Laboratory is only the pilot model of a much larger installation which will be known as the Automatic Computing Engine, but although comparatively small in bulk and containing only about 800 thermionic valves, as can be judged from Plates XII, XIII and XIV, it is an extremely rapid and versatile calculating machine. The basic concepts and abstract principles of computation by a machine were formulated by Dr. A. M. Turing, F.R.S., in a paper1. read before the London Mathematical Society in 1936, but work on such machines in Britain was delayed by the war. In 1945, however, an examination of the problems was made at the National Physical Laboratory by Mr. J. R. Womersley, then superintendent of the Mathematics Division of the Laboratory. He was joined by Dr. Turing and a small staff of specialists, and, by 1947, the preliminary planning was sufficiently advanced to warrant the establishment of the special group already mentioned. In April, 1948, the latter became the Electronics Section of the Laboratory, under the charge of Mr. F. M. Colebrook.

Early von Neumann-architecture computers

The First Draft described a design that was used by many universities and corporations to construct their computers.[16] Among these various computers, only ILLIAC and ORDVAC had compatible instruction sets.

Manchester Small-Scale Experimental Machine (SSEM), nicknamed "Baby" (University of Manchester, England) made its first successful run of a stored-program on June 21, 1948. EDSAC (University of Cambridge, England) was the first practical stored-program electronic computer (May 1949) Manchester Mark 1 (University of Manchester, England) Developed from the SSEM (June 1949) CSIRAC (Council for Scientific and Industrial Research) Australia (November 1949) ORDVAC (U-Illinois) at Aberdeen Proving Ground, Maryland (completed November 1951)[17] IAS machine at Princeton University (January 1952) MANIAC I at Los Alamos Scientific Laboratory (March 1952) ILLIAC at the University of Illinois, (September 1952) BESM-1 in Moscow (1952) AVIDAC at Argonne National Laboratory (1953) ORACLE at Oak Ridge National Laboratory (June 1953) BESK in Stockholm (1953) JOHNNIAC at RAND Corporation (January 1954) DASK in Denmark (1955) WEIZAC in Rehovoth (1955) PERM in Munich (1956?) SILLIAC in Sydney (1956)

Early stored-program computers


The date information in the following chronology is difficult to put into proper order. Some dates are for first running a test program, some dates are the first time the computer was demonstrated or completed, and some dates are for the first delivery or installation.

The IBM SSEC had the ability to treat instructions as data, and was publicly demonstrated on January 27, 1948. This ability was claimed in a US patent.[18] However it was partially electromechanical, not fully electronic. In practice, instructions were read from paper tape due to its limited memory.[19] The Manchester SSEM (the Baby) was the first fully electronic computer to run a stored program. It ran a factoring program for 52 minutes on June 21, 1948, after running a simple division program and a program to show that two numbers were relatively prime. The ENIAC was modified to run as a primitive read-only stored-program computer (using the Function Tables for program ROM) and was demonstrated as such on September 16, 1948, running a program by Adele Goldstine for von Neumann. The BINAC ran some test programs in February, March, and April 1949, although was not completed until September 1949.

The Manchester Mark 1 developed from the SSEM project. An intermediate version of the Mark 1 was available to run programs in April 1949, but was not completed until October 1949. The EDSAC ran its first program on May 6, 1949. The EDVAC was delivered in August 1949, but it had problems that kept it from being put into regular operation until 1951. The CSIR Mk I ran its first program in November 1949. The SEAC was demonstrated in April 1950. The Pilot ACE ran its first program on May 10, 1950 and was demonstrated in December 1950. The SWAC was completed in July 1950. The Whirlwind was completed in December 1950 and was in actual use in April 1951. The first ERA Atlas (later the commercial ERA 1101/UNIVAC 1101) was installed in December 1950.

Evolution

Single system bus evolution of the architecture Through the decades of the 1960s and 1970s computers generally became both smaller and faster, which led to some evolutions in their architecture. For example, memory-mapped I/O allows input and output devices to be treated the same as memory.[20] A single system bus could be used to provide a modular system with lower cost. This is sometimes called a "streamlining" of the architecture.[21] In subsequent decades, simple microcontrollers would sometimes omit features of the model to lower cost and size. Larger computers added features for higher performance.

Von Neumann bottleneck


The shared bus between the program memory and data memory leads to the Von Neumann bottleneck, the limited throughput (data transfer rate) between the CPU and memory compared to the amount of memory. Because program memory and data memory cannot be accessed at the same time, throughput is much smaller than the rate at which the CPU can work. This seriously limits the effective processing speed when the CPU is required to perform minimal processing on large amounts of data. The CPU is continually forced to wait for needed data to be transferred to or from memory. Since CPU speed and memory size have increased much faster than the

throughput between them, the bottleneck has become more of a problem, a problem whose severity increases with every newer generation of CPU. The term "von Neumann bottleneck" was coined by John Backus in his 1977 ACM Turing Award lecture. According to Backus: Surely there must be a less primitive way of making big changes in the store than by pushing vast numbers of words back and forth through the von Neumann bottleneck. Not only is this tube a literal bottleneck for the data traffic of a problem, but, more importantly, it is an intellectual bottleneck that has kept us tied to word-at-a-time thinking instead of encouraging us to think in terms of the larger conceptual units of the task at hand. Thus programming is basically planning and detailing the enormous traffic of words through the von Neumann bottleneck, and much of that traffic concerns not significant data itself, but where to find it.[22][23] The performance problem can be alleviated (to some extent) by several mechanisms. Providing a cache between the CPU and the main memory, providing separate caches or separate access paths for data and instructions (the so-called Modified Harvard architecture), using branch predictor algorithms and logic, and providing a limited CPU stack or other on-chip scratchpad memory to reduce memory access are four of the ways performance is increased. The problem can also be sidestepped somewhat by using parallel computing, using for example the NonUniform Memory Access (NUMA) architecturethis approach is commonly employed by supercomputers. It is less clear whether the intellectual bottleneck that Backus criticized has changed much since 1977. Backus's proposed solution has not had a major influence.[citation needed] Modern functional programming and object-oriented programming are much less geared towards "pushing vast numbers of words back and forth" than earlier languages like Fortran were, but internally, that is still what computers spend much of their time doing, even highly parallel supercomputers. As of 1996, a database benchmark study found that three out of four CPU cycles were spent waiting for memory. Researchers expect that increasing the number of simultaneous instruction streams with multithreading or single-chip multiprocessing will make this bottleneck even worse.
[24]

Non-von Neumann processors


The National Semiconductor COP8 was introduced in 1986; it has a Modified Harvard architecture.[25][26] Perhaps the most common kind of non-von Neumann structure used in modern computers is content-addressable memory (CAM).

See also
Computer science portal

CARDboard Illustrative Aid to Computation Harvard architecture Interconnect bottleneck Little man computer Modified Harvard architecture Random access machine Turing machine

References
Inline
1. ^ Jump up to: a b c von Neumann, John (1945), First Draft of a Report on the EDVAC, retrieved August 24, 2011 2. Jump up ^ Ganesan 2009 3. Jump up ^ Markgraf, Joey D. (2007), The Von Neumann bottleneck, retrieved August 24, 2011 4. Jump up ^ Copeland 2006, p. 104 5. Jump up ^ MFTL (My Favorite Toy Language) entry Jargon File 4.4.7, retrieved 200807-11 6. Jump up ^ Turing, A.M. (1936), "On Computable Numbers, with an Application to the Entscheidungsproblem", Proceedings of the London Mathematical Society, 2 (1937) 42: 23065, doi:10.1112/plms/s2-42.1.230 (and Turing, A.M. (1938), "On Computable Numbers, with an Application to the Entscheidungsproblem. A correction", Proceedings of the London Mathematical Society, 2 (1937) 43 (6): 5446, doi:10.1112/plms/s243.6.544) 7. Jump up ^ "Electronic Digital Computers", Nature 162, September 25, 1948: 487, doi:10.1038/162487a0, retrieved 2009-04-10 8. Jump up ^ Lukoff, Herman (1979), From Dits to Bits...: A Personal History of the Electronic Computer, Robotics Press, ISBN 978-0-89661-002-6 9. Jump up ^ ENIAC project administrator Grist Brainerd's December 1943 progress report for the first period of the ENIAC's development implicitly proposed the stored program concept (while simultaneously rejecting its implementation in the ENIAC) by stating that "in order to have the simplest project and not to complicate matters" the ENIAC would be constructed without any "automatic regulation". 10. Jump up ^ Copeland 2006, p. 113 11. Jump up ^ Copeland, Jack (2000), A Brief History of Computing: ENIAC and EDVAC, retrieved January 27, 2010 12. Jump up ^ Copeland, Jack (2000), A Brief History of Computing: ENIAC and EDVAC, retrieved 27 January 2010 which cites Randell, B. (1972), "On Alan Turing and the Origins of Digital Computers", in Meltzer, B.; Michie, D., Machine Intelligence 7 (Edinburgh: Edinburgh University Press): 10, ISBN 0-902383-26-4 13. Jump up ^ Copeland 2006, pp. 108111 14. Jump up ^ Bowden 1953, pp. 176,177 15. Jump up ^ Bowden 1953, p. 135

16. Jump up ^ "Electronic Computer Project". Institute for Advanced Study. Retrieved May 26, 2011. 17. Jump up ^ James E. Robertson (1955), Illiac Design Techniques, report number UIUCDCS-R-1955-146, Digital Computer Laboratory, University of Illinois at UrbanaChampaign 18. Jump up ^ F.E. Hamilton, R.R. Seeber, R.A. Rowley, and E.S. Hughes (January 19, 1949). "Selective Sequence Electronic Calculator". US Patent 2,636,672. Retrieved April 28, 2011. Issued April 28, 1953. 19. Jump up ^ Herbert R.J. Grosch (1991), Computer: Bit Slices From a Life, Third Millennium Books, ISBN 0-88733-085-1 20. Jump up ^ C. Gordon Bell; R. Cady; H. McFarland; J. O'Laughlin; R. Noonan; W. Wulf (1970), "A New Architecture for Mini-ComputersThe DEC PDP-11", Spring Joint Computer Conference: 657675. 21. Jump up ^ Linda Null; Julia Lobur (2010), The essentials of computer organization and architecture (3rd ed.), Jones & Bartlett Learning, pp. 36,199203, ISBN 978-1-44960006-8 22. Jump up ^ Backus, John W.. Can Programming Be Liberated from the von Neumann Style? A Functional Style and Its Algebra of Programs. doi:10.1145/359576.359579. |accessdate= requires |url= (help) 23. Jump up ^ Dijkstra, Edsger W.. "E. W. Dijkstra Archive: A review of the 1977 Turing Award Lecture". Retrieved 2008-07-11. 24. Jump up ^ Richard L. Sites, Yale Patt. "Architects Look to Processors of Future". Microprocessor report. 1996. 25. Jump up ^ "COP8 Basic Family Users Manual". National Semiconductor. Retrieved 2012-01-20. 26. Jump up ^ "COP888 Feature Family Users Manual". National Semiconductor. Retrieved 2012-01-20.

General

Bowden, B.V., ed. (1953), Faster Than Thought: A Symposium on Digital Computing Machines, London: Sir Isaac Pitman and Sons Ltd. Rojas, Ral; Hashagen, Ulf, eds. (2000), The First Computers: History and Architectures, MIT Press, ISBN 0-262-18197-5 Davis, Martin (2000), The universal computer: the road from Leibniz to Turing, New York: W W Norton & Company Inc., ISBN 0-393-04785-7 re-published as: Davis, Martin (2001), Engines of Logic: Mathematicians and the Origin of the Computer, New York: W. W. Norton & Company, ISBN 978-0-939-32229-3 Check |isbn= value (help) Can Programming be Liberated from the von Neumann Style?, John Backus, 1977 ACM Turing Award Lecture. Communications of the ACM, August 1978, Volume 21, Number 8 Online PDF C. Gordon Bell and Allen Newell (1971), Computer Structures: Readings and Examples, McGraw-Hill Book Company, New York. Massive (668 pages) Copeland, Jack (2006), "Colossus and the Rise of the Modern Computer", in Copeland, B. Jack, Colossus: The Secrets of Bletchley Park's Codebreaking Computers, Oxford: Oxford University Press, ISBN 978-0-19-284055-4

Ganesan, Deepak (2009), The Von Neumann Model, retrieved October 22, 2011

External links

Harvard vs von Neumann A tool that emulates the behavior of a von Neumann machine JOHNNY A simple Open Source simulator of a von Neumann machine for educational purposes [hide]

v t e

CPU technologies

Architecture

Harvard Modified Harvard von Neumann Dataflow Comparison ASIP CISC EDGE EPIC MISC OISC RISC VLIW NISC ZISC 1-bit 4-bit 8-bit 9-bit 10-bit 12-bit 15-bit 16-bit 18-bit 22-bit 24-bit

Instruction set

Word size

25-bit 26-bit 27-bit 31-bit 32-bit 33-bit 34-bit 36-bit 39-bit 40-bit 48-bit 50-bit 60-bit 64-bit 128-bit 256-bit 512-bit variable Instruction pipelining In-order & out-of-order execution Branch predictor Register renaming Speculative execution Hazards Bubble Bit Instruction (Scalar Superscalar) Data (Vector) Task Multithreading Simultaneous multithreading Hyper-threading Super-threading SISD SIMD MISD MIMD (SPMD) Digital signal processor

Pipeline

Parallel level

Threads

Flynn's taxonomy Types

Microcontroller System on a chip Cellular Arithmetic logic unit Barrel shifter Floating-point unit Back-side bus Multiplexer Demultiplexer Registers Memory management unit Translation lookaside buffer Cache Register file Microcode Control unit Clock rate APM ACPI Dynamic frequency scaling Dynamic voltage scaling Clock gating

Components

Power management

Categories:

Computer architecture Flynn's taxonomy Reference models Classes of computers School of Computer Science, University of Manchester Open problems

Navigation menu

Create account Log in Article Talk Read Edit View history

Main page Contents Featured content Current events Random article Donate to Wikipedia

Interaction

Help About Wikipedia Community portal Recent changes Contact page

Toolbox Print/export Languages


Asturianu Bosanski Catal esky Deutsch Espaol Euskara Franais Hrvatski Bahasa Indonesia slenska Italiano Latina Latvieu Magyar

Nederlands Norsk bokml Polski Portugus Romn Shqip Slovenina / srpski Srpskohrvatski / Suomi Svenska Trke Winaray Edit links This page was last modified on 24 August 2013 at 00:29. Text is available under the Creative Commons Attribution-ShareAlike License; additional terms may apply. By using this site, you agree to the Terms of Use and Privacy Policy. Wikipedia is a registered trademark of the Wikimedia Foundation, Inc., a non-profit organization. Privacy policy About Wikipedia Disclaimers Contact Wikipedia Developers Mobile view

Você também pode gostar