Talk:History of computing

From Citizendium
Revision as of 12:56, 24 May 2007 by imported>Greg Woodhouse (My position - What should be included?)
Jump to navigation Jump to search


Article Checklist for "History of computing"
Workgroup category or categories Computers Workgroup, History Workgroup [Categories OK]
Article status Developing article: beyond a stub, but incomplete
Underlinked article? No
Basic cleanup done? Yes
Checklist last edited by Greg Woodhouse 11:02, 24 May 2007 (CDT)

To learn how to fill out this checklist, please see CZ:The Article Checklist.





first draft

I've copied this initial stuff out of Computers into here. It may seem a bit incomplete or awkward for now, but it's essential to reduce the size of the top-level Computer article. Someone please take this and own it!Pat Palmer 15:23, 23 April 2007 (CDT)

I'll assume the helm, lieutenant. Stand down! --Robert W King 15:28, 23 April 2007 (CDT)
Thank you! It's a big job.Pat Palmer 18:47, 23 April 2007 (CDT)
I'm considering blanking the whole article and starting over. Any objection? The wikipedia entry was/is such a mess to sort through.--Robert W King 13:08, 24 April 2007 (CDT)

I am totally re-writing the article with more completeness, so it doesn't look like a long essay. In fact it almost looks like the original wikipedia version came from someone's term paper.--Robert W King 13:55, 24 April 2007 (CDT)

more source material?

There might be some useful source material in this archive.Pat Palmer 18:46, 23 April 2007 (CDT)

suggestions for structure

It might be helpful to cover each previous century separately, and then in 20th century have a section for each decade. Or something.Pat Palmer 18:48, 23 April 2007 (CDT)

brainstormed list of items to include

Please add to this list anyone you think might ought to be included somewhere; we can mark them off once dealt with.Pat Palmer 09:29, 11 May 2007 (CDT)

now trying to start categorizing the list (I will strike items as soon as they are incorporated in the article, or a group decision has been make never to incorporate them)Pat Palmer 11:11, 12 May 2007 (CDT)

Pioneering people:

Early machines:

  • Eniac - important early machine ADDED TO ARTICLE
  • Edvac - important early machine ADDED TO ARTICLE
  • Colossus - important early machine ADDED TO ARTICLE

Industry things:

  • DARPA, IETF and RFC's - led to invention of networks
  • Turing awards - as important in computing as, say, Nobel is is physics
  • Apple_Inc., Burroughs Corporation, IBM, Minneapolis-Honeywell (Honeywell Labs), RCA, Sperry Rand, Sylvania Electric Products, Microsoft, Tandy Radio Shack, DEC, etc. (but where do we stop? do we go on to include Sun, Oracle, SAP, Google and the newer big players?)
  • key de facto standards (where the marketplace voted in a good idea)
  • official standards bodies, and their role and key standards
  • the internet (a diametrically opposed way of doing telecom and everything else; culturally very different than telecom) - some articles already started

Software evolution:

  • Unix, Multics, Mac OS X, Windows, Linux
  • first spreadsheet - now, what was that guy's name?
  • operating system evolution (pointing off to many other articles)
  • programming language evolution (pointing off to many other articles)
  • cryptography
  • security
  • advanced software such as AI and machine learning applications (many people don't even know these things exist, or can be done, but they are becoming important in stock market, military, security etc.)
  • codecs (a branch of mathematics) leading to Voice Over Internet Protocol (VOIP), or digital streaming media communications standards.

Computer design evolution:

  • batch processing to multi processing to multi threading
  • memory management, especially virtual memory
  • character sets
  • processing speedups (maybe belong in computer architecture?)
  • changing memory technologies (mix of hardware and software, very complex)

Personal computing: (several articles already exist, though their variety and structure is still in flux)

  • invention of single-chip microprocessor
  • CP-M, Commodore, Tandy Radio Shack

Special computers:

  • Nasa's computers for the Apollo moon voyages
  • supercomputers (Cray etc)
  • embedded computer and gadgets
  • telecom and fiberoptic communications (an entire specialized industry based almost solely on special computers, realtime software, and some specialized hardware)

Everything not fitting somewhere else:


I confess, I have been seriously slacking on this article. There's a lot more to be worked in of course, and I'll try to get into it more today and the following week.--Robert W King 09:15, 11 May 2007 (CDT)
No need to feel pressure. This is a huge topic. Books have been written (and I own some of them) he he. I think it will take a long time to get this one ready for prime time, but we have to start somewhere, so I'm starting the list above. I will have a lot more to add to it. I need to look up my old notes from when I last taught this, and we need to construct a timeline. The timeline would then branch off into deeper articles about the invention of that thing. That's one way it could be structured to prevent it from becoming booksized, anyway.Pat Palmer 09:33, 11 May 2007 (CDT)
I think a goal for this article should more or less be going from the very initial need to identify quantities of things to the concept of binary, to the development of the first digital computer and then very briefly zoom through the last 80 years or so of computing history. I don't think every single technological landmark needs to be covered in this particular article, because I think they could be discussed at great length in another segment itself.--Robert W King 10:10, 11 May 2007 (CDT)

archive of strike-outs

User:Robert_W_King has struck out a number of items from the brainstorm list. I'm going to unstrike them on grounds that this is brainstorming, and we should postpone evaluating until all the information is in. Furthermore, I don't necessarily agree with his reasons as stated on the edit notes. The items he struck are show here:Pat Palmer 07:43, 12 May 2007 (CDT)

Apologies for strikeouts; it was a bad decision on my part. --Robert W King 10:05, 14 May 2007 (CDT)
  • Thomas Edison - for Edison effect, which led to triode, which became vacuum tube switch Edison would be better for something like "History of the Transistor" (Pat's note: this is now covered on the Electronic switch#Vacuum tube page.)
  • DARPA, IETF and RFC's - led to invention of networks networks were developed after the first real "digital computer" (Pat answers: everything except the first computer was developed "after"; so what? it is part of the history of computing; it may end up being covered by other articles and just pointed to here, but we need it as a placemarker)
I think that networks are more or less a way to move data between nodes; is this the history of computing or the history of the computer? I think I may have confused it.--Robert W King 10:09, 14 May 2007 (CDT)
  • the internet (a diametrically opposed way of doing telecom and everything else; culturally very different than telecom) - some articles already started "History of Networks" (Pat answers: same comment as above)
(see above)
  • operating system evolution (pointing off to many other articles) Not necessarily about the development of the "computer" in the core context. (Pat answers: i'm not sure i agree. the name here is "computing" not "computers"; OS and hardware are built to interwork very closely)
  • Apple, IBM, Microsoft (Software company), DEC, Burroughs etc. (but where do we stop? do we go on to include Sun, Oracle, SAP, Google and the newer big players?) (Pat answers: I'm not ready to exclude software from "history of computing")
  • security Tangetal. (Pat answers: "tangential?" today's its mainstream)
What I meant by tangetal was that although it is highly integrated into computing, I think it should get it's own topic, with a very summarized reference to that topic here.--Robert W King 10:09, 14 May 2007 (CDT)
  • codecs (a branch of mathematics) leading to Voice Over Internet Protocol (VOIP), or digital streaming media communications standards (Pat answers: this is a matter for discussion; I've worked on such projects that use special signal processors, which is hardware, and also there is related software; seems like "computing" to me)
Is the implementation of signal processing the same thing as development of a codec? If these things are the same then it really ought to go onto a page such as "DSP" or some other type of signal processing entry. --Robert W King 10:09, 14 May 2007 (CDT)
Please don't strike out more of the items on the list; they are just suggestions at this point and of course are subject to debate. Instead, please register your opinions as comments below (begin with multiple colons) or new sections on the page. I assure you we will listen to your opinions.Pat Palmer 07:43, 12 May 2007 (CDT)

ideas on how to get started

This article seems like a mission impossible. I think the trick is to keep this article as short as possible, while also making it complete by pointing off to other articles for deep detail. At the same time, we need a 'compelling narrative'; what a story it is. Good luck getting started!Pat Palmer 09:41, 11 May 2007 (CDT)

reminder to coordinate with CPU article

Some of the history of computing is currently incorporated in the CPU article.Pat Palmer 10:25, 12 May 2007 (CDT)

potential subtopics/directions

Since this article is "History of computing" and not "History of computers", I think it probaby ought to talk about the development of the idea of computability (Church, Turing, Gödel), and the idea of a von Neumann machine. It also might be worth talking about the development of virtual machines and microcode (and, of course, the subsequent shift towards RISC architectures as a kind of counter-movement). What I have in mind is that developments in software and operating systems have had a profound influence on direction(s) that computing has taken. I would also consider pointing out that computability theory started out trying to model notions of computation as carried out by humans, and only later became closely associated with computing devices.

Propose move to "History of computers"

Should this be about the history of computers themselves or the history of the use of computers? I think because the focus is more or less from a technological standpoint (if it is agreed that is the case) then I believe it should be renamed. The original WP topic also seems to cover the evolution of the computer more than computing itself. Agree/Disagree? --Robert W King 10:14, 14 May 2007 (CDT)

I intend for it to cover development of software technology as well as hardware. I am feeling a little pressured right now. It may need to be renamed, but I request that we wait just a bit to see how it forms up. The way I write is, I start brainstorming, and trying things, and out of that, a form mysteriously starts to appear. I've never been able just to lay down an outline from the get-go (at least, not one that remains stable). I tinker and experiment, and sometimes I am rewarded by creating something special. And then sometimes, that doesn't happen of course! I envision this article, whatever it ends up being called, as pointing off to other "histories", such as "history of operating systems" or "history of virtual memory" or "history of the compiler". At present, these little histories are scattered here and yonder, often buried within the articles about the topics. I think the history of a technology is somewhat of a different undertaking that the description of what it is now, though they are related. Well...as I write here, I can see that I am confusing even to myself. Rename it if you think it ought to be. But I don't consider this article anywhere near done. It's in its very early stages. Pat Palmer 11:57, 14 May 2007 (CDT)
This article is not about "use" of computers; I intend it to be about evolution of the hardware, software and industry from the 1950's until now, including networks and telecommunications which are foundationally based on computers, albeit special ones. This is intended to be the umbrella article. These, days, universities have "history of science" programs; they train historians, not technologists, to write about technology. That's the kind of person I'd like to help out here. "History of computing" is a subfield within that. Entire books exist about it. This article should refer to some of them before it's done. We need a coherent strategy for articles such as compilers, computer networks, and operating systems to break out their history into separate sections, for separate maintainance. It's relatively easy to get the current discussion of a technology done, but when you try to add the history, it bogs down and takes forever. By breaking these articles apart, we may find it easier to approve the non-history parts quickly. Pat Palmer 12:06, 14 May 2007 (CDT)
Pat, this article is now up to you. Unfortunately the differences that I feel about material that should be included is too radically different from your view, and I'd rather just back away than create conflict.--Robert W King 09:25, 15 May 2007 (CDT)

Charles Babbage section

I'm trying to determine how best to explain the development from using counting devices to the pascaline to the development of babbage's machines. I dont want to add any more to charles babbage (specifically about his life, work, lovelace etc.), but I think it's important to recognize the relationship on how one piece of history related to the next. --Robert W King 14:14, 14 May 2007 (CDT)

Well, how many people will have seen mechanical adding machines? Am I showing my age here? (I think I know the answer to that question!) At any rate, a picture of the mechanism in a mechanical adding machine would go a long way toward illustrating the concept. You could also start with the abacus and talk about using cog wheels to automate the "carry" process. Greg Woodhouse 14:21, 14 May 2007 (CDT)
Greg, you read my mind. I'm actually working on that in the pascaline section!--Robert W King 14:24, 14 May 2007 (CDT)
Pat, I'm not very happy with the edit you made to the Charles Babbage section. I feel that a lot of pertinent information in this history of computing was taken out.--Robert W King 16:26, 14 May 2007 (CDT)

please make SEPARATE articles for each person

Charles Babbage has his own article, and most stuff should go there. I'd like to see this article contain only a short overview pointing off to Charles Baggage. I think we should do the same thing for Herman Hollerith, Blaise Pascal, and any other. This is the framework article; the guts are in other articles. Otherwise, this thing will very quickly reach an unmanageable size. Pat Palmer 16:19, 14 May 2007 (CDT)

For the record, I'm trying to keep it down to 2 paragraphs(roughly) to contain the concept developed by each person that contributes to the history of computing. I totally agree that there shouldn't be an entire bio; but the information that should be recognized from each individual is important enough to reference (providing some background) so that were someone to actually read the article would have a rough idea or understanding of how computers got from A to Z (Z being some variable which represents your chosen state of "now").--Robert W King 16:23, 14 May 2007 (CDT)
Additionally, the bio should cover everything about his life. In this article, it's only fair to state his contributions.--Robert W King 16:29, 14 May 2007 (CDT)
I have moved the Babbage details to the Charles Babbage article. I just want this article to give a brief intro and then point off to other articles. Pat Palmer 16:32, 14 May 2007 (CDT)

Fame -- and personal computers

Hey, looks as though this entry is shaping up wonderfully!

I do think though that we should avoid terms like "Famous" in subheaders for people and concepts. Many quite significant people and concepts are not well-known, and some who are quite well known as not terribly significant. I think we should exercise judgment, choose criteria, not let "fame" do this work for us!

I changed it to "key" for the moment. That's really just a placeholder for now until someone gets time to work on it more. THANK YOU from dropping by to help. Pat Palmer 12:14, 15 May 2007 (CDT)

Also -- will there be an entry here on the history of personal computers, or ought that to be a separate topic? You could go all the way back to the Simon in 1950, and all the way up to the dual-core Macs and PC's of today. Another area not very well covered over at the "other" encyclopdia ... Russell Potter 18:05, 14 May 2007 (CDT)

I am urging authors to break out histories to subarticles; history of personal computers would make a good one (if coordinated with personal computer to avoid duplication]]). Feel free to start that one! I thought I'd heard it all but I don't remember the Simon. :=) Pat Palmer 12:12, 15 May 2007 (CDT)

Shannon and boolean gates?

You may be right, but I'm not certain Shannon is the right person to credit with the idea of implementing boolean operations through switches. He certainly did pioneer information theory, but I'd have to do some research before I felt comfortable crediting him with the idea of implementing logical operations through switches. Or did I misunderstand? Greg Woodhouse 18:53, 14 May 2007 (CDT)

Here's the link: Claude Shannon; please read the 3rd sentence on the page--Shannon was, as far as I can tell, the first to note the crucial link of boolean algebra and logic design principles!Pat Palmer 17:24, 25 April 2007 (CDT)
George Boole was actually responsible for boolean operations. See (http://www.csc.liv.ac.uk/~ped/teachadmin/histsci/htmlform/lect4.html) and his work dates circa 1854. --Robert W King 12:13, 15 May 2007 (CDT)
No one is disputing that George Boole created Boolean algebra. Claude Shannon first published work noting the Boolean algebra could be applied to hardware "logic design" (gate design). His master's thesis about from MIT got published in a journal in 1938 and created quite a stir becaus of the idea. Pat Palmer 15:44, 15 May 2007 (CDT)

I'm not convinced. When you go down to section II of the article, the claim actually made is something different: that Shannon inroduced the idea of using binary codes to represent information. In other words, the state of a switch can be represented using binary data, but that says nothing about the operation of a switch.

Now, what Shannon did do - and this is huge - is study the fundamental limits imposed on the transfer of information by available bandwidth (basically, the number of binary digits available), both in the presence of noise and on purely noiseless channels. You may know of the theoretical limit on the achievable data rate on a physical medium with a given bandwidth (this time, the width of the frequency band used) as the "Shannon limit". Greg Woodhouse 12:25, 15 May 2007 (CDT)

Feel free to start the subarticle on Shannon, but please don't start talking about his information theory in the paragraph for 1938. I'm archiving someone's comments here:
To elaborate a bit, Shannon addressed the fundamental problem 
of determining how efficiently switches could operate (in 
practical terms, how long it takes to "complete the call"). 
He not only showed that there was a theoretical limit on 
efficiency, but computed it, and laid the groundwork for 
designing efficient switches (and, more generally, telecommunications networks).
This is good stuff, but it belongs in the article Claude Shannon, which the history of computing article will end up aiming at in several places. Shannon was awesome, a polymath who made major contributions in a number of areas, both theoretical and practical. I have added a reference on his master's thesis which is from the Bell Laboratories site (highly credible). I haven't read the thesis myself. For grins, search on "Claude Elwood Shannon" at Amazon.com and see what pops up! Pat Palmer 12:52, 15 May 2007 (CDT)

continuation of first draft

I'm merging Pat's existing version with mine to merge the work, and it will be continued from then. see http://forum.citizendium.org/index.php/topic,931.0.html. --Robert W King 15:34, 15 May 2007 (CDT)

Yes, to all, User:Robert_W_King is going to guide this article from now on. I intend to get busy on some other projects. Thank you, Robert! Pat Palmer 15:46, 15 May 2007 (CDT)

Comments

Great start all. However, author-me removed all the headings! "Why?!?"

Because the article really reads like a collection of modular vignettes rather than a narrative. Try reading it now without the headings and see how that fact really glares out at you. Removing the barriers of those headings, as I did, can really help facilitate this article to be written as one whole, as one flowing narrative, from start to end.

Try it and see. And have a look at CZ:Introduction_to_CZ_for_Wikipedians#Get_ready_to_rethink_how_to_write_encyclopedia_articles.21 and CZ:Article_Mechanics#Narrative_coherence_and_flow.

So how about we leave those headings out a good while and work on that? I'd be happy to do the busy work of adding them back in, come time. I'll try to help "narratize" this too.

Stephen Ewen 23:02, 15 May 2007 (CDT)

Steven, which do you think would be the best solution? In one sense, this forces the user to read the whole article to get an understanding of the topic. On the another hand, modularity creates a chronological relationship that the user can visually reference. What do you feel is the best approach? --Robert W King 08:49, 16 May 2007 (CDT)
I think that the choice need not be made. Readers can have the best of both. Create it to read as one whole. Later add headings. People who read it as a whole will have a narrative. People who read for isolated points will have their full cake too. :-) Stephen Ewen 15:11, 16 May 2007 (CDT)
What do you think of it now? --Robert W King 15:16, 16 May 2007 (CDT)

Telecom, Strowger, Shannon

I see that there has been some information lost, concerning all references to telephony switches, which were early special-purpose computing devices (though not in the public consciousness). In particular, many people widely recognize Shannon's role in making it possibly to realize the earliest computers. I had written some sections on this which were present a day or so ago, and I thought I provided strong references for them, although I had not had a chance to add the references on Strowger switches. Any particular reason this material has disappeared? Pat Palmer 22:20, 16 May 2007 (CDT)

I had to rearrange a lot of stuff for the timeline, and I haven't yet finished the article. It'll be in there eventually.--Robert W King 10:02, 17 May 2007 (CDT)

In reference to the strowger switch; it's not a computer at all. See

It's more of an electrical device and should be included in a Telecomms article.--Robert W King 10:32, 17 May 2007 (CDT)

Specifically, each Strowger "switch" contains a bunch of electromechanical relays. The entire Strowger device made one decision among several, based on user input, on how to route the call. It is electronic, it (along with the entire network of wiring) "computes" in every sense of the word. This has been almost lost in the history of computing because, in our generation, so few people actually worked with Strowger switches (I'm one of perhaps 10 people alive--and I had a job 3 years maintaining several banks of these) that no one in our generation knows that much about how the whole network of them performed. The earlier generations of technologists who know about them didn't necessarily know about computing, so the connection has not been widely made, but I can defend it. It's really a very very interesting (to me) phenomenon. Strowger switches, and an intermediate generation of special-purpose "switchboards" made entirely of relays that were more like conventional computers (made by the Leich Co. among others, these had "registers")==these "switches" (as they were called) "computed" all telephone routing until the 1960's, when they were replaced by general purpose computing devices (with the obvious advantage that call routes could be assigned to number with less rewiring).Pat Palmer 09:34, 18 May 2007 (CDT)
Pat, I'm not questioning your background at all, but I argue that by your theory, I could say anything is a computer. A bicycle for instance, has a user input (the pedals and handle bars), a process (the gears are driven and the wheel turns) and an output (I go forward); but we know in fact a bicycle is not a computer. Additionally, we have no evidence that ENIAC or any other electromechanical devices used strowger switches for computing. I just don't consider developed telephony equipment to be computers; unless we are talking about them in the modern sense where they are all mostly digital(and in fact are computers, but very special computers).--Robert W King 12:17, 22 May 2007 (CDT)


I used to work in telecom and with Strowger switches, and I know exactly how that all works. The telephone system, since the first days it was "automated" with dialing, has been a special form of computing. Trust me on this, those dial networks are closer to being a computer than the Pascaline was. They computed routes through a network, and the route which would be ocmputed had to be changed by electrical wiring, no differently that ENIAC's program had to be changed by electrical wiring. I would appreciate it if you would return the material (at your leisure) and register your doubts here, and we'll seek an editor qualified in telephony (which IS a form of computing as much as networks today are) to work it out. I will also find references, eventually, but in the meantime please trust me when I say I know this material. The earliest automation of routes in telephony was 1920's, and this was pioneer computing, and it led to people like Shannon (who later worked at Bell Laboratories). Pat Palmer 09:24, 18 May 2007 (CDT)
Question about Shannon, Boole, etc.. should they go on the "Important contributions to computing" page or should they actually go in the article? Conceptually, Shannon's contribution was huge but he didn't actually implement it as much as come up with the idea.--Robert W King 13:25, 17 May 2007 (CDT)
I think Shannon should go here. He "empowered" others to complete the first computer hardware designs by making a key breakthrough in understanding. It came in 1938, right before the first computers got built in the early 1940's. Pat Palmer 09:19, 18 May 2007 (CDT)

In my opinion, the interesting question is whether or not what a thing does is computing. I mean this in the very specific sense of Turing computable functions (recursive functions). A device that can add (say a scale) may not be all that interesting unless it can be shown to implement an algorithm in an interesting way. If it does, whether it is electrical or mechanical is much less interesting than what it actually does in computational terms. The Strowger switch is difficult, but it is a) historically significant, and b) a component of a system that actually does do computation. It is clear that switches (of any kind) have been the enabling technology allowing us to build computational devices, so while the Strowger switch is not a computer by this criterion, I do believe it belongs in any account of the history of computing. Greg Woodhouse 12:41, 22 May 2007 (CDT)

Greg; well put, but honestly I believe that Strowger would better fit into a history of communications article than a history of computing article unless we know for sure that strowger switches were a part of a device that marked a milestone in computing history. If we're talking about just switches and that technology, then you could alternatively argue that he should definately be mentioned in a history of electronics, along with conductors, resistors, transistors, capacitors, soldering, etc. I think there are enough specifics of strowger's switch that link him more to communications and electronics than to computing.--Robert W King 13:10, 22 May 2007 (CDT)

That's a good point. Perhaps it would be smarter to cover the historical information under telecom, as you suggest. Greg Woodhouse 15:39, 22 May 2007 (CDT)

I am going to ask you one more time, nicely, to restore the information which I authored about 1) Shannon, and 2) Strowger switches. I would then like to ask a "qualified" editor to arbitrate. I really do not see why you think you get to throw out this material. You are entitled to disagree, but you are not entitled to start a reversion war. I am on the verge of calling a constable. Please restore the deleted parts to the article, so that I may finish adding the references, and then we can seek a qualified artibrator. Pat Palmer 20:12, 23 May 2007 (CDT)

Pat, I have attempted to enter a discussion in which I put forth an argument that Strowger does not necessarily have an impact on the history of computing and would be better referenced in a history of telecommunications. Additionally, the work on this article is far from complete and I find your lack of patience frustrating.
This most recent statement of claims of my activity I find extremely unnerving; particular the threatening tone. I encourage you to not make such emotional statements and excercize an ability to participate in this debate. I find this very unprofessional. --Robert W King 09:51, 24 May 2007 (CDT)
For the record, I never sugested deleting another author's work without his or her permission! I merely meant that telecom would be a logical place to cover the Strowger switch. Shannon's work is usually treated under information theory, which has significant overlap with both fields. Greg Woodhouse 21:07, 23 May 2007 (CDT)

Editorial issues

(Editor hat on) I see no harm in mentioning the Strowger switch here, but I do agree that it makes more sense to place any detailed treatment under history of telecommunications. I went back through the page history, but am unsure which edit Pat is referring to. If one of you can point to the specific language, I can make a more informed comment. But given that this is a contentious issue, it would probably be wise to get feedback from another editor, too. Bear in mind that there is no one way to write an article, and no one right answerf to a guestion such as this. It's really an issue of judgement. My philosophy as an editor is to respect the author's judgement where possible, and work with the author(s) and editor(s) to achieve consensus. I don't believe it's the role of the editor to "override" or trump the author's judgement is such matters. Greg Woodhouse 10:53, 24 May 2007 (CDT)

My position

My recommendation here is to include material on the historical importance of the development of switches in computing, perhaps mentioning the Strowger switch as a significant example of a switching device that is not an electronic switch. Detailed discussion of the Strowger switch and, in particular, its role in telecommunications should not be treated here but in another article. However, it is entirely appropriate for this article to link there, and it is important to recognize the synergy between the fields of telecommunications and computing, and how ideas introduced in one area may have inspired developments in the other. Now, let's try to come up with some mutually acceptable text! Greg Woodhouse 11:13, 24 May 2007 (CDT)

Greg, I find this amicable and acceptable. My only concern about this is since the article so far is a narritive on the history, and the development of switches is more or less a concept, where *should* it go? Part of the reason why I haven't yet covered the vacuum tube, transistor, or any other significant part in the history of computing is because of this page. Should it go in it's own section or should it go here?--Robert W King 11:20, 24 May 2007 (CDT)

That list needs some work. Several key ideas are missing, and some of the terminology there is technology specific. At a minimum, such a list should include computability, algorithm, complexity class, information and coding theory, programming languages (syntax and semantics), compilers and translation, operating systems, and basic hardware concepts. Other basic ideas are concurrency, distributed computing, computer netwotrks, modern computer architectures, logic, formal methods, functional and logic based languages, and that's probably just a start. Greg Woodhouse 11:53, 24 May 2007 (CDT)

I guess my question is, does it belong in the "History of Computing" article or in the article that already exists for the concepts, since those concepts are already pointed to in the History of Computing article?--Robert W King 13:28, 24 May 2007 (CDT)

The basic test, in my view, is: Did this idea or invention have a significant impact on the history of computing, per se. If it had a significant impact on the development of either computer science or computer technology, then I think the answer is yes. If it did not play an important role in the development of computer science and did not significantly influence the development of computers, then I think the answer is no. I realize that this covers an awful lot of territory, and it's not realistic for an article on the history of computing to include every invention or idea that could possibly be included. It's up to the authors to include enough detail to paint a reasonably accurate picture of the history of computing without overburdening the article with information that does little for the reader. I realize that some people have argued that end matter (notes and references) should be kept short and, in some sense, I agree. But if there is material that doesn't fit naturally into the narrative and which isn't so important that it simply must be included, but which also provides additional insight and which would help the reader to gain a better understanding of the subject, then I do favor including it in the article as a note or in a (perhaps annotated) bibligraphy. Greg Woodhouse 13:56, 24 May 2007 (CDT)

History of computing timeline image

User_talk:Robert_W_King#History_of_Computing_Timeline

Please make recommendations and comments about the image I created for this article. It's far from complete, but I'd like to know what should be changed. Created in excel, imported into paint shop pro, and cropped. --Robert W King 12:56, 17 May 2007 (CDT)

I am not in favor of using an image like this, because it can only be updated by the person who made it originally. It would be better to find a way to represent the timeline so that others can add items to it, that is, as text. A graphic does not allow that. Pat Palmer 05:58, 20 May 2007 (CDT)
I'll see what I can come up with in my sandbox.--Robert W King 10:10, 21 May 2007 (CDT)

Table for history of computing timeline

It's about 90% of what I wanted it to look like, minus some details. But it'll work--see the code, and the pattern within should show how to create new rows of cells. --Robert W King 11:45, 22 May 2007 (CDT)

I finalized the design for the timeline. Let me know if it looks screwed up on anyone's screen. --Robert W King 16:45, 22 May 2007 (CDT)

early computing

There's a section on "early devices" but none on computing with pencil and paper. Why's that? Most computing was done on paper until some time after the early 20th century. Michael Hardy 20:16, 22 May 2007 (CDT)

Good question. Many people forget that when Alan Turing wrote about a "computer", he was not talking about modern day computers (which didn't exist). In retrospect, Turing machines do look an awful lot like idealized computers (with memory, state and a control), but Turing, Church, Post and others were simply trying to formalize the concept of computing, and computing with pencil and paper is very probably just what they had in mind! Greg Woodhouse 22:43, 22 May 2007 (CDT)

Well, certainly at some point Turing had in mind machines, but of course in conventional usage in his day "computers" were people who do computations. Michael Hardy 16:03, 23 May 2007 (CDT)