Talk:History of computing

From Citizendium
Revision as of 14:11, 16 May 2007 by imported>Stephen Ewen (→‎Comments)
Jump to navigation Jump to search


Article Checklist for "History of computing"
Workgroup category or categories Computers Workgroup, History Workgroup [Categories OK]
Article status Developing article: beyond a stub, but incomplete
Underlinked article? Yes
Basic cleanup done? Yes
Checklist last edited by Pat Palmer 15:23, 23 April 2007 (CDT)

To learn how to fill out this checklist, please see CZ:The Article Checklist.





first draft

I've copied this initial stuff out of Computers into here. It may seem a bit incomplete or awkward for now, but it's essential to reduce the size of the top-level Computer article. Someone please take this and own it!Pat Palmer 15:23, 23 April 2007 (CDT)

I'll assume the helm, lieutenant. Stand down! --Robert W King 15:28, 23 April 2007 (CDT)
Thank you! It's a big job.Pat Palmer 18:47, 23 April 2007 (CDT)
I'm considering blanking the whole article and starting over. Any objection? The wikipedia entry was/is such a mess to sort through.--Robert W King 13:08, 24 April 2007 (CDT)

I am totally re-writing the article with more completeness, so it doesn't look like a long essay. In fact it almost looks like the original wikipedia version came from someone's term paper.--Robert W King 13:55, 24 April 2007 (CDT)

more source material?

There might be some useful source material in this archive.Pat Palmer 18:46, 23 April 2007 (CDT)

suggestions for structure

It might be helpful to cover each previous century separately, and then in 20th century have a section for each decade. Or something.Pat Palmer 18:48, 23 April 2007 (CDT)

brainstormed list of items to include

Please add to this list anyone you think might ought to be included somewhere; we can mark them off once dealt with.Pat Palmer 09:29, 11 May 2007 (CDT)

now trying to start categorizing the list (I will strike items as soon as they are incorporated in the article, or a group decision has been make never to incorporate them)Pat Palmer 11:11, 12 May 2007 (CDT)

Pioneering people:

Early machines:

  • Eniac - important early machine ADDED TO ARTICLE
  • Edvac - important early machine ADDED TO ARTICLE
  • Colossus - important early machine ADDED TO ARTICLE

Industry things:

  • DARPA, IETF and RFC's - led to invention of networks
  • Turing awards - as important in computing as, say, Nobel is is physics
  • Apple_Inc., Burroughs Corporation, IBM, Minneapolis-Honeywell (Honeywell Labs), RCA, Sperry Rand, Sylvania Electric Products, Microsoft, Tandy Radio Shack, DEC, etc. (but where do we stop? do we go on to include Sun, Oracle, SAP, Google and the newer big players?)
  • key de facto standards (where the marketplace voted in a good idea)
  • official standards bodies, and their role and key standards
  • the internet (a diametrically opposed way of doing telecom and everything else; culturally very different than telecom) - some articles already started

Software evolution:

  • Unix, Multics, Mac OS X, Windows, Linux
  • first spreadsheet - now, what was that guy's name?
  • operating system evolution (pointing off to many other articles)
  • programming language evolution (pointing off to many other articles)
  • cryptography
  • security
  • advanced software such as AI and machine learning applications (many people don't even know these things exist, or can be done, but they are becoming important in stock market, military, security etc.)
  • codecs (a branch of mathematics) leading to Voice Over Internet Protocol (VOIP), or digital streaming media communications standards.

Computer design evolution:

  • batch processing to multi processing to multi threading
  • memory management, especially virtual memory
  • character sets
  • processing speedups (maybe belong in computer architecture?)
  • changing memory technologies (mix of hardware and software, very complex)

Personal computing: (several articles already exist, though their variety and structure is still in flux)

  • invention of single-chip microprocessor
  • CP-M, Commodore, Tandy Radio Shack

Special computers:

  • Nasa's computers for the Apollo moon voyages
  • supercomputers (Cray etc)
  • embedded computer and gadgets
  • telecom and fiberoptic communications (an entire specialized industry based almost solely on special computers, realtime software, and some specialized hardware)

Everything not fitting somewhere else:


I confess, I have been seriously slacking on this article. There's a lot more to be worked in of course, and I'll try to get into it more today and the following week.--Robert W King 09:15, 11 May 2007 (CDT)
No need to feel pressure. This is a huge topic. Books have been written (and I own some of them) he he. I think it will take a long time to get this one ready for prime time, but we have to start somewhere, so I'm starting the list above. I will have a lot more to add to it. I need to look up my old notes from when I last taught this, and we need to construct a timeline. The timeline would then branch off into deeper articles about the invention of that thing. That's one way it could be structured to prevent it from becoming booksized, anyway.Pat Palmer 09:33, 11 May 2007 (CDT)
I think a goal for this article should more or less be going from the very initial need to identify quantities of things to the concept of binary, to the development of the first digital computer and then very briefly zoom through the last 80 years or so of computing history. I don't think every single technological landmark needs to be covered in this particular article, because I think they could be discussed at great length in another segment itself.--Robert W King 10:10, 11 May 2007 (CDT)

archive of strike-outs

User:Robert_W_King has struck out a number of items from the brainstorm list. I'm going to unstrike them on grounds that this is brainstorming, and we should postpone evaluating until all the information is in. Furthermore, I don't necessarily agree with his reasons as stated on the edit notes. The items he struck are show here:Pat Palmer 07:43, 12 May 2007 (CDT)

Apologies for strikeouts; it was a bad decision on my part. --Robert W King 10:05, 14 May 2007 (CDT)
  • Thomas Edison - for Edison effect, which led to triode, which became vacuum tube switch Edison would be better for something like "History of the Transistor" (Pat's note: this is now covered on the Electronic switch#Vacuum tube page.)
  • DARPA, IETF and RFC's - led to invention of networks networks were developed after the first real "digital computer" (Pat answers: everything except the first computer was developed "after"; so what? it is part of the history of computing; it may end up being covered by other articles and just pointed to here, but we need it as a placemarker)
I think that networks are more or less a way to move data between nodes; is this the history of computing or the history of the computer? I think I may have confused it.--Robert W King 10:09, 14 May 2007 (CDT)
  • the internet (a diametrically opposed way of doing telecom and everything else; culturally very different than telecom) - some articles already started "History of Networks" (Pat answers: same comment as above)
(see above)
  • operating system evolution (pointing off to many other articles) Not necessarily about the development of the "computer" in the core context. (Pat answers: i'm not sure i agree. the name here is "computing" not "computers"; OS and hardware are built to interwork very closely)
  • Apple, IBM, Microsoft (Software company), DEC, Burroughs etc. (but where do we stop? do we go on to include Sun, Oracle, SAP, Google and the newer big players?) (Pat answers: I'm not ready to exclude software from "history of computing")
  • security Tangetal. (Pat answers: "tangential?" today's its mainstream)
What I meant by tangetal was that although it is highly integrated into computing, I think it should get it's own topic, with a very summarized reference to that topic here.--Robert W King 10:09, 14 May 2007 (CDT)
  • codecs (a branch of mathematics) leading to Voice Over Internet Protocol (VOIP), or digital streaming media communications standards (Pat answers: this is a matter for discussion; I've worked on such projects that use special signal processors, which is hardware, and also there is related software; seems like "computing" to me)
Is the implementation of signal processing the same thing as development of a codec? If these things are the same then it really ought to go onto a page such as "DSP" or some other type of signal processing entry. --Robert W King 10:09, 14 May 2007 (CDT)
Please don't strike out more of the items on the list; they are just suggestions at this point and of course are subject to debate. Instead, please register your opinions as comments below (begin with multiple colons) or new sections on the page. I assure you we will listen to your opinions.Pat Palmer 07:43, 12 May 2007 (CDT)

ideas on how to get started

This article seems like a mission impossible. I think the trick is to keep this article as short as possible, while also making it complete by pointing off to other articles for deep detail. At the same time, we need a 'compelling narrative'; what a story it is. Good luck getting started!Pat Palmer 09:41, 11 May 2007 (CDT)

reminder to coordinate with CPU article

Some of the history of computing is currently incorporated in the CPU article.Pat Palmer 10:25, 12 May 2007 (CDT)

potential subtopics/directions

Since this article is "History of computing" and not "History of computers", I think it probaby ought to talk about the development of the idea of computability (Church, Turing, Gödel), and the idea of a von Neumann machine. It also might be worth talking about the development of virtual machines and microcode (and, of course, the subsequent shift towards RISC architectures as a kind of counter-movement). What I have in mind is that developments in software and operating systems have had a profound influence on direction(s) that computing has taken. I would also consider pointing out that computability theory started out trying to model notions of computation as carried out by humans, and only later became closely associated with computing devices.

Propose move to "History of computers"

Should this be about the history of computers themselves or the history of the use of computers? I think because the focus is more or less from a technological standpoint (if it is agreed that is the case) then I believe it should be renamed. The original WP topic also seems to cover the evolution of the computer more than computing itself. Agree/Disagree? --Robert W King 10:14, 14 May 2007 (CDT)

I intend for it to cover development of software technology as well as hardware. I am feeling a little pressured right now. It may need to be renamed, but I request that we wait just a bit to see how it forms up. The way I write is, I start brainstorming, and trying things, and out of that, a form mysteriously starts to appear. I've never been able just to lay down an outline from the get-go (at least, not one that remains stable). I tinker and experiment, and sometimes I am rewarded by creating something special. And then sometimes, that doesn't happen of course! I envision this article, whatever it ends up being called, as pointing off to other "histories", such as "history of operating systems" or "history of virtual memory" or "history of the compiler". At present, these little histories are scattered here and yonder, often buried within the articles about the topics. I think the history of a technology is somewhat of a different undertaking that the description of what it is now, though they are related. Well...as I write here, I can see that I am confusing even to myself. Rename it if you think it ought to be. But I don't consider this article anywhere near done. It's in its very early stages. Pat Palmer 11:57, 14 May 2007 (CDT)
This article is not about "use" of computers; I intend it to be about evolution of the hardware, software and industry from the 1950's until now, including networks and telecommunications which are foundationally based on computers, albeit special ones. This is intended to be the umbrella article. These, days, universities have "history of science" programs; they train historians, not technologists, to write about technology. That's the kind of person I'd like to help out here. "History of computing" is a subfield within that. Entire books exist about it. This article should refer to some of them before it's done. We need a coherent strategy for articles such as compilers, computer networks, and operating systems to break out their history into separate sections, for separate maintainance. It's relatively easy to get the current discussion of a technology done, but when you try to add the history, it bogs down and takes forever. By breaking these articles apart, we may find it easier to approve the non-history parts quickly. Pat Palmer 12:06, 14 May 2007 (CDT)
Pat, this article is now up to you. Unfortunately the differences that I feel about material that should be included is too radically different from your view, and I'd rather just back away than create conflict.--Robert W King 09:25, 15 May 2007 (CDT)

Charles Babbage section

I'm trying to determine how best to explain the development from using counting devices to the pascaline to the development of babbage's machines. I dont want to add any more to charles babbage (specifically about his life, work, lovelace etc.), but I think it's important to recognize the relationship on how one piece of history related to the next. --Robert W King 14:14, 14 May 2007 (CDT)

Well, how many people will have seen mechanical adding machines? Am I showing my age here? (I think I know the answer to that question!) At any rate, a picture of the mechanism in a mechanical adding machine would go a long way toward illustrating the concept. You could also start with the abacus and talk about using cog wheels to automate the "carry" process. Greg Woodhouse 14:21, 14 May 2007 (CDT)
Greg, you read my mind. I'm actually working on that in the pascaline section!--Robert W King 14:24, 14 May 2007 (CDT)
Pat, I'm not very happy with the edit you made to the Charles Babbage section. I feel that a lot of pertinent information in this history of computing was taken out.--Robert W King 16:26, 14 May 2007 (CDT)

please make SEPARATE articles for each person

Charles Babbage has his own article, and most stuff should go there. I'd like to see this article contain only a short overview pointing off to Charles Baggage. I think we should do the same thing for Herman Hollerith, Blaise Pascal, and any other. This is the framework article; the guts are in other articles. Otherwise, this thing will very quickly reach an unmanageable size. Pat Palmer 16:19, 14 May 2007 (CDT)

For the record, I'm trying to keep it down to 2 paragraphs(roughly) to contain the concept developed by each person that contributes to the history of computing. I totally agree that there shouldn't be an entire bio; but the information that should be recognized from each individual is important enough to reference (providing some background) so that were someone to actually read the article would have a rough idea or understanding of how computers got from A to Z (Z being some variable which represents your chosen state of "now").--Robert W King 16:23, 14 May 2007 (CDT)
Additionally, the bio should cover everything about his life. In this article, it's only fair to state his contributions.--Robert W King 16:29, 14 May 2007 (CDT)
I have moved the Babbage details to the Charles Babbage article. I just want this article to give a brief intro and then point off to other articles. Pat Palmer 16:32, 14 May 2007 (CDT)

Fame -- and personal computers

Hey, looks as though this entry is shaping up wonderfully!

I do think though that we should avoid terms like "Famous" in subheaders for people and concepts. Many quite significant people and concepts are not well-known, and some who are quite well known as not terribly significant. I think we should exercise judgment, choose criteria, not let "fame" do this work for us!

I changed it to "key" for the moment. That's really just a placeholder for now until someone gets time to work on it more. THANK YOU from dropping by to help. Pat Palmer 12:14, 15 May 2007 (CDT)

Also -- will there be an entry here on the history of personal computers, or ought that to be a separate topic? You could go all the way back to the Simon in 1950, and all the way up to the dual-core Macs and PC's of today. Another area not very well covered over at the "other" encyclopdia ... Russell Potter 18:05, 14 May 2007 (CDT)

I am urging authors to break out histories to subarticles; history of personal computers would make a good one (if coordinated with personal computer to avoid duplication]]). Feel free to start that one! I thought I'd heard it all but I don't remember the Simon. :=) Pat Palmer 12:12, 15 May 2007 (CDT)

Shannon and boolean gates?

You may be right, but I'm not certain Shannon is the right person to credit with the idea of implementing boolean operations through switches. He certainly did pioneer information theory, but I'd have to do some research before I felt comfortable crediting him with the idea of implementing logical operations through switches. Or did I misunderstand? Greg Woodhouse 18:53, 14 May 2007 (CDT)

Here's the link: Claude Shannon; please read the 3rd sentence on the page--Shannon was, as far as I can tell, the first to note the crucial link of boolean algebra and logic design principles!Pat Palmer 17:24, 25 April 2007 (CDT)
George Boole was actually responsible for boolean operations. See (http://www.csc.liv.ac.uk/~ped/teachadmin/histsci/htmlform/lect4.html) and his work dates circa 1854. --Robert W King 12:13, 15 May 2007 (CDT)
No one is disputing that George Boole created Boolean algebra. Claude Shannon first published work noting the Boolean algebra could be applied to hardware "logic design" (gate design). His master's thesis about from MIT got published in a journal in 1938 and created quite a stir becaus of the idea. Pat Palmer 15:44, 15 May 2007 (CDT)

I'm not convinced. When you go down to section II of the article, the claim actually made is something different: that Shannon inroduced the idea of using binary codes to represent information. In other words, the state of a switch can be represented using binary data, but that says nothing about the operation of a switch.

Now, what Shannon did do - and this is huge - is study the fundamental limits imposed on the transfer of information by available bandwidth (basically, the number of binary digits available), both in the presence of noise and on purely noiseless channels. You may know of the theoretical limit on the achievable data rate on a physical medium with a given bandwidth (this time, the width of the frequency band used) as the "Shannon limit". Greg Woodhouse 12:25, 15 May 2007 (CDT)

Feel free to start the subarticle on Shannon, but please don't start talking about his information theory in the paragraph for 1938. I'm archiving someone's comments here:
To elaborate a bit, Shannon addressed the fundamental problem 
of determining how efficiently switches could operate (in 
practical terms, how long it takes to "complete the call"). 
He not only showed that there was a theoretical limit on 
efficiency, but computed it, and laid the groundwork for 
designing efficient switches (and, more generally, telecommunications networks).
This is good stuff, but it belongs in the article Claude Shannon, which the history of computing article will end up aiming at in several places. Shannon was awesome, a polymath who made major contributions in a number of areas, both theoretical and practical. I have added a reference on his master's thesis which is from the Bell Laboratories site (highly credible). I haven't read the thesis myself. For grins, search on "Claude Elwood Shannon" at Amazon.com and see what pops up! Pat Palmer 12:52, 15 May 2007 (CDT)

continuation of first draft

I'm merging Pat's existing version with mine to merge the work, and it will be continued from then. see http://forum.citizendium.org/index.php/topic,931.0.html. --Robert W King 15:34, 15 May 2007 (CDT)

Yes, to all, User:Robert_W_King is going to guide this article from now on. I intend to get busy on some other projects. Thank you, Robert! Pat Palmer 15:46, 15 May 2007 (CDT)

Comments

Great start all. However, author-me removed all the headings! "Why?!?"

Because the article really reads like a collection of modular vignettes rather than a narrative. Try reading it now without the headings and see how that fact really glares out at you. Removing the barriers of those headings, as I did, can really help facilitate this article to be written as one whole, as one flowing narrative, from start to end.

Try it and see. And have a look at CZ:Introduction_to_CZ_for_Wikipedians#Get_ready_to_rethink_how_to_write_encyclopedia_articles.21 and CZ:Article_Mechanics#Narrative_coherence_and_flow.

So how about we leave those headings out a good while and work on that? I'd be happy to do the busy work of adding them back in, come time. I'll try to help "narratize" this too.

Stephen Ewen 23:02, 15 May 2007 (CDT)

Steven, which do you think would be the best solution? In one sense, this forces the user to read the whole article to get an understanding of the topic. On the another hand, modularity creates a chronological relationship that the user can visually reference. What do you feel is the best approach? --Robert W King 08:49, 16 May 2007 (CDT)
I think that the choice need not be made. Readers can have the best of both. Create it to read as one whole. Later added headings. People who read it as a whole will have a narrative. People who read for isolated points will have their cake too. :-) Stephen Ewen 15:11, 16 May 2007 (CDT)