![]() |
||||||
|
"Personal"
Computing
By the late 1970s, the price of computer processors and other hardware had both dropped in cost and shrunk in size to such an extent that it was actually becoming feasible for someone to own their own computer, an idea that had been completely unthinkable in the previous decades. Intel, the company that produced the first microprocessors driving many of these early so-called microcomputers, had actually considered the idea of marketing a personal computer. However, they couldn't come up with an answer to the obvious question: Who would buy such a thing, and what would they do with it? At the time, computers were still primarily considered to be number-crunchers, and it seemed that few individuals really needed to have that much computation done. Computers were seen as "industrial strength" machinery, and, to a lot of people, suggesting that there might be a market for a "personal" computer seemed to make as much sense as trying to sell the average person their own bulldozer! Another part of the problem was that although computer hardware was smaller, cheaper, and more widely available, computer software was still largely "custom made." In other words, if you wanted to use a computer, that usually meant programming it yourself (or hiring someone else to do it for you). For most tasks, there was no "off-the-rack," ready-made solution. There was no real software industry, apart from the folks who were selling programming languages.
It would take a young programmer named Bill Gates to foresee the idea of a software market. |
|
||||
|
If you encounter technical errors, contact computing@calvin.edu.
|
|