People today know that computers will be faster in the future since lithography decrease. Did people believe that before microprocessors?
People always say we will have better hardware in the next five years than today. The processors get better as the lithography decrease. I read that we have reached the limit and it requires quantum to develop but this is not our topic now. Our topic is people always believe that computers will develop in the future. I also believe that 2020 computers would be better than 2015, the same way it was better than 2010 and 2000. So we people are expecting development and won't be surprised if hardware got better. BUT, before microprocessors were invented, did people believe that computers will get better in the future? Did people say "we will have 1gb ram in 50 years" the same way we say "We will be able to run 4k 120fps in the future"? or did people just think we will have large slow dos computers forever until the first microprocessor was invented?
- I Like StoriesLv 712 months agoFavorite Answer
Before microprocessors were invented nobody knew much about or cared much about computers. Back then "computers" were mechanical devices, not unlike a mechanical typewriter on steroids. They used punch cards to crunch complex math equations. You were screwed if you got the cards out of order. These machines were huge and few companies had them or saw the value in them. You leased time on them from IBM.
There were no DOS computers before the invention of the microprocessor. The first DOS computers used an 8 bit Intel microprocessor. Those didn't hit the market until ~1980.
I remember when the IBM XT computer came out (roughly 1984), it used 2 x 5.25" floppy disks for storage. You were styling if you had a 10MB HDD. I believe it was with MS-DOS 2.2 and Bill Gates famously said nobody will ever need more than 64KB of RAM. LOL.
Trust me, an XT with rudimentary word processor was vastly superior to using a typewriter, which is what the other option was. Spreadsheets didn't exist until several years after the emergence of PCs.
I think what you are referring to is Moore's Law, that says the density of transistors on an integrated circuit will double every two years. https://en.wikipedia.org/wiki/Moore%27s_law
- Anonymous11 months ago
- WhoLv 711 months ago
its a shame you already picked an answer - cos most of it is wrong
"computers" (the actual computing part of them) were never mechanical devices , mechanical devices were only used for input and outputting data
"computers" existed well before microprocessors were invented (I went to look at a computer mid 60's and it was huge)
they didnt use cards to crunch anything - cards were used to input data
" There were no DOS computers before the invention of the microprocessor"
this is misleading - DOS(more correctly MS-DOS)was specifically written for the microprocessor used in the ibm pc - but it was a development from CP/M which was used as the OS in small computers before then
both word processors (rudimentary) and spreadsheets were available well before the pc was (I was using both in the mid 70s)
the 1st floppy disk wasnt 5.25 - it was 8.5
As for the question
there are 3 limiting factors
1) clock speed
2) distance in and between components
3) switching time of components
clock speed is limited by capacitance and tracking- (the components and tracking can result is a lot of radiated rf and heat generated)
3) this is limited by the technology
- m8xpayneLv 712 months ago
This worked for years but Silicon has hit it's limits and it won't shrink down past the 3nm to 5nm process. This is why Intel is 3 years late with their 10nm processors, and they'll be 4 years late once their 10nm processors are actually out. lol, years ago and according to Moore's Law, Intel projected to have their 7nm processors out in 2018, but they're still not able to get their 10nm process down and out the door.
So NO, computer processors are going to hit a headwall unless they can find a material that's better than Silicon.
- How do you think about the answers? You can sign in to vote the answer.
- Norm FLv 712 months ago
You have just given the "Best Answer " to an answer that is complete crap
- Spock (rhp)Lv 712 months ago
computers were getting better and better all the time -- even back in the 50s and 60s. Certainly people working in the business believed they would continue to improve.
transistors were invented in 1947. by 1957, it was plain that they were the future of computing.
integrated circuits were invented in 1959. Moore's Law was described in a 1965 paper and heralded the power gains in throughput you refer to.
Lots of ordinary folk just didn't get how powerful computers and personal computers would become, even as late as the early 1980s.
- yLv 712 months ago
Some saw that the tech would grow, get better, get faster while others, never thought they would amount to much. Then there were the others, who believed that tech would grow and take over. You can still find many who still have all the different lines of thought.
- ioerrLv 712 months ago
we're already hitting the limits of what's possible just using microlithography, with silicon anyway. the current focus of development seems to involve making circuits incorporating 3d structures, and of course parallel architectures
oh right, the actual question... well generally they didn't think about computers at all. there were some people who speculated about them, and sometimes they anticipated "powerful computers in the future" of some kind, but even a lot of these didn't really know that much about what it was computers even do
- the internetLv 712 months ago
Probably yes, but it was very few people then who knew about computers and even fewer knew about the idea of 'faster' computers.