I have a huge respect for this guy, but the irony of this is not lost on me:
> Since most time scales are fixed by human re-
action times, habits, and other physiological and psycho-
logical factors, the effect of the increased speed of technological processes was to enlarge the size of units— political, organizational, economic, and cultural —
affected by technological operations. That is, instead of
performing the same operations as before in less time,
now larger-scale operations were performed in the same
time.
We call von Neumann architectures to the very devices that have allowed to operate--by proxy, so far--in different timescales. And we do tons of stuff these days in less time, thanks in part to von Neumann.
Had he lived today, I would imagine he would be still concerned with timescales and sizes of physical places, but from an entirely different perspective.
Wow, that story about doing more in the same time instead of the same with less time, that sounds straight out of this villain origin story the New Yorker wrote about Robert Mercer [1]:
“While in college, he had worked on a military base in Albuquerque, and he had showed his superiors how to run certain computer programs a hundred times faster; instead of saving time and money, the bureaucrats ran a hundred times more equations. He concluded that the goal of government officials was “not so much to get answers as to consume the computer budget.”’
This as some justification for becoming a libertarian-activist billionaire, but it seems more likely this is a universal experience than Mercer plagiarizing von Neumann.
I would think someone who goes on to found a hedge fund would understand that if you drastically lower the cost of something which produces a valuable result, the effect should be to increase the demand for that value, not just shrug and say "no, we only do 1 of those per day".
In the general case, how is multi core different than saying “complicated CPU”, you still have computation in a separate box from data and move bits back and forth.
And if we’re being precise, aren’t we all on modified Harvard systems these days? :P (I am not a computer scientist)
Well there’s the logical difference, wherein a multi core system could compute in a sequence other than the straightforward logical sequence A -> B -> C -> D.
If you put a bunch of von Neumann machines in a room and connect them via a network cable, wouldn't that be a trivial extension of a von Neumann architecture? And isn't that sort of what a multicore system is, considering also that access to the same memory goes through a bottleneck?
Well the networked machine system would need vastly more sophisticated programming to operate at the same level of reliability. A difference in magnitude could be a difference in kind, though I haven’t actually read of anyone specifically addressing the point.
> Since most time scales are fixed by human re- action times, habits, and other physiological and psycho- logical factors, the effect of the increased speed of technological processes was to enlarge the size of units— political, organizational, economic, and cultural — affected by technological operations. That is, instead of performing the same operations as before in less time, now larger-scale operations were performed in the same time.
We call von Neumann architectures to the very devices that have allowed to operate--by proxy, so far--in different timescales. And we do tons of stuff these days in less time, thanks in part to von Neumann.
Had he lived today, I would imagine he would be still concerned with timescales and sizes of physical places, but from an entirely different perspective.