On Sat, 14 Feb 1998 01:58:13, Adam Wiggins <nightfall@user2.inficad.com> wrote:
[Brandon J. Rickman:]
>> Moore's Law: the computational power of computers doubles every <make
>> up a number between 12 and 60> months.
>
>Hrm, I've never heard any number other than 18. The actual number doesn't
>matter as much as the principle.
There has been recent talk about revising the number, technology is
advancing faster than expected, etc. I don't dispute the truth of it
(it's a Law, after all), just the implications.
>> "computational power" merely refers to a measure of how many operations
>> a chip can perform in a fixed amount of time. The higher the MIPS
>
>It does? Once again, if anyone ever uses it in this context, I assume
>they are taking it too literally. Computational power refers to the
>total capabilities of the machine. This includes quite a bit more than
>the chip. Most folks know that chips aren't the bottleneck in much of
>anything you want to do with computers nowadays. RAM, disk access speed,
>network bandwidth, and bus speed are all signifigantly lagging behind
>processors.
Yes, people seem to take it quite literally. If the Law isn't actually
tied to some measurable phenomenon then it isn't very useful (assuming it
isn't just a marketing slogan in the first place). Consumer knowledge
of the law is prejudiced towards CPU speed, so while you and I know
that RAM/bus speeds/etc are now much more critical in doing what
we want to do, a certain chip manufacturer continues to make profit
by increasing the advertised speed of the newest chips. This is
important when talking about the graphics capability of the latest
machines; you get much more bang for the buck when you optimize for
certain tasks. The introduction of the Nintendo 64 probably outdid
the expected performance based on Moore's Law, but maybe that is because
Nintendo is a Japanese company and they have had decades of practice
optimizing contemporary technology. You don't get that kind of graphics
power packed into an Intel box (actually you can, but the Intergraph
machines aren't useful for much else since they have nothing but
truecolor support) because the advertised functions of the machine are
_too broad_.
>This is true of any tool. However, a generalization of the sort Moore's
>Law makes, at least in my semi-humble opinion, doesn't refer to any of this.
>If one were to buy a new sort of screwdriver which added a pull-out bar
>for extra torque, few would complain for faulty advertising just because
>they don't work out as much as they used to back when they used the old
>screwdriver, and therefore are actually able to turn the same amount or
>less with the new torque-bar feature. This doesn't impact the generalization
>that the new screwdriver can provide you with more torque.
This was my point, that Moore's Law doesn't take into account optimization.
Screwdrivers (and screws for that matter) have been optimized for
centuries (well, like _I_ really know) but there isn't much demand
for some kind of Torque Law (or Torque Reform, if you will). Moore's
law is heavily influenced by modern culture; we don't really know where
things are headed, so we'll explain it with science. If the Law is
true, there really isn't any security in doing anything with the
technology from now into the forseeable future, except for quick (and
risky) monetary gain.
>> Second, the amount of computational power available on the consumer market
>> is far beyond what anybody could actually use. The average non-business
>
>Oh? Explain to me why I spend so much time waiting for my computer at work,
>a state-of-the-art Pentium with plenty of RAM and a nice fast hard drive,
>to perform the routine tasks I do from day to day? Why do I wait for
>3D Studio to render long animations, or Photoshop to load up, or twenty
>minutes for Developer Studio to do a complete rebuild?
You've used too many polygons and your models are totally wrong. You're
probably using too much specular as well, please stop. ;)
>Naturally there are plenty of other things to blame here other than
>hardware. I'm one of the first to complain that modern software packages
>are bloated and top-heavy and consume far too much resources for what they
>do. Nevertheless, this still falls into the category of 'computational power'.
>Far too many times I've heard 'Oh, computers nowadays are so fast there's
>no point to even optimizing' which is what the above seems to advise in an
>incedental sort of way. It's *extremely* easy to use all the computational
>power availible today, and then some. Whether this 'use' is justifiable
>or not is another argument.
I believe there is currently a generation of programmers who, having been
forced to write efficient and modular code for many years, are quietly
revolting by encouraging bloated, use-all-the-power-you-can code. The
resulting products don't scale well, another problem with Moore's Law:
the increase in computing power is not scalable, mostly because of
uneven improvements in bus speeds, &c. (I think parallel processing
effectively throws the Law out the window, massively parallel machines
have their own operating paradigm.)
I think efficiency is an important aesthetic
choice. Yet I have argued in the past against using
excessivly clever "16-bit reverse-lookup property storage" code when
it requires changing the original design concept. When designing a
mud the computational power should be leveraged to the benefit of the
game first, on how the game operates (as opposed to how cool it looks).
If the design calls for solving quadratic equations to do a player
skill check, optimize the friggin' srqt() function, don't tell me
to change the skill check system.
Computational power is underused because people don't know what the
computer is actually doing. Turn off the wallpaper. Disable java.
This doesn't help high-end users, not until the 3d software people
build in preferences for disabling unwanted features.
>Hrm. Well, most companies I've worked for do have a real problem with not
>sending the computing resources to the folks that really need them, but
>generally the best rigs go to the 3D artists, who most certainly use all
>the power they are given. Even with a nice GLint card, plenty of RAM,
>a multi-processor machine, and all that junk - manipulating a mesh with
>upwards of five or ten thousand polygons, especially with detailed textures,
>is pig-like.
Think of it as giving you something to do with all the time you have
saved. :)
>Your market *is* the folks who run out to buy a top-of-the-line rig
>every couple of years. Actually, this brings up another point - I
>think we lost whether or not Mike was talking about client or server
>software. A mud client is more concered with video acceleration and
>internet bandwidth than any sort of processing power. A server is worried
>about speed, RAM, and mass storage.
Mike was quite innocently talking about target client machines. I've
taken everything way out of context.
>An example of what I believe Mike was getting at - racking your brain
>trying to come up with killer optimizations is occasionally a huge waste.
>Orion and I spent the first year of our project obsessing over the amount
>of RAM and processor time our mud took. We spent long amounts of time
>trying to squeeze every last bit out of the structures we allocated,
>and building extra lists to speed up some of the game loops. This was because
>I thought it would be running on my 486-33 with 4 megs of RAM. By the
>time we were well into the project, we had it running on a Sparc of some
>sort at the university sporting a nice big RAID drive and multi-hundred
>megabytes of RAM. At that point the fact that our base server took up less
>RAM and processor time than tcsh was only amusing, and not at all useful.
>We ended up going back over and undoing a whole bunch of our optimizations
>that we labored so hard over and replacing them with what we really wanted
>to do in the first place.
Using a RAID is an optimization. You are fortunate to benefit from an
extant hardware solution; had the project been something more than a
single server this probably wouldn't have worked as well. I would
say it is better to over-optimize first and then add features than to
pack in features and request new hardware to run it on. (Of course, the
latter is a common and clever developer solution effective against
under-budgeted projects, and is possibly the strategy of certain mega-
corporations that are strong enough to drive the consumer market. It
works if you already have influence over your audience.)
>> Third, designing for non-existant technology is a dumb-assed design
>> constraint.
>
>Quit beating around the bush, Brandon. Tell us what you *really* think! :)
>This is an extreme statement. Designing for non-existant technology is
>impossible. Designing for propossed technology that is currently in
>the works isn't much fun and is a gamble (as you say below), but can
>sometimes pay off large rewards. (case in point: Microsoft's first product)
>Designing for currently technology with an eye on what the future holds
>is only logical.
I believe the game industry is actually designing for current technology,
but they like to delude themselves and generate consumer buzz by talking
about proposed tech. No one is going to make something for a market that
doesn't exist, because the market that _might_ upgrade their machines next
year is the market that already has the latest machines, i.e. the market
is using current technology and will at miniumum be at that same level
12-24 months from now.
>> Productivity has not increased.
>
>I disagree with this quite strongly. Perhaps people are 'lazier' nowadays
>(in the same way that people are lazier since the invention of cars,
>or microwaves), but this doesn't mean that you can't do more with less
>effort. I used to spend dozens of hours trying to come up with
>very simple character animations viewable from a single angle with DPaint.
>Now I can do a good character animation from any angle in about with a 3D
>animation package. Have my skills as an artist improved? Hell no. Even
>something simple like the wand tool in modern paint programs reminds me
>of long, tedious hours spent 'cleaing up' images by hand, try to hunt down
>stray pixels.
>
>This also depends how you meassure productivity. One might say that the
>average artist can create about the same number of art pieces per period
>of time N as they could ten years ago. The difference is that the pieces
>are probably higher-resolution, higher color depth, higher framerate, and
>>more easily modifiable. Whether you see this as a huge improvement or
>not (I don't, really) is subjective, but one cannot deny the truth of the
>>hard numbers (1280x1024x24 bit vs 320x200x8 bit, for example).
Productivity is tied to the entire work process, so it isn't how much
work an individual can do in one afternoon, but how much work that
one person plus some number of system administrators plus some number
of customer service reps have to do to generate one 1280x1024x24 image.
If after all that you can squeeze out more frames a day with fewer
man-hours than five years ago you are indeed more productive, or rather
you have actually leveraged the computational advantage in your favor.
At this point it all becomes a post-modern dilemma, because the work
wouldn't need to be done if computers didn't exist.
>> (Productivity was hardly even measured before computers entered the
>> workplace, so the argument is moot.)
>
>Productivity is a fundamental yardstick by which any endevor is meassured.
>This is a factor independant of computers, or businesses or humans, for
>that matter. Since computers had such an effect on it (both positive
>(good tools) and negative (internet games, *ahem*)), it became popular to
>try to meassure it 'accurately'.
Hardware makers aren't so much interested in "accurate" productivity
measures as they are in "positive" measures. Lest I be taken for a
true conspiracy nut I will pause here.
>> To somehow tie this back to a list-relevant topic: Mike is advocating
>> that product cycles should be targeted towards cutting-edge machines,
>> because cutting-edge is cool? important? profitable? Someone has to
>
>I don't think that's quite what he said...you probably should have quoted
>a bit more. What I got out of it was, "Don't hold yourself back for
>fear of having something completely useless. As long as you don't go
>crazy it's likely you'll have the technology to support it, if not now,
>then soon."
I feel like making a hairline distinction between designing for
technology that will soon exist (faster chips that aren't actually on
the market but have been prototyped) and designing for technology
that doen't exist at all. The madman that designs for nonexistent
technology is in fact inventing something new.
>That's what you get, and yeah, it's a risk. You do the best you can to
>guess. As with anything like this, taking the safe route and going for
>the lowest common denominator might end up with your either wasting time
>with pointless optomizations, or ending up with a product that appears
>obsolete next to all the others. Like it or not, computer users aren't
>much interested in obsolete programs. Just try to convince any
>hard-core Diablo player that Angband is a much better game along the
>same lines and see what response you get.
So all those repackaged Atari Classics don't make any money? Activision
doesn't make money by slapping the Zork title on new games? True,
some companies don't believe in long-term profit, they hope to make
a quick buck and sell the company to Microsoft before the employees
can unionize. But if someone had some actual numbers as to how many
quick-buck ventures fail, in particular multi-player games, we could
all have a good chuckle.
>> A short list:
>> - having a large and divers world to explore that can be affected by
>> players
>
>Implies lots of data transmited from the server. Internet connections
>are certainly getting vastly better in a hurry, but they are still a huge
>bottleneck. As much as I agree with you, I tend to think that anything
>which increases this load is a Bad Idea, at least for right now - especially
>if we're designing for current technologies like you suggest.
Now you've just nixed my design spec instead of trying to solve the
problem. If we design for a maximum network speed of 14.4 (a totally
obsolete speed, eh?) we can encode our content, whatever.
- Brandon Rickman - ashes@zennet.com -
While I have never previously found a need for a .sig, this
may be considered one for the purposes of this list