[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: Scalable games
Stephen J Baker wrote:
>
>
> I think it's increasingly necessary to pre-profile the hardware on
> program start-up by doing things like drawing a gazillion tiny
> triangles (to measure the approximate polygon throughput) and a
> few hundred full-screen sized polygons (to measure approximate
> fill rate performance) and then using those numbers to tune the
> model and rendering algorithms ahead of time.
I agree with this. In fact, during my discussion with one of the
designers, I suggested something like a "benchmark" to check out the
current machine's capabilities, like poly count and fill rate. But why
not extend this to not only graphics? Why shouldn't we run tests with
AI, physics, or whatever, each system designed to work with varying
levels of complexity? I mean, the design should agree with what is the
bare minimum, and what would be the "dream settings", and let the engine
interpolate in between.
>
> Providing those are only setting defaults and you allow the user
> to tune them in the 'Options' screen, that should be OK. Remember,
> some gamers live by 'twitch response' and need insanely high frame
> rates whilst others may be enjoying the scenery and keenly observing
> small details in the scene for clues to something or other.
Of course, we shouldn't take away the user's right to tune in manually,
but if the engine could be smart, we could even point out eventual
trade-offs in performance/detail depending on the settings chosen.
Miguel A. Osorio.