[Author Prev][Author Next][Thread Prev][Thread Next][Author Index][Thread Index]
Re: [linuxgames] Poll: Video Access
Francesco Orsenigo wrote:
Adam D. Moss:
If these are features that you want, go for it. There are some
issues with turning arbitrary pixel data into textures (standard
OpenGL textures are limited in maximum size [there are still
surviving cards with a 256x256 pixel upper limit] and powers-of-two
dimensions), but this can all be worked around (at worst you have
to split your image across several textures taking aliasing at the
seams into account, and stitch them back up when it comes to
drawing).
How can estimate the texture memory usage?
System memory or graphics card memory? It's hard to pin down
which (and how much of each) will be used, as it's largely
down to the driver and the capabilities of the card.
I've always supposed 4bytes per pixel (RGBA)...
If so memory usage may grow very quickly....
It depends. You can ask OpenGL to use a specific internal
representation (there are many different OpenGL-internal
representations you can ask it to use, including RGBA packed
2/2/2/2 into the bits of one byte -- but you won't always get
what you ask for) if you know what space/quality tradeoff
you're aiming for for a particular texture.
Generally the attitude of 'don't worry about it, but don't
go crazy-generous on texture sizes' works okay for most
OpenGL developers. Drivers usually have pretty good texture
caching policies; the most important issue is to not *use*
too many textures in the space of a couple of frames (plus
texture-sorting within a frame is a win). Simply *creating* a
lot of textures isn't that big a deal (but if the set is
unbounded or huge then you still need to manage it at the
application level).
It goes back to the give-player-color-to-unit issue: I must store the sprite
twice: the normal sprite and the areas that must be colored...
Is possible to achieve this just playing with the alpha channel or a similar
hack?
I don't remember what answers you got originally. I'm sure that
someone would have suggested textured palettes (though they're
somewhat deprecated). I presume that you need your alpha channel
for the sprite's real outline -- sure, everyone loves masked
sprites rather than tiles.
If you want to get fancy and really want to encode all this
info in a single texture to work on all OpenGL implementations,
I think you could use a single alpha channel to encode both the
outline-mask and the tinting-levels by:
1) Initially set up your sprites with the areas to the colourized as
(generally) white or light grey. The alpha should be 255/255
(or whatever range you're requesting for the internal
representation's alpha) for the areas that you want to NOT be
drawn at all (this is the opposite of what you'd normally expect
from alpha). The alpha should be 0/255->254/255 for
the amount of colour tinting wanted for an area, from 'none'
to 'very high', so you can have gradiated tinting as a bonus
(don't say I never give you nothin').
Set glTexEnvi( GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE )
for such textures.
2) Disable alpha blending (use say glBlendFunc(GL_ONE, GL_ZERO)) but
enable alpha testing (and an alpha test func that DISCARDS pixels
with alpha>254) for the first pass (non-tinted, outline-masked).
Draw sprite, without vertex colouring (use all-white vertex colours).
3) Draw the sprite a second time with the same alpha test
function, vertex-colours set to the colour of the overall
tint (team colour, or whatever), and an alpha-using
framebuffer blend function (say
glBlendFunc(GL_SRC_ALPHA,GL_ONE_MINUS_SRC_ALPHA)).
There are just *slightly* less tortuous methods you could use,
I think, but they'd assume multitexturing hardware or Z-tricks;
this way (though I haven't tried it; I may have goofed) should
work on everything...
--Adam
--
Adam D. Moss . ,,^^ adam@gimp.org http://www.foxbox.org/ co:3