[Author Prev][Author Next][Thread Prev][Thread Next][Author Index][Thread Index]
GDK_GL_DEPTH_SIZE value of GtkGlArea widget
- To: linuxgames@sunsite.dk
- Subject: GDK_GL_DEPTH_SIZE value of GtkGlArea widget
- From: Abhir Joshi <abhir@vsnl.net>
- Date: Fri, 5 Dec 2003 13:49:54 +0530
- Delivered-to: archiver@seul.org
- Delivered-to: mailing list linuxgames@sunsite.dk
- Delivery-date: Fri, 05 Dec 2003 03:15:47 -0500
- Mailing-list: contact linuxgames-help@sunsite.dk; run by ezmlm
- Reply-to: linuxgames@sunsite.dk
- User-agent: Mutt/1.2.5i
While creating a GtkGlArea widget a number of attributes need to be
specified. One of them is GDK_GL_DEPTH_SIZE. I still don't know what
exactly does this mean. On my machine (that doesn't have hardware
acceleration) if I set this value to 16, surfaces don't get rendered
properly. Some hidden parts are visible and some visible parts get
hidden. I observed that it becomes dependent on the order of
rendering those surfaces. If I increase the depth size to 24, things
become a bit better. If I set the value around 100, the surfaces are
rendered perfectly.
Now when I try to run this program on a machine having i810
motherboard with h/w acceleration and depth size set to 16, the
surfaces aren't rendered properly. This is the same thing that
happens on my machine. But now if I increase the depth value to
anything above 16, it fails to create the GtkGlArea widget itself.
The depth value of X is 16 for all the time.
When directly using GLUT with depth value of 16, there are no problems
at all on machines with or without h/w acceleration.
What could be going wrong? Am I missing on some other parameter?
--
Abhir Joshi
http://education.vsnl.com/abhir/