package: nx-libs version: head
In nx-libs/nx-X11/extras/Mesa/src/glx/x11/glxextensions.c the length of the zeroed memory is the length of the pointer (4 bytes in case of a 32bit system) not the length of the data the pointer points to (8 byte).
static void __glXProcessServerString( const struct extension_info * ext, const char * server_string, unsigned char * server_support ) { unsigned base; unsigned len;
(void) memset( server_support, 0, sizeof( server_support ) );
Furthermore the length of the memory area pointed to by server_support is defined in varying ways in the coding:
#define __GL_EXT_BYTES ((__NUM_GL_EXTS + 7) / 8)
unsigned char server_support[ __GL_EXT_BYTES ]; unsigned char server_support[8];
Currently __NUM_GL_EXTS = 123, so __GL_EXT_BYTES = 8.
What is expected to happen if __GL_EXT_BYTES > 8 after defining six more values in the unamed (sic!) enum with the different bits?
This questionable code was identified with cppcheck. http://cppcheck.sourceforge.net/
Best regards
Heinrich Schuchardt