There’s a subtle pointer arithmetic bug I’ve seen a few times recently in various C modules. It stems from a lack of understanding about the difference between the sizeof operator and the ARRAYSIZE macro (on Windows, at least), and also perhaps of pointer arithmetic.
#define cARRAY_ITEMS 1000
DWORD *rgPtrs [cARRAY_ITEMS];
In the above snippet, the size in byte of sizeof(rgPtrs) depends on the pointer size of the host machine. For an x86, the total size is 4 * 1000 = 4000 bytes. For a 64-bit app, it’d be twice that.
Regardless of the host architecture, we expect ARRAYSIZE(rgPtrs) to evaluate to 1000. Depending on your build environment, the macro may in turn be defined as the following macro (from the latest Windows SDK version of winnt.h):
#define RTL_NUMBER_OF_V1(A) (sizeof(A)/sizeof((A)))
Finally, note that the difference between two pointers (there are some caveats here – your compiler documentation) is defined in terms of a count of array elements (i.e. not bytes, unless it happens to be an array of type byte).
Here’s some sample code with this type of problem – see it?
#define cbVALUE 256
#define cSAVED 10
typedef struct _WIREDATA
BYTE rgbValue [cbVALUE];
} WIREDATA, *PWIREDATA;
WIREDATA rgbSaved [cSAVED];
PWIREDATA pCurrent = pStart;
// Handle some number of TLVs
pCurrent += XXX;
if (pEnd – pCurrent < sizeof(rgbSaved))
memcpy(rgbSaved, pCurrent, sizeof(WIREDATA) * (pEnd – pCurrent));