我一直在寻找了typedef
的GLsizei
对的OpenGL ES 1.1的实现,在iOS上,并惊讶地发现,它被定义为int
.一些快速的谷歌搜索显示这是正常的.(包括普通的OpenGL.)
我期待它被定义为unsigned int
或size_t
.为什么它被定义为香草int
?
它似乎不太可能是一个问题,除非你有任何4GB的数据结构.
以下是某人的回答:http://oss.sgi.com/archives/ogl-sample/2005-07/msg00003.html
Quote:
(1) Arithmetic on unsigned values in C doesn't always yield intuitively
correct results (e.g. width1-width2 is positive when width1<width2).
Compilers offer varying degrees of diagnosis when unsigned ints appear
to be misused. Making sizei a signed type eliminates many sources of
semantic error and some irrelevant diagnostics from the compilers. (At
the cost of reducing the range of sizei, of course, but for the places
sizei is used that's rarely a problem.)
(2) Some languages that support OpenGL bindings lack (lacked? not sure
about present versions of Fortran) unsigned types, so by sticking to
signed types as much as possible there would be fewer problems using
OpenGL in those languages.
Run Code Online (Sandbox Code Playgroud)
两种解释似乎都是合理的 - 我在很多情况下都在尝试使用NSUInteger作为循环计数器(提示:不要这样做,特别是向后计数到零时).