Hacker News new | past | comments | ask | show | jobs | submit login

I don't mean to sound dismissive, but a lot of what you just mentioned is because you are trying to use OpenGL in a way it wasn't intended.

OGL is meant for 3D work--if you want a framebuffer, an SDL_Surface or some native GUI element is probably a better fit for manual CPU hackery. The reason you find it so uncomfortable is that you are trying to use a refinery to bake a waffle. Textures, framebuffers, and polygons are all kinda central to how 3D graphics (and thus OGL) work--they require some work to understand, but it's not wasted work, if you are doing 3D pipeline stuff.

If you're trying to do simple operations, you probably shouldn't be using OGL directly. For debugging, RenderMonkey was quite a good tool.

Lacking modern language features means keeping a clean C ABI is easier. Having #define's means that there is little work to make sure that any language can invoke different things. Using enums, or even better strongly-typed/templated arguments ala C++, would harm that portability.

The many variants on the APIs are actually quite useful under certain circumstances, and it is always pretty clear how they are used (and again, remember that this is meant for a C ABI).

~

I agree that a simpler environment (like Processing, mentioned above) would be helpful. That said, OGL is hardly bad for the reasons you listed.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: