An industry-standard 3D graphics API and rendering model originally based on SGI's GL. See http://www.opengl.org/ for the specifications and the list of which companies are on the architecture review board. There are quality implementations available for any platform of interest. Free (FreeAsInSpeech) implementations have rivalled many software-only proprietary implementations on workstations for years, and now Linux- and BSD-based PCs are beginning to see implementations that efficiently use their hardware.
OpenGL has been used in high-end graphics systems for CAD and scientific visualization for nearly ten years. Now that the low-end of the computing spectrum is surpassing the high-end of even five years ago, OpenGL is seeing more widespread use. Game programmers, inspired by JohnCarmack's evangelism, are finding that the OpenGL rendering model delivers high performance for their applications on PCs. Then they discover that it's also a cross-platform standard and realize it's wonderful.
The downside of OpenGL is the relatively painful standardization process for new extensions. Some say that rapidly changing graphics hardware and philosophies are straining the process. Others point to the standardized extension system and say that's the way it should be. Various new texturing operations, and combining multiple texture passes in one, sometimes cause people to use non-standardized APIs like 3Dfx's glide rather than the OpenGL extension system.
Note on PC 3d hardware: They typically handle only one graphics context, as opposed to nearly limitless contexts on systems like SGI's Onyxs. This is reasonable for a typical PC application, but people looking for a cheap substitute for the high-end systems (where you sometimes display dozens of models in separate views) are often disappointed.
We were writing a television broadcast system where we went through a DirectDraw interface clone to a $6000USD NTSC/PAL output graphics card. Broadcast customers expect proper alpha blending between overlays in order to do nifty alpha transitions, little icons, "bugs" or whatever. Weirdly, we found that the alpha from an overlay higher in z-order (closer to viewer) "bled" through to overlays below, which was especially apparent over a region of zero opacity "on screen" (to allow a realtime video layer to show through).
Judicious flipping of the various DirectDraw alpha related flags the clone had implemented revealed nothing of use, especially given that dang video layer. A telephone call down to our extremely helpful counterpart revealed that the actual rendering was done with an underpowered OpenGL chip. He had set the alpha to be the "traditional" (1 - alpha) matrix (someone can fill this in with the actual macro; it's been several months, sorry). This matrix, however, is merely "good enough" alpha blending, though not correct. Correct in the sense of "This is what photons would do."
We spent three hours consuming the largest whiteboard in the building (the kitchen--poor marketing people couldn't grok the Greek) calculating the correct alpha formula and the correct OpenGL matrix ordering. /* Stupidly, OpenGL requires two passes to blend alpha correctly. What the hell! Blew the chip's mind at 60 fps. DirectDraw was nice though. -- SunirShah */
On the flip side, it'd work just fine on a $1M SGI box. The original designers of OpenGL took a bit of a purist stance on some things... Alpha blending is a cheap hack around real radiosity-based (or RayPool?, now) methods. At least in 3D. 2D is a bit different...
See also DirectXversusOpenGl.
See also MathGl for mathematical application of OpenGl.