Any of the definitions of "object oriented programming" tossed about on this wiki and in the literature necessarily (and often quite purposefully) exclude one or more languages that is widely considered to be an OO language. We contend that "object oriented" has long been a meaningless term, and attempts at a precise definition at this late date will satisfy only those with a partisan agenda, while a broad definition provides no value.
Let's just give "object oriented" away. It's not doing us any good anymore.
The set of languages which are considered object oriented includes such odd bedfellows as EiffelLanguage and JavaScript, which have far less in common with one another than they have with non-OO languages AdaLanguage and ToolCommandLanguage, respectively. It can be seen as a fairly arbitrary grouping. If, however, we look at these languages in terms of features, more useful distinctions are available.
Discussions of whether or not a given language is object oriented are right out. It is much more fruitful, for instance, to discuss polymorphism in C++ vs polymorphism in Smalltalk than it is to discuss which of these is or is not an object oriented language. Polymorphism, to continue the example, can be reasonably defined, and is specific enough that it can be reasoned about.
Past [based on?] on past debates, I don't think it will be that smooth.
I think this is similar to the DefinitionOfLife battles. Rather than try to find a perfect definition, maybe treat OO as a list of features or tendencies. Something is considered OO (or OO-ish) if it has enough of the features.
See also ReesOnObjectOrientedFeatures, OoLacksMathArgument, AlanKaysDefinitionOfObjectOriented, DefinitionsForOo