FANDOM


loup vaillant Edit

http://loup-vaillant.fr/articles/taboo-oo references:

We should stop using the term "Object Oriented".

I see two reasons: "OO" has many definitions, and we just can't pick one. For instance, by Alan Kay's definition, Erlang, a functional concurrent language, is more OO than C++. This alone should warrant a blanket ban on the term. There is a largely accepted definition of OO: roughly the style encouraged by Java (classes, encapsulation, inheritance, subtype polymorphism, and even sometimes genericity). However, while this sounds like typical OO, in practice it's little more than syntactic sugar over plain old procedural programming. Hardly a whole new paradigm.

On one hand, I have overwhelming evidence that "OO" is useless in technical discussions. No one readily knows what each instance of that term is intended to mean, and when we do, it generally isn't distinctive enough. On the other hand, many programmers are still uttering that term with a straight face. They're still using it to describe programming languages and practices, to assess their merits, and to address their flaws.

Are they all nuts?

For quite some time, I refused to come to that conclusion, because it suspiciously sounded like some "they're all wrong, I'll show them" syndrome. Let's face it, the prior probability that a majority of professionals have such a basic misconception about their own field is small. I was confused. So I looked. I asked. And I finally found the answer. They are nuts Or at least somehow incapable of changing their minds in the face of overwhelming evidence. It took me some time to understand what was going on. I now see several reasons. One reason has to do with: "If a tree falls in the forest, and no one hears it, does it make a sound?". Everyone should agree that in such a case there will be acoustic vibrations but no auditory experience, and be done with it. The facts have all been laid out, after all. Nevertheless, many can still feel the nagging question: "Okay, but does it make a sound?". This feels like a real question. Actually, it's a mere dispute over the meaning of "sound".

The same happens with "OO": we know for instance that C++ has classes and inheritance, but also primitive types, and no "top" class. We should be done with it, but since have the label "OO", whether C++ is OO or not feels like a real question. Actually, it's a mere dispute over the meaning of "OO". OO is not some thing that exist independently of us, of which there is a natural definition we could find. We just failed to pick a definition among those we made up. Another reason has to do with the aura of mystery that surrounds "OO". I often hear "OO is hard"; "Few really get OO". Wait a minute, are you saying this because you know OO, or because you don't? If you do know OO, if you actually got it, Then I expect a technical explanation, so I can at least tell if any given program is OO or not, and to what extent. It may be hard, but I'm sufficiently full of myself to think I can take it.

If you don't, please do not treat your ignorance as strong evidence that OO is actually hard. It may be true, but there are other possible causes: lack of effort, poor teachers, bad books, or lack of a key insight. Sure, "OO is hard" is very convenient: it sounds wise, and often prevent further questions. It doesn't explain anything, however. Parroting it back will only sustain the aura of mystery around "OO". Yet another reason is the popularity of languages that brand themselves "OO". C++, Java, Python, Javascript, Self… New ones pop up daily. That doesn't help define "OO", but it does makes it ever more present. All these people around OO must me onto something, right? Upon reflection, I think not. I suspect this is just some kind of network effect.

Conclusion OO is not some arcane knowledge that most of us don't "get". "OO" doesn't intrinsically have a definition that we merely have a hard time to find. It has become a mere marketing term wrapped up in a fake aura of mystery, completely unsuitable for serious discussion.

loup vaillant deaths of oop Edit

http://yudkowsky.net/rational/technical This essay is meant for a reader who has attained a firm grasp of Bayes' Theorem. An introduction to Bayes' Theorem may be found at An Intuitive Explanation of Bayesian Reasoning. You should easily recognize, and intuitively understand, the concepts "prior probability", "posterior probability", "likelihood ratio", and "odds ratio". This essay is intended as a sequel to the Intuitive Explanation, but you might skip that introduction if you are already thoroughly Bayesian. Where the Intuitive Explanation focused on providing a firm grasp of Bayesian basics, the Technical Explanation builds, on a Bayesian foundation, theses about human rationality and philosophy of science. The Intuitive Explanation of Bayesian Reasoning promised that mastery of addition, multiplication, and division would be sufficient background, with no subtraction required. To this the Technical Explanation of Technical Explanation adds logarithms. The math is simple, but necessary, and it appears first in the order of exposition. Some pictures may not be drawn with words alone.


http://loup-vaillant.fr/articles/deaths-of-oop

http://loup-vaillant.fr/articles/classes-suck , http://loup-vaillant.fr/articles/classes-as-syntactic-sugar

http://loup-vaillant.fr/articles/taboo-oo , http://wiki.c2.com/?DefinitionsForOo , http://wiki.c2.com/?NobodyAgreesOnWhatOoIs

The five structural and three behavioral attributes of oop as given by Deborah J. Armstrong, are difficult to parse as it isn't related to the fundamental function of hardware: mapping inputs to outputs and Dave Acton's premise that hardware is the platform, not software.


Abstraction Representing reality in a simplified form by removing certain distinctions so that we can see the commonalities.

Hardware has data and functions, what is being removed?

Class A generalised description of similar objects that share a common structure and behaviour.

Behaviour and structure as metaphor for what?

Encapsulation Data and behaviour are defined within an object and separated from everything else, protecting the internal representation of the object.

Clarify the metaphors in terms of how inputs are mapped to outputs on the hardware.

Inheritance Allows the attributes and methods of one class to be based on another existing class.

Meaning, that the derived class has access to to the section of the hash table that the base class had access to. The last derived class has access to the entire hash table, equivalent to having the the functions outside the struct transforming the data inside the struct with procedural programming.


Object An individual, identifiable item, either real or abstract, which contains information about itself and the descriptions of its manipulations.

Metaphors don't exist inside of computers, if Armstrong has now idea what she is trying to say, how would anyone else?

Polymorphism Different objects can respond to the same message and implement it appropriately.

A function gets the same name in separate classes, the compiler stores them in separate memory locations.


https://en.wikipedia.org/wiki/Parametric_polymorphism got renamed as https://en.wikipedia.org/wiki/Generic_programming

Tony Marston Edit

http://www.yegor256.com/2016/12/13/mvc-vs-oop.html I disagree completely. OO programming is exactly the same as procedural programming except for the addition of encapsulation, inheritance and polymorphism. The CPU cannot tell the difference between OO code and procedural code simply because there is no difference. It executes instructions in a purely linear fashion starting at the first instruction and stepping through them one at a time until it hits a 'stop'. Human beings design solutions in a purely linear fashion - there is a start point, a number of intermediate steps, and an end point.

It does not matter whether you are writing OO or procedural code - there is always a start point, a number of intermediate steps, and an end point. In procedural code the solution is broken down into a number functions or procedures, which have unique identities, which are called. In OO code the solution is broken down into a number of objects and methods, where the same method name can exist in different objects, and where a method can only be called by specifying its containing object.

You can take code out of a procedural function and put it into a class method and it will be executed in exactly the same way. A series of functions or methods will be executed in a linear sequence and synchronously - the caller will be suspended, the called function/method will do its stuff and return a response, at which time the caller will wake up and carry on processing.

The idea that OO involves the sending of messages is completely wrong. I have worked with several messaging systems and there is a big difference between sending a message and calling a function/method. A messaging system is asynchronous, and it requires separate 'sendMessage' and 'recieveMessage' functions. The caller is not suspended while the message is being processed, and it has to execute its 'receiveMessage' function at periodic intervals to see if a response has been received yet.

The idea that you should not use IF, WHILE, FOR and all those other procedural keywords in "proper" OO programming is ridiculous. You do not need a totally different thought process in order to do OOP, you just need to know how to how to use encapsulation, inheritance and polymorphism for maximum effect.

links Edit

Oop ,