Oop

oop category mistake
Objects are things that exist in the real world. Stuffing functions into a record with data and calling it a class with a pointer to its memory location doesn't give you an "object" but a metaphorically induced category mistake. Transforming data, mapping inputs to outputs, is the only goal in programming and code cannot model the real world says Dave Acton and noun. This activity can't be separated from the computer's constraints such as the L2 cache event, which struct stuffed procedural programming ignores: hardware is the platform, not software. Struct stuffed procedural(oop) attempts to increase maintenance, how it regulates access to data, but this prevents grasping the properties of data, what is known as the Lost in space syndrome.

Metaphor allows us to make an excursion into the unknown and also obfuscates the premises. The premise with both FP and oop(struct stuffed procedural) is that the real world can be modeled on a Turing machine and Dave West asserts that it can't. Academia and Design patterns book authors are in on the object oriented programming category mistake because it allows then to solve artificially created problems.


 * Dadgum The problem with the "Duck extends Bird" kind of example is that it gives you no understanding of the kind of considerations you need to think about in order to decide whether the design decisions discussed above are good or bad. In fact, it actively sabotages that understanding. You can't add code to ducks; you can't refactor ducks; ducks don't implement protocols; you can't create a new species in order to separate some concerns (e.g. file I/O and word splitting); you can't fake the ability to turn a duck into a penguin by moving its duckness into an animal of some other species that can be replaced at runtime.


 * Tony Marston A Car and a Train and a Truck can all inherit behavior from a Vehicle object, adding their subtle differences. A Firetruck can inherit from the Truck object, and so on. Wait.. and so on? The thing about inheritance is that is so easy to create massive trees of objects. But what OO-bigots won't tell you is that these trees will mess you up big time if you let them grow too deep, or grow for the wrong reasons.


 * beyond noun and veb "..... To describe object oriented programming, objects are often referred to as “nouns” and actions that determine their behavior as “verbs”. These verbs are traditionally implemented as methods, i.e. systematically coupled to the object that ‘performs’ the action....."

And object is the metaphor for combining procedures and data, the same procedures and variables that were clearly separate with struct. In neither case has this anything to do with nouns and verbs, it is a category mistake as Turing machines don't have volition.


 * https://en.wikipedia.org/wiki/Identity_(object-oriented_programming) "..Identity allows the construction of a platonic ideal world, the ontology or conceptual model, that is often used as basis of object-oriented thinking ..."

Which is the meaningless nonsense you get when not grasping the object metaphor.

An object in OOP is the metaphor for functions stuffed inside a record transforming data in that record. A situation which Edsger W. Dijkstra described as an exceptionally bad idea which could only have originated in California, what Rob Pike refers to as the roman numerals of computing. See Dijkstra reddit discussion. Metaphor leads to regression of metaphor, such as "behavior" and "fields". Your computer isn't growing fields of mielies, it is the metaphor for the hash table which Python wraps in a dictionary.

The procedural code acting on the hash table("fields") inside a struct(object) and the same code outside of it has no more a 'behavior' than a mathematical function has: functions only map inputs to outputs, the domain is no more the noun than the range is the verb. As such they have no goals, stuffing the function inside of a Struct doesn't map the Turing machine itself into noun and verb space, it is a category mistake, like describing the marital status of the number seven.

A verb doesn't belong to anything inside of a Struct, it relates subject to object in metaphorical language: Turing machines don't grasp metaphors according to Buko Obele. Metaphor is essential to language but is absent in mathematics and Turing machines. All of language functions as metaphor, there are none though in software which allows con artists like the gang of four to game high IQ people with epistemological inversion. The language of nouns and verbs allows for myths and flexible metaphor, a programmer can only use precision's like types, functions and variables. Programming isn't an artistic endeavor. Luca Cardelli states "....In conclusion, object-oriented languages still have to learn some engineering lessons from procedural languages..." But oop is procedural programming, in a Rube Goldberg fashion though. Because of this flawed premise that OOP isn't procedural we are presented with a false dichotomy between between functional and object oriented(Struct stuffed procedural).


 * Perlmonks oop The definition of OO is unclear: Do we allow single-dispatch? Multiple-dispatch? Single-inheritance? Multiple-inheritance? Do we have prototype-based inheritance? Some class-based model? Something more sophisticated (like Perl 6's roles)? Is everything an object? Do we call it object-oriented if you have lots of accessor methods?

https://en.wikipedia.org/wiki/Object_(computer_science) "...In computer science, an object can be a variable, a data structure, a function, or a method, and as such, is a location in memory having a value and referenced by an identifier. ...". This definition's broadness to the point of encapsulating all of programming itself is an attempt to hide that oop is stuffed procedural programming.

Design patterns
Metaphorical language obfuscates that convoluted procedural programming is being done: whether the function is outside of the record or inside, it can only map the domain to the range. Because this process doesn't represent the real world, code reuse and encapsulation isn't possible as implied by the object metaphor.(Encapsulation is impossible). The computer suffers from constraints that the real wold doesn't, mirroring the real world isn't possible(Dave West slamming both OOP and FP). Look around you, many objects will be observed but have you seen a function? Referring to object oriented programming programming as Struct stuffed procedural programming with localized access to the hash table, clarifies how this induces the Lost in space syndrome.

Design patterns uses pattern(noun) as a dissimilar term for some design or action(verb). This semantic fraud is mirrored by Stallman's "open source" as dissimilar term for "free". The meaning of free is clarified as public domain. Patents, GPL and BSD code are proprietary by copyright and patent law.

After using design patterns for some time, programmers no longer see things or write programs in clear and straightforward ways. The visitor pattern for example is a transcription of functional programming into a convoluted form in order to get around the artifact of stuffing data and functions into a record. In a conventional OOP language, having to use design patterns makes hard things even harder; but in a purely functional language, having to use pure functions to model side-effects makes trivial things hard and hard things impossible. To write programs in a purely functional programming language is like living without any type of sound, light and electromagnetic waves. Everything is blind and deaf. All information must pass through pipes, connected by switch boxes called “monads” in haskell. (See design patterns don't work by Yin Wang)

Consultants don't want to code
We cannot adapt the computer to the way we relate subject to object, we have to adapt to it, computers don't have volition. “Gothic architecture”, “Georgian coat” and “Victorian corset” are nouns not verbs. Software design "patterns" lingua would have us speak of "opening the door pattern" or "pick up the fork pattern". These everyday activities aren't noun like patterns by action like verbs. The gang of four, http://norvig.com/design-patterns/ used Orwellian pretentious language to make millions of dollars out of the software industry in conferences and consulting gigs. Robert Martin makes his money with teaching high IQ people how to cope with the category mistake they find themselves immersed in. University professors need to publish journal papers and sponsor PHD students, deceiving them into a category mistake gives the academics career security.

Struct stuffed procedural programming allows for embezzlement, this is one reason in addition to employment protection, why a CEO will get so much push back when trying to revert to a system where data and functions are separate. Directors must consider how accounting software coders are getting Zcash payments from company procurement officers. Both OOP and functional, being unreadable allows for accounting fraud. Functional ideas like Haskell's monads allows high IQ people to demonstrate their skill, but it also excludes average skilled programmers who can only cope with unstuffed procedural programming from informing the CEO about embezzlement.

Brassuan monkeys
"Understanding" pattern based design is groupthink, from the desire not be the stupid one out. University professor conmen capitalize on this, making Brassuan monkeys out of their students: they have been gamed into an ontological state of noun reification, in ordinary language we don't send messages to rocks. Object in oop is a dissimilar term for message, message programming. This is adds a layer off semantic cruft to what remains procedural in nature. A message is sent to a function(method) that is stuffed in below the data(fields) in a record or struct. In procedural the memory location(pass by reference) is sent to a function, procedural is "message" programming. Data is inert, we don't think of 'messages' when mapping data from the domain to the range with a mathematical function, no more than we would "send a message" to an object like a rock when breaking it. This "message sending" is a hoaxy metaphorical attempt at obfuscating that obfuscated procedural programming is taking place.

In this interview fraudster Elizabeth Holmes effuses confidence, the blood detection technology theranos was a scam. Your university professor is also like Holmes a fraud when referring to struct stuffed procedural programming as oop. Jim Cramer's sideways eye movements showed that he knew she was lying and he did a good job of giving her enough rope to hang herself. She managed to deceive Larry Ellison the CEO of Oracle. https://www.youtube.com/watch?v=NAKMhg1tv34 For 15 years, Theranos’ CEO & founder Elizabeth Holmes led investors on and defrauded them to the tune of $900M, growing the value of her privately held company to $9 billion, before an investigation by the Wall Street Journal began the slow steady crumbling of her empire. In the end, she settled with the SEC, paid a $500,000 fine and avoided jail time. On this episode of Talks, Eric asks Ash how she pulled off the scam.
 * https://www.youtube.com/watch?v=hSlBiMdlyyU   The Psychology of Con Artists, and How to Avoid Them | Maria Konnikova. 60 minutes documentary on cons
 * Babani Sissoko Stole $242 million from Dubai Islamic bank by using confidence tricks.
 * How to Spot Con Men Before They Spot You video
 * Roger Cook's Ten Greatest Conmen: True Stories of the World's Most Outrageous Scams
 * larval sokals hoax, schon hoax, wikic2 oop hoax, scitation.aip schon hoax, crikey.com.au windschuttle hoax quadrant, blogs.crikey.com diary of a hoax,

An "abstraction" is a complexity in ordinary language. In oop coding it is used as a dissimilar term for simplification, because of the attempt at infusing cognitive mystique into the input output mapping of inert data. Transforming data is the only purpose in programming, this process does not represent the real world(Dave Acton). It involves the usage of libraries, which are reusable because they reflect ordinary language, whether they're written in an object-oriented style or not.

Imaginary is the synonym of fictional, in math though it is used as dissimilar term for numbers in the plane and cube. An apple exists in three dimensions, in ordinary language we cull its domains to the real numbering system as this is what is usually relevant: how many apples you have not their position in space. Representing its position on the table is done with "imaginary" or complex numbers. Complex and imaginary here is the dissimilar term for numbers in the plane, its x,y coordinates(x +jy). One step further we can specify the height of the apple (x+jy+ik). Science popularizers(Stephen Hawking) are like Celtic druids in their usage of "imaginary numbers", charlatanism did not end on the shores of Gaul.

OOP is procedural programming
Object oriented programming is basically a hash table with localized name spaces. Haskell's monads is the container object from design patterns. One of the best ways to induce an eternal debating flamewar and generate consulting fees is have people discuss something which doesn't exist. The world of computer science is just as prone to academic fraud as sociology, biology etc. in the race for grant money to produce papers. Apache server is written in OOP C++ allowing for an attack vector to steal company databases. Firing people at Equifax won't solve the problem. As with the NSA Systemd Linux to WindowsXP morphing, we can presume that they also have a hand in forcing OOP on the industry. See Torvalds on C++.

Metaphor in the context of a Turing machine is inappropriate because we aren't making an excursion into the unknown but dealing with how procedural functions are stuffed into a data record. OOP is convoluted procedural programming. Having thousands of swarming procedures isn't solved by stuffing them into planets of "encapsulated" data records orbiting the root object. In this Yegor oop video he lists a few  definitions of objects from textbooks.
 * ".....Java in a nutshell: A class is a collection of data fields that holds methods and values that operate on those fields..."
 * ".....http://goo.gl/qmRFH4, http://blog.cleancoder.com/uncle-bob/2014/11/24/FPvsOO.html   Robert Martin 2014  "Objects are not data structures". Robert says its not about the data, but about behavior or functions....."  Which is wrong as having stuffed the functions with the data into a record, you know have to deal with data domain range mapping inconsistencies this generates within the constraints of a Turing machine
 * "..... Bjarne C++,  An object is some memory that holds a value of some type. p.40 and makes no reference to methods or functions....."
 * Book Smalltalk-80, "An object consists of private memory and a set of operations"  p.6


 * Yegor says 8min ".... we all agree that objects are not data structures but something which is supposed to have behavior ..."  ".... if you have getters and setters you're definitely doing something not oop. ..." 9min

Which shows how the 'object' metaphor induces a category mistake: a grouping of data and functions inside of a record as opposed to having only data in a record. Either way you're only doing procedural programming and stuffing it inside a record doesn't give you rocks, 'behavior', verbs or nouns it gives a you an "...exceptionally bad..." way to map inputs to outputs.


 * "...OOP says that bringing together data and its associated behavior in a single location (called an “object”) makes it easier to understand how a program works. FP says that data and behavior are distinctively different things and should be kept separate for clarity....." from https://www.codenewbie.org/blogs/object-oriented-programming-vs-functional-programming

Domain range mapping isn't a "behavior" but a mathematical function, Turing machines don't have volition. Metaphorical nonsense might never be exorcised from CS. We have a false dichotomy between FP and stuffed procedural(oop), as unstuffed procedural is keeping data and functions separate, the same way a mathematical function's domain range mapping of data is separate of that data.

wikipedia dynamic dispatch "....Object-oriented systems model a problem as a set of interacting objects that enact operations referred to by name...."

This OOP or convoluted procedural programming results for example in programmers asking themselves stupid questions
 * Should a Message send itself.
 * Or should a Sender object send messages.
 * Should a receiver object receive messages.
 * Or should a connection object transmit messages.

Procedural should be the only paradigm, has the premise that something other than domain range mapping of data is possible. Cloaking the premises is enabled by the fraudulent usage of metaphor. A problem is by definition problematic, it cannot be reduced. OOP is a "free software license" Stallman like semantic trick foisted on programmers by academia, as it gave them a reason to Rube Goldberg something understandable in the pursuit of grant money. The singleton pattern Jeff Ward 6min video for example is different terminology for procedural with a global variable. It also allows for a NSA Encryption attack vector, I2p is written in Java for example. See Oop singleton


 * https://www.quora.com/Why-cant-I-understand-object-oriented-programming and https://aryehoffman.com/reference/structured-objects-approach/ "...An object in a structured-objects approach does not typically encapsulate its own behavior. Instead, it spreads its behavior to those objects that ask it for information.  Logic and constraints become widely spread across multiple objects, which then become interdependent....."

These is are examples of how metaphor conceals that software engineering doesn't exist

Message programming
Everything is an object goes the slogan, but is everything really a message? The question is meaningless in terms of the premise that procedural domain range mapping is all that exists, whether pass by reference("messages"), fields, data, functions or "methods" in a single math like function step or an algorithmic series of input to output mapping.

Integers, float and double in C are metaphorical objects, the behavior is encapsulated. We lose sight of what is happening at the machine level with all these metaphors and that we cannot adapt the zeros and ones to the way we think. An assembler function transforms inert data on reception of a message. By couching what is specific machine functions in metaphors a logical disconnect is induced in the minds of programmers: the ones and zeros inside of DDR3 does not represent the real world.

Straight C can be used to emulate a higher level OOP like python. Python allows mixing of both procedural and message programming while Java doesn't. Because OOP isn't defined, it allows for the mistake that int,float etc. aren't message receptors or "objects" then. Is a language OOP because I can put both data fields and functions into a record? Or is it OO only if it also provides extremely late binding? How about inheritance, overloading, etc etc? Must I have all of them? Any of them? There really is no such thing as an “object-oriented language”. Objects can be part of a language, but it is just a small part of it. You can’t really say that a language is object oriented just because it provides objects as a feature. The so-called OO languages are solidly rooted in traditional procedural programming (PP).

OOP basically stole everything from PP, renamed the terminologies and acted as if the ideas were its own. Jin Wang n industry, OO hasn’t really proved its effectiveness with evidence. Good systems may be built in a “OO language”, but the code is often written by people who understand the problems of OO and don’t embrace “everything is an object” or “design patterns”. Good programmers usually use workarounds in OO languages and are essentially writing in a traditional procedural style combined with bits from functional programming. So some OO languages and their tools may be pretty widely used, but the OO style doesn’t really have much influence on the advancements of programming as a field. [https://yinwang0.wordpress.com/2013/12/24/oop/ Yin Wang's view is that there is no such thing as an “OO language”, hence no debate possible about procedural vs object. Every so-called OO language also contains good elements that it borrowed (or stole) from procedural languages or sometimes functional languages, so they are not completely useless.

A mathematical function maps the domain to the range, but it isn't the data. OOP attempts to combine inert data and functions into an encapsulated record or object. Tony Marston views oop the same as procedural programming except for the addition of encapsulation, inheritance and polymorphism.(Nobody knows what an object is.)

An object is a message receptor''. The object instance receives a message and calls a function which transforms the data or "fields". Alan Kay

Oop libraries are only a collection of functions. Inside classes they are called "methods" to obfuscate this, a semantic brew is stewed to try and make it seem that oop isn't a calvin hobsy way of procedural programming. You can spend two hours rewriting code in procedural or spend two hours looking for a function buried in some class. Having to rewrite code is inevitable and preferred over endless searching.

A python class is a complex wrapper around a dictionary - so, if anything, you are adding more overhead than by using a simple dictionary. A Python dictionary is internally implemented with a hash table. A hash table is a data structure that maps keys to values by taking the hash value of the key (by applying some hash function to it) and mapping that to a bucket where one or more values are stored. If you are working on performance critical applications, use C or something. https://stackoverflow.com/questions/35988/c-like-structures-in-python.

stop writing classes
https://www.youtube.com/watch?v=o9pEzgHorH0  Classes are great but they are also overused. This talk will describe examples of class overuse taken from real world code and refactor the unnecessary classes, exceptions, and modules out of them.

Steve Yegge
http://steve-yegge.blogspot.com/2006/03/execution-in-kingdom-of-nouns.html Object Oriented Programming puts the Nouns first and foremost. Why would you go to such lengths to put one part of speech on a pedestal? Why should one kind of concept take precedence over another? It's not as if OOP has suddenly made verbs less important in the way we actually think. It's a strangely skewed perspective. As my friend Jacob Gabrielson once put it, advocating Object-Oriented Programming is like advocating Pants-Oriented Clothing.
 * https://plus.google.com/u/0/110981030061712822816/posts/KaSKeg4vQtz

It's easier for me to understand a system if it's written in a procedural language like C even without its documentation. But the same system written in Java is very hard for me to grasp without it's Doc and UML. When going to an unfamiliar place, for example, it's easier to follow the instruction that directs you how you can go there (e.g, go straight; at the 3rd street, turn left; cross the first pedestrian and you will see the train station, blah blah...) than finding objects of which state and behavior are unfamiliar to you (e.g, when you see a big blue rectangular building, it's the most famous shop, if it's not blue but is big, it's probably the hospital; neglect those shop and hospital because you must pay attention to the yellow building which is the restaurant. Continue roaming around the city until you see the shop, the hospital, and the restaurant which are beside each other; When you found the right restaurant and it is open, enter and eat some food there, otherwise, go home and cook your own food!).

oop failure
from https://www.quora.com/Was-object-oriented-programming-a-failure. The Catch-22 to the whole equation - and the part which corporate execs seem to understand the least - is that Object-Oriented is strictly an organizational paradigm. ALL actual software is strictly Procedural. No matter how the source code is laid out (wholly for the convenience of the programmer(s)), data is data and computers don't care if you're using OOP, Procedural.

Alexander Stepanov's complaint is blistering and accurate. If you read Types and Programming Languages, you get a sense for just how much complexity objects add to your world. OOP, as commonly envisioned, doesn't play well with static or dynamic typing.

Is OOP a failure? Well, what is it? I've heard OOP given about 12 definitions, all credible in some core way, but many conflicting. Like "Scrum", it's too all over the place to justify a closed-form, final opinion. It's either highly beneficial or loathsome depending on which interpretation one uses. There's good OOP and bad OOP. This should be no surprise: in the anti-intellectual world of mainstream business software, it's mostly bad OOP. (For "Scrum", there's the same sad story.)

Separation of implementation and interface is a clear win. That's not limited to OO languages, of course. Haskell has type classes, Clojure has protocols, and Ocaml has (if you're brave) functors. Nonetheless, I'm going to score that as a clear Good Idea that OOP championed early on.

For that, Alan Kay's inspiration was the biological cell. Alan Kay is one of the best software designers alive, and has been extremely critical of modern OOP. Now, the cell: it's an intricate, convoluted machine, almost on the verge of collapsing under the weight of its own complexity. In a larger organism, cells communicate through a simpler interface: chemical signals (hormones) and electric activations. If they coupled more tightly, the organism wouldn't be valuable. Kay was not saying, "you should go out and create enormously complex systems". OOP, to him, was about how to manage it when complexity emerged. In this way, OOP and functional programming (FP) were actually orthogonal (and could support one another) rather than in conflict. It was still desirable that objects do one thing and do it well; but interfaces were intended to underscore that "one thing" when the demands on the implementation made it hard to tell what that was.

OOP and FP (and, in reality, all higher-level languages) both exist to answer the question, "How do we prevent software entropy?" See, Alan Turing's result on the Halting Problem isn't about termination or about machines and tapes. It's the first of many theorems establishing the same thing: we can't reason, in any way whatsoever, about arbitrary code. It's mathematically impossible. Obvious solution: "don't write arbitrary code." (Most code that a person would write to solve a problem is in a low-entropy region where reasoning about code is possible.) Equally obviously, no one does write "arbitrary code". Generally, we don't go very far at all into that chaotic space of "all code", and that's good. However, as the number of hands that have passed over code increases, it gets further into that high-entropy/"arbitrary code" space. FP and OOP are two toolsets designed to prevent it from getting there too fast. FP enforces simplicity by forcing people to think about state and mutability, encouraging code that can be decomposed into "do one thing" components-- mostly mathematical functions. OOP tries to make software look like "the real world" as can be understood by an average person. (CheckingAccount extends Account extends HasBalance extends Object). The problem is that it encourages people to program before they think, and it allows software to be created that mostly works but no one knows why it does. OOP places high demands on the creators of the machinery (in effect, a new DSL) that will be built to solve a problem. Because of the high demands OOP places on human care of the software, the historical solution has been to have elite programmers (architects!) design and peons implement; that never worked out for a number of reasons-- it's hard to separate capability from political success, the best programmers don't want to be around with lines and boxes and DDL, business requirements are still a constant source of increasing complexity (with outdated or unwanted requirements never retracted).

What went wrong? People rushed to use the complex stuff (see: inheritance, especially multiple) when it wasn't necessary, and often with a poor understanding of the fundamentals. Bureaucratic entropy and requirement creep (it is rare that requirements are subtracted, even if the original stakeholders lose interest) became codified in ill-conceived software systems. Worst of all, over-complex systems became a great way for careerist engineers (and architects!) to gain "production experience" with the latest buzzwords and "design patterns". With all the C++/Java corner-cases and OO nightmares that come up in interview questions, it's actually quite reasonable that a number of less-skilled developers would get the idea that they need to start doing some of that stuff (so they can answer those questions!) to advance into the big leagues.

procedural is oop
https://www.quora.com/Was-object-oriented-programming-a-failure Okay, ready to drop the bomb? Everything is an object-like in your favourite non-OOP language. C ints are objects, and I do not mean in the Java sense. They encapsulate the underlying binary. They have well-defined behaviour, namely arithmetics. You have that polymorphism stack of short, long, long long leading to that devilish char.

Haskel functions are objects, and I do not mean in the Java sense. They encapsulate the underlying algorithm. They have well-defined behaviour, namely being callable. OOP and FP are just two representations of data and operations. It is like two sides of a fourier transformation. One is ugly, the other is sleek. Which is which depends entirely on the issue at hand. So let me answer the actual question: OOP is not a failure. FP may be more attractive to an algorithm designer, but I have a truckload of tasks that are stupidly difficult to express functionally. What is a failure is thinking strictly in one category or another. Some tasks require OOP, some require FP, but most can be expressed either way. Whether either or the other is good idea depends on the use case…

1.5k Views · 5 Upvotes
 * Without loss of generality, FP just makes the analogy nicer. The argument would work for composition models and other stuff. It is all binary anyways.

cat

 * http://nuthole.com/blog/2004/02/05/musings-on-an-interview-with-alex-stepanov/ STL is not OOP.
 * http://harmful.cat-v.org/software/OO_programming/why_oo_sucks "Objects bind functions and data structures together in indivisible units. I think this is a fundamental error since functions and data structures belong in totally different worlds. If a language technology is so bad that it creates a new industry to solve problems of its own making then it must be a good idea for the guys who want to make money. This is is the real driving force behind OOPs."
 * http://www.smashcompany.com/technology/object-oriented-programming-is-an-expensive-disaster-which-must-end
 * https://whydoesitsuck.com/cpp-sucks-for-a-reason/ In my opinion, C++ is this weird Frankenstein velociraptor that somehow survived the dark ages of programming and is now constantly being revived and patched up. The members of the standardization committee are trying hard to make things a little better by applying tons of makeup over its wrinkles. Obviously, this doesn’t work out too well since it’s still the same ugly beast under the mask. The problem here is that C++ is old – and I mean the antique kind of old that deserves to be put into retirement.

rebol
http://www.rebol.com/article/0425.html In its purest form, OO is a model of associating behavior with state (function with data). Originally, back in 1982, it seemed like a good idea because real world objects had specific actions related to them. A pen was used to write and draw. A pencil was used to write and draw. We thought, "Wow, there's a pattern, and it seems to be quite natural." However, it was a false model. A pen does not write and draw, it takes a human to make a pen write and draw. The actions of write and draw do not belong to the pen. OOL is not a complete solution. Too many of the behaviors of objects come from (or are influenced by) sources that are external to their encapsulated definitions.

Yegor256

 * http://www.yegor256.com/2016/12/13/mvc-vs-oop.html
 * https://www.youtube.com/watch?v=K_QEOtYVQ7A What's Wrong With Object-Oriented Programming? links to slideshare oop ,
 * Robert Martin

http://www.yegor256.com/2016/08/15/what-is-wrong-object-oriented-programming.html Edsger W. Dijkstra (1989) "TUG LINES," Issue 32, August 1989 "Object oriented programs are offered as alternatives to correct ones" and "Object-oriented programming is an exceptionally bad idea which could only have originated in California."


 * Paul Graham (2003) http://www.paulgraham.com/hundred.html The Hundred-Year Language "Object-oriented programming offers a sustainable way to write spaghetti code."


 * http://www.yegor256.com/2016/07/14/who-is-object.html Who is an object? What is common throughout all these definitions is the word "contains" (or "holds," "consists," "has," etc.). They all think that an object is a box with data. And this perspective is exactly what I'm strongly against.

Hyperplanes

 * http://www.separatinghyperplanes.com/2014/10/on-object-oriented-programming.html
 * all evidence points to oop being a disaster links to http://wiki.c2.com/?ArgumentsAgainstOop
 * https://blog.codinghorror.com/rethinking-design-patterns/

David West
,, http://blog.berniesumption.com/software/inheritance-is-evil-and-must-be-destroyed/ ,  http://userpage.fu-berlin.de/~ram/pub/pub_jf47ht81Ht/doc_kay_oop_en  ,   http://axilmar.blogspot.com/2014/10/object-oriented-programming-is-disaster.html rebuttal https://en.wikipedia.org/wiki/Object-relational_impedance_mismatch
 * https://www.youtube.com/watch?v=RdE-d_EhzmA David West 59min functional programming doesn't work.
 * https://news.ycombinator.com/item?id=3641212
 * http://www.smashcompany.com/technology/object-oriented-programming-is-an-expensive-disaster-which-must-end
 * http://lucacardelli.name/Papers/BadPropertiesOfOO.html
 * http://web.archive.org/web/20080710144930/http://gagne.homedns.org:80/~tgagne/contrib/EarlyHistoryST.html

http://www.shenlanguage.org/

http://whiley.org/2010/06/23/rich-hickey-on-clojure-se-radio/

https://8thlight.com/blog/colin-jones/2012/06/05/on-obsessions-primitive-and-otherwise.html

Paul Graham
http://www.paulgraham.com/avg.html

http://www.paulgraham.com/hundred.html Somehow the idea of reusability got attached to object-oriented programming in the 1980s, and no amount of evidence to the contrary seems to be able to shake it free. But although some object-oriented software is reusable, what makes it reusable is its bottom-upness, not its object-orientedness. Consider libraries: they're reusable because they're language, whether they're written in an object-oriented style or not. http://queue.acm.org/blogposting.cfm?id=34658

http://www.paulgraham.com/noop.html, http://www.paulgraham.com/reesoo.html

http://harmful.cat-v.org/software/OO_programming/why_oo_sucks

http://www.artima.com/weblogs/viewpost.jsp?thread=141312

http://batsov.com/articles/2011/05/12/jvm-langs-clojure/

Yolo
https://github.com/OznOg/yolo-reloaded C++ version making a readable C version from Yolo unreadable.

Pattern terminology pretentiousness
http://realtimecollisiondetection.net/blog/?p=81 (grumpydude counter view) Some anonymous soul emailed me regarding my “Design patterns are from hell!” post, arguing that “somehow, knowing patterns exist is the same as knowing different data structures exist” and that “understanding the different ways for creating objects (hello creational patterns) is like understanding the implications of deciding to use a dequeue rather than an array or rather than a linked list.”

No, there are no similarities between data structures and algorithms on one side and design patterns on the other side! Rather, there are lots of distinctions but the perhaps most important one is that data structures and algorithms are language independent whereas design patterns are language dependent. Data structures and algorithms are forever, whereas design patterns are as fleeting as the object-oriented languages for which they have been (predominantly) proposed. (That fact alone should warrant little to no attention being paid to design patterns. And if you don’t understand why OO is fleeting, time to learn a second language, other than C++.)

A second important distinction is that data structures and algorithms do not come encumbered with preferred usages. They just are. A programmer has to make deliberate choices — has to think — before selecting one over the other. Thinking is what makes, or breaks, the programmer.

In contrast, design patterns are purported “master programmer advice” strongly suggesting to young or otherwise impressionable programmers that the design patterns convey important concepts, practices, or principles that have been “prethought.” However, design patterns are not “master programmer advice!” Any master programmer would know that you cannot simply dish out a small number of generic solutions for generic situations, but that every situation is (potentially) different and warrants its own (potential) solution.

Far from “master programmers,” design patterns are the work of people who do conferences, talks, and books for a living, to further their own cause; they’re the work of academics who live in their heads and have never worked on real projects to see what kind of code their abstract ideas produce when put in practice; they’re the work of people who couldn’t care less about what toxic miasma they have unleashed because they’re too busy speaking at Software Development to push their consulting gigs to the fools who bought into the snake o

Design patterns are spoonfeed material for brainless programmers incapable of independent thought, who will be resolved to producing code as mediocre as the design patterns they use to create it.

The problem isn’t that knowledge of patterns is completely useless and programmers are much better off spending time learning useful knowledge like data structures and algorithms, even though that’s a true statement as far as I’m concerned. The problem is that patterns are as bad as, well, guns. Guns kill people, and pattern thinking causes brain rot.

There’s ton of people who incorrectly think and propagate that patterns are master programmer advice when they really are over-engineered solutions for deficiencies of object-oriented programming languages. I don’t like over-engineered solutions and I don’t like object-oriented programming languages, so fleeting terminology for stuff like that is not something I’m likely to promote any time.

I realize reading the mindless drivel of Design Patterns might give some programmers instant satisfaction because everyone talks about patterns and they now feel smarter having read about them, but in reality these programmers would be better off beating their heads against Knuth because that will actually make them smarter, not just feel like they are. (Knuth is a hard read, but avoiding solving hard problems isn’t the way of becoming a master programmer any more than is studying Design Patterns. If Knuth is too much, read Skiena’s book.)

I mean, if someone thinks they are a better programmer for knowing the “visitor” and “observer” patterns, but they don’t know, say, what a skiplist is, how to perform a k-nearest neighbor search, or how to apply dynamic programming to a problem then they’re fools.

And, no, I’m not overreacting. :) I’m just making sure my point comes across loud and clear, because there needs to be a lot of shouting to counterbalance all the published fraud about patterns. I was hoping the problem would just go away, but as it isn’t, I’ll use my little soapbox to make it painfully clear that I, for one, think they’re a scourge of programming/design.

It goes without saying that terminology is important, but relevant and precise terminology arises natually, from a need. “Callback”, for example, never had this cult or passionate discourse about its being, because that term arouse naturally. But there was never a natural need for labeling of encapculated global variables as “singletons” or other similarly trivial concepts as “visitor”, “observer”, or what-have-you patterns. These patterns are artificial concepts.

aThirdParty, Twylite’s comment was feeble-minded for the following reasons: 1. He suggests that all nouns are useful. But this is trivially a false statement as once we reach as many nouns as there are concepts, the nouns have lost their abstractive power and have become worthless. 2. He correctly identifies that design patterns have lengthy definitions and a limited domain, but fails to note that they are, in fact, much more limited than so; they are so specific to a particular development methodology and a particular language that they effectively have no expressive power beyond those. 3. Worse, he fails to read the actual message, not seeing — even though it is plain to see in just about every of my comments — that it is not an issue about the descriptive power of “pattern nouns” but one of leading a whole generation of programmers astray thinking that patterns are important when they so are not. To prove his total ignorance of my point, he still posts about the descriptive powers of nouns (and even so, gets it wrong, as per points 1 and 2 above).

In other words: his comment has less relevancy to my post than Palin’s statement about Putin’s head floating in Alaskan airspace had to Couric’s question. That you did not see this feeble-mindedness in his comment I find disappointing. You are right on one thing though, I have little compassion for feeble thinking. I find my compassion is better expressed as donations to ACS than as trying to spell things out, as I did in this post, only to find e.g. that some still don’t get a simple reductio ad absurdum argument (point number one, above).

christer said, May 7, 2009 @ 12:56 am I really have no intention of replying further here, because those who haven’t gotten my message already never will. That said, some final (late) comments:

unwesen, on the silliness that is your “[design patterns are] no more than a name for an approach to pounding nails into a board” analogy. Ask yourself, do carpenters have different names for “approaches to pounding nails into a board”? (Go ask a carpenter. No, really, go do it!) Of course not! Unlike programmers, carpenters are intelligent people, and they wouldn’t even dream of doing something as moronic as assigning a name to nail pounding! Kenneth, these are all perfect examples of academic questions that have little practical value beyond getting someone a thesis. These questions and attempts at answering them do not belong on my blog. This blog is for discussing real-world issues. Greg, “language dependent” is not a singular but a plural reference to languages. And, yes, I’m 100% serious, modulo the five somewhat randomly picked subjects of the sentence.

christer said, March 16, 2010 @ 3:55 amI see you’re missing the point.

In carpentry, there is no group of “carpentry masters” who holds classes telling carpenters to use the ‘new’ “glue pattern” while poo-pooing the old way of gluing. There is no group who tries to tell other carpenters what to do or what to call it. Carpentry nomenclature occurs naturally, on the job, not from some club of theory-only carpenters selling expensive coursework and books. Indeed, it appears all other disciplines are quite sane, and it is only in software development where we have enough feeble-minded people that we have been taken in by a bunch of snake-oil salesmen selling made-up, out-of-the-blue, nonsense terminology like, say, “flyweight pattern.”

People thinking any forced pattern-name is important have been bamboozled. They have been fooled, just like people believing in the value of homeopathy, astrology, phrenology, or navel-gazing have been fooled. Just like I’m sad to see people get harmed by using homeopathic “medicine” I’m greatly saddened to see software developers harming software and their profession by applying “pattern-names” and, worse, “pattern-thinking.”

awood said, March 29, 2010 @ 9:01 pm This has got to be a cool place for a discussion that started 2-3 years ago to still be alive! I’ve wondered about this thread based on my own experience and I can’t help but agree with Christer. My first instinct was to wonder how labeling algorithms or data structures is any different than labeling patterns, but algorithms and data structures are much, much more concrete and applying them has very measurable results. And while this may seem harsh and overly judgmental, in my experience, the best programmers/engineers are the ones that think in terms of data and algorithms. Those are also often the best architects, because they truly understsand what encapsulation means or they otherwise would not be able to seperate data and algorithms (and I don’t consider encapsulation to be an OOP-only concept, I say this because Christer is obviously not an OOP fan!).

By contrast, programmers that litter the code with pattern usage are usually the ones that cause the most trouble. They absolutely have to label whatever code they create with the pattern they picked from the book. Those are the programmers that love to explain how an Adapter is different than a Facade or a Decorator, etc, and who include the pattern names in whatever new classes they’ve created instead of just trying to name a class with something that captures its purpose. They also think that labeling something with a pattern makes it clean, and that it’s okay for WhateverAIObjectFactory to be known by the entire codebase because it’s recognized pattern. They go through great effort to systematically incorporate pattern names in their language and are completely unaware of the “real” problems other engineers are solving daily to make the game fast and ready for ship. To me, those guys are trouble. Patterns exist, and whether they need to be labeled or not can be debated. All I can say is, when I see very explicit pattern usage, my alarms trigger and I pay special attention! And that’s a reflex that has been burnt in me over time.

Oh, just a minor note. One thing to that keeps coming up whenever subjects such as over-engineering, pattern bashing, etc, are in question, is the architecture vs performance/hardcore programmer comparision. I don’t know why, but there is this assumption that you either architect code well or make it fast, and that if it’s fast it must be unmaintainable (one of the earlier comments touched on that). I don’t know why that is. I am in favor of both architecture (not OOP style though, way more in the way of DOD) and obviously speed, but I consider them orthogonal problems. It is actually easier to optimize parts of the code when that code is well isolated, sticks to solving one thing, etc.

christer said, June 25, 2010 @ 11:12 pm Matt, there’s a big difference between the word “pattern” as it occurs in a dictionary and the word “pattern” as it is used in design-pattern contexts. It doesn’t seem you see the distinction as you claim we all develop patterns and that tricks are patterns. I couldn’t disagree more.

Do you call everyday life commonalities for “xxx pattern”, like you obviously do for a programming “design pattern?” Like, say, the “opening pattern” which can be applied to car doors, cans, and caps. I doubt you do. In fact, I doubt you label any commonalities outside of programming a pattern (in the “design pattern”-usage sense) even though you clearly could. Ask yourself why is what? No, really. Apply the opening pattern to your mind and consider deeply why we don’t see people talk about “design patterns” in carpentry (the “v-claw pattern”) or mathematics (the “substitution pattern”) or any other field and then draw the obvious conclusion.

And of course I’m overstating! The message would be muddled if I said that something is 98% bad. The message remains the same though: pattern terminology and usage thereof rots the brain. BTW, data structures have nothing to do with design patterns, and vice versa. Why mix them? Data structures (like algorithms) are language independent, design patterns are language dependent. The former (two) is timeless knowledge. The latter is perishable knowledge (with a best-before date of 1994)!

Brian Will
https://www.youtube.com/watch?v=lbXsrHGhBAU B.Will states: OOP is privileging data over action. It attempts to solve a problem by decomposing it into a bunch of data types. Procedural decomposes a problem into a series of actions(functions).
 * Encapsulation is impossible
 * https://www.youtube.com/watch?v=Lzc3HcIgXis What Programming is Never About (Informal Lecture) by Abner Coimbre (Will critiques some of his points.)
 * https://www.youtube.com/watch?v=gWv_vUgbmug Jonathan Blow "Making Game Programming Less Terrible" Talk at Reboot Develop 2017
 * https://www.youtube.com/watch?v=da_Rvn0au-g Stefan Mischook. - Another story from the .com bubble era that helped teach me a coding lesson.

https://medium.com/@brianwill/object-oriented-programming-a-personal-disaster-1b044c2383ab by Brian Will 1. Encapsulation doesn't protect state coherence without huge structural burdens. In practice, real OOP codebases rarely achieve real encapsulation of partial program state, let alone the entire program state. 2. Most behaviors have no natural primary association with any particular data type. Consequently, object decomposition of application logic almost always produces unnatural associations of behavior and data as well as producing extra, unnatural data types and behaviors we otherwise wouldn't need.

As Casey talked about, stand-alone 'objects' are a perfectly fine concept, e.g. ADT's are natural objects (data manipulated only through a defined interface). But trying to shove everything into an object mold produces Frankenstein entities with superfluous layers of abstraction.

I say all this as someone generally comfortable with high levels of performance overhead. In my experience, OOP adds complications which overwhelm the expressiveness gains of higher-level code. Linked from oop is ineffective handmade.network

wikipedia
https://en.wikipedia.org/wiki/Circle-ellipse_problem

OOP is the Root of All Evil by Jeff Ward
http://jeffongames.com "....Most colleges now teach OOP languages and OOP principles, but a completely object oriented program is frequently at odds with creating fast code. In an attempt to create abstractions and encapsulate complex systems, programmers sacrifice speed, arguing that these make more maintainable and more re-usable code. This talk will show that the principles that make re-usable and maintainable code are not necessarily the principles that are part of OOP, and that the speed / maintenance dichotomy is sometimes an illusion....."

https://www.youtube.com/watch?v=748TEIIlg14 Oop pulls data from random memory locations which hits the cache line. With virtual functions the compiler can't optimize for it.

Yin Wang
What's wrong with OOP and FP "....OOp design patterns makes hard things even harder; but in a purely functional language, having to use pure functions to model side-effects makes trivial things hard and hard things impossible. So speaking of the two evils, OOP is the lesser one because at least easy things are still easy in them. The problem with pure FP is: there exists things that are not pure. ...."

Wang concurs with Dave Acton: it is impossible to model the real world with a computer program. He got so much flak from the Haskell crowd that he deleted his pages, but copies were made by archive.org. David West in his Youtube videos stated that you are surrounded by many objects but no "functions". Computer science has like sociology, biology and philosophy been beset by charlatans.

https://web.archive.org/web/20140129035955/http://yinwang0.wordpress.com/2013/12/09/what-makes-python-static-analysis-interesting-and-hard/

Joe Armstrong

 * https://www.quora.com/Is-object-oriented-programming-actually-bad-I-have-read-articles-and-watched-some-videos-about-this-Im-at-a-beginner-level-and-know-a-little-OOP
 * https://www.infoq.com/interviews/johnson-armstrong-oop
 * http://harmful.cat-v.org/software/OO_programming/why_oo_sucks

OOP prevents reuse
https://www.quora.com/Has-object-oriented-programming-lived-up-to-its-promises-of-superior-code-reuse Encapsulation makes every object oriented project its own little private domain language. In doing that, it actively prevents re-use. If we think about a trivial example, if I create a graph package with Edge and Vertex classes, and a depth first search algorithm and you create another with Arc and Node classes and a breath first search algorithm, the two search algorithms cannot then be re-used with the other package, because they have class (and function and variable) names embedded in them. In purely duck-typed languages you could maybe get away with creating some wrapper classes, but even that is a non-trivial exercise. The key problem here is that encapsulation actively encourages developers to create very specialized abstractions that only allow very specialized functionality to be applied. This prevents mis-uses that break invariants, but it also prevents unexpected uses that are valid.

c2.com
http://c2.com/cgi/wiki?ObjectOrientationIsaHoax

http://wiki.c2.com/?ObjectOrientedProgramming

http://wiki.c2.com/?OoEmpiricalEvidence "..The problem is not complexity; people regularly perform more complex tasks with far more predictability. The problem is that software engineering doesn't yet really exist. The practice of writing software today is not really comparable to what engineers do. We are learning, but there is a lot of catching up to do. Even worse, all the engineering process in the world can't make up for a lack of basic theory. You can't engineer what isn't understood. Likewise, you can't measure what you don't understand. So "argument from authority" and "anecdotal evidence" are the best we can manage. This shouldn't be a surprise; it is a symptom of a larger malady...."

http://wiki.c2.com/?ArgumentsAgainstOop

subject oriented

 * "....https://en.wikipedia.org/wiki/Subject-oriented_programming In computing, subject-oriented programming is an object-oriented software paradigm in which the state (fields) and behaviour (methods) of objects are not seen as intrinsic to the objects themselves, but are provided by various subjective perceptions (“subjects”) of the objects...."

An attempted English translation from the premise that procedural is all that exists would be: Subject-oriented programming a paradigm in which the data and functions transforming this data inside the record are not seen as intrinsic to this record, but are provided by perceptions(messages) of this data and functions from outside the record.

Which is incomprehensible. Man with its language relates subject to object via verbs, Turning machines don't. We have perceptions, Turing machines don't have perceptions, they have zeros and ones.

links
Asaf Shelly http://cwsof.com/blog/?p=425

State and behavior

http://weblog.raganwald.com/2007/10/too-much-of-good-thing-not-all.html  Too much of a good thing: not all functions should be object methods from noun

https://fredmameri.wordpress.com/2010/04/27/on-the-object-oriented-myth/

[https://web.archive.org/web/20050503090058/http://www.ifs.uni-linz.ac.at/~ecoop/cd/papers/ec89/ec890025.pdf Taenzer, David; Ganti, Murthy; Podar, Sunil (1989). "Problems in Object-Oriented Software Reuse" (PDF). ECOOP 89: Proceedings of the Third European Conference on Object-Oriented Programming, 1989. Cambridge University Press]

https://learn.freecodecamp.org/javascript-algorithms-and-data-structures/object-oriented-programming/use-inheritance-so-you-dont-repeat-yourself/

dadgum, perlmonks oop ,Stroustrup , Richard Mansfield , stackexchange oop

Vulkan graphics

http://gameprogrammingpatterns.com/component.html

https://channel9.msdn.com/Events/GoingNative/2013/Inheritance-Is-The-Base-Class-of-Evil

http://www.tattvum.com/don-t-grow-trees

http://wiki.c2.com/?AspectOrientedProgramming

http://wiki.c2.com/?SubjectOrientedProgramming there are no subjects, verbs or objects in programming. It is only input to output mapping.

http://loup-vaillant.fr/articles/deaths-of-oop The obvious solution is to separate code and data. Let the components be mere blobs of data, and have separate processors map over them. This gives you a database-like system: your game data is basically a giant table, with a column per type of component, and a row per game object. The processors then query the relevant columns. This separation of data and code has a number of advantages. For instance, it makes it easy to separate performance sensitive processors from the rest. That lets you write much of your game logic in a scripting language, and treat it as ordinary data —like textures and meshes. Your game is now easier to modify before you release it, and easier to mod after. from yegor on oop

https://www.gamedev.net/articles/programming/general-and-gameplay-programming/understanding-component-entity-systems-r3013  To wrap up, OOP-based entity hierarchies need to be left behind in favour of Component-Entity-Systems. Entities are your game objects, which are implicitly defined by a collection of components. These components are pure data and are operated on in functional groups by the systems.

http://blog.jot.fm/2010/08/26/ten-things-i-hate-about-object-oriented-programming/

https://groups.google.com/forum/#!topic/dddcqrs/Icep9ujRllk Practicioners of Domain Driven Design (DDD) often ask whether something should be modelled as an Entity or as a Value Object. An Entity is a domain-object with an identity-property, so it can change over time and still be tracked. It can also be discerned from similar objects by its identity, even if they have the same state. On the other hand, a Value Object is an immutable object without an identity-property. Two Value Objects with the same state, the same value, are considered replacable, interchangeable, effectively the same. Although the distinction between Entities and Value objects has many merits, it is a so called category-mistake to use the two conjunctive (joined by 'and') or disjunctive (joined by 'or') in one sentence. A category-mistake is not a term from category theory, but coined by the linguistic philosopher Gilbert Ryle in the middle of the 20th century.

https://www.youtube.com/watch?v=_xLgr6Ng4qQ&index=2&list=PLYG-GfK4ITZ7Q19M8ajAdOz2I2T6wLTVx  getter and setter methods are bad. never use private for data.

https://www.youtube.com/watch?v=zjkuXtiG1og&list=PLYG-GfK4ITZ7Q19M8ajAdOz2I2T6wLTVx&index=3  "But, suppose you were to introduce a virtual function. First, you would get fired from my team. But then second what would happen is, inside this (foo) struct, the compiler would silently introduce a pointer to what's called a VTable, which is a dispatch table - that is how it actually implements virtual functions. Now, you have no way of initializing that virtual table yourself, because you don't even know where.... you don't know anything about that - that's a compiler detail. And so one of the things "new" also does, is it initializes any hidden information that you did not put there, but the compiler did. And so, if you start using compiler features, C++ features, that the compiler needs to augment the structure of the class with, and which have important inititialization that has to happen, you must call "new", even if you don't care about the compiler calling your constructor. Because the compiler, essentially, has a constructor that has to get called and it is initialization of that VTable."

https://www.youtube.com/watch?v=GKYCA3UsmrU t's really sad that we are only taught OOP and no other paradigms in our college, when I discovered programming I had no idea about OOP and it was really easy to build programs, bt then I came across OOP:"how to deconstruct a problem statement into nouns for objects and verbs for methods" and it really messed up my thinking, I have been struggling for a long time on how to organize my code on the conceptual level, only recently I realized that OOP is the reason for this struggle, handmadehero helped alot to bring me back to the roots of how programming is done, remember never push OOP into areas where it is not needed, u don't have to model ur program as real world entities cause it's not going to run on real world, it's going to run on CPU! See Nouns and verbs oop

http://www.reocities.com/tablizer/oopbad.htm

http://purplepwny.com/blog/object-oriented_programming_sucks.html, http://wiki.c2.com/?FunctionalWeenie , http://wiki.c2.com/?ObjectWeenie Data is not action in and of itself, but does provide for an infinitude of potentialities to be achieved via functions acting across that data. Binding a finite set of functions to the data that it is intended to operate on is the antithesis of reusability. In light of the direction that most languages in widespread use have taken, this philosophy is admittedly somewhat anachronistic.
 * http://wiki.c2.com/?ObjectModel ,  http://wiki.c2.com/?FraudulentMindset
 * https://saimaterial.wordpress.com/2007/09/14/1what-is-the-difference-between-object-oriented-programming-and-procedural-programming/
 * https://stackoverflow.com/questions/552336/oop-vs-functional-programming-vs-procedural

http://lambda-the-ultimate.org/node/3265 Human languages have the ability to talk about the state of the world, and all of them contain nouns, which rather than describing a particular thing, they are supposed to indicate a set of things that contain a certain property. Cars can be driven; boxes are containers; speakers produce sound. People from different cultures that have different languages, even with a difference in nomenclature, typically have common words that describe sets of things the same way. In addition, modern science has found mathematical relationships between classes of things with precision. All this to say there is such a thing as "real-world objects," it is not arbitrary (and suggests that there is a right way and wrong way of describing things.)

Most programming languages have a concept of objects, and allow programmers to define them and describe relationships between objects. The idea of objects helped programming with type-checking, encapsulation, code reduction, etc. In fact, an argument can be made that the job of the programmer is to figure out a representation (real-world object -> code -> binary data) of a problem and a process to solve the said problem.

Yet even with the amount of programs being written and problems about objects being solved, rather than converging on a complete representation of real-world objects, objects in programs seem to diverge, where an object from one project is different from an object from another project and is typically different from how a normal person thinks about the real world object.

When the representation in code is the same as a person's understanding about a real world object ("common sense" or common understanding), the person can process and reason and be productive with the code with ease (like.. integer object types). On the other hand, if the representation is not the same (unintuitive), then the person has to go through the documentation/look over each line to match up the person's internal representation with the representation in code, making programming difficult (any large scale programming project).
 * http://www.dataorienteddesign.com/dodmain/node17.html Second chapter book on why oop doesn't work.

https://zaemis.blogspot.fr/2009/06/whats-wrong-with-oop.html

medium.com
https://medium.com/@richardeng/i-think-you-vastly-overstate-the-issues-surrounding-oop-291c4e4dcea4

ycombinator good buy object oriented programming comments on https://medium.com/@cscalfani/goodbye-object-oriented-programming-a59cda4c0e53


 * oop comments
 * https://www.youtube.com/watch?v=71qoHW-g9bc Jamie King This video points out the issues and problems with inheritance as a primary object-oriented design pattern. Inheritance, polymorphism, and virtual functions seem complex. But once mastered, they tend to become a primary tool. However, replacing inheritance with composition leads to a more flexible and elegant design.
 * https://www.johndcook.com/blog/2011/07/19/you-wanted-banana/
 * https://mollyrocket.com/casey/stream_0019.html I always begin by just typing out exactly what I want to happen in each specific case, without any regard to “correctness” or “abstraction” or any other buzzword, and I get that working. Then, when I find myself doing the same thing a second time somewhere else, that is when I pull out the reusable portion and share it, effectively “compressing” the code. I like “compress” better as an analogy, because it means something useful, as opposed to the often-used “abstracting”, which doesn’t really imply anything useful. Who cares if code is abstract? Linked from http://www.mikedrivendevelopment.com/2014/06/compression-driven-development.html
 * Deborah Armstrong