Mike Acton

hardware is the platform
http://www.opowell.com/post/talks-that-changed-the-way-i-think-about-programming/ Almost everything that’s interesting in a non-trivial program are ‘cross-cutting’ concerns. When faced with the reality of real word programs, promises of encapsulation tend to be broken, and object graphs quickly become tangled webs from Brian Will.

We always hear arguments for Garbage Collection, and Virtual Machines, but what about the counter arguments?

ccp conf(Mike Acton)
https://www.youtube.com/watch?v=qWJpI2adCcs Handmade hero interviewing Mike

https://www.youtube.com/watch?v=rX0ItVEVjHc The transformation of data is the only purpose of any program. A point of view shared by Buko Obele who views a computer's sole purpose as mapping inputs to outputs.(Noun. Common approaches in C++ which are antithetical to this goal will be presented in the context of a performance-critical domain (console game development). Additionally, limitations inherent in any C++ compiler and how that affects the practical use of the language when transforming that data will be demonstrated. linked from altdevblog.com 2012/09/16


 * There is no ideal abstract solution to the problem. (25min)
 * You cannot future proof code.

(19min) Code doesn't model the real world and Oop forces the wrong paradigm of hiding data. It confuses two problems: maintenance, how it allows changes to access data and grasping properties of this data(critical for solving problems). If you increase the maintenance ability by hiding data, you make it more difficult to grasp the properties of such data. World modeling implies some relationship to real data or transforms, but in the real world "classes" are fundamentally similar, e.g. a chair is a chair. But in mapping inputs to outputs on a computer, the only actual activity taking place, "classes" are only superficially similar, it makes no sense(Category mistake in philosophy).

(21min) World modeling leads to monolithic, unrelated data structures and functions mapping this data. World modeling tries to idealize the problem, but you can't make a problem simpler than it is. It is the equivalence of self help books for programming ... solve by analogy ... solve by storytelling. Instead of solving within the constraints of the hardware platform: it can only map inputs to outputs.

Software isn't a platform, hardware is. Text and data goes into a cpu and maps it to data. See (Noun). With different hardware we have different solutions. FpGa solutoins for example is difficult with oop and you need to parallelize code for real time execution. The nested for loops in Yolo is executed with fpga. Reality isn't a hack you're forced to deal with to solve your problem, reality is the actual problem.

22min Code cannot be designed around a model of the world. Code is a minor issue, its the data that is the problem. Programmers are fundamentally responsible for the data, not the code. Code is a tool for transformations, only write code that transforms the data. Grasp the data, to understand the problem. There is not ideal abstract solution to the problem and future proofing code is impossible.


 * Code isn't more important than data.
 * We must solve the 90% of problem space(L2,1,3 memory cache) that the compiler can't. The idea isn't to miss the L2 cache event. OOP consumes 90% of the L2 cache memory as opposed to straight C which used much less. The OOP c++ example written in C is debugable, maintainable and we can reason about the cost of change. OOP ignores the finite limit of the cache, this is irrational.

Data oriented design
13min: DOD is semantic tricky way of bringing back unstuffed procedural programming. If you don't understand the hardware, you can't reason about the cost of solving the problem. Everything is a data problem, usability, maintenance, debug etc. Its not a coding problem. Latency and throughput are only the same in sequential systems.

nameguy poster
Your rant about floating point precision, aside from being strange, didn't actually address the robustness of the time travel feature.

A modular object system that lets you add features and behaviours without changing the central structure of the code is bad. It may seem to increase development speed, but nobody has seriously demonstrated that to be true. From a design/quality standpoint, if you're inserting a feature into something like a game without considering the "cross-cutting concerns", you're basically doing a tack-on job. That means the feature is likely either bad, unnecessary, or doomed to yield a bad user experience. You are obligated to consider every interaction between components or else your software is incomplete.

Nobody here encouraged copypaste programming, although the crusade against any and all code duplication is part of the reason why the world's software is terrible.

Static analysis is good. Dynamic analysis is good, but only during development of course. Unit testing is garbage.

Ignoring the fact that virtual functions aren't necessary or useful for any purpose, they eat some indeterminate amount of space from the beginning of any structure that uses them, meaning the structure's memory layout can no longer be optimized. Calling a virtual function risks a cache miss dereferencing the object pointer, another dereferencing the vtable pointer, and a third dereferencing the function pointer. This gives virtual methods a minimum overhead of maybe ~3 cycles and maximum overhead of 600 cycles or more, not adjusted for branch misprediction. Iterating over a collection of objects with virtual methods will thrash the icache unless you sort by type. That means your objects must store their own type, at which point you could be using a switch. Philosophically, virtual functions are bad because they mandate that they remain abstract. Their implementation is opaque to the programmer, but experience with the real world shows that you actually have to care about the implementation of things. They are better done in userland than as a language feature, even considering compiler optimizations. See Handmade hero on virtual tables and virtual functions.

youtube comments on acton
My awakening to this vectorization of data was in game development when I built an Entity Component System from the ground up. While they too have their flaws, one of the great strengths is the co-locality of like information. You start to think about things in terms of "streams of vector3s" or "streams of structX" as opposed to a more typical "process list of game objects" which then requires opening each of object, performing last minute logic, or worse, process additional hierarchies. This by itself (colocation and process of data) led to a nearly 50% improvement in raw engine performance. Fortunately, it was done early. It's not an easy refactoring project...

links
Oop

https://www.youtube.com/watch?v=u8B3j8rqYMw data oriented future

Handmade hero