by kkdaemas on 10/24/20, 3:27 PM with 144 comments
by atty on 10/24/20, 4:09 PM
by kibwen on 10/24/20, 4:20 PM
IMO, a more accurate term would be stateless programming. This paradigm is about minimizing state. Of course, as the OP mentions, state is often quite useful. But minimizing state, especially global state, particularly mutable state, and especially particularly global mutable state, is something even imperative language fans can get behind.
by eyelidlessness on 10/24/20, 4:59 PM
All you have to do is pass the world state to each function and return a new state.
True, yes, but...yuck. It can be clunky in a language with single-assignment. And what is this really gaining you over C?
I mean, if you stop there, sure that’s ugly. But if you model your program with that as the foundation, then break it down (and generalize where breaking it down is general), it’s pretty easy to reason about. And what it “gains you” is never having to think about something changing in a way that produces invalid or unexpected state. (This is more true in statically typed languages of course.)In any program of any real complexity, you will eventually be inclined to break the problem down into smaller pieces. If your smaller pieces are functions, you can be certain about the state that’s returned by them. If they’re stateful subroutines, then you have to think about multiple pieces at the same time.
by voodootrucker on 10/24/20, 4:21 PM
Totally agree with the article: sometimes functional is the right tool for the job, sometimes OOP.
A more recent example: parsing USB descriptors in Rust. The parser would have been trivial and easy to understand if I could have multiple mutable references to nodes, but alas I kept fighting the borrow checker.
I talked with a friend about why it was so hard, and he said "do it in a functional style". That made my borrow checker worries go away, but instead I have multiple "tree pivoters" that recursively descend through input trees while building output trees. It practically broke my brain and although it is pure, it's not nearly as readable or maintainable as one with mutation would have been.
by jasperry on 10/24/20, 4:26 PM
Trying to wrap an entire function's computation into one expression is a fun mental exercise, but the resulting code may be harder to read than a sequence of statements. Haskell's "do" notation seems to be an acknowledgment of this, but to me it's adding another layer of abstraction just to recover what imperative syntax gives in the first place. Why not start from the imperative model, and then find other ways to try to limit side effects and mutation? Languages like Rust may be heading in that direction; "Return of the statement" as it were.
by karmakaze on 10/24/20, 6:25 PM
by chowells on 10/24/20, 6:55 PM
Accidentally interleaved mutation is not a theoretical or academic problem. It's probably the number three source of production bugs in my dayjob's various products, behind misunderstood requirements and web browsers constantly changing the rules. It turns out that everyone, even people like me who know better and have been burned several times, will sometimes take the convenient shortcut. It's so tempting to get something done immediately by quietly mixing some mutation in an unexpected place, and it usually doesn't bite you. Then it gets ossified that way, and a year later starts biting you. This really does happen in code maintained by multiple developers over multiple years.
To be fair to the original post, though, Haskell has come a long way in making that kind of coding easy since 2007. The lens library didn't exist back then, and it's a big part of why data transformation programs are so much more pleasant in Haskell than most languages. It lets you express data access at the right level of abstraction. You get laws to enable algebraic reasoning and a broad range of utilities for composing small pieces together to solve complex problems.
by dexwiz on 10/24/20, 4:36 PM
Functional programming says values like spawn rate or type should be the output of an expression. Monads try to solve this, but storage must be the output of an expression, and is always more complex than simple assignment.
There is one more secret about programming that functional tries to ignore, that it runs on physical machines. Even though some would like to describe programs pure mathematically, they still are bound by physical mechanisms. Most of those mechanisms are for storing state in memory or transferring state to another location. Abstracting over this fact makes programming painful once performance is taken into account.
by mrkeen on 10/24/20, 4:34 PM
> And each of these is messier than it sounds, because there are so many counters and thresholds and limiters being managed and sounds being played in all kinds of situations, that the data flow isn't clean by any means. > What's interesting is that it would be trivial to write this in C.
You could make the same argument about dealing with git and all its crazy branching, rebasing, and reflogs, compared to just editing your source code in place. Or double-entry bookkeeping versus just keeping track of how much money you have. It gets awkward.
I loved the elegance of game programming in Elm a few years back, but the performance was dogshit - in my case at least - so I probably wouldn't do it again.
The real reason to stick to C/C++ in game programming is speed.
by flowerlad on 10/24/20, 7:25 PM
More on that here: https://medium.com/weekly-webtips/dysfunctional-programming-...
by nicolashahn on 10/24/20, 4:04 PM
Absolutely. Some problems are just so much simpler in an imperative style that it's worth the lack of confidence you get with the functional solution in its correctness and robustness.
That's why I love languages that have fairly powerful functional features but have an imperative escape hatch. I think more and more languages are becoming that. Rust does it very well, but the lisps, probably Scala, and a few others I'm not thinking of may be better, don't have much experience with those though.
by dnautics on 10/24/20, 4:27 PM
by savanaly on 10/24/20, 4:20 PM
I would like to know more about this. I've written several games in Elm, a js counterpart to Haskell, and I guess foolishly assumed that meant I was writing them in the functional style. Now I'm curious if I'm actually writing in an iterative style and if so what would the functional style look like? I would welcome general responses to this as well as any comments on the code style specifically in examples like [0].
[0] https://github.com/tristanpendergrass/legendary-barnacle/blo...
by r-w on 10/24/20, 4:20 PM
by tome on 10/24/20, 5:05 PM
by FpUser on 10/24/20, 5:41 PM
I personally always shied away from adopting/committing myself to any strict concept. My long experience taught me that as soon as you do there it will likely blow up in your face one or the other way some time down the road.
by ookdatnog on 10/24/20, 10:14 PM
https://youtu.be/1PhArSujR_A?t=125
https://www.gamasutra.com/view/news/169296/Indepth_Functiona...
by kaashmonee on 10/24/20, 6:32 PM
by mikewarot on 10/24/20, 5:50 PM
by ngcc_hk on 10/25/20, 12:08 AM
by pmarreck on 10/24/20, 7:45 PM
by pg_bot on 10/24/20, 4:51 PM
For example in erlang/elixir you can keep game state in a series of processes. You model the state changes based on messages that a process receives, and you can spawn/terminate processes as needed. Everything he described as challenging is trivial if you understand the language paradigm.
by jqpabc123 on 10/24/20, 4:54 PM
by nerdtime on 10/24/20, 6:53 PM
FP is only modular when you use combinators. If you use closures then it's no longer modular.
f x y = x + y
g y = y * 2
w x = (f x) . g
g and f are combinators and modular and w is the composition of both of those combinators. w = \x -> (\y -> (x + y) * 2)
In this case the above is not modular because it doesn't use combinators. The above style is actually kind of promoted by haskell when you need to do things with side effects. It actually makes FP more complex than it needs to be without improving modularity.