Why Go Getting Generics Will Not Change Idiomatic Go

After years of overly-acrimonious online discussion, Go is continuing down the chute towards getting generics. I'm already seeing the inevitable "functional" libraries (plural) coming out. I'm going to explain why none of these libraries are going to materially impact what the community considers good style.

They may see some use. I may even use some of them myself on small scales. Some iconoclasts will program with them extensively, and their code will work, in the sense that it will do what it was designed to do regardless of how the code reads. But they are not going to change what generally-good style is considered to be.

Edit: May 2021 - Update for the latest generics syntax.

For context, I like Haskell. I don't get to do a lot of work in it, but I know it reasonably well, I could take a job in it if one came up. I'm not saying what I'm about to say out of ignorance of how functional programming works or how to build non-trivial programs out of those tools. If anything, it is out of an abundance of respect.

For this post, "functional programming" is the modern strain exemplified by Haskell, with strong typing, pervasive immutability, languages that have enough differences that you end up with different optimal idioms than "imperative" languages. This is as opposed to an earlier definition around functions being first-class citizens and having closures available. (This is no longer a very useful definition, because this has simply won. The only remaining languages that are not "functional" by this standard are that way only because they are legacy, and most of them have added if it was even remotely possible, leaving behind only pure C. Also Go is already "functional" by this standard, as everything else is.)

So, that said: Functional constructs are highly overrated outside of functional programming languages.

They are not of zero value. I use them myself in various appropriate places in non-Go languages. But they are overrated.

Why do I consider them overrated out of the functional programming context?

In FP languages, the touchstones of what people consider FP are building blocks, the beginning of the utility of the functional paradigm, not the end.

Consider the map, filter, and reduce operations.

Rather than the sine qua non of functional programming, these are three examples of the general pattern that functional programming uses of extracting out recursion schemes into functions (or typeclasses/interfaces, depending on which language you identify with more). Defining these recursion schemes and then building on top of those schemes is arguably the core idiom of functional programming.

Within that context, the utility of map, filter, and reduce are not the functions they directly perform, but their ability to compose with other recursion schemes. If you look at the source code for libraries like parsec (a parser combinator library) or pipes (compositional component-oriented pipeline library), you'll find they don't directly use map/filter/reduce hardly at all, instead defining their own schemes.

In fact, if you take the strictest possible definition of map/filter/reduce, in that they operate only on lists/arrays, they aren't used that much at all in Haskell. Their generalizations are used all over the place. But many functional programming libraries for imperative languages will only offer the very specialized versions, because for any of several reasons, the programming language in question can't offer the generalized version.

To analyze their cost/benefits in another environment requires starting with a fresh sheet of paper, because when ripped out of their FP home, they aren't even remotely the same functions, but new things demanding new analysis. Just because you define a function called M-A-P in your favorite language does not mean you've got Haskell's map now.

Consider the the costs of a functional programming style in Go, post-generics.

  • Major negative performance impact: In Go, suppose you have a for loop that goes through an array and does 5 things to the values of it and puts the result in some new array if it matches some conditions.

    Some clever FP-wannabe comes along and says "Oh, the RIGHT way to do this is with map(..., filter(..., map(..., filter(..., map(...))))) because now Go has generics and this is the right way to do it!"

    But now, what used to be

    • Take element from array A
    • Decide whether to process element.
    • Convert from type A to type B somehow.
    • Decide whether to put element in array B

    is now

    • Take element from array A
    • Process element and add to new array C
    • Repeat until array C is fully constructed.
    • Take element from C
    • Decide whether to keep element
    • Construct array D out of the elements we keep.
    • Take element from D.
    • Transform element.
    • Construct array E from all the elements of D.
    • Take element from E.
    • Decide whether to keep element.
    • Construct array F from all the elements of E.
    • Take element for F.
    • Transform element from F.
    • Put elements from F in final array B.

    Since for the most part, memory usage dominates CPU usage for performance, this is going to be much worse for performance, for no gain. Haskell can optimize the intermediates away with fusion to turn the sequence of calls into something more like the first case. Go isn't going to. You've got function boilerplate repeated five times, it's harder for the optimizer to deal with (it may optimize that some but you're spending your "optimizer budget" on that), it's almost all disadvantage. Not quite all. But mostly.

    Other constructs similarly won't reduce boilerplate to speak of.

    Speaking of boilerplate...

  • Awful syntax: Let's have a look at that boilerplate. I'm going to stack the deck a bit with simple operations which will maximize the visual impact of the boilerplate, but it's not that stacked. Compare (playground link):
    func NonGenericProcess(someInts []int) []string {
    	result := []string{}
    
    	for _, i := range someInts {
    		element := someArray[i]
    		if element%2 == 0 {
    			continue
    		}
    		str := strconv.Itoa(element)
    		if len(str) < 3 {
    			continue
    		}
    		result = append(result, str)
    	}
    	return result
    }

    with

    func WithFP(someInts []int) []string {
    	return Filter(string)(func(s string) bool { return len(s) <= 2 },
    		Map(int, string)(strconv.Itoa,
    			Filter(int)(func(i int) bool { return i%2 != 0 },
    				Map(int, int)(func(i int) int { return someArray[i] },
    					someInts))))
    }

    The horizontal spacing is how godoc formats it right now, not me trying to make it look bad. If anything, this makes it look too good; if we imagine these as multi-line functions we get

    func WithFP(someInts []int) []string {
    	return Filter(string)(func(s string) bool {
    		return len(s) <= 2
    	},
    		Map(int, string)(strconv.Itoa,
    			Filter(int)(func(i int) bool {
    				return i%2 != 0
    			},
    				Map(int, int)(func(i int) int {
    					return someArray[i]
    				},
    					someInts))))
    }

    which is going to be a common case... especially when you have to introduce func MapError(type A, B)(func (A) (B, error), []A) ([]B, error) and what the closures passed to that will have to do if it's not an already-existing function.

    Plus we have the usual problem with code being written backwards, which is just a cost-of-business in the functional world but very confusing here.

    You need a baseline level of conciseness for closures for this style to work. To be honest, even Javascript's (x, y => x + y) can feel tedious after using Haskell, which has optimized its syntax for this sort of thing. Post-generics Go isn't going to be anywhere near concise enough for this to be practical.

  • I can hear some of you... hey, that's a foul! map, filter, and reduce are methods on an slice-like type! Then you get something like
    func WithMethods(someInts FPSlice(int)) FPSlice(string) {
        return someInts.Map(int)(func(i int) int { return someArray[i] }).
            Filter(func (i int) bool) { return i%2 != 0 }).
            Map(string)(strconv.Itoa).
            Filter(func (s string) bool { return len(s) <= 2 })
    }

    The proximal problem is that the current generics proposal won't support it, as methods on a type can't introduce new generic types. Map and Reduce needs to be able to introduce a destination type to work. (You can have Filter, because it only introduces a hard-coded new bool.)

    But the real problem is that this steps even farther away from functional programming. The recursion scheme is no longer independent, it's directly tied to a type. If you want to borrow some functional idioms as convenient in your imperative language (which is fine), that's not a big problem. If you want to program functionally, it's terrible. You can't define a new data type and implement some sort of Traverseable on it and apply your functional tool set to it. Even if you define a generic data type with Map defined on it you're still committing to getting the same type you put in out of it, which isn't a huge problem, but is another on the rather large pile of "little annoyances that add up to FP not being very much fun in Go" (or very many other imperative languages).

    And if anything, that's far cleaner than it will look in real life, on code that may need to deal with errors, or code where the closures can't be one liners... and you still have all the performance issues I cited earlier.

    I get Haskell here.

    filter (\x -> length x <= 2) .
        map itoa .
        filter (\x -> x `mod` 2 == 0) .
        map (somearray !!)

    I get somebody wanting to write that. I do not get somebody wanting to write that Go code above.

    Haskell can turn map f1 . map f2 . map f3 into map (f1 . f2 . f3), which means it doesn't need to manifest intermediates, quite trivially. Go isn't going to, because of its priority on fast compilation. Possibly other Algol-langauges could do it... I don't know if any do, but there's a lot of languages out there, probably at least some do... but Go won't.

That is a whackload of costs. Does this style carry enough benefits to make up for it in Go post-generics? Some iconoclasts here and there will disagree, but I submit the community as a whole is quite inevitably going to come down with no.

After all, what are the benefits here? "I can write code to operate on arrays without overflows?" map is a solution to this problem, it is not the solution to this problem. The vast majority of loops on arrays in Go use range, and can't overflow. I am not in the market for a language construct to solve the handful of times I've written loops more directly, and over half those loops are basically

for i := 0; i < numberOfWorkers; i++ {
     go runWorker()
}

anyhow. Much of the remaining long tail are those odd cases where I'm doing something so odd it probably doesn't fit the map/reduce paradigm anyhow.

What else? "Now my Go code kinda looks more like the code that is idiomatic in another language"? That's a disadvantage, not an advantage.

It's not faster. It's not going to be fewer tokens (because you'll never compete with the non-functional solutions with all the function boilerplates you'll be adding). It's not easier to refactor. It's not going to give access to advanced features like STM. It isn't a superior solution to isolating pure code from IO code (there are better ways to do that in current Go). It's not like you're going from totally unsafe C code to some totally safe solution; at most you're going from an in-practice 99% safe solution to an in-practice 99.4% safe solution. That's a fairly steep price to pay for that level of improvement.

Programming functionally isn't about just having particular functions around. By a similar line of logic, it isn't about having monads or an Either/Option type or a Maybe type. It's about using those things as building blocks in solutions to problem. But non-functional languages range from simply lacking the abstractions necessary to put those together into large structures, through making it extremely syntactically inconvenient and making it trivial to accidentally violate the constraints needed for this style of programming to make sense. (Go, for what it is worth, is in the first category. Even post-generics it lacks too many things.) Functional programming is not intrinsically good, such that no matter where you implement it, it is better than the other possibilities, it is a contingent good, dependent on very specific preconditions and a certain support structure necessary for it to make sense which is missing from non-functional languages. The same can be said about any paradigm.

Ripping functional programming out of its contingent environment and jamming it into others is not a sign of a good programmer. It is the sign of a programmer who does not understand functional programming. It is exactly the same as a programmer who comes to Haskell and immediately starts asking questions about how to write object oriented programming in it.

In Go specifically, generics are not going to cause some sort of functional programming renaissance in the community. There are still far too many disadvantages to the style in Go post-generics, and the advantages too tiny to make up for them.

Generics in Go will give us data structures, tools to manipulate channels, maps, and slices in various generic ways beyond what we have now, possibly type-safe iterators, and a handful of other useful things. I suspect there are going to be some changes to the web frameworks, some moves on the numeric computation front, and some other similar things. I've got a few little places where I can avoid a bit of casting. But overall, what is "idiomatic" isn't going to change much.

I wrote this in terms of map/filter/reduce as the easiest ways to see what I'm talking about, but for similar reasons, I'm not expecting much traction from the inevitable "Option" libraries, or other similar things. Replacing

res, err := Something()
if err != nil {
    // handle error
}

with

res := SomethingOption(Something)()
if err := res.Error(); err != nil  {
    // handle error
}

isn't a win. Haskell has the structures so that

do
    x <- somethingOption arg1
    y <- otherOption x arg1
    z <- moreOption y
    return $ z + 10

is handling errors and short-circuiting, despite appearing not to. If post-generics Go could even express that properly via some method on Result (again without the ability for methods to introduce a new type this is not going to fly in Go), it will be a nightmare of syntax by comparison to Haskell, like the first map/filter/reduce above, as well as have performance issues. It's not going to be preferable to the current mechanisms for handling code.

This post is about Go. Some of these arguments may partially apply to other languages, but in exactly the same way I'm pointing out the analyses based on FP languages have to be redone if you switch them to a different context, these points would have to be redone in any other languages as well. Having something like Javascript's (x => x + 1) changes the syntactic convenience; a dozen other syntax differences can have their own effect. Having an optimizer that runs longer than Go's and is more capable of inlining functions or noticing it can do a fusion optimization can change the performance analysis. None of what I'm saying here is a criticism of functional programming in an appropriate environment (I quite like Haskell), none of what I'm saying here changes one tiny bit what idiomatic Rust code should look like, and so on.

This post is not a universal claim about the utility of functional programming constructs embedded in imperative programming languages... with the possible exception of my claim that they're overrated when the real lesson of FP is about generalized recursion scheme support rather than a particular set of blessed functions, but being overrated doesn't mean it isn't sometimes useful even so. When I'm in Python, I use more itertools. When I'm in Javascript, I use futures stuff. When I'm in C#, I use LINQ. But they're just occasionally useful tools; you're not "programming functionally" just because you have a couple of maps sprinkled into your imperative language. You've just refactoring your imperative code with some ideas inspired from FP.

This isn't even a claim that this sort of thing will never be useful in Go. I expect to use some maps every so often, for instance. There will be some small changes here and there.

What I am saying is simply that generic support isn't going to radically change what constitutes idiomatic Go code.

PS: I've posted things like this before. Inevitably, someone will write a Map that mitigates one of my objections or something. But the goal isn't to produce a particular solution that mitigates exactly one of my objections; the goal is to produce something that is so much better than the current alternatives, in practice, to get the entire community to agree en masse that you've got a better way to do it, across the entire cost/benefits analysis.

I submit that while there's probably half-a-dozen different ways of writing a Map or Result in Go that mitigates one or another particular objection, no matter how you write it it'll have major weaknesses.

When I say that, I'm thinking beyond mere tweaks to what is above; consider defunctionalizing map/filter/reduce. You would then have to write an interpreter to execute the defunctionalized data structure. This interpreter could be smart enough to avoid the issues I wrote about above with multiple array instantiation by fusing the functions internally.

But while this takes a bite out of the performance problems (ultimately, you can't solve it in Go because using closures is just inevitably going to bite you when you blind the inliner vs. simply writing code, and so many of the payloads of these functions aren't large enough to push the interpreter costs below the noise threshold), it's worse on every other dimension. Pardon if I don't implement it to show you, as I think I've done enough work for this post. Goodness help you if you try to make the interpreter extensible; working with that will look more like "implementing a language runtime" than "writing a couple of for loops".

If you think you've got better, show me a real function, like, let's say, path.Base. Show me how it's not just a bit better, but worth switching for. Fair warning: I will be looking at performance.

PPS: This discussion is not complete without a link to the inner platform effect; all of this amounts to implementing functionality that the language already has functionality for. Part of why I keep saying the alternatives need to be a lot better is to make up for the disadvantages of any inner platform.