by robgering on 10/2/17, 3:09 PM with 617 comments
by alkonaut on 10/2/17, 7:32 PM
- Quality (How many bugs)
- Dev time (How fast to develop)
- Maintainability (how easy to maintain and adapt for years, by others than the authors)
The argument is often that there is no formal evidence for static typing one way or the other. Proponents of dynamic typing often argue that Quality is not demonstrably worse, while dev time is shorter. Few of these formal studies however look at software in the longer perspective (10-20 years). They look at simple defect rates and development hours.
So too much focus is spent on the first two (which might not even be two separate items as the quality is certainly related to development speed and time to ship). But in my experience those two factors aren't even important compared to the third. For any code base that isn't a throwaway like a one-off script or similar, say 10 or 20 years maintenance, then the ability to maintain/change/refactor/adapt the code far outweigh the other factors. My own experience says it's much (much) easier to make quick and large scale refactorings in static code bases than dynamic ones. I doubt there will ever be any formal evidence of this, because you can't make good experiments with those time frames.
by agentultra on 10/2/17, 3:32 PM
What is interesting is using the type system to specify invariants about data structures and functions at the type level before they are implemented. This has two effects:
The developer is encouraged to think of the invariants before trying to prove that their implementation satisfies them. This approach to software development asks the programmer to consider side-effects, error cases, and data transformations before committing to writing an implementation. Writing the implementation proves the invariant if the program type checks.
(Of course Haskell's type system in its lowest-common denominator form is simply typed but with extensions it can be made to be dependently typed).
The second interesting property is that, given a sufficiently expressive type system (which means Haskell with a plethora of extensions... or just Idris/Lean/Agda), it is possible to encode invariants about complex data structures at the type level. I'm not talking about enforcing homogenous lists of record types. I'm talking about ensuring that Red-Black Trees are properly balanced. This gets much more interesting when embedding DSLs into such a programming language that compile down to more "unsafe" languages.
by catpolice on 10/2/17, 5:02 PM
For my money, I work in a primarily dynamic language and I already have a set of practices that usually prevent relatively simple type mismatches so I very rarely see bugs slip into production that involve type mismatches that would be caught by a Go-level type system, and just that level of type information would add a lot of overhead to my code.
But if I were already using types, a more expressive system could probably catch a lot of invariance issues. So I feel like the sweet spot graph is more bimodal for me: the initial cost of switching to a basic static type system wouldn't buy me a lot in terms of effort-to-caught-bugs-ratio, but there's a kind of longer term payout that might make it worth it as the type system becomes more expressive.
by simon_o on 10/2/17, 3:34 PM
"I don't see the benefit of typed languages if I keep writing code as if it was PHP/JavaScript/Go" ... OF COURSE YOU DON'T!
This is missing most of the benefits, because the main benefits of a better type system isn't realized by writing the same code, the benefits are realized by writing code that leverages the new possibilities.
Another benefit of static typing is that it applies to other peoples' code and libraries, not only your own.
Being able to look at the signatures and bring certain about what some function _can't_ do is a benefit that untyped languages lack.
I think the failure of "optional" typing in Clojure is a very educational example in this regard.
The failure of newer languages to retrofit nullabillity information onto Java is another one.
by flavio81 on 10/2/17, 5:30 PM
Camp A: Languages with mediocre static typing facilities, for example:
-- C (weakly typed)
-- C++ (weakly typed in parts, plus over-complicated
type features)
-- TypeScript (the runtime is weakly typed,
because it's Javascript all the way down)
Camp B: Languages with mediocre dynamic typing facilities, for example: -- Javascript (weakly typed)
-- PHP 4/5 (weakly typed)
-- Python and Ruby (no powerful macro system to
help you keep complexity well under control
or take fulll advantage of dynamicism)
Both camps are not the best examples of static or dynamic typing. A good comparison would be between:Camp C: Languages with very good static typing facilities, for example:
-- Haskell
-- ML
-- F#
Camp D: Languages with very good dynamic typing facilities, for example: -- Common Lisp
-- Clojure
-- Scheme/Racket
-- Julia
-- Smalltalk
I think that as long as you stay in camp (A) or (B), you'll not be entirely satisfied, and you will get criticism from the other camp.by fny on 10/2/17, 4:47 PM
While, yes, top-quality dynamic code will have documentation and test cases to make up for this deficiency, it's often still not good enough for me to get my answer without spelunking the source or StackOverflow.
I feel like I learned this the hard way over the years after having to deal with my own code. Without types, I spend nearly twice as long to familiarize myself with whatever atrocity I committed.
by mpartel on 10/2/17, 4:16 PM
The main use case of generics, making collections and datastructures convenient and readable, is more than enough to justify the feature in my view, since virtually all code deals with various kinds of "collections" almost all of the time. It's a very good place to spend a language's "complexity budget".
I wrote an appreciable amount of Go recently, with advice and reviews from several experienced Go users, and the experience pretty much cemented this view for me. An awful lot of energy was wasted memorizing various tricks and conventions to make do with loops, slices and maps where in other languages you'd just call a generic method. Simple concurrency patterns like a worker pool or a parallel map required many lines of error-prone channel boilerplate.
by mattnewton on 10/2/17, 3:36 PM
by evmar on 10/2/17, 4:10 PM
I have come to see type systems, like many pieces of computer science, can either be viewed as a math/research problem (in which generally more types = better) or as an engineering challenge, in which you're more concerned with understanding and balancing tradeoffs (bugs / velocity / ease of use / etc., as described in the post). These two mindsets are at odds and generally talk past each other because they don't fundamentally agree on which values are more important (like the great startups vs NASA example at the end).
by oldandtired on 10/3/17, 12:51 AM
Though I am not a type theorist (I only dabble in compilers and language design), I have noted that many people conflate static typing and dynamic typing with other additional ideas.
Static typing has certain benefits but also has certain disadvantages, dynamic typing has certain benefits but also has certain disadvantages.
What I find interesting is that few people fall into the soft typing arena, using static typing where applicable and advantageous and using dynamic typing where applicable and advantageous.
Static typing has a tendency in many languages to explode the amount of code required to get anything done, dynamic typing has a tendency to produce somewhat brittle code that will only be discovered at runtime. The implementation of static typing in many languages requires extensive type annotation which can be problematic.
But what is forgotten by most is that static typing is a dynamic runtime typing situation for the compiler even when the compiler is written in a static typed language.
Instead of falling into either camp, we need to develop languages that give us the beast of both world. Many of the features people here have raised as being a part of the static typing framework have been rightly pointed out as being of part of the language editors being used and are not specifically part of the static typing regime.
Many years ago a similar discussion was held on Lambda-the-Ultimate, and the sensible heads came to the conclusion that soft typing was the best goal to head for. Yet, in the intervening years,when watching language design aficionados at work, they head towards full static typing or full dynamic typing and rarely head in the direction of soft typing (taking advantage of both worlds).
S, the upshot, this discussion will continue to repeat itself for the foreseeable future and there will continue to NOT be a meeting of minds over the subject.
by willtim on 10/2/17, 4:57 PM
by solatic on 10/2/17, 5:12 PM
Start-ups decide not to write MVPs in languages like Haskell or Idris not because those languages aren't "rapid" enough, but because it's too difficult to find programmers experienced in those languages on the labor market. It's already difficult enough to find competent programmers - no founder wants to make their hiring woes even more difficult.
by barrkel on 10/2/17, 3:38 PM
by mannykannot on 10/2/17, 5:14 PM
You write "Why then is it, that we don't all code in Idris, Agda or a similarly strict language?... The answer, of course, is that static typing has a cost and that there is no free lunch."
I take it that you wrote "of course" here through assuming that there must be some objective reason for the choice, and that it depends solely on strictness, but languages don't differ only in their strictness, so choices may be made objectively on the basis of their other differences, and we also know that choices are sometimes made on subjective or extrinsic grounds, such as familiarity. I don't know what proportion of professional programmers are familiar enough with Iris or Agda to be able to judge the value proposition of their strictness, but I would guess that it is rather small.
Now, to look at the sentences I elided in the above quote: "Sure, the graph above is suggestively drawn to taper off, but it's still monotonically increasing. You'd think that this implies more is better." As the graph is speculative, it cannot really be presented as evidence for the proposition you are making. I could just as well speculate that static program checking does not do much for program reliability until you are checking almost every aspect of program behavior, and that simple syntactical type checking is of limited value. That would be consistent with the fact that there is little empirical evidence for the benefit of this sort of checking, and explain why most people aren't motivated to take a close look at Iris or Agda. In this equally-speculative view of things, current language choices don't necessarily represent a global optimization, but might be due to a valley of much more work for little benefit between the status quo and the world of extensive-but-expensive static checking.
by geokon on 10/2/17, 5:44 PM
I've been thinking about the trajectory of C++ language development recently and the emphasis has definitely been on making generics more and powerful. You watch CppCon talks and see all this super expressive template spaghetti and see that while it's definitely a better way to write code - the syntax is just horrifying and hard to "get over"
Just like when "auto" took off and people starting thinking about having "const by default" - I'm starting to think that generic by default is the way to go. The composability of generic code is incredible powerful and needs to be more accessible
However the other end of the spectrum: dynamic code leaves a lot of performance on the table and leads to runtime errors
by CoolGuySteve on 10/2/17, 3:29 PM
Especially when it comes to GUI programming, I really don't care if a BlueButton.Click() got called instead of RedButton.Click().
by ruskimalooski on 10/2/17, 3:35 PM
by k__ on 10/2/17, 3:29 PM
So year, static typing doesn't buy you much, but in some languages it's at least cheap.
by stephengillie on 10/2/17, 3:26 PM
Or, type can be specified when setting the variable:
[String]$myString = "Hello World!"
This would generate a type error:
[Int]$myString = "Hello World!"
Often, typed and untyped variables will sit together:
[Int]$EmployeeID,[String]$FullName,$Address = $Input -split ","
by coding123 on 10/2/17, 7:46 PM
Now this codebase was written with a high degree of quality (it's pretty good but not perfect), but the lack of compile (and of course runtime)-time checks has caused waste.
The second phase of my project to convert all promises to RX Observables :)
by cm2187 on 10/2/17, 3:25 PM
And I disagree with the barrier to entry argument. Static typing, by enabling rich tooling, helps a beginner (like it helped me) a lot more by giving live feedback on your code, telling you immediately where you have a problem and why, telling you through a drop down what other options are available from there, etc. Basically makes the language way more self-discoverable than having to RTFM to figure out what you can do on a class.
by seasoup on 10/2/17, 4:44 PM
by btown on 10/2/17, 3:42 PM
Disclaimer: Python user scarred by email header RFC violations
by noncoml on 10/2/17, 5:17 PM
I think Go with its lack of algebraic type is more of the first, helping the compiler, so I wouldn’t use it as a good example of static typing.
Haskell, OCaml and Rust would make excellent case studies, but we have nothing to compare against.
So IMHO the best way to compare static typing vs dynamic typing is by comparing Typescript against JS. And in my experience the difference when writing code is huge. It completely eliminates the code-try-fix cycle during development.
by thesz on 10/2/17, 4:06 PM
This is a basic intuition behind all good practices, including CI, QA, etc.
Types allow one to discover program defects (even generalized ones, when using some of the programming languages) in (almost) shortest possible amount of time.
Types also allows one to constrain effects of various kind (again, use good language for this), which constraintment can make code simpler, safer and, in the end, more performant.
by valuearb on 10/2/17, 4:56 PM
I love everything about Swift except the compile times and occasionally inscrutable compile error messages.
I love the interactivity of Javascript, but despise the lack of types, it's like I'm sketching out the idea for a program instead of directly defining what it is. And the lack of types burns me occasionally.
by avg_programmer on 10/2/17, 5:58 PM
by _Codemonkeyism on 10/2/17, 3:45 PM
https://github.com/fthomas/refined
not only for the static checking,
scala> val i: Int Refined Positive = -5
<console>:22: error: Predicate failed: (-5 > 0).
val i: Int Refined Positive = -5
but the expressive descriptions of a domain model.by hwayne on 10/2/17, 5:03 PM
All static typing means is that type information exists at compile time. All dynamic typing means is that type information exists at runtime. You generally need _at least_ one of the two, and the benefits each gives you is partially hobbled by the drawbacks of the other, so most dynamic languages choose not to have static typing. I also feel that dynamic languages don't really lean into dynamic typing benefits, though, which is why this becomes more "static versus no static".
One example of leaning in: J allows for some absolutely crazy array transformations. I don't really see how it could be easily statically-typed without losing almost all of its benefits.
by hellofunk on 10/2/17, 6:02 PM
In such a case, the line between these two type environments narrows.
by bad_user on 10/2/17, 5:41 PM
> "Go reaps probably upwards of 90% of the benefits you can get from static typing"
That 90% number is totally made up as well. I don't see evidence that the author actually worked with Haskell, or Idris, or Agda these being the three static languages mentioned. Article is basically hyperbole.
If I am to pull numbers out of my ass, I would say that Go reaps only 10% of the benefits you get with static typing. This is an educated guess, because:
1. it gives you no way to turn a type name into a value (i.e. what you get with type classes or implicit parameters), therefore many abstractions are out of reach
2. no generics means you can't abstract over higher order functions without dropping all notions of type safety
3. goes without saying that it has no higher kinded types, meaning that expressing abstractions over M[_] containers is impossible even with code generation
So there are many abstractions that Go cannot express because you lose all type safety, therefore developers simply don't express those abstractions, resorting to copy/pasting and writing the same freaking for-loop over and over again.
This is a perfect example of the Blub paradox btw. The author cannot imagine the abstractions that are impossible in Go, therefore he reaches the conclusion that the instances in which Go code succumbs to interface{} usage are acceptable.
> "It requires more upfront investment in thinking about the correct types."
This is in general a myth. In dynamic languages you still think about the shape of the data all the time, except that you can't write it down, you don't have a compiler to check it for you, you don't have an IDE to help you, so you have to load it in your head and keep it there, which is a real PITA.
Of course, in OOP languages with manifest typing (e.g. Java, C#) you don't get full type inference, which does make you think about type names. But those are lesser languages, just like Go and if you want to see what a static type system can do, then the minimum should be Haskell or OCaml.
> "It increases compile times and thus the change-compile-test-repeat cycle."
This is true, but irrelevant.
With a good static language you don't need to test that often. With a good static type system you get certain guarantees, increasing your confidence in the process.
With a dynamic language you really, really need to run your code often, because remember, the shape of the data and the APIs are all in your head, there's no compiler to help, so you need to validate that what you have in your head is valid, for each new line of code.
In other words this is an unfair comparison. With a good static language you really don't need to run the code that often.
> "It makes for a steeper learning curve."
The actual learning is in fact the same, the curve might be steeper, but that's only because with dynamic languages people end up being superficial about the way they work, leading to more defects and effort.
In the long run with a dynamic language you have to learn best practices, patterns, etc. things that you don't necessarily need with a static type system because you don't have the same potential for shooting yourself in the foot.
> "And more often than we like to admit, the error messages a compiler will give us will decline in usefulness as the power of a type system increases."
This is absolutely false, the more static guarantees a type system provides, the more compile time errors you get, and a compile time error will happen where the mistake is actually made, whereas a runtime error can happen far away, like a freaking butterfly effect, sometimes in production instead of crashing your build. So whenever you have the choice, always choose compile-time errors.
by iamleppert on 10/2/17, 7:10 PM
Wouldn't it be great if we can use the computer to figure out what the types should be by a runtime evaluation of the code and save precious human time for things only humans can do?
I don't have to think or decorate my speech with types of noun, verb, pronoun, adjective etc. when I speak, but I'm still able to communicate very effectively, because your brain is automatically adding the correct type information based on context that helps you understand what I'm saying, even with words that have multiple types. Granted, natural language is different than programming language but there was once a trend to try and make programming languages more like human language, not less so.
by platz on 10/2/17, 7:08 PM
Software failures are failures of understanding, and of imagination.
The problem is that programmers are having a hard time keeping up with their own creations.
dynamic typing simply doesn't scale.
by jon49 on 10/6/17, 7:31 PM
I would not consider a language to be modern unless it has Type Providers I consider this to be such an essential feature. I believe Idris and F# are the only languages that have it. People are trying to push TypeScript to add it - who knows if it will happen.
Many are saying that if you have a dynamic language you just need to be disciplined and write many tests. With good static typed languages like F# you can't even write tests on certain business logic since the way you write your code you make "impossible states impossible", see https://www.youtube.com/watch?v=IcgmSRJHu_8
by hyperpallium on 10/2/17, 9:53 PM
1. performance dominates (like 80:20)
2. tooling
3. doc (becomes crucial on large projects)
4. correctness
Formal correctness doesn't really matter. Anecdotally (since that's really all we have), I find in practice, very few bugs are caught by the type-checker.Further, code is usually not typed as accurately as the language allows. i.e. the degree of type-checking is a function of the code; the language only provides a maximum. In a sense, every value has a type, even if it's not formally specified or even considered by the programmer, in the same sense that every program has a formal specification, even if it's not formally specified.
Upfront design is the price. Which is difficult to pay when the requirements are changing and/or not yet known.
by js8 on 10/2/17, 4:34 PM
By adding types (and in the extreme, dependent types), you're allowing compiler to prove more things about the code (to check correctness or generate more optimal code). If you actually need to prove more things, then it's better to leave that for a compiler rather than human.
Of course, if you're writing e.g. web scraping script, you don't need these guarantees and then you don't have to care about types. But the better engineering you want, the more static typing will help and there is no diminishing returns.
by FranOntanaya on 10/2/17, 5:21 PM
It makes the higher level types seem more transcendental than they are, and also seems to put actual validation on a second rate level. End of the day if an argument is the right scalar or interface you'll get the same result on runtime whether you hinted it -- for one's quality of life improvements -- or checked it with some boilerplate validation. Worst case scenario people will forgo encoding known stricter constraints after generally hinting the expected type.
by tabtab on 10/2/17, 9:30 PM
by cleandreams on 10/2/17, 5:42 PM
by lisper on 10/2/17, 10:54 PM
by snambi on 10/2/17, 4:52 PM
by tiuPapa on 10/2/17, 7:10 PM
by ratherbefuddled on 10/3/17, 12:08 AM
> upfront investment in thinking about the correct types
being a cost. Surely you have to do this whether the compiler will check your work or not, and if you just don't do the thinking you'll end up with bugs? Isn't this a benefit?
by zengid on 10/2/17, 7:39 PM
by z3t4 on 10/3/17, 7:35 AM
by magice on 10/2/17, 11:15 PM
Just ONE study, so don't take too much heed. That said, apparently:
* Strongly type, statically compiled, functional, and managed memory is least buggy
* perl is REVERSELY correlated with bugs. Interestingly, Python is positively correlated with bug. There goes the theory about how Python code looks like running pseudo-code... Snake (python's, to be more precise) oil?
* Interestingly, unmanaged memory languages (C/C++) has high association with bugs across the board, rather than just memory bugs.
* Erlang and Go are more prone to concurrency bugs than Javascript ¯\_(ツ)_/¯. Lesson: if you ain't gonna do something well, just ban it.
All in all, interesting paper.
by shalabhc on 10/2/17, 4:56 PM
by amelius on 10/2/17, 6:44 PM
(I'm not talking about systems which just infer types automatically).
by vhiremath4 on 10/2/17, 10:00 PM
Can someone explain this?
by woolvalley on 10/2/17, 7:07 PM
by jugg1es on 10/3/17, 3:37 AM
by danharaj on 10/2/17, 3:40 PM
A type system doesn't only describe the behavior of the program you write. It also informs you of how to write a program that does what you want. That's why functional programming pairs so well with static typing, and in my opinion why typed functional languages are gaining more traction than lisp.
How many ways are there to do something in lisp? Pose a feature request to 10 lispers and they'll come back with 11 macros. God knows how those macros compose together. On the other hand, once you have a good abstraction in ML or Haskell it's probably adhering to some simple, composable idea which can be reused again and again. In lisp, it's not so easy.
A static type system that's typing an inexpressive programming construct is kind of a pain because it just gets in the way of whatever simple thing you're trying to do. A powerful programming construct without a type system is difficult to compose because the user will have to understand its dynamics with no help from the compiler and no logical framework in which to reason about the construct.
So, a static type system should be molded to fit the power of what it's typing.
The fact that every Go programmer I talk to has something to say about their company's boilerplate factory for getting around the lack of generics tells me something. This is only a matter of taste to a point. In mathematics there are a vast possibility of abstract concepts that could be studied, but very few are. It's because there's some difficult to grasp idea of what is good, natural mathematics. The same is in programming: there are a panoply of programming constructs that could be devised, but only some of them are worth investigating. Furthermore, for every programming construct you can think of there's only going to be a relatively small set of natural type systems for it in the whole space of possible type systems.
Generics are a natural type system for interfaces. The idea that interfaces can be abstracted over certain constituents is powerful even if your compiler doesn't support it. If it doesn't, it just means that you have to write your own automated tools for working with generics. It's not pretty.
by tree_of_item on 10/2/17, 3:33 PM
by 201709User on 10/2/17, 3:18 PM
by katastic on 10/2/17, 11:38 PM
To me, it's right tool for the right job. I have no problem spinning up a static language for performance and outsourcing the scripting to a dynamic language like Python for the best of both worlds in terms of speed, and rapid development.
by zzzcpan on 10/2/17, 3:55 PM
That's not really true, just a belief. I give you an example to start understanding these things: the exact same program written in a very high level and very expressive language, like Perl, instead of Go, is going to have at least 3 times less code and since defect rates per line of code are comparable, you would end up with at least 3 times less bugs. Suddenly reliability argument of static typing doesn't make any sense. That's because in PL research there is a huge gap in understanding of how programmers actually think.
by guicho271828 on 10/2/17, 4:42 PM
by brango on 10/2/17, 3:48 PM
by nwellinghoff on 10/2/17, 4:46 PM