I worked with Michael on the same project, after also working with Haskell previously. On the whole I agree with the pros/cons stated in the article. Having said that, my conclusion would be a bit different: I would err on the side of Go for the majority of commercial projects.The article mentions the impressive worst case pause times of Go's GC. Since then we have performed some additional benchmarking. The conclusion was: it is impressive, but there are still a couple of issues that break the sub 1ms claims. We blogged about this here: https://making.pusher.com/golangs-real-time-gc-in-theory-and.... It's hard to guarantee low latency in all cases... Michael also mentions that there is nothing like ThreadScope, or at least nothing that's easy to find. The latter is true. There is an impressive runtime system event visualiser which can be opened with `go tool trace` https://golang.org/cmd/trace/. You can see a screenshot of this in the GC blog post I linked to above. Unfortunately the only documentation is this Google Doc: https://docs.google.com/document/d/1FP5apqzBgr7ahCCgFO-yoVhk... which is tricky to find, and could be more in-depth. I'm in the middle of writing a blog post on how to use this too. Watch out for it on our engineering blog in the next month or two. https://making.pusher.com/
|
|
| |
It is, I think, going to be very difficult to enjoy writing code in a less powerful language when you are exposed to languages that hold awesome power.In fact, this has been the basis for much writing on Lisp too. Paul Graham has written entire essays along the same lines. If you work in a job that forces the use of a less powerful language than what you've been exposed to, you can, I think, go through a sort of depression. You simply long to use the tools that you know hold much more power yet must resign yourself to the tools you have. You can overcome the problem by being present in your work. The language might be ugly. It might be totally underpowered with limited means of abstraction and a type system that snmacks you and, sometimes, your client over the head. What you can do is, despite that, commit to writing great software with the tools you have. Commit to improving your work. Perhaps expand the ecosystem with tools borne of insights from your adventures with high-powered tools. You will have a much more enjoyable time in most languages this way. Perhaps except MUMPS but there might be hope.
|
|
| |
I don't know how to ask this without it sounding offensive, and I don't intend it that way. But do these "hardcore" haskellers enjoy writing programs at all? Like are there any well known open source apps that people actually use written in Haskell? There are tons of hobbiests and it has a following but what are the examples of its greatness?I ask as an old SML guy, I like the math theory, I like the promise of better, more accurate and reliable software, I think you can do some beautiful things with algebraic types but I've been dumbfounded by the lack of great examples out there. Ada suffered/suffers the same fate, it has this "academic" pedigree for robustness but not many examples that you can get your hands on. so much expressiveness that doesn't seem to result in much expression.
|
|
| |
I've been thinking a lot lately about why Go, a quite minimalist language by modern standards, has been so incredibly successful in such a short time. It's about a decade old, and is already among the top 5 languages for people starting new OSS projects (at least in my area, which is systems management). Green field development is being done in Go at a rate unprecedented for such a new language, I think. Node/JS might be precedent if we say that Node is a whole new thing, even though it uses JS as its syntax, using it as a server language is pretty new (ignoring Netscape's failed attempts at it in the 90s).I'm tempted to compare it to PHP, which exploded in popularity despite (unarguably, IMHO) better languages existing all around it at the time. Go doesn't seem like a bad language in the way that PHP was a bad language for its first 4~5 major versions, but it does find itself surrounded by many more powerful/expressive languages, and is seemingly getting much faster adoption. I've been learning Go myself, and while I miss some of the features of my normal primary languages (Perl, lately, but also Ruby and Python), I like that I can read other people's Go code without having to look anything up, even though Go is very new to me and I have years experience reading/writing Perl/Ruby/Python. Of course, Perl/Python/Ruby aren't a major paradigm shift away from Go as Haskell is...but, it's still a question of whether it's better to write ten simplistic LOC or two maybe opaque LOC. I tend to prefer concision over verbosity (and thus hate reading Java; I get lost in the trees before I can find the forest). Maybe Go is a sweet spot for a lot of developers. Rob Pike did a talk called Simplicity is Complicated, about this very subject...I found it a pretty convincing argument. https://www.youtube.com/watch?v=rFejpH_tAHM
|
|
| |
> has been so incredibly successful in such a short time.1) Our industry is obsessed with new and shiny. 2) There are a lot of developers and companies who don't have the time to invest in a new paradigm or language. 3) By learning/using golang these developers and companies can look "hip" and forward thinking with minimal effort. With a Java background and a 10 hour investment in golang you can write decent software. With a 10 hour investment in Haskell you can't do anything.
|
|
| |
The problem is, people tell me that if I just learn Haskell, Idris, Closure, Coffescript, Rust, C++17, C#, F#, Swift, D, Lua, Scala, Ruby, Python, Lisp, Scheme, Julia, Emacs Lisp, Vimscript, Smalltalk, Tcl, Verilog, Perl, Go... then I'll finally find 'programming nirvana'.While you might think it's worth spending 100+ hours learning Haskell, I don't (at the moment) have any reason to Haskell as a better time investment than anything else on that list.. why should I give it more than 10 hours?
|
|
| |
I don't know about you, but everything I work on is going to be something that someone will have to maintain, or will be actively developed by multiple people with different categories of expertise. Also, I usually want to build on other people's work (i.e. libraries).So while I'd love to write big chunks of Haskell, if its just me doing it, that code will be nuked and rewritten unless I also undertake a project of education. So accessibility is key. Go/python/js seem to win in that regard, and likewise in access to libraries. So unless as an organisation there is motivation to step up skilling, e.g. because there is a lot of money at stake if something breaks (e.g. trading software), or because there is a community of practice (compilers) its hard to see Haskell gaining much ground. Likewise all those other languages has their niche, and it is hard to seem them displacing each other.
|
|
| |
100 hours is really optimistic for haskell
|
|
| |
Just a typo, but you mean Clojure? :)
|
|
|
| |
Go has one of the largest tech companies on earth actively pushing it and pouring countless programmer hours into developing tooling and support for it. That pretty much trumps all of its misfeatures in terms of gaining adoption.
|
|
| |
So, it's previous analogs are Java and C#? That doesn't feel quite right, either, in that Google makes no money from Go; Java and C# are both direct moneymakers for their respective stewards. I haven't really seen Google pushing Go that hard; it feels like a skunkworks project that just happened to catch everybody's eye (including the company it came out of), rather than a grand plan to conquer the world of programming.
|
|
| |
It seems like everyone wants to believe they have Google-shaped problems, and thus should mimic Google in as many ways as they possibly can. Is some component of it that? Whether or not it's part of some grand plan?This happens in hiring practices, in programming languages, in infrastructure tech, in data tech, etc. It's extremely rare to encounter a data scale problem that actually requires a Hadoop-ish deployment, but they're all over the place. It's extremely rare to encounter an ops infrastructure that requires a scale and complexity where Kubernetes-ish tools make sense, but they're all over the place. I admittedly don't fully understand the point of Go (as in existentially). I have used it, and will assuredly have to keep using it based on its increasing popularity. But it's not as good as Erlang for reliability or supporting production tools (services use cases). It's not as good as Rust or Haskell for implementation assurance (safety/security use cases). It's not as good for numerical processing as Julia or Fortran (scientific/performance use cases). It feels a lot like the next generation of Java to me. In that it's some shape of "good enough". And when "good enough" is combined with the cachet of aligning oneself with Google's ecosystem, it's basically unstoppable.
|
|
| |
There's a lot of Java at Google, so it wouldn't be surprising that Go would be Java-esque. But, I find it more like C. It has literally no accommodation for "enterprise" things. It's kind of its own thing that bubbled up out of a parallel timeline (which is kinda true, since it evolved out of the Plan 9 ecosystem and from Plan 9 developers), without much baggage from the couple of decades of commercialization of languages.It's a pretty close to pure expression of a few language designer's vision, with very little in the way of corporate influence (that I can discern). And, I think that's the point of Go; it's the language its designers wanted for the tasks they wanted to work on (web scale services). It just happens that a lot of people want/need to work on the same sort of thing. I agree that most people embracing massive scale are doing so way too early and probably aint gonna need it. But, that doesn't mean Go isn't a useful tool, anyway. I've watched a few videos, and read a few tutorials, and the brevity of Go for things like network servers is breathtaking. The cleverness of interfaces is...well, it really has to be seen to be understood. It's not OO, it's not functional. It's...something else. Contrary to the impression that maybe I've given in other comments, Go does have some features that aren't commonly seen elsewhere (or, if they are, they're bolted on). It has some fancy features, but only a few, and only in specific areas. It seems like a very well thought out language, to me. Which might have been true of Java in its very early days...I'm not sure. I've never really been a Java developer, though I've read some Java code now and then. So, Go might be merely "good enough" on some fronts, but I think it's great on a few specific fronts. And, it does it without requiring total buy-in on a new paradigm, which is probably its killer feature. It looks a little like Python or whatever in syntax, but acts a little like Erlang on a few specific details. Pretty good combination, and the community seems to agree.
|
|
|
| |
I agree generally with this sentiment. It's probably the biggest reason I have such a hard time falling in love with Elixir. Because Erlang is simpler, despite them being largely interchangeable, and that simplicity is extremely appealing.
|
|
| |
I wouldn't call it a grand plan, but it is one of Google's five officially supported server languages and as a result there's a lot of work being done to build tools and libraries for it. Code generation and relying on language-agnostic building and packaging systems (bazel) are both pretty common Google-isms, which show how the company's development culture influenced the priorities and direction of the language.
|
|
| |
What are the other 4? Python, C++, Java, and what other language?
|
|
|
| |
Not a server language (no NodeJS at google). The other is C.
|
|
| |
AFAIK (although I could be wrong about this) the typical 'five' languages used at Google are not specifically for servers. I have heard they are Javascript, Java, Python, Go, and C++
|
|
| |
>So, it's previous analogs are Java and C#? That doesn't feel quite right, either, in that Google makes no money from Go; Java and C# are both direct moneymakers for their respective stewards.Well, Google doesn't put as much money into Go either (as MS and Sun put into Java and C#). Sun and C# had much bigger teams, much bigger budgets, and tons of marketing money spent on them, including actual ads everywhere. Just the team writing the C# documentation was probably bigger than the whole Google Go team. But that's relative: Unlike most OSS/community driven languages (e.g. Nim or Julia), Go has a big enough team that's paid to work on Go. That makes a difference.
|
|
| |
My journey and perspective is so similar, thank you for sharing your experience
|
|
| |
Off the top of my head, Pandoc, a tool for converting between different markup/down flavors, and xmonad, a tiling window manager for linux. Both are well known and popular.
|
|
| |
I'm not sure I'll call xmonad popular. Also I know quite a lot of people who hate it -- because it's configuration language is "write some Haskell and recompile", so it tends to be very fragile (people who use Haskell love it)Also found it crashed a lot, which I would have hoped would be one of the things Haskell would solve (the problems revolved it having to integrate into C code for linking to X, but that's part of living in the real world).
|
|
| |
Is there anything actually interesting about what they do, though? Converting markup is not a complex task.
|
|
| |
Grandparent sold pandoc short. It is actually a really nifty utility capable of converting a plethora of formats from/to each other. It is a godsend for situations when you have to create MS Office-compatible documents, and want to avoid the inevitable muscle cramps you'd get if you were to actually use Word.You can create your own document templates, and tweak the resulting output in a million ways. At my job, I have to create a lot of documentation, often in PDF or docx, and Pandoc has become an integral part of my workflow. I can create documents within the comfortable environment of Emacs, and use Pandoc to produce output in any old format that $pointy_haired_boss may desire.
|
|
| |
Parsing is something that's very pleasant in Haskell. Other languages can use parser combinators too, of course. It's good at it though.
|
|
|
| |
Yup! There was already a great, ready to use C parser in Haskell, and we don't have one in Rust yet.
|
|
|
| |
Yeah, fair point. That'll be a function of the fact that Haskell has tended to prioritize the concerns and problems of writing the compiler.Complex transformations of data structures are part of the compilation activity too, but they're more difficult to spot in the wild of applications.
|
|
| |
I'd disagree. Most business logic is just transformation of data structures in disguise. Once you have a sufficiently rich set of data structures this becomes abundantly clear.
|
|
| |
Pandoc actually converts between a wide range of document formats.
|
|
| |
Moving goalposts detected.
|
|
| |
I didn't ask the original question.
|
|
| |
Others have mentioned excellent software like pandoc and xmonad. There are also projects with a smaller target audience, but that are insanely impressive compared to what you could plausibly do in basically any other language. In particular, I've used Clash and Lambda-CCC, which let you compile Haskell code to VHDL or Verilog and put it on an FPGA.I think the number of useful tools is pretty closely correlated to popularity, and obviously Haskell has a relatively small user base. It's not like Haskell gives you super-powers and makes you 10x more productive; it mostly lets you write better software, not necessarily more software. I do feel more productive with Haskell, but it's on the order of 50-100% (depending on the task), not enough to let Haskellers match the open-source output of a very popular language like C.
|
|
| |
Haskell community does tend to produce extremely powerfull libraries but not so may user facing tools (maybe it's just an impression).
However, compilers for Elm and PureScript are written in Haskell :) and you probably know the FB spam filter core is also in Haskell
|
|
|
| |
Any high-profile users of this that I might have heard of? Because on first look, PostgREST looks kind of... amazing.
|
|
| |
Depends on what you mean by high profile.
I would not say there are high profile projects but that's only because it's quite a new project. The latest version which imo is the first one to be a truly powerful one is only a month old.
But there are people using in production for close to a year now.Oh, and i've seen someone on HN saying it's used in something like the upcoming Kosovo land property tax system or something like that but i am not directly aware of that.
|
|
| |
And since GraphQL is all the rage now, I'm using PostgREST here https://graphqlapi.com/ so i would say Haskell is picking up steam in terms of user facing products
|
|
|
| |
I have been writing Haskell professionally for about 4 years now, and I absolutely love it. I'm constantly finding myself having to answer questions of "So, what do you mean?" or "what is this" so early on in development. Never before have I had a language that is so directly tied to my understanding of problems.On the practical side, I love having types that let me refactor so mercilessly, along with a very mature compiler with a decent suite of warnings.
|
|
| |
I think Pandoc can count as an example here - it may be a niche product, but it's well known in its niche, and it's arguably the most powerful document markup format converter out there. And it's all Haskell.
|
|
| |
I enjoy it so much I quit mainstream tech to do Haskell professionally 5 years ago.
|
|
| |
That's really cool. What sort of problems do you work on? How did you manage to start working with it professionally?
|
|
| |
Not OSS, but I know Galois use a lot of haskell in their work for largely government agencies (NASA, DoD, etc.)Perhaps the type of problems Haskell is apt to solve (those requiring strict correctness, etc.) are not typically open source software projects.
|
|
| |
Maybe not Haskell, but a lot of ideas from Haskell and Haskell-like languages such as typeclasses, Option types, type inference, etc. have found their way to Rust, and it makes programming in Rust an absolute pleasure.Rust, I'd argue, is the correct choice for many OSS projects. In fact, for security-focused projects, it should be the only choice. C-like performance without C-like vulnerabilities? It makes no sense to choose C or C++ any more.
|
|
|
| |
Haskell is actually less expressive. It is a functional abstraction on top of what is essentially a procedural machine and with all abstractions placed on top of lower layers you can only lose functionality as you go up rather than gain. You could say assembly language is the most expressive language out there.The great thing about Haskell and other functional programming languages is that the functional style forces the program to be created in a form that is modular and elegant. Programs become modular and composable and re-composable in the functional world because the programmer has No Choice but to build his programs in an elegant way. It's hard to explain this to a typical programmer who hasn't dived deep into functional programming but code reuse, modularity and elegance are significantly higher in functional programming languages than in procedural. This elegance, beauty and forced organization comes at a high cost though. And the cost is thinking in a way that is outside the normal paradigm of what humans are use to. I know a lot of haskellers think that humans can be taught to think in both ways equally, I disagree with them and believe that it is far Easier for a human to think about an algorithm procedurally than it is for him to think about it functionally. If you have several tasks to execute, what is your first thought on how to plan out those tasks? A task list (aka list of functions) or a series of composed tasks placed in a single sentence (aka expression)? The answer is obvious: humans think about the list first because procedural programming comes more naturally. This I believe is the main reason why Haskell or lisps haven't taken over the programming world. Functional programming is just HARDER. It takes me 10 times longer to hack out a simple program in haskell than it does in python because haskell doesn't allow you to 'hack' together a program. In haskell the application has to be an elegant diamond or the thing won't even compile! Python allows me to hack something together quickly which is both a good and a bad thing because as we all know hacking together software is quick but easily leads to unmaintainable software.
|
|
| |
> In haskell the application has to be an elegant diamond or the thing won't even compile!Nonsense - You only have to make things as hard/elegant as you want them to be (and fore sure, some people do play that game for better/worse). But, Haskell will happily allow you throw everything in to IO; you can side-step creating a perfect model and just hack together a list of imperative IO statements. (unless you're referring to the difference between a dynlang and a static lang; that's a difference of another category altogether)
|
|
| |
If you are going to "throw everything in IO" then what advantage does Haskell have?
|
|
| |
Even "just IO" in Haskell enjoys some very nice typing properties and is often easier to write and ensure proper operation than similar "just IO" programs in other languages.It's actually super pleasant!
|
|
| |
You can refactor it later, of course.It's even possible to opt out of static type checking if you want. GHCi also provides a REPL that makes exploratory programming easy.
|
|
| |
yes, there is a difference between "'hack' together a program" and doing proper domain modeling.However, neither should prevent the other. (i.e. "you can do X" should not be confused with "always do X"). Haskell allows you to choose. Indeed, proper domain modeling is preferred.. but it's nice to have to ability to make that choice yourself based on your own constraints. Additionally, Haskell makes it very easy/pleasant to refactor from the former to the latter. (On that note, imho the best models (in business) are arrived at iteratively rather than big-design-up-front)
|
|
| |
Lazyness, generic programming, algebraic datatypes, inferred typing
|
|
| |
> Haskell is actually less expressive. It is a functional abstraction on top of what is essentially a procedural machine and with all abstractions placed on top of lower layers you can only lose functionality as you go up rather than gain. You could say assembly language is the most expressive language out there.That is not what is usually meant by "expressiveness" of a language. You seem to be talking about types of languages (as in the Chomsky hierarchy), by implying that there are problems you can solve in assembler but not in Haskell. That is not the case, as both are Turing-complete (Type-0). In contrast, "expressiveness" usually refers to the "ease" (one may substitute conciseness in most cases) with which higher level concepts (abstractions) are expressible in the language, and here Haskell clearly wins against assembler, and many other popular languages.
|
|
| |
The set of all possible assembly instructions is bigger than the set of all possible assembly instructions that haskell can compile to. Therefore haskell is less expressive than assembly language. That is what I mean by expressive.
|
|
| |
>The set of all possible assembly instructions is bigger than the set of all possible assembly instructions that haskell can compile to.No it's not. The Haskell compiler is free to use whatever assembly instructions that it sees fit to compile the Haskell program. Maybe you mean that effectively, the set of assembly programs (i.e. sequences of assembly instructions) which existing Haskell compilers happen to produce is limited. This is true. But it is also true that you can find an equivalent Haskell program for every possible assembly program, where equivalent means they exhibit the same behavior with respect to inputs and outputs. This is because both are Turing equivalent. Now, it's also true that the way two equivalent programs get to their results may differ. For example, the assembly program might be faster. However, these are non-functional properties of the programs. And you were talking explicitly about functionality, so this point doesn't count towards your argument.
|
|
| |
My mistake, I shouldn't have used the word "functionality" so I concede. And yes, I meant sets of assembly instructions, thanks for the correction.But this is the basic idea I was trying to convey: that you lose "expressiveness" as you get higher and higher in levels of abstraction.
|
|
| |
you are mistaking the meaning of expressiveness
I'll provide a short example in assembler vs haskell expressive power.
I'll use hello world as an example.
This is considers an amd64 compatible processor, because I have to know the names of the registers.```asm
section .data
str: db 'Hello world!', 0Ah
str_len: equ $ - str section .text
global _start _start:
mov eax, 4
mov ebx, 1
mov ecx, str
mov edx, str_len
int 80h
mov eax, 1
mov ebx, 0
int 80h
``` In Haskell I don't have to consider any processor. ```hs
main = putStr "Hello World"
``` expressiveness is about how much code you need for a certain behavior.
It's not about how many (possible fucked up) source files my compiler accepts.
|
|
| |
I don't believe this is true for Haskell but I'm not a great Haskell dev. I do know Ocaml (mostly F# actually but it's super close) and this isn't at all true for OCaml which has a ton in common with Haskell. In OCaml I make lists of functions as pipe based expressions, where each step in the pipeline is a function. It's incredibly easy to reason about and matches the lists of functions view of development extraordinarily well. I know Elm (a simplified version of Haskell) supports pipe based programming also. Functional languages haven't taken over because OO and Java are still the primary way of teaching beginners to code. Very few beginners start with functional programming.
|
|
| |
function composition sort of emulates a list of procedures but it's not really the same thing. There's restrictions on function composition that aren't intuitive.A typical task list has tons of side effects, which is not the case with your "pipe lines."
|
|
| |
I don't understand the downvote. The main argument looks valid:> If you have several tasks to execute, what is your first thought on how to plan out those tasks? A task list (aka list of functions) or a series of composed tasks placed in a single sentence (aka expression)? The answer is obvious: humans think about the list first because procedural programming comes more naturally. This I believe is the main reason why Haskell or lisps haven't taken over the programming world.
|
|
| |
It's overly simplistic. For example, Haskell's do notation allows asynchronous programming in a linear "task list" (to use your phrase). Yet most popular imperative programming languages do not; one must instead use callbacks.As to whether programming with expressions is fundamentally harder, I'm not so sure. Since Fortran, most languages have supported arithmetic expressions over primitive numeric types, since they are certainly not easier with an imperative approach (MOV, MOV, ADD, PUSH). Haskell allows one to easily take this further using arbitrary objects and algebras, for example, expressions with drawing primitives, parsing grammars, financial contracts etc. With a small amount of syntactic sugar, you can even make expressions that look exactly like imperative programming (Haskell's do notation).
|
|
| |
Forgot to mention that I didn't downvote either, despite disagreeing with much of the post.
|
|
| |
I'm not the downvoter, but I still don't understand the assertion why Haskell doesn't allow the "task list".There is nothing preventing using these "task lists" (e.g. lists/steps/procedures) in Haskell. In fact, "do notation", a core language feature every beginner learns, features it.
|
|
| |
You are right, but no beginner truly understands do notation. It's just magic to them. Do notation is an abstraction on top of monads which is an incredibly hard to understand abstraction on top of functional programming.Functional programming in essence does not allow procedures, but with special incredibly complex abstractions you can simulate procedures. If you program using said abstractions I would argue that it makes functional programming even Harder to understand.
|
|
| |
I never found monads particularly difficult.I still don't understand what the problem is with them. It's just a box with some "stuff" in, a function that makes a new box with a thing inside, and a function that takes a box and a value and gives you a new box with some different stuff inside. The hardest part for me was accepting that I can't look inside the box and don't actually need to. I do get that there are a lot of people who'll look at monads and want to know why the heck they should bother when any "decent" or "reasonable" language just lets all their functions crap all over the filesystem whenever they like.
|
|
| |
Look, I'm not saying Haskell is easy to learn, it's one of the steepest learning curves I know of in programming.But I don't think do notation is what trips up beginners (not sure if this discussion is about beginners per se). It is explained by simple mechanical plug-and-chug transformations that I think beginners understand, but am willing to hear if there is evidence otherwise.. I just think it's the other things that will eventually trip them up (I was only responding to specific claims made earlier). Also, on the point of resting on abstractions being a problem, I don't find compelling, because I'm also sure that Python beginners don't understand what "class MyStuff(object):" is really doing, but if the abstraction isn't leaky try don't need to know right away, and they can delve later.
|
|
| |
I didn't downvote either, but the first paragraph seems to me definitely wrong. The rest of the post makes an argument that I don't necessarily agree with, but I think is interesting and deserving of discussion.
|
|
|
| |
While I think you're doing injustice to the haskell community, I think it's quite interesting that I personally quite frequently stumble over relevant tools that are written in go -- e.g. syncthing. The only haskell-based app I use is pandoc (which is old) after my ruby-based (and even older) converter of choice got abandoned.
|
|
| |
Replying again since I can't edit my comment: Idris, Elm and PureScript.
|
|
| |
I wrote a small javascript compiler in Haskell and it was great fun.
|
|
| |
Snowdrift.coop is written in Haskell/Yesod.
|
|
|
| |
I prefer less powerful languages (that have sufficient power to clearly express the domain).I've found that "powerful" languages allow me to do clever things(TM) and I hate clever things(TM) retroactively when I come back to them. This gets even worse when you have multiple people on the team doing clever things(TM).
|
|
| |
>I've found that "powerful" languages allow me to do clever things(TM) and I hate clever things(TM) retroactively when I come back to them.They also allow you to do sensible things(TM) like use the one and single, battle tested, standard library's generic sort, btree, etc implementation, instead of having to roll (and then read) 1000s of your own over the span of a year... And it's a false dichotomy that you can't have the sensible things(TM) without the too-clever-for-their-own-good-things (TM) you allude too.
|
|
| |
The problem with "too clever" isn't just about cleverness that is actually in the code. It's also very much about cleverness that could potentially lurk behind any particular syntactical expression.When you look at a line of code, what can you tell about its semantics without considering non-local information? What is invariant and what could potentially be redefined? I think the answer to this question is extremely important for readability when you're not already familiar with a codebase.
|
|
| |
That's a great call out, though I see this in most languages, Haskell and go equally have implicit non local behaviours, but I find the functional paradigm tend to have less, because of side effect free functions.
|
|
| |
I find laziness (coming from more imperative, non-lazy languages) to be a huge source of implicit non-local behavior. Trying to figure out how efficient my code is without specific knowledge of how certain functions in the standard library work/how the compiler interprets my code is a impossible.
|
|
| |
You're right, the execution context is non local. I'm actually not a big fan of laziness as the default myself. I love the option of lazy evaluation, because sometimes, it makes things really easy, like for infinite sequences, but most of the time, it does add complexity in reasoning about the code.There's functional alternatives to Haskell which adopt the non-lazy as default, such as Elm, Rust, SML, Lisps, Fantom, Elixir, Scala, etc. I have to admit though, this is a bit of a trade off situation. Working with pure functions is very simple, but to map those to unpure behavior, like IO, you need something that isolate that, and without laziness, I'm not sure how you can get that to be practical.
|
|
| |
I'm not yet familiar enough with Haskell to be honest, but languages like Scala, C++ (to the extreme) and to a lesser degree C# and Swift have a lot more support for non local redefinition of syntactic expressions than Go.I totally agree with you about pure functions. They can be a real simplification, but only if any deviation from purity requires special syntax at the call site. Otherwise you're back in the guessing game.
|
|
| |
but only if any deviation from purity requires special syntax at the call site.Haskell makes it easy to write data and function types that enforce purity at the call site, throwing a type error if you make a mistake. Not just purity though, you can restrict these types to any arbitrary set of methods of your choosing. This lets you do things like parsing a blob of JSON and having all functions that depend on the result be guaranteed not to have to deal with a failed parse or otherwise invalid data. The fact that the data is good has been encoded in the types, preventing you from passing bad data by throwing a type error.
|
|
| |
> special syntax at the call siteDefinitely Haskell's game then
|
|
| |
No one is arguing to roll your own when there a generic sort in the standard library. The issue with clever things is that people try to make a all purpose solution for what could be a simple one off solution. The all purpose is only used for 2-3 uses and it never battle tested.
|
|
| |
Clever things like a sort function that can sort both ints and floats and any new type that implements a comparison function?
|
|
|
| |
I am not arguing for Go. I am arguing that there is an issue with some complex languages. Go is trying to avoid that issue. I agree that the issue is real but also think go takes it a little too far.
|
|
| |
Or a quicksort in constant memory.
|
|
| |
Haskell, if that's where the sneer is addressed to, absolutely gives you that.It's just toy first-timer Haskell examples of quicksort (that still work, mind you) that don't. But then again, take a language like C, and consider how many things (like proper error handling) are missing from 99% of the toy tutorial examples available.
|
|
| |
I feel like you are arguing against Java, not against Haskell.
|
|
| |
If by clever things you mean "pointfree" all over the place then i agree but if you mean "abstractions" in general then that's a different thing and this says it best.“The purpose of abstraction is not to be vague, but to create a new semantic level in which one can be absolutely precise.” ~ Edsger Dijkstra I think i can honestly say i did not fully get that quote until i learned Haskell
|
|
| |
Great quote. I think if people think of level of abstraction it makes more sense. All code is an abstract representation, but to solve a problem, it's easier to work within an abstraction level that fits the problem.
|
|
| |
Agreed, but I'd take this into a different direction and argue that code generation - as it is often practiced in Go - is a Clever Thing (TM).A language like Haskell lets me write extremely complicated code, but it also lets me concisely express the constraints I want in the form of types without getting too verbose. Go lets me do neither.
|
|
| |
I think it depends what type of "power" we are talking about. Ruby's meta programming stuff certainly falls into the problematic area. A powerful type system might be seen as quite the opposite because it can make code more predictable.
|
|
| |
Powerful languages do clever things under the hood so you don't have to(TM). All programmers use this same abstraction so there is no confusion.Less powerful languages force the programmer to come up with their own clever abstractions so that if you have multiple programmers you'll end up with 20 different clever ways of doing the same thing.
|
|
| |
I hate "clever things" as well, but it's worth being precise about what constitutes cleverness in different languages.Dynamic languages - ruby, python, bash, objective C etc - allow all sorts of cleverness - such as monkey patching, attribute injection, dynamic lookup - because of their dynamic runtime. All the checks about whether the code makes sense is performed at runtime. Languages with more mature/strong/static typesystems are initially restrictive - but the emphasis in cleverness is to be able to know upfront - and at compile time - if the code expressed is consistent in the way data and effects are represented and expressed. Of course - there are other kinds of code complexity -in terms of ease of understanding, and maintainability - that will always involve tradeoffs considering the task at hand and nature of the programming environment (solo, community, commercial, academic etc).
|
|
| |
This is 100% my recollection of my 2 months of Scala.
|
|
| |
> If you work in a job that forces the use of a less powerful language than what you've been exposed to, you can, I think, go through a sort of depression. You simply long to use the tools that you know hold much more power yet must resign yourself to the tools you have.I was forced to use go for a job, it actually made me feel stupider, so much that I didn't know how to use the right types anymore when i moved back to a more sophisticated language. I was afraid the language wouldn't let me do what I want, just like Go. There will be a "go programmer generation" and it's not going to be pretty. If people deem "generics too complicated to understand", what can you expect from programmers who don't want to learn anything? The only thing I find nice about the language is implicit interfaces. But not at all cost.
|
|
| |
Go has a lot of advantages.Yes, you are a little bit limmited sometimes because of the lack of generics and the like, but at the same time, if you learn to use Go interfaces well, you end up using them most of the time happily. The parallelism primitives are great too. As much as I love complex type systems, pattern matching, algebraic data types etc., I know that if Go had those things, it wouldn't be the language I love to write everyday anymore. Somehow, Go is a really practical language and you end up being very productive in it, at least from my experience. Saying that people who write Go should be stupid is shallow thinking. Complex languages aren't the only complexity that you should have. Some people concentrate on sophisticated architectural design (like Kubernetes). There is one thing in Go, that I haven't met anywhere else. I can really understand any codebase I stumble upon in a matter of seconds. This is a huge boon to productivity. Anyways, I never had any problems switching between Go and languages like F# or Scala on a daily basis. Rust looks interesting, though I think it will end up being too complex for a lot of people to understand, which will hinder its growth.
|
|
| |
> There is one thing in Go, that I haven't met anywhere else. I can really understand any codebase I stumble upon in a matter of seconds.Care to elaborate a bit more? I understand that when you see for(...) { b[i] = f(a[i]); } then you instinctively know that it's a loop that maps over some sequence, and so on. But let's say you have a 50-100kloc codebase with a 100 files in it. In what unique to Go way does Go help you understand the big picture/coarse architecture of such a project?
|
|
| |
Oh, that it doesn't.I meant smaller parts of code, and not for loops actually. (I prefer maps, filters etc.). Things like gofmt and the simplicity (in that, small number of concepts) of the language lead to most people writing fairly idiomatic code. This way most codebases look the same and you read each other's code as it were yours.
|
|
| |
I'm just getting into Go and have found what you said to be true. It's easy to read other people's code on GH (Java is much the same way). Compare that to something like Scala, where no ones code looks the same.
|
|
| |
I'd like to echo what you said. I tend to be a victim of analysis paralysis, spending hours thinking up the most clever design. Go offers far fewer alternatives so there goes the temptation. Given a smallish task, I simply sit down and grind out code, and a couple of thousand lines later I have a simple, self-contained executable that does what I wanted to do. I feel there's a kind of magic in Go's brute simplicity.
|
|
| |
Check out sourcegraph, it's great for reading Go and Java code and has a GitHub integration.
|
|
| |
Honestly the biggest thing go has going for it is gofmt. Its one of the few things I miss when I work with other languages.
|
|
| |
Doesn't every semi-decent IDE have an option to reformat your code? What makes gofmt so special?
|
|
| |
That all code is written the same from the get-go, and you can readily browse it in places like github, etc.
|
|
| |
No external choices the language dictates style. It ships with the sdk & contributors are expected to use it so there is a cultural agreement out of the box. It's a cli tool so it's immensely scriptable and easy to integrate.
|
|
| |
I think it may actually be the thing I miss the most. (together with implicit interfaces)
|
|
| |
My biggest problem with it is implicit interface.I genuinely despise that about it.
|
|
| |
Why don't you like implicit interfaces, it's the feature I love the most actually?
|
|
| |
I like it as well although I can see there would be times when you might want to be explicit.
|
|
| |
I understand, but why would it be the greatest issue to despise, I'm just curious.
|
|
| |
So, do people who write timeseries databases in Go (like influxdb and prometheus), people who solve complex infrastructure problems (like k8s and docker), people who do distributed software (like consul) feel stupider because they write in Go?
|
|
| |
Probably not. As someone who finds Go cringeworthy, I still find it extremely useful for certain types of programs.
I write far simpler Microservices in Go, and I don't feel stupid. It's pretty rough and boring tool, and using it is a chore, but if the tool fits the job well enough, you can get good craftsmanship done even with a hammer.I think it's in the same position Java found itself in the early 2000s, another boring and unimaginative language that ended up being a workhorse.
|
|
| |
A good engineer can build a bridge out of rotten trees.But they could still build a better bridge, in less time, with proper materials.
|
|
| |
You know, I think, maybe it's not a coincidence that almost all high-scale infrastructure tooling is written in Go nowadays.
|
|
| |
Almost all high-scale infrastructure tooling? Are there many databases, parallelism frameworks, indexing systems, or machine learning frameworks in go?I’d be interested if you can name any fields where non-Google go projects dominate. I only know of containers.
|
|
| |
Service Discovery and configuration (Consul, etcd), Monitoring (Prometheus and Influx), non-container cluster management (Nomad), databases
(CockroachDB), messaging (NATS, Jocko), Terraform. Other than that, I think everybody noticed the frequent posts "xxx database written in Go". Also, most high scale computing companies now use Go for their infrastructure parts.EDIT: You can also check out all the stuff the CoreOS team puts out.
|
|
| |
I’ve actually never used any of them, and none of the devs or ops people I usually talk to used them either.We never even heard of any of those except for CockroachDB. Are you sure these are "dominating the entire industry"-products, as you mentioned before? Or is it more that they’re only used in SV?
|
|
| |
Totally not, I'm working in Poland. And, I'm sorry to say, if you haven't heard about Consul, then you're really not in a position to judge if infrastructure tools are or aren't written in Go. Consul is the de'facto industry standard for service discovery for a few years already.EDIT: Cockroach is actually the most SV'ish of all those.
|
|
| |
I think you exaggerate. Consul is probably the go-to discovery service discovery for new projects at startups and tech companies that aren't afraid of the bleeding edge, but if you count installed base and less bleeding-edge companies ZooKeeper probably still reigns supreme.
|
|
| |
Definitely. And while ZooKeeper is common, (and apparently Consul is now the hot new thing, I guess?), many places also build their own systems.I wouldn’t call Consul "industry-dominating technology", as you’d do with ElasticSearch, Hadoop, Spark, etc.
|
|
| |
I'd call Spark a much more bleeding edge product than Consul. And no, it's not the hot new thing at all. It's just a standard tool.
|
|
| |
And the funny part is that all of them are Java. :)
Well, except Spark which is Scala.
|
|
| |
Yes, exactly. In a list of most important tools out there, Go only turns up for a few tasks that used to be done in C++, as layer between the OS and services.Almost everywhere else, it’s not "industry-dominating" at all, so I’m not sure why people use the argument "go is used everywhere and everything is in go" (which is obviously false) to argue that go is a good language. (besides, C is actually used everywhere, but no one would use that as argument to show how C is a good language).
|
|
| |
> Consul is the de'facto industry standard for service discovery for a few years already.So standard that it doesn’t even have a Wikipedia page? Nor is there even one for this specific type of datacenter-oriented service discovery? This is all pretty niche tech, used by a handful of startups that engineer technology for billions of users, but not used anywhere else. I’ve never seen any companies on national scale operate with this – nor would they need to.
|
|
| |
The thing is, I know a lot of companies using it at nationwide scale (non-startups). I think there's a reason you are getting downvoted.Anyways, Service Discovery like this is less niche than containers are. If containers are still niche for you, then there you go.
|
|
| |
> I think there's a reason you are getting downvoted.Well, so are you, your comments are even so grey I can barely read them anymore (and I can’t downvote you). Regarding the rest: Containers are far less niche than Consul is. For service discovery, the market isn’t nearly as monopolized as containers are. And containers are far more used than consul is – most places I’ve seen build their own service discovery systems, or hardcode a main node and then have services in containers register with that, etc. While for containers, at the moment, there’s almost only docker, due to the ecosystem.
|
|
| |
etcd is also used quite a bit for service discovery, but it's written in Go too :)
|
|
| |
Are you using it personally? I never met anyone using it for that. How's your experience with it been? (in case you did use it)
|
|
| |
In the ops world, these tools are pretty much the standard. If you look at the state of many of the tools targeted at infrastructure were, with in the best scenario a decently maintained Python code-base - it's no surprise the ops world jumped on Go.Go's main attraction there is that it's a simple, script-like language that compiles to easily deployed static binaries. That enables people who's main job is not necessarily coding, but with a good scripting background to solve problems by writing more complex/performance-critical stuff which was previously not feasible in languages like Python, Ruby, ... These projects requiring more performance would have required a seasoned dedicated C/C++ coder before, but now they can be started and maintainer by people initially not focusing on programming, but on solving problems they are facing in their daily job. They don't have the time to invest learning a language with complex typesystems where initially, you'll be fighting the language itself more than solving actual problems. For programmers, this investment is useful, it's their main skill and is worth something. Ops people have other things to worry about, and just want to solve problems. It does mean wheels are being reinvented, but the major projects now have pretty smart people working on them, hitting the limits of the language and it's garbage collector, pushing them forward.
|
|
| |
> In the ops world, these tools are pretty much the standardThankfully, this is true only in the HN bubble. One example: Amazon.
|
|
| |
Amazon is a big enough company that it's obvious that they'll roll their own stuff usually.
|
|
| |
I am forced to use PHP at work. I hope that the stimulation I throw at my brain on lunch breaks and evenings offsets what you describe. Maybe it isn't enough and I'm on a long road down towards a point of no return.That said if I look back on the past 6 months, I believe I have improved. In fact, a core principle I hold dear goes as follows: "If I look back 6 months and I'm not any better, there's a big problem". At one point, it seemed to annoy my non-technical boss because he couldn't understand why some of the code I wrote 6 months ago is perhaps not as good as it could be. He has kept me around so maybe it isn't a bad thing :) We will, I think, always have programmers who don't want to learn anything new. It's interesting that there exists "Continuing Professional Development points" in the UK for healthcare: you get a rebate for each hour spent on courses to further your education. Maybe software engineering would benefit from this given the amount of, quite significant, cost sunk into large scale projects. Their universal credit programme should be incentive enough. Entirely different problem though. If not Go, then what?
|
|
| |
I experience this depression constantly. It's very rare to find people or places using Erlang, the community is extremely small and seemingly getting smaller all the time, and new things attempting to approximate its core strengths (often badly or incompletely) keep coming to the market and only further exacerbating the situation.So I end up stuck working with crappier tech, crappier tools, and dealing with the effects of fewer and fewer resources going into improving the last 5% of annoying things about Erlang. While a deluge of resources go into partially reinventing the first 50% of it over and over again in some other random language.
|
|
| |
Having now read the entirety of the originally linked post... I'm doubly depressed.Erlang would have been a great fit for Pusher's problem space. Blargh! :-(
|
|
| |
What are your thoughts on Elixir?
|
|
| |
It's fine. Parts of it seem really weird to me.Agents don't seem like they should be in the standard library. The pipe operator should probably have some sugar for representing the first implicit argument to the function because it's weird to have things like Enum.map/2 become written as Enum.map/1 in that context. Protocols are bizarrely complex (you can do the same thing in Erlang with about 10% of the ceremony) and made weirder by the fact that structs are represented as maps with a special key instead of as records, so field access and key walking has an unnecessary amount of overhead comparatively. It doesn't feel "complete", because there are pretty big chunks of the underlying system that still make you extremely aware you're really running atop Erlang (debugging, tracing, profiling, etc.), up to and including the docs themselves which often just point you straight to the relevant Erlang documentation. I like that it has brought a lot of new energy to the BEAM ecosystem. I like the tooling and focus on docs and testing that have come with it. I like the macro system, even though I think it's overused by the community and often to ill effect, but the system itself is much cleaner than Erlang's. tl;dr... Sure, why not. :-)
|
|
| |
It's stoicism for programming. Accept what you cannot change (tools, language), and focus on what you can change (attitude, striving for great solutions).
|
|
| |
>Accept what you cannot changeOnly you can easily change languages... Stoicism was referring to actual things one cannot change. Like, say, death, or their status as a roman slave. So it's less stoicism and more "our way or the highway".
|
|
| |
> Only you can easily change languages...That's a ridiculous blanket statement to make. Changing languages can incur a very high cost. Making peace with this as an individual seems like a useful skill to acquire. If you want to insist then there's no connection to stoicism, then fine, but don't try to pass off language choice as something that's easy to change along the way.
|
|
| |
>That's a ridiculous blanket statement to make. Changing languages can incur a very high cost.Yes, but you can always not even get started with Go (or whatever language you don't like) to begin with, so no change cost there. Also, you can always change organizations, especially in today's "buyer's market" IT landscape. Better than working with something that you feel makes you miserable and non-productive. And it's best to apply the principle of charity to what one responds to. Things are not always meant at their strictest interpretation. Compared to the "things you cannot change" stoicism was concerned with (like death, your past, some genetic misfortune, etc.), a programming language is nothing especially unchangeable -- which was my point. That said, I absolutely love your libs and programs and use them all the time :-)
|
|
| |
> That said, I absolutely love your libs and programs and use them all the time :-)I don't write Rust or Go code but his repo seems like it is a great showcase for both languages - from what I read, it seems Rust is a little bit more expressive than Go but, having not written anything significant in either, I'm not the best judge. I think Rust especially should be on on my "to explore" list :)
|
|
| |
I'm skeptical of declaring what stoicism ought to be applied to. It seems like a generic framework that works at the level of perception rather than a finite whitelist of approved things.I would consider changing organizations to be an immense cost. It would be perfectly reasonable to apply stoic principles to make peace with language choice. Many of the costs associated from using a tool one believes is inferior are related to one's own emotional reaction to it. If you can conquer that, then you've made yourself a lot more flexible. I can't imagine how anyone could claim that changing organizations is something that's easy. :-/ Thanks for your comment on charity. I admit that I came on a bit strong after rereading my comment. I sometimes get frustrated quite a bit with your commentary on Go, and I let it get the best of me. Sorry about that.
|
|
| |
Much more fundamental is the idea that suffering itself is an unchangeable fact of life. This, I think, is what makes it so similar to Buddhism. Inact, suffering is a necessary contrast to bliss and one can induce bliss via temporary self-induced periods of bliss.Perhaps then the stoic lesson is we should write some machine code from time to time to really enjoy the benefits of anything above. Or, to relate, writing a bit of Go for some time when you mostly write Haskell.
|
|
| |
An individual can easily change languages.It's significantly harder for an organization of 10+ developers to change from a single language to another single language without a technically savvy and respected manager just mandating what it will be.
|
|
| |
I agree to a certain extent. There are times when I've seen great beauty in something as mundane as pre-generics Java, when someone put together a well factored set of loosely couple objects to accomplish a task.Other times, though, being forced to work with obtuse tools feels like trying to construct a skyscraper out of sticks and dog crap. No matter how elegant a structure you manage to construct, doing so is a miserable experience because you're never able to transcend the feeling that you're working with twigs and poo.
|
|
| |
I understand the analogy but I think many ancient cultures did great things with twigs and poo. In fact, there are many wattle and daub structures still standing - there's one around the corner that's protected by law.
|
|
|
|
| |
>It is, I think, going to be very difficult to enjoy writing code in a less powerful language when you are exposed to languages that hold awesome power.As someone whos written code in Haskell (even some medium sized programs) and moved on just fine, no. Frankly, Haskell can be an excercise in frustration.
|
|
| |
It depends, I imagine, on how much you enjoy working with it. If it was so painful, I think moving to another language would be refreshing as opposed to depressing.
|
|
| |
I'd concur. It's mostly around tooling for me (and Stack has helped here) but whenever I pick up Haskell for an out-of-work project now I end up putting it down again. I find Rust increasingly pleasing, although the lack of constraints on side effects is frustrating compared to Haskell's purity.
|
|
| |
You're not wrong. When you have access to what feels like "the answer" to so many problems and you repeatedly see your company experience those problems...over and over again...it can be depressing when the solution falls on deaf ears.
|
|
| |
TL;DR : get off your high horses and get to work
|
|
| |
Isn't it possible to choose the best tool AND get to work? Isn't it our responsibility, as software engineers, to ensure that those who pay for our services get the best bang for their buck? That includes choosing a programming language that actually helps developing new features faster, while guaranteeing that the system is available, robust and maintenance costs are low.Choosing a language that makes these things difficult, is a poor choice. We should not accept subpar languages to be used just because.
|
|
| |
You're right. If you have the freedom to choose the best tool, I think you absolutely should.I think the problem lies in the fact many programmers don't have the luxury of making those decisions. I imagine if one of those programmers refused to use the mandated tools, they'd get the sack. Many of those programmers, again, don't have the leg room to handle that. Plus, I don't think any employer would look favourably on "I got sacked because I refused to use language X because language Y is superior". I know that isn't the kind of attitude to work I would appreciate. The companies that allow freedom of choice in tools and languages are perhaps the best companies to work for. Unfortunately, those companies hire the best programmers in the first place, or at least set the bar with that in mind. Most programmers, including myself, are not of this calibre.
|
|
| |
Counterpoint: Programmers don't choose the best language, they choose the language they are comfortable using.Source: at a previous job we had a "use whatever tool you want, just get the job done" policy. After two years we had everything from Perl to JavaScript in the same project. Edit: ... and this project was a desktop application!
|
|
| |
This is true, from my experience. When I'm screening candidates, I often ask them to solve a problem in their language of choice. They choose PHP, even though I know the same problem solution can be expressed far clearer and shorter in Ruby.Perhaps we are unearthing various kinds of choices i.e choices for best fit vs best convenience. Perhaps that's a case of engineering vs something other.
|
|
| |
Despite its checkered past, isn't php these days a more powerful language than ruby?
|
|
| |
That's the first time I think I've heard that. I've heard many times that PHP is much better than it used to be, but in what ways is it more powerful than Ruby?
|
|
| |
One nice feature is that it is optionally typed (php 5.6 had this for classes and arrays php 7 added support for specifying primitive types.)Also, the newer parts of the language such as the object system and the presence of closures and first class functions are almost pleasant to work with: for the most part, the annoyances come from the old stuff
|
|
| |
Not at all.I abused Ruby's metaprogramming capabilities to build a small functional language with operators. I monkey patched symbols with operator overloading to add the typical functional operators such as composition (using * since . isn't possible). I created operators for piping, partial application, etc. Terrible abuses that probably reduced inter-library and future compatibility to 0. I don't think that would be possible in any way with PHP without PECL (there is an extension for operator overloading magic methods) whereas it is with Ruby out the box. If the language allows me to abuse it in such a way more than another, I reckon it holds more expressive power. Case in point: Lisp.
|
|
| |
I'm really curious about this! I don't know much about recent versions of PHP - what do you think makes it more powerful than Ruby?
|
|
| |
> Programmers don't choose the best languagebad programmers do that. As professionals, we should refuse to implement something in the wrong language and allow somebody else to do the job.
|
|
| |
I ended up calling this engineering. Language is just a mean to an end, and learning its limits, picking one that match the most the problem at hand.That said FP/Lisp are very hard to leave, I can understand that.
|
|
| |
Or perhaps "we are programmers - if you wish for X in a language, make it so". Maybe a bit too long? I certainly don't want to sound prickly :)
|
|
| |
I find that I end up picking the excellent language engineering that is in Go almost any day over languages with many features that simply don't work well together.Just the other day I was struggling with the hopeless nightmare of Java's lack of true first class functions (but still having some first class function like semantics) combined with embedded classes restrictions combined with generics thrown in for good measure (AsyncTask embedding in an activity and updating result in the activity). Go features are clearly specified in a very readable and fairly concise spec and guide and its features fit seamlessly together virtually every time
|
|
| |
> Go is just too different to how I think: when I approach a programming problem, I first think about the types and abstractions that will be useful; I think about statically enforcing behaviourI see statements like this a lot from Haskellers and I think its overstated. Anecdotally, after going from Python to spending 3-4 years in Haskell then going back to a dynamic language (Elixir) I've come to the conclusion that how you think when programming is very much a learned trait that works for that language. It's neither good or bad, but it's educational nonetheless. Haskell and other languages like it forces you to have a very unique mindset that can overpower previously learned languages not like it. And it's in no way permanent. After I stopped using Haskell I was like "ugh, I need types! wth!". I wanted to go back to Haskell only out of familiarity, but as I continued after a short while I wasn't even thinking like that anymore. The thought rarely occurred. I stopped thinking in Haskell and started thinking in Elixir.
|
|
| |
I disagree firmly. I fought many languages, even statically typed like java. The day I found out about lisp and FP I lost it entirely. If I wasn't a young student in a Java era I'd have saved 5 years of headache. I tried, many times, I always end up thinking like mathematicians (at least partially, I have shallow math skills) and/or FP-ers. That's the only time my brain was allowed to function and walk toward solutions. It's how I finally ended up understanding C (before that I couldn't write nested loops without making a mess).I don't think it's entirely placebo or emotional drive too, because I really liked the Java OOP world at first, and went deep into it; but got nothing.
|
|
| |
I had very similar experience when switching language there and back. New one feels unnatural and ineffective at first, but then you learn how to use it effectively and adjust.I have come to think that it is precisely when the language switch feels the worst, the learning is the most beneficial for long term. If it is popular and you feel like it does not really work, it means you have conceptual learning to do.
|
|
| |
> I stopped thinking in Haskell and started thinking in Elixir.https://youtu.be/yyzK6e6py9A?t=7 Seriously though, I think Elixir (with its guards and pattern-matching facilities) occupies a nice middle ground between strict static typing and the dynamic typing of procedural/OO langs.
|
|
| |
I cannot speak for Elixir, but coming from the Erlang world, I'm sure it's a fine language that has sane defaults, much like Clojure.However I switched from Python to Scala and besides the performance issues and the poor handling of async I/O that I had with Python, by far the biggest problem with Python was all the insecurity while developing with it. It drove me insane, because we had a production system that always had problems due to how poorly Python's ecosystem treats errors. Just to give an example, at that time we were using BeautifulSoup to do HTML parsing, a library resembling JQuery, except that BeautifulSoup can return nulls all over the place and no matter how defensive I got in my coding, null pointer exceptions still happened. And what was the solution for async I/O in Python? Monkey patching the socket module by means of Gevent or Eventlet of course, which didn't work for native clients (e.g. MySQL's) and if this wasn't problematic enough, you had to guess what worked or not and when. The worst kind of magic possible. When I switched to Scala, all of these problems vanished. This isn't about having tests either, because tests only test for the behavior you can foresee, unless you do property-based testing with auto-generated random data and in this regard static languages like Scala and Haskell also rule due to the availability of QuickCheck / ScalaCheck. You see, my Python codebase got really bad because of fear of refactoring. Yes, even with plenty of tests and good code coverage. Because actually the tests themselves became a barrier for refactoring, because in a language like Python there is no such thing as information hiding and coupled with dynamic typing, it means that your tests end up tightly coupled with implementation details, will break on correct refactoring and will be hard to fix. Nowadays when I'm doing refactoring with Scala, most of the time it simply works as soon as the compiler is happy. Some people bitch and moan about that compiler being kind of slow and it is, but the things that it can prove about your code would require specialized and expensive tools for other languages and that wouldn't do such a good job. Going back to dynamic languages, there are also languages like Clojure, which get by with sane defaults. For example in practice Clojure doesn't do OOP-style dynamic dispatching most of the time and functions are usually multi-variadic and capable of handling nil values gracefully. This is not enforced by the language, being simply a hygienic rule accepted by the community. However, relying on such conventions requires (a) capable developers that (b) agree on what the best practices should be. And the issue that people miss when thinking of the merits of such languages is that smaller communities are always homogeneous. So when giving as example a language like Elixir, if it hasn't drove you insane yet, well, you have to take into account that Elixir might not be popular enough. So basically I prefer strong static typing because I'm not smart enough or wise enough to seek and follow best practices as often as I'd like and because I don't have much faith that other people in the community can do that either, especially after that community grows. The biggest irony imo is that Python is right now the anti-thesis of TOOWTDI.
|
|
| |
> because in a language like Python there is no such thing as
> information hiding and coupled with dynamic typing, it
> means that your tests end up tightly coupled with
> implementation details, will break on correct refactoring
> and will be hard to fix.
Would you mind to provide more details about this point? I thought dynamic languages make testing easier, because you don't care about the type of the object as long as it works with the same API (if it quacks…).
This also supposedly makes it way easier to mock objects. Now there you are basically claiming (unless I misunderstand your claims) the opposite.
|
|
| |
"Dynamic languages make testing easier," paints a very broad brush. They have some very nice benefits, but also a lot of edgecases.* An argument might be an int when you expected a float. This can be addressed by casting things with `float()` in every entrypoint, but it's more defensive to require a float and error out otherwise. Either way, problems only appear at runtime.
* An object might be null and this only gets picked up at runtime - using Rust it's been very nice to have to explicitly say when things can be null. I've found it can feel like luck when a complex Python program works correctly for real-world input, but it's also very powerful at getting something done. Tradeoffs.
|
|
| |
> An object might be null and this only gets picked up at runtime - using Rust it's been very nice to have to explicitly say when things can be null.This point actually frustrates me greatly about C# 7. At some point, there was talk about non-nullable references being introduced into the language, which has been put on hold for at least another version. Personally, I think that feature should have been in the language since version 1. Every runtime error due to a null reference is one too many.
|
|
| |
I have thought about creating a template struct with a single member that blows up if you try to insert null. Something like this: public struct NotNull<T>
{
private readonly T val;
public T Value {
get { return val; }
}
public NotNull(T v)
{
if (v==null)
throw new Exception();
val = v;
}
}
However you can still create a class FOO with a member NotNull<T> xxx and xxx may be entirely uninitialized. It can be then passed from one method to another until you try to read it, so it basically does not give any guarantees.I thinks this is why they pulled non-nullable types from C# - they couldn't figure out what to do in this case. If I were them I would force the constructor of FOO to blow up on exit if xxx is not populated. That would help a little bit. Although it still causes problems if a method is invoked from the constructor. So yeah, it's a tough problem. Maybe they could add a check when a field of type NotNull<T> is read to throw an exception? That way you're still possibly in trouble when reading a field, but at least you can be confident that if you receive a method parameter NotNull<T> you know it's not null.
|
|
| |
I also thought about solutions like this for a while, but frankly, every option sounds like the attempt to fix a leaking dam using duct tape.Things like this have to be implemented properly. The problem is that this ties into the core architecture of the platform, not only the language - references are basically the most important implicit primitive there is in the CLR (implicit because not even the IL handles them like the other datatypes). I'm not very informed on the subject, but I suspect that without a proper way to declare locals and members non-nullable in the CLR, the matter would get rather complicated - there'd be no possibility to communicate the requirement through method signatures, for example. And since you could call pretty much any C# method from other languages, I'd expect various problems there. Of course, one option would be to only handle the initialization and assignment checks in the C# compiler (I guess that would be much easier) and simply generate conditional ArgumentNullExceptions in every method, but I suspect that this would be a problem for performance reasons.
|
|
| |
You are correct. The problem here seems to be that the OP's tests depend on implementation details, which of course shouldn't happen. Presumably this wouldn't occur in a language where you could hide these implementation details (private attributes/methods, etc). That said, Python doesn't force you or encourage you to write your tests like this. Just because it's possible to access the implementation, doesn't mean you should.
|
|
| |
Tests always rely on implementation details. The implementation details are what you're testing, after all. The problem is relying on the wrong ones.You can hide all the details you want in the language, unless it's a pure function, there's side effects that can be relied on improperly. Often times this is because of hiding implementation details. A GenerateCert function might read from a config file. It hides the detail that the config file exists, but then to test it you need to create the file in the right spot. Yes, testing Add(int, int) is easy in any language. No one's struggling with those tests.
|
|
| |
That's simply more precarious than making an anonymous object that adheres to a statically-enforced protocol.
|
|
| |
But writing Python is a pleasure :( And writing the same thing is Go feels like wasting my wrists for nothing, specially for throwaways and simple web code...Maybe one could write Python and turn a "strict" flag for the module (project?) where you'd have to fill in the types and it becomes Scala-like.
|
|
| |
> But writing Python is a pleasureThis sentiment seems commonly shared, but by no means universally.
|
|
| |
Don't know if you are aware, but python now supports optional type hints.
|
|
| |
Is there a runtime that enforces them? I've certainly found them useful in restricting autocompletion, but I've still been able to feed in accidental Nones.
|
|
| |
I have yet to try it, but there are things like mypy and PyCharm apparently has an integrated type checker now.http://mypy-lang.org/
|
|
| |
You kinda have this already, it's called mypy.
|
|
| |
Scala, where code you've written 6 months ago really does look like hieroglyphics to you or to anyone else right now who doesn't know Scala intimately ;)I would agree that the Elixir community is currently very homogeneous and therefore very pleasant. To some extent, that is true of many new langs and their early communities. I disagree that this is a premature strike against it. I say enjoy it (and work within it) while it lasts, if you can.
|
|
| |
Not sure why the wink is there :-) but the Scala codebase I'm working on is 3 years old already and it is the cleanest codebase I've ever worked with. Maybe my experience is not representative of the whole ecosystem, I have certainly seen really messy Scala codebases in my ~5 years of experience with it, just like I have seen for other languages too.However Scala is the perfect example for my opinion though - people perceive it as a TIMTOWTDI language, but that's because it got an influx of developers from other communities, the most representative IMO being Java, Ruby, Erlang and Haskell. After all, a determined Java/Ruby/Erlang/Haskell developer can continue to program Java/Ruby/Erlang/Haskell in any language, especially if that language is expressive enough to allow it. Having such an influx gave rise to different opinions and mentalities of how to do things. Speaking of the web crowd, there's a big difference between Scalatra (see scalatra.org) which appeals to a Ruby crowd and relies on un-typed and unsafe APIs with baked in mutation, versus Http4s or Finch, which are very FP and typeful, versus Play Framework which takes a more balanced approach, being like a Rails framework (including the kitchen sink), but with sanity and types. And the Scala community is actually on a convergence path right now, due to better books, tools and libraries. Which for me actually doesn't matter that much, because in Scala I can rely on the compiler such that my code is not so dependent on learning and applying best practices, when compared with other languages. I was NOT saying that Elixir is pleasant because it is unpopular. I am saying that this might be a factor, that I suspect that in such instances the social aspect plays a larger role than we give it credit, especially when dynamic typing is involved, because with dynamic typing you rely on agreed upon best practices that the community must push forward, much more than static language proponents need to do. And personally I think that a language's technical merits should be judged after it gets popular enough to see an influx of developers with strong opinions from other languages. E.g. in a static language a Maybe/Option value forces you to deal with the possibility of a value missing, whether you want to or not, being also useful as documentation that's always up to date. Whereas in a dynamic language you have to rely on documentation (poorly updated because it's the last thing that people update), soft community rules, taboos, etc in order to do the right thing.
|
|
| |
I have been working with scala for 10 month now. Don't have a problem reading my code or open source / coworkers code.
|
|
| |
When I look at a programming language, I look at the community and how it gets stuff done and projects that are noteworthy.Something about Haskell strikes me as different. Despite the buzz about it, I don't see many projects for it other than shellcheck, pandoc and xmonad, and for two of those, there's better solutions around (sphinx, awesome/i3). The other thing is the general flow I've see with Haskell programmers, many really tie down their identity do it. They see programming as a crossword puzzle for them to solve in short term, not as something other programmers have to read later on. They're not very empathetic to the idea the runways dwindling and you have to ship sooner rather than later. In addition, I found that the Scala / Haskell developers I knew took golang to be quite the nuisance. They find gophers pesky. I think the reason why is years of Haskell teaches them to overengineer and complicate things needlessly. It's frustrating because they're not aware of it themselves and take offense, even blame you when you point it out to them. Maybe I've just been unlucky. In 10 years, I've never had people who consistently failed to ship, been mean and arrogant as scala / haskell programmers. They take the slight criticism as an assault on their identity.
|
|
| |
> Despite the buzz about it, I don't see many projects for it other than shellcheck, pandoc and xmonad, and for two of those, there's better solutions around (sphinx, awesome/i3)I tried awesome for two weeks and had one crash. I've run XMonad for 5+ years and had no crashes. Just because they are supposed to accomplish the same things does not mean that they're equal. One is better and it's because of the choice of language. It's very hard to take your post seriously when you include something like this in it and you don't bother to qualify it even in the slightest.
|
|
| |
> One is better and it's because of the choice of language.
> It's very hard to take your post seriously when you include something like this in it and you don't bother to qualify it even in the slightest.I looked in your history and yup, you are partial to Haskell. Why do you feel a need to immediately judge. Why did you say "hard to take seriously". Why didn't you just ask politely for more details? In my opinion, you already had your mind made up. Just like the blog poster did. > I tried awesome for two weeks and had one crash. I've run XMonad for 5+ years and had no crashes. That's your personal anecdote. There's no evidence that the language could have prevented the crash. And that doesn't make the window manager "better", which is subjective. What are more people using? Awesome and i3. Primarily because they don't want to deal with Haskell when they could be doing lua or a simple config.
|
|
| |
> I looked in your history and yup, you are partial to Haskell.It's convenient that you have no post history to go through, since you decided that the nonsense you're posting shouldn't be tied to your real account. Your bias is astounding, yet you go for the "your mind was already made up" card. I like Haskell but by no means am I tied to any single language and I'm not the one making posts about how all the users of a language seem to have undesirable traits. Even better, you then criticize using "personal anecdotes", when that's exactly what you did in order to condemn the users of two languages. You half-dampen it by saying "Maybe I'm just unlucky", but the entire point of your post is "these two communities suck". > That's your personal anecdote. There's no evidence that the language could have prevented the crash. Any ML-family compiler will catch more crap at compile time than any C compiler. Adding lua on top will not solve any of that. None of that is opinion. It's just how the languages are designed and how they use types. Adding automatic memory management to that makes the gap even wider and in short, that stops the vast majority of crashes that you can get. C + lua is great for lots of things, but everything is a tradeoff and here the tradeoff is that you're very likely to get crashes. Good intentions alone can't prevent segfaults and most C software is proof of that.
|
|
| |
If it were node.js, I highly doubt I'd have a person coming out of the woodwork with cleverly placed quotes as if it's a legal threat.Then blaming me for being biased as if you don't already have a conclusion derived. This is the behavior I experienced - to me, this feels like projection and trying to bully me into silencing what I truly witnessed. It really bothers me and makes me uncomfortable. I'm just stating my experiences. You wanna know why I use a throwaway? Because people bully and bludgeon anyone who doesn't praise Haskell and / or Scala. And it's been unique to those language communities. At the end of the day, your reaction is predictable. I'm not reading too deeply into what you say because, like my prior experiences, I think you have your mind up, and in spite or proof given, you're going to be on the defense / offense. > Any ML-family compiler will catch more crap at compile time than any C compiler... None of these things about language mattered since awesome and i3 had language as an afterthought and focused on experience. xmonad literally has haskell mentioned as a feature, the token battlestation pic (which is pretty cool) even has a haskell book next to it. Xmonad even uses haskell for the configuration of it. Look, not trying to take a jab at hobbyists. It's just I'm saying, in my experience, I found Haskell and scala "hobbyists" to be quite mean and it hurt my feelings alot, especially when I dropped, far more politely than when they ever spoke to me, the news they're not focused on business goals and work product and just refactoring stuff over and over. They kind of failed at bringing product vision to light, even basic things like defining requirements felt beneath them. Again, maybe I've just been unlucky.
|
|
| |
>You're proving my point. If it were node.js, I highly doubt I'd have a person coming out of the woodwork with cleverly placed quotes as if it's a legal threat.The irony is that you came out to defend Go from any sort of criticism without any direct argument as to why Haskell is bad. Your post talks about the people who use it and criticizes them for criticizing other programmers, yet that reflects perfectly on what you are doing. And yes, I am also defending Haskell when I see your comment. I'm sure Node.js programmers would defend Node.js. Programmers care about their language choice as much as they care about their text editor/IDE choice, and it's something that we all love to argue about. >At the end of the day, your reaction is predictable. I'm not reading too deeply into what you say because, like my prior experiences, I think you have your mind up, and in spite or proof given, you're going to be on the defense / offense. So you haven't made up your mind already by criticizing Haskell before even evaluating it as a language? It's not making up your mind on it when you based all your judgements from personal anecdotes of Haskell and Scala programmers you've met before? >xmonad literally has haskell mentioned as a feature AwesomeWM lists Lua as a feature [1]. Maybe Haskell programmers care about that a bit more strongly, but both groups use their configuration language as a feature. Hyper lists JS/HTML/CSS as a feature as well [2]. Although I would argue the real argument between Haskell and other languages is safety -- just like Rust or Ada programmers would list those languages as a safety feature, Haskell programmers are inclined to do the same. It's pretty similar to saying "code has been tested extensively by our quality assurance team" or something "code has been formally verified," etc. [1] https://awesomewm.org/
[2] https://hyper.is/
|
|
| |
It's my experience and opinion.Also, I looked through your history, you're a Haskell programmer. Due to the context of my thread, I'd appreciate it if you disclosed your relation so others know. > AwesomeWM lists Lua as a feature It's a scripting language on top of that's used for configuration. > It's pretty similar to saying "code has been tested extensively by our quality assurance team" or something "code has been formally verified," The problem is Haskell developers aren't really getting enough shipped, especially relative to the advocacy I see of it. So how much does code correctness matter? Look, let's say I want play FreeCiv. It being programmed in C vs Haskell vs Erlang doesn't mean much to me. But imagine if FreeCiv lacked basic functionality or code documentation, and when people mentioned these things they'd get jumped by people who advocate it being programmed in an esoteric language who look down on those who program C as unsafe, inefficient, etc. > So you haven't made up your mind already by criticizing Haskell before even evaluating it as a language? It's not making up your mind on it when you based all your judgements from personal anecdotes of Haskell and Scala programmers you've met before? Hugo vs Pandoc Awesome/i3 vs xmonad You want to know why these solutions got more popular than the their Haskell predecessors? They had devops like documentation and tooling down. They picked languages that were easier to read so the community could participate. The discussion was about how to ship features by a version release, not how to force the language to get a purely internal technical working. Users can't see the internals, so they don't care. They want something that's there for them on time and reliably. In all these years to this day, there's been more advocating Haskell philosophically on forums than there's been real discussion about getting stuff shipped. Try saying that for node/go/ruby/python.
|
|
| |
I am not a Haskell programmer, in fact I am not very fond of the language and have only ever used it for a single project. I find the language interesting, but I disagree with many of the fundamental design decisions and it is never my first choice for a project.
|
|
| |
On the other side, I used XMonad for a while and it crashed frequently.I did submit a bug report and it did get fixed, but there was another crash I couldn't be bothered to try to track down, and just switched back to the standard ubuntu shell.
|
|
| |
> I did submit a bug report and it did get fixed, but there was another crash I couldn't be bothered to try to track down, and just switched back to the standard ubuntu shell.Interesting. What was the bug, if I may ask? Do you have a URL for it and the resolution to it? What do you mean, you "switched back to the standard Ubuntu shell", exactly? XMonad is a window manager, it has nothing to do with shells. On top of that, do you mean you switched back to bash, which is the standard shell of most distros, or do you really mean you switched to the standard Ubuntu terminal emulator? There are a few things that need clarification, as what you've said doesn't make much sense in the context of window managers.
|
|
| |
By shell, I mean window manager.I'll try and look up the bug.
|
|
|
| |
In over 10 years the most productive people I've worked with were able to learn Scala and ship code in Scala quickly, without worrying about whether they would rather be programming Haskell.
|
|
| |
The best programmers I've met created ML, Scala and Haskell.
|
|
|
| |
I started out liking Go. It looked like a fairly pragmatic language. As I got deeper into my evaluation project (simple api stuff) it felt more and more like cutting wood with a dull saw.I started out liking Haskell too! But has I moved along with my small evaluation api project it felt more and more like I was trying to cut wood with gyroscopic laser saw. It worked but it was a lot of fan fair for sawing some wood. Picked up erlang/elixir. Looked pretty decent. Felt like cutting wood with a Japanese pull saw, so I had to use the vise clamps that came with it. It cut some fucking wood. ~ Ron Swanson out.
|
|
| |
My impression of Elixir and Erlang are that they are the only dynamic languages I'm aware of that seem to have dynamic typing for solid technical reasons. Many languages are dynamic just for user experience reasons. But the way the BEAM VM handles hot code swapping, concurrency, and memory management, etc, required using a dynamic language. (As far as I'm aware?)
|
|
| |
I think it's because of the built-in nature of message passing in the language. It's just easier to send around messages to processes. Hot code swapping would be possible with other JITed languages (BEAM isn't a JIT).Also since erlang has excellent pattern matching (even on binary data!) and a supervisor system (OTP) errors are easy to recover from. Erlang takes a very pragmatic approach of building tools that deal with errors by recovering from them instead of trying to exhaustively trying to prevent them.
|
|
| |
i had to google "japanese pull saw"but as someone striking out into the elixir camp/world, this is also what I'm seeing
|
|
| |
I hate to sound like the rust evangelist strike force... I really do. But your complaints are exactly what it would solve... Sigh I hate to say this I really do. But here goes...So have you checked out rust?
|
|
| |
> I hate to sound like the rust evangelist strike force... I really do. But your complaints are exactly what it would solve...Rust doesn't have lazy evaluation, or a particularly extensive standard library. I'm not aware that Rust has heap or thread profiling any better than Go's. For everything else, though, yes. I found Rust pretty easy to learn (coming from Java with a bit of Python background); i haven't mastered lifetimes or coherence, but i manage my day-to-day work without needing to. No GC means no GC pauses, and if there are big slow drop cascades, they are at least confined to one thread. Rustfmt seems to be mostly reasonable. Library versioning is done properly. The type system provides generics, sum types, and strong, if unconventional, control over side effects. Godoc uses source order and Markdown. Struct fields have to be initialized. Boilerplate is minimal; in particular, checking errors is one character, and sorting detects and uses an existing. ordering, or takes a single function to define one.
|
|
| |
Rust's library ecosystem is expanding very rapidly. I think it will overtake Haskell's in months rather than years, if it hasn't already.
|
|
| |
Also, does anyone fancy designing a Rust Evangelism Strike Force T-shirt? Because i would absolutely buy one.
|
|
| |
How robust/mature is it as of yet? More importantly, how are compile speeds and how much of a priority are they to the compiler maintainers? I too find it quite promising in terms of "a language aiming for the benefits of Go with the expressiveness and added easier correctness of FP".But "promising" doesn't mean I'd replace the few use-cases where Go currently shines for me (wouldn't recommend it for any-and-all development at all, but there are select cases where I currently absolutely wouldn't use anything else) as these are those few programs where I really want to write (and later, read) exactly step-by-step "almost machine instructions" and where the only "expressiveness" needed is already fully afforded by Go and quite well at that. Dev-related mostly bulk-file-processing-based custom tooling (not bad to fire off sizable amounts of automatic background transformations from one weird format to another on file-system events etc.) .... filters through which to pipe vast/continuous data streams with just-some-processing over it .... providing out-of-process certain "low-level, must be lightning fast" tasks and logic with IPC for a higher-level codebase .... it's a great tool for such cases, especially given how simple the syntax is. Ie. the basics are grasped very easily very quickly and you don't have to recall "its pattern matching syntax and quirks, its generics syntax and quirks, its etc etc". =) I'm perplexed when teams choose to express their entire domain model or startup in Go, however.. fair game, maybe I'm missing something there
|
|
| |
> robust/matureNeeds more context. It's more mature for some things than others, like many young languages. > how are compile times Not Go, for sure, but not the worst either. > how much of a priority are they Getting them down is a high priority, but has taken some time to make a dent in. The last few releases have all posted speed ups, and "cargo check" in the last release helps. Incremental recompilation is coming, you can try its initial implementation on nightly today. There's a lot more to do, but users really want improvements, so it is and will continue to be actively worked on.
|
|
| |
> Incremental recompilation is comingGood to hear, when that is in place compile times matter a whole lot less
|
|
| |
> How robust/mature is it as of yet? More importantly, how are compile speeds and how much of a priority are they to the compiler maintainers? I too find it quite promising in terms of "a language aiming for the benefits of Go with the expressiveness and added easier correctness of FP".I never get the compile time argument. Well I code a big Scala application and the most time it spents is integration testing. the 10 minute test suite would probably still run 9 minutes and 30 seconds on go.
we heavily rely on the database and cover a lot of concurrency/parallelism things in our tests.
some things which we would need to test in go, too. the compile time argument always looks good on first, but later on it's just a dumb way to prefer a language since it never matters.
(Actually even the go maintainers didn't cared for a while about the compiler performance).
|
|
| |
Well, the thing is, with compile times of 2 seconds I can run my unit tests as fast as possible. At the same time, I'd just use a mock database for most of the tests.It's great to be able to compile applications like kubernetes in 2 minutes.
|
|
|
| |
More to the point, Rust is trying pretty hard on the compile time front. They've almost got incremental compilation out the door, on top of this.
|
|
| |
Does kubernetes take 2 minutes to build? That seems super long. I csn compile juju (1mil LOC) in like 15 seconds.
|
|
| |
I haven't compiled it myself but I've read it's something like that. Though I'm not a 100% sure.
|
|
| |
with all dependencies (first time compiling) it takes way longer than 2 minutes.
|
|
| |
My comment is aimed at the OP who has it very clear in their mind what Go's shortcomings are.On the contrary, you haven't pointed out an objective issue that you have with Go. You offered some harsh remarks about the language but backed them up with praise. I'd say you love the language, but apologetically so. You shouldn't it's a great language. If ever you decide it's not for you then look at other alternatives, rust being one them.
|
|
| |
> I'd say you love the language, but apologetically so. You shouldn't it's a great language.I just think for me it absolutely shines for certain use-cases and sucks for many others, where I simply won't use it. The question then becomes "why Rust over [Haskell&friends]", not "why Rust over Go" ;)
|
|
|
| |
Small update on that thread: Tokio has now had an initial release, with a lot of docs. Still much more to do, of course :)
|
|
| |
Most haskellers are fond of Rust. However, it is certainly not a Haskell replacement; it has substantially less expressive power. This is, arguably, necessary for its impressive memory model (no GC and memory safe allocation), but it does hurt programmer productivity in many cases. I always feel constrained when I use Rust, although I can't fault the language for being that way. Rust is well-designed, but even good designs come with caveats. I prefer Haskell's caveats for most common programming tasks.Personally, as a frequent Haskell user, I'm looking forward to replacing embedded C/C++ with Rust as soon as they get the AVR toolchain figured out and there's better support for common ARM uCs.
|
|
| |
Luckily, LLVM 4.0 added support for AVR, and hopefully Rust can take advantage of this soon
|
|
|
| |
Or Swift - Swift sneaks in sum and product types via a c-structy syntax. Once you see the type system underneath the syntax (which was devised not to scare the Objective-C horses IIRC) it’s quite pleasant.
|
|
| |
I don't think anything in the 'The Bad' is unfair, and they're certainly not written from a position of ignorance.Those are things that suck about go; it's not specifically that they suck about go when compared to haskell; they just generally suck (particularly the type system stuff). However, I don't think that the situation is so bad I would go as far as to say, "Other than that, I will probably never choose to use Go for anything ever again..." I maintain that while go might not give smart programmers the flexibility to express their FP dreams, it has a grungy practicality both for developing and maintaining code that is quite effective. Introducing go and building applications with it in a team is easy because its quick to pick up for the whole team, regardless of background, its simple to use and its (relatively) difficult to shoot yourself in the foot in terms of distributing binaries or developing applications that run at modest scale. When gogland (the IDE by jetbrains) comes out of EAP, it'll even have a modestly good professional IDE with integrated debugger (no atom, you don't count when your debugger never works on any platform). ...but hey, if you don't like it, don't use it. Haskell is pretty great too.
|
|
| |
The thing with Go, is that it is disappointing what Google is capable of in terms of language design versus what Apple and Microsoft have done in this field.Sure there were languages with such type systems before, and we managed to deliver our work with them. However I don't want to work in 2017 as I used to work in the mid-90's, when templates were an experimental feature in C++, or the only MLs we knew were Caml Light, Miranda and SML. The world has moved on.
|
|
| |
> is that it is disappointing what Google is capable ofMan, how long will this meme survive? Go only initially originated with some Google developers, it's not in any way "Google's answer to Apple's and Microsoft's strategically-important and accordingly-subsidized-and-evangelized-and-invested-in languages". Just picture a handful of (previously "accomplished" in the field, as it turns out though) guys thinking "this company has a wide mess of Python etc scripts that should really be C programs, except for the problems this would pose"..
|
|
| |
Those developers are being paid by Google and their actions within this context relate to the public image of their employer.If Google wouldn't agree with their actions, or did not pay to increase the team size, they would be developing the language outside Google walls.
|
|
| |
You may disagree but Google relationship with Go looks lot more distant as compared to C#-MS, Java-Oracle, Swift-Apple. Here Dart-Google looks more appropriate. Those Dart people always keep telling us about Adword's usage of it. Google organizes Dart's conferences whereas for GopherCon doesn't even have Google as sponsor.Go looks far more independent from Google in branding, sponsorship and references from official Google products.
|
|
| |
That being said Go is one of the four recomended language in google : java, c++, python and go.
|
|
| |
Especially because Go is in most ways less powerful than Java – a native compiler for Java would have been more useful than this.
|
|
| |
What do you mean by "powerful"? How is a language powerful? Or not?
|
|
| |
From Paul Graham, and one of the best essays on the subject ever written: http://www.paulgraham.com/avg.htmlAs long as our hypothetical Blub programmer is looking down the power continuum, he knows he's looking down. Languages less powerful than Blub are obviously less powerful, because they're missing some feature he's used to. But when our hypothetical Blub programmer looks in the other direction, up the power continuum, he doesn't realize he's looking up. What he sees are merely weird languages. He probably considers them about equivalent in power to Blub, but with all this other hairy stuff thrown in as well. Blub is good enough for him, because he thinks in Blub. When we switch to the point of view of a programmer using any of the languages higher up the power continuum, however, we find that he in turn looks down upon Blub. How can you get anything done in Blub? It doesn't even have y. By induction, the only programmers in a position to see all the differences in power between the various languages are those who understand the most powerful one. (This is probably what Eric Raymond meant about Lisp making you a better programmer.) You can't trust the opinions of the others, because of the Blub paradox: they're satisfied with whatever language they happen to use, because it dictates the way they think about programs.
|
|
| |
So when you say a language is "powerful" you are talking about its level of abstraction? Because then we should probably try to call it something else so as not to confuse "power" for technical ability to deliver performance or productivity.
|
|
| |
Definitely, the measurement of how "powerful" a language is directly corresponding to its level of abstraction.It does not necessarily mean that any dev can code faster – or even similarily fast. But it means that much of refactoring usually is easier (as you’ll have less duplicated code), that you get more guarantees, that you have less boilerplate, etc.
|
|
| |
Then I'm not sure if "powerful" is necessarily what I look for in a language.I have three priorities when programming: 1. Correct
2. Understandable
3. Fast
Priority 1 is non-negotiable (code should be correct, or the rest is moot). Priority 2 is negotiable in situations where priority 3 isn't, however I've had this happen perhaps a handful of times in the last 25 years.Being able to produce code quickly is nice, but unimportant if the code isn't likely to be correct, understandable or fast. Productivity in production code is measured in how you achieve the above. Likewise it is nice to have code that is easy to refactor and doesn't require duplication, but duplication is more permissible than the dogmatic avoidance of duplication when it is detrimental to understanding or speed. It can also lead to unnecessarily bothersome abstraction or spending too much time creating abstraction where the domain doesn't naturally lend itself to abstraction. (Note that points 2 and 1 tend to dictate that duplication should be avoided: it isn't about the typing saved, it is about correctness.) Priority 3 could be elaborated as "quick response times and avoidance of unnecessary resource use". Any idiot can write code that does X. A craftsperson writes code that does X well. A valuable craftsperson can do X well and quickly. Be a craftsperson. For rapid prototyping you can throw the rules slightly overboard, but when you write production code things count. Abstractions and language mechanisms that can give you static guarantees are good when they help you achieve the above. They are counter-productive when they conflict with any of the above. It is easy to become counter-productive when you try to satisfy some dogma.
|
|
| |
I mean that you will need to write more, and more duplicate code, to express the same, it’ll be less clear, and less safe.Generics, and annotation processing with code generation integrated in the compiler, are two example features supported by Java that improve this. Haskell obviously is a lot better even. Additionally, it’s also a question if, in the language, any state of the program is immediately obvious from the code – or if the syntax hides things (such as Go’s multi-return errors vs. Java’s exceptions, although Java could use an "ignore" keyword to ignore any error happening in a block or statement) A powerful static type system can express everything a dynamic type system can, and more. (That’s why I tend to err towards static typing)
|
|
| |
Very little people who have tried errors by value or Either monads (they are basically the same concept in some sense) think that exceptions are better. Exceptions cause your code to be non-linear. Also, it makes them seem like something exceptional (pun intended), whereas you should really carefully handle each error and decide what it implies.Also, just something I noticed after almost two years of Go. Thanks to library quality I usually end up writing much less Go than Java/Scala for similar applications.
|
|
| |
I think Go has made a lot of people re-think their assumptions about what makes a language a good language.I have yet to figure out precisely what it is that makes me more productive in Go (which I have relatively little experience with) than Java (which I've programmed since the first beta).
|
|
| |
Make sure to write some blog post about it when you do! I'd be very curious to hear what it is.
|
|
| |
At least one thing I can say right away is that they have taken care of a lot of small hygiene factors. Things like the rapid compiler, the "go" command that has a lot of nice things built in, the way they did away with formatting discussions (although I can't articulate how it differs from other languages that try to do the same....probably the tooling).The Go community also prefers libraries over frameworks. Coming from Java, and its humongous, cancerous framworks, that's a BIG selling point. I notice that when someone posts a link to a "Go framework" I get slightly ticked off and I feel resistance even before I've clicked the link. As for the language: I don't know yet. It just works for me. :)
|
|
| |
Go does not have Either monads, or real errors by value. It has a multi-assignment syntax similar to tuple return, but nothing else.For Java, with a tiny library, you can have real Either monads: https://github.com/spencerwi/Either.java I use them in lots of code. For Go, you end up with an assignment syntax where you have to manually check, and get no compiler safety. At all.
|
|
| |
You do have errors by value. You return for example two values, the response and error.What do you mean with no compiler safety exactly?
|
|
| |
I mean that the compiler does not check if I’m handling the error. I can easily just ignore the error and work with the value, and therefore have code that breaks if the value might not be there.Any code that compiles should not be able to ever segfault or have NullPointerExceptions or similar.
|
|
| |
Well, I never had segfaults.Anyways, it's true, I miss pattern matching on errors. However, in practice, you just get used to it. I haven't had a situation in the past half year where I'd forget to handle the error / check it.
|
|
| |
> However, in practice, you just get used to it. I haven't had a situation in the past half year where I'd forget to handle the error / check it.But with the same argument, you could as well use PHP7. And this still doesn’t allow you to replace all stdlib datatypes with your own implementations without using custom precompilers.
|
|
| |
Excuse me, what do you mean by 'replace all stdlib datatypes with your own implementations'. Why would I do that?
|
|
| |
To interface via an FFI with a game engine, which uses a custom allocator, and you need to pass datatypes between it and your own code?(As I’ve done in Java!) To replace existing implementations with your own, because you want a LinkedArrayList (a linked list of blocks)? Or because you want a TreeMap instead of a HashMap, or a Set with different implementation? Because you want to implement your own Either type to better handle return values (see https://github.com/spencerwi/Either.java )?
|
|
| |
This is an interesting point actually. Because it gets raised every time and I actually thought the same before starting with Go.Thing is, in practice you usually don't need special generic containers. And when I actually need it, I'll just code up a non-generic implementation. When I need a complex type system I'll use Scala.
|
|
| |
Well, I use them a lot – I’ve written lots and lots of generic code in my projects, and reuse it across many dozens of projects.This allows me to have fully reactive collections in Java, lazy reactive collections, and more. I can just connect to a socket transmitting updates with netty, write a transformer, apply those to a lazy reactive collection, and have the changes directly appear in the list in the UI, fully threadsafe. Writing those things again, and again, every time, would be far more work, and increase the time I need to write code massively. I’ve actually used go a bit, and tried reimplementing several of my projects in go, but either it wasn’t easily possible, or I had to translate dozens of generic classes into hundreds of non-generic versions, with lots of duplicated code. And then I had found a bug, fixed it at one point, and had to copy it to hundreds of files.
|
|
| |
I'd love to take a look at your codebase if it's hosted on github (or something comparable).
|
|
| |
Currently not, as I only used it in smaller projects in the past.In the future I’ll use it likely with QuasselDroid-NG, an app of mine that’s very early in development, and for which I developed it. The use case there is that we have a list of IRC messages, constantly updating and adding messages at any point of the list, and we want to update the UI and animate. The client can also cache, or preload, messages at any point, or throw them away. Additionally, users can filter, search, or choose to disable some messages. Other messages might get merged, or synthetic ones inserted. So I have a backing store of messages that can constantly change, I have on top of that a stream of transformations taking part, and on top of that an interface presenting a single, consistent Java List, but also triggering the relevant updates on Android’s recyclerview. Most importantly, you can just add a message to the lowest list, it will be properly filtered, transformed, animated, and displayed, and this properly threadsafe across UI thread, backend thread, and filtering threadpool, while being faster than the solutions Realms and co provide for such updated UI.
|
|
| |
I'm not sure if I understand, are you actually using Go for your UI backing or using it as a backend for your application?
|
|
| |
The application described above is Java. Runs on Android and desktop Java with JavaFX.I tried to reimplement the desktop version with Go, due to native UI bindings. And failed.
|
|
| |
Well, I don't see any problems with the logic part. Receiving, filtering, transforming, and notifying someone else about incoming messages (would probably do it in a few threads, sending messages between each other).For the UI part: no wonder, Go, at least currently, is useless for UI's.
|
|
| |
> (would probably do it in a few threads, sending messages between each other).The problem is that I try to abstract these things away – in Java, I have a general framework, and just do var c = new ObservableCollection<Message>();
var f = new AsyncObservableFilter<Message>();
f.setSource(c);
f.addFilter(new DayChangeMessageFilter());
f.addFilter(new CollapseJoinLeaveMessageFilter());
f.addFilter(new RemoveIgnoredMessagesFilter());
var adapter = new ObservableAdapter<Message, MessageViewHolder>();
adapter.setCollection(f.getResultCollection());
and then I can do c.add(new Message(new Date(), "cube2222", "Well, I don't see any problems with the logic part."));
And this works for any type – I can use the same code for observable collections of IRC networks, of channels, etc. And I do use them for that. And if I change the filters, it also does the minimum amount of work necessary to update the list, and not a single CPU cycle more.I’m using it with over 11 different types in the Java version, and I’d have to maintain 11 identical, but with different types, versions in the Go version. You can understand how much of a pain in the ass this is. Oh, btw, this also abstracts away the entire thread handling, while still doing it in a reasonable way. And the backing collection doesn’t have to be a normal ObservableCollection storing data in memory – I also have ones using SQLite as backend, or caches, and I have the ability to add listeners for scrolling up or down to load or unload data as well, if it’s not needed anymore. In Go, I’d manually have to reify all these, and if I find a bug in one, I’d have to update all of the others, too.
|
|
| |
Of course I can't say for 100% as I don't know your codebase. But I think you could easily do this with Message being an interface defining the common functionality of your messages (like, probably, render(), timestamp(), id()). Then you could just have you DayChangeFilter use the timestamp, joinleave do a type assertion, removeignored check for the id.If the genericism for you is also abstracting away the source of the messages -> in Go you just usually abstract sources as io.Readers. All filters can easily be an intermediate channel which only sends further proper messages. Or just write a filter(func (m Message) bool) function which abstracts that away. Though I understand you may want to use your Reactive collections and streams, this isn't really a case for Go. Have I understood your use-case correctly?
|
|
| |
> Generics, and annotation processing with code generation integrated in the compiler, are two example features supported by Java that improve this.I think it is fair to point out that generics were not part of the first Java spec. They were added in version 5, see [1]. [1] https://en.wikipedia.org/wiki/Generics_in_Java
|
|
| |
Of course not. Early Java was comparable to today’s Go in many ways, and Generics is part of that.
|
|
| |
I feel like some of the criticism is unwarranted, specifically:> The tooling is bad I feel like the tooling is really impressive, considering the age of the language. Remember that Haskell is something like 25+ years old. Go has done quite a bit in short time - I can only hope it will get better too. > Zero values are almost never what you want I've always felt the defaults to be spot on. Anyway, it's my responsibility to initialise these values, even on struct fields. > A culture of “backwards compatibility at all costs” I agree the current (package management) landscape is dire, but this should change, hopefully this year, with godep. As it is now, I have found success using glide for dependency management, so that's what I would recommend for now. Apart from that, I can agree on a lot of things - I'm specifically annoyed by the whole generics thing, the way I interpret the people involved with the project is "it would be nice to have, but the implementation is awkward, so we won't admit to it being nice to have".
|
|
| |
> Remember that Haskell is something like 25+ years old.Yet, I'd argue that tooling is still one of the worst aspects of the language. The whole cabal/stack thing is a mess.
|
|
| |
Yet, at least we can use proper versions instead of Git urls that are exposed in source code.
|
|
| |
And yet I can be up and running and compiling with Go faster then Haskell every time, ready to ship production binaries if I want.With Haskell the process for me has repeatedly been: "okay I'm going to follow this tutorial...okay I need to install it...okay cabal is complaining about versions or exceptions....okay let's try stack....okay this example needs some includes...okay I don't have quite the right ver..." Then 4 hours later I remember that I can't deploy any of this to my servers, give up, and go back to waiting for someone to make a sane "get started" bundle for an FP language. Haskell's package management might exist - but it sucks, and I'll take go get/govendor over pretty much anything currently out there.
|
|
| |
This is one of the things I love about go. Basically any random tool written in go I can have installed and runnable in 3 seconds from the time I know the github url. All but the largest of applications is made to be compatible with 'go get' and so most of the time it just works.
|
|
| |
To be honest there's a lot of old information out there about Haskell's packaging. Today, I just use stack and it's completely fire and forget. The default template will build you a little executable and a companion library and you can just modify it from there. stack new my-project
cd my-project
stack build
stack test
stack exec my-project-exe
stack repl
|
|
| |
>And yet I can be up and running and compiling with Go faster then Haskell every time, ready to ship production binaries if I want.You can. Today. But leave that code in a repo for a month, and it will stop compiling, because some of your dependencies got updated.
|
|
| |
Not any of my code (or any of the projects I know) - the solution of the vendor/ is not "smart" but it's simple and comprehensible. My dependencies are locked and explicit - if you can clone my repository, then you also get the dependencies. And no, I really don't care about wasted disk space measured in megabytes.
|
|
| |
Incorrect. You apparently stopped at 'that looks like a URL!' and dreamed up the rest of the implementation to fit some narrative.
|
|
| |
How much have you used Stack with Haskell? The kind of version-related problems you’re talking about are exactly what it’s supposed to fix, and my experience has been that it mostly Just Works, at least if you’re not trying to do something unusual like, say, cross-compiling. Have you found otherwise?
|
|
| |
These import paths aren't really URLs. They're local import paths, where Go's out-of-box tooling additionally allows for far simpler package install/update flows when one chooses to have these local paths replicate remote-repo URL paths. One doesn't have to, but it essentially turns "any CVS" into what would be Hackage/Stackage in the Haskell world. Sure, no curation, but `go get repo-url` has its charms too for quick iterations and experimentations. Vendoring is also entirely trivial (have a fork, update it from the original when it seems sensible to do so, undo when it turns out not to be). And guess what, with that you have "reproducable builds".I like the hackage/stackage model, too. As long as there is always some folks tending to those repos. But, the `go get` model is a fine workflow too.
|
|
| |
Those urls will change if upstream happens to change to another SCM provider or tooling.Whereas in any sensible package management system the package names and versions will be unaffected by such changes.
|
|
| |
This is a fundamental misunderstanding of how go dependencies work.Dependencies are simply package names. It just happens that names are actually meaningful and tell you where you, the developer can get the package. Using vendoring and/or Godep/glide, you then strictly version and manage your dependencies. `go build` doesn't just run `git clone` or something.
|
|
| |
So what is the approach when package X happens to change their Git path, change to another SCM server, or migrate the project to the next cool SCM tool?On any sensible package management system, as a user of the said package, I won't need to change anything.
|
|
| |
Vendoring is how you do dependency management with Go. Using vendoring you would also not need to change anything.
|
|
| |
So you vendor version a04377dfebbf79f0ff43b5da2b21b835f1495803, then by the time you decide to upgrade to version a04377dfebbf79f0ff43b5da2b21b835f1495803, the project moved from Github to Gitlab.How you upgrade with zero code changes?
|
|
| |
stack isn't a mess at all, it's very well-designed, robust and stable. cabal is a mess (or say, a furball) that is properly "handled" by just using stack.
|
|
| |
I've tried to use stack many times; I think I'm up to 5 attempts now?Each time, I've invested many hours trying to make it build; trying to make it find/fetch GHC; trying to make it find/recognise the GHC it's just downloaded; talking to others in IRC/GitHub issues/etc. for help; reading through the source code; trying to hack around brokenness (stack filling up temp dir, GHC misusing bash, etc.). Each time my patience has worn out and I've just used Cabal instead. I know that stack must work on some people's machines, and that's great for them. I don't understand at all why it's regarded as some sort of simple, works-everywhere, "end of cabal hell" thing. Haskell's infrastructure is probably its biggest pain-point; Cabal isn't great, but it's at least predictable enough and old enough to have stable, reliable workflows. Stack's approach seems to be solving problems by bundling them as yet another stack feature; that doesn't help when stack itself flat-out doesn't work :(
|
|
| |
Are you on NixOS by any chance? There is a problem with /run filling up that you can fix by using TMPDIR=/tmp stack whatever
I switched back to Arch a while back, and I'm not sure if this problem still exists on the NixOS side. (Of course, you can always choose a bigger /run size in configuration.nix.)
|
|
| |
> Are you on NixOS by any chance?Yes. That's the issue I alluded to with "stack filling up temp dir". Off the top of my head, other issues include: - Bootstrapping GHC from old versions. This tries to run a script with a non-existent `/bin/bash` shebang. A more portable version would be `/usr/bin/env bash`, but since bootstrapping relies on out-of-date versions this can't be changed. Since such paths can vary depending on the OS, the right thing to do is using what the OS provides, rather than making yet another attempt at cross-OS/distro packaging for a one-off app. - When using "Nix integration", to try and avoid GHC bootstrap hell, the "lts" releases written in various project's "stack.yaml" files is assumed to correspond to a key in the `(import <nixpkgs> {}).haskell` attribute set. Except that a) different projects use a variety of these identifiers, which may or may not correspond to actual keys, b) nixpkgs doesn't actually contain any of these identifiers anymore https://github.com/NixOS/nixpkgs/issues/14897
|
|
| |
Bs. tell me how to compile individual packages from a .cabal in stack? tell me how to pass configure args through stack to cabal. tell me how to get stack to install an executable that isn't just the name of the package (eg if I attach version number to the executable). these are reasonable things you can't do because stack is immature.
|
|
|
| |
I feel like the tooling is really impressive, considering the age of the language. Remember that Haskell is something like 25+ years old. Go has done quite a bit in short time - I can only hope it will get better too.That's like hiring a toddler who can do sums for an accounting position. Showing promise is good, but not the same as being good right now.
|
|
| |
When a decent software engineer builds a project, you not only take into consideration the status quo of the technology, but also its promise, momentum and community development speed. You don't want to pick a stack with weaning manpower because you'll soon be left with legacy software in your hands. Hence, I do think that, for most serverside projects, the "promise" (as you call it) is very relevant and a true factor of decision.
|
|
| |
Yes, but that's a red herring. Outside maybe of the JavaScript world, the available tooling isn't divided into immature and near legacy. Haskell itself is a good example of a mature language still growing in popularity.
|
|
| |
All I'm saying is that it's unfair to discredit the argument that Haskell has been around for 25 years, hence its had more time to evolve a more complete tooling offering. It is a valid argument.If Haskell didn't exist and had only been launched a few years ago alongside Go, what would the status of its tooling be today? That would be a fairer comparison. Obviously one can't measure it objectively, but you do have a comparable case with Elixir, I believe. And it's clear that Go surpasses Elixir in traction. So let's not discredit the merits of Go in such a long time. It definitely has more happening for it now than Haskell or Elixir.
|
|
| |
This is sounding like a golang version of "Leave Britney Alone".The author wasn't accusing the creators of Go of being incompetents and unable to produce good tools. All was said is that currently, the tooling is not as good as Haskell's. That Go is a newer language is not a valid argument for rebutting that claim - it's only an explanation for it.
|
|
| |
This article's comments on code generation and generics are a good exposition of what I don't like about Go: it seems internally inconsistent, as though there is one set of features for the Go development team, and a lesser set for everyone else.The nice thing about Haskell and the Lisps is that they are consistently the same thing all the way down through every layer of abstraction, even at the bottom. There is no point where you reach the "magic" or "forbidden" layer where the paradigm changes and it turns into a different language. The problem with Go is that the code we programmers get to write feels like a DSL on top of the "real" Go, which uses constructs and "magic" we aren't allowed to take advantage of.
|
|
| |
TL;DR: "Go isn't like Haskell, and that means it's not as good"I get that we all have favourite languages, but it is not amazingly helpful to try and compare them like this, for me. I'm sure if you're a Haskeller and you're eyeing up Go, being forewarned might be helpful, but here's another idea: Don't compare. Just use. Take it at face value. Figure out what becomes easy, what becomes hard. I came at Go from 10+ years of professional Ruby development, and it was a tough punch to the stomach at times. I still think in Ruby more often than I think in Go. But I know the two aren't really comparable for most things.
|
|
| |
I don't understand. How can they not be comparable? They're the same type of tool, built with the same goal (writing any kind of software). In fact, Go itself was born because its creators felt the current languages weren't good at handling the current computing environment - comparisons with other languages are the at core of its existence. Finally, why even choose Go if it's not because it has some comparative advantage? Might as well stick with what we already know.
|
|
| |
They're not the same type of tool. Go is mainly aimed at distributed computing evironments and infrastructural tools. I really don't see a sensible Consul/Kubernetes/Docker alternative being written in Haskell.
|
|
| |
I don't see why not, Haskell has green threads, channels, and everything that Go has. The only argument I can see made is that the laziness and current GC implementation aren't suited, and even then, those problems can be solved with strictness pragmas, and a different GC respectively.
|
|
| |
> I really don't see a sensible Consul/Kubernetes/Docker alternative being written in Haskell.
The only reason for this is the unpredictable latency issues caused by the current garbage collector in GHC. This should be avoidable once GHC has linear types[1] and someone rewrites its GC to take advantage of this.[1] https://news.ycombinator.com/item?id=13866787
|
|
| |
> Don't compare. Just use. Take it at face value. Figure out what becomes easy, what becomes hard.Isn't this precisely what the author did? The bad parts he mentioned were mostly "this is stuff that the compiler could've catched, but it didn't and thus gave me a hard time". If we were going to carry out this argument, it would probably just boil down to the familiar debate of how strict a type system should be. The author is just a proponent of expressive type systems, and he plainly puts it in the title: "a Haskeller's perspective".
|
|
| |
I learned Go in the manner which you describe. It's still a disappointing language. I don't understand the powerful Stockholm Syndrome which appears to grip most Gophers.For example, compared to MRI, Go has better GC and execution speed, much less flexibility, better concurrency primitives but not better concurrency libraries (EventMachine), and is incredibly verbose. If you refuse to compare languages, then you are asserting that at least one of the languages is somehow magically incomparable and thus criticism-free, and Go is not so special or good that it should get any pedestal-driven treatment.
|
|
| |
What was hard to get used to when you came from Ruby?I came from Java, but it should be said that the Java style I've been using since the mid 2005s (some functional influences and a lot of experience-driven minimalism) made learning Go really easy. I should be missing immutable types and types that can only be instantiated if the state is consistent, but for some reason I didn't. Perhaps it is that the Go mindset is just very easy to adapt to? One thing I would say, though, is that you can feel that Go was a more deliberate language. Unlike most popular languages it wasn't really a language that grew out of inexperience and as a learning exercise for the language implementers. The authors already had a fairly good idea of what they wanted to accomplish and they had the attention to important detail that others tend to miss due to lack of experience.
|
|
| |
I found this comparison really interesting. Yes it is snarky and condescending at times (someone really needs to write the corollary to that blub essay, about programmers spending more time working on powerful abstractions and purity than real work), but it's useful to have an outside perspective - sum types seem nicer than returning type,error to me, the criticism of containers is fair, etc
|
|
| |
I guess you have looked at Crystal then? Much of the same expressibilty and stdlib as Ruby, but natively compiled and with channels.
|
|
| |
The author states:>"Strict evaluation is typically better for performance than lazy evaluation (thunks cause allocation, so you’re gambling that the computation saved offsets the memory cost), but it does make things less composable. " Can anyone tell me what a "thunk" is in this context and also why it causes performance problems? The article linked to in the sentence results in a 404.
|
|
| |
A thunk is an unevaluated value. Better, it's an unevaluated piece of code that returns a value when it's required.For instance, you can write x = [1..]
which gives you a list of all the positive integers. Because of laziness (which thunking enables), this line runs just fine. The compiler allocates a thunk for x, saying "okay, I know what x is now". But x is never fully evaluated (because we don't really have enough memory!): it's only evaluated as far as needed. Indeed, typing head x -- this is 1
makes the interpreter print the first element of the list. The rest is still in a thunk. In memory, we have something like [1,<unevaluated thunk>]
Now you can try doing head (tail x) -- this is 2
which evaluates the list only as far as the second element. Right now, in memory, the list looks something like [1,2,<unevaluated thunk>]
You can do similar things with basically any other datatype, enabling cool stuff like [1].However, having to evaluate (or "force") thunks repeatedly in tight loops can obviously degrade performance: printing a Haskell list like [1..10] requires you to evaluate the first element, print it, then force the next thunk to print the 2, and so on. Comparing this to the equivalent in basically any other (strict) language, we see that a lot of indirection is removed, because the endless wrapping/unwrapping is replaced with a simple linked list of 10 elements. (I'm sure GHC is smart enough to optimize the thunking away in this toy case sufficiently.) One might also say that a thunk is a "promise" (I'm not familiar with the JS concept of the same name, which I think is related to async things instead) to give you a certain value by calculating it only when you need it (if you do at all: a thunk stays unevaluated when you're done running the program, it's GC'd away). [1]: http://jelv.is/blog/Lazy-Dynamic-Programming/
|
|
| |
Thanks for the great explanations. Cheers.
|
|
| |
Thunks are values that have yet to be evaluated. For example '3 + 3' could be represented as a thunk, and there's no cost other than memory because you have not computed '6'.So you're skipping the cost of computation for a value you may never use, but you still have to represent the expression in memory somewhere. This can lead to large thunks building up, eating up memory.
|
|
| |
A thunk is saved state whose evaluation has been postponed. a = 1 + 2
if (condition) {
print(a)
}
In a lazy system, if `condition` is not true, then `a` is never evaluated.This contrived example would be a poor place to have lazy evaluation though, as the overhead of deferring would exceed the cost of computing 1 + 2 up-front.
|
|
| |
I liked this conclusion: "Go is just too different to how I think"It is in a way a mirror image of my experience with Go: It is not so much that Go is a great language, it has its fair share of flaws, but it is quite compatible with the way I think.
|
|
| |
> GHC’s garbage collector is designed for throughput, not latency. It is a generational copying collector, which means that pause times are proportional to the amount of live data in the heap. To make matters worse, it’s also stop-the-world.This is pretty much unacceptable in today's world of low-latency (web) apps. How active is GHC's development? Would it be possible to efficiently run Haskell using Go's runtime environment, i.e. by changing the compiler backend?
|
|
| |
If you have not too much data in the heap, you are safe. Even if you have a lot of data there, you most probably fine too.As an example, see here: https://bazqux.com/ and the discussion is here: https://news.ycombinator.com/item?id=5961570 He was able to survive unexpected slashdot effect from being on the front page of Hacker news without even noticing it. He told me he went up one evening and found that there were tens of thousands of new users actively trying his site. So he added a couple of servers and went to sleep. GHC development is very active. And the answer to your last question is No.
|
|
| |
Okay, but surviving a Slashdot effect is different from having a low-latency server of course :)
|
|
| |
He didn't tell me that anyone complained about delays, etc.I think delays weren't that noticeable then.
|
|
| |
> How active is GHC's development?There are two active areas of development which could solve this problem (long pause times with a large working set): * Compact regions -- which will provide off-heap storage that can be manually memory managed. These are shipping with GHC 8.2. See the paper here if you're interested: http://ezyang.com/compact.html * Linear types -- A type system extension which allows the programmer to specify values which can only be used a single time. This allows a king of type-safe manual memory management where the compiler statically check that values are freed. This work is further off, but there is currently a prototype being developed in GHC. Check out this blog post for more information: http://blog.tweag.io/posts/2017-03-13-linear-types.html > Would it be possible to efficiently run Haskell using Go's runtime environment, i.e. by changing the compiler backend? It's an interesting thought, although I suspect this would be very challenging in practice. Haskell's lazy evaluation, with thunks instead of concrete values, probably wouldn't play nicely with Go's heap layout. I would also suspect Go's runtime has been quite special-cased to Go the language. It would nice if the Go runtime system was opened up in the same way the JVM is. At this point it might be simpler to translate Haskell source syntax into Go. My overall feeling is that putting more work into the standard GHC runtime system would be a more efficient use of time.
|
|
|
| |
> There is no way to specify a version [of an imported package]You can do this in your version-control system (e.g. git), via a process called "vendoring". It's ugly but, using one of the popular tools, quite workable.
|
|
| |
> It's uglyWhat do you find ugly about it?
|
|
| |
You don’t properly separate and deduplicate dependencies, you can end up with legal issues (if you ever decide to redistribute your source), etc.One of the few dependency systems I’ve seen that work well is Gradle with Maven dependencies, using Java 9’s versioned modules. Proper versioning, proper dependencies, etc.
|
|
| |
The idea of copying multiple other repos into mine.I have used git submodules to avoid copying, but if one library has its own vendor'd copies you can end up with two versions of the same dependency. Which is even more ugly.
|
|
| |
Pulling no punches:Other than that, I will probably never choose to use Go for anything ever again, unless I’m being paid for it. Go is just too different to how I think: when I approach a programming problem, I first think about the types and abstractions that will be useful; I think about statically enforcing behaviour; and I don’t worry about the cost of intermediary data structures, because that price is almost never paid in full. Wow.
|
|
| |
The cost profile for data structures is different in Go, though, since it tries to stack allocate very aggressively. GHC, like Java and many other classical GC'd languages, tends toward heap allocation.
|
|
| |
Good writeup, but not an unexpected conclusion. I'd be very interested to see a similar writeup by a Haskeller on Rust, since the two are closer in some aspects (static enforcement of behavior, strong typing, well designed concurrency), but different in other key ones (strict vs lazy, GC vs manual, etc).
|
|
| |
Haskell always makes me sad. In real-world apps you always need hacks. I do anyway. In an imperative language I can be proud when my code only has a couple hacks in it. With Haskell I just end up feeling nasty about my code if there's even one hack there, and it makes me like coding less. (For toy apps with no deadline Haskell is great; the above regards Haskell-at-work, as a freelancer).
|
|
| |
For me it’s quite the opposite: Haskell allows me to have local hacky sections that I can then hide in a module that exposes a safe API. Since refactoring is fairly easy with compiler assistance, I don’t even have to break up my 100-line-long work-in-progress algorithm with temporary names until it works – and once I have a working solution, converting it to something readable is mostly manually extracting definitions and giving them useful names.
|
|
| |
That's nice. I haven't used it enough in a hacky way to come up with a list of good hacking patterns. Like I said, I always feel dirty, like I may as well be writing C, and then go do something else. Someone needs to write a book "REAL Real-World Haskell", with effective patterns for that kind of thing.
|
|
| |
Well the fact that pretty much no one uses Haskell anywhere should give you some hints about the language iteself.
|
|
| |
> But Go does have generics, for the built-in types. Arrays, channels, maps, and slices all have generic type parameters.Doesn't this mean you could implement a generic tree type if you fix the underlying data structure to be an array/map? (Not a go programmer yet, but honestly curious)
|
|
| |
>Doesn't this mean you could implement a generic tree type if you fix the underlying data structure to be an array/map?Nope. There are only a few functions that accept type parameters, and they must be called with concrete types, and they only work with certain aggregate types. For example, "make" can only be used to create slices, maps or channels, so you could create a slice of 10 bytes with "make([]bytes, 10)" but you cannot write "make(T)" or even "make([]T, 10)" where T is a type variable, and you also can't do "make(Tree)" where Tree is, e.g, some struct type.
|
|
| |
I think it's better to not even think of them as "generic functions". Think of them as language keywords which happen to have the syntax of functions.
|
|
| |
Defining the accessor methods would be the pain point there. The caller has to cast the returned object back to the correct type.
|
|
| |
I'll argue that Python PEP 8 was probably the precursor to gofmt.
|
|
| |
TL;DR:> I will probably never choose to use Go for anything ever again
|
|
| |
Offtop: what kind of static website generator did he use? Looks clean, looking for something similar for my usage.
|
|
|
| |
The author mentioned that code generation "introduces additional, non-standart syntax". One can say exactly the same about generics.Most useful typesystems with generics are Turing-complete. Essentially they introduce own language for types with often very weired rules and syntax that one has to master on top of the basic language. With code generation I can program my types using the same language I use for code with less things to learn.
|
|
| |
> One can say exactly the same about genericsNo, it's the same language with a bigger vocabulary and grammar, on which everybody agrees. And generics can be quite sane, as exemplified by the languages in the ML family, which have been around for decades. Java also didn't have generics. They added them eventually, much later in version 5, but then due to backwards compatibility concerns they added them with invariance at the declaration site and complex wildcard rules at use site. So the irony of this situation is that Go will add generics, it's inevitable for a (quasi) static language once it grows in usage and ecosystem. But when they'll do add those generics, they'll be broken due to backwards compatibility concerns, becoming yet another counter example for generics, picked up by the next Go / Java that will reinvent the wheels again.
|
|
| |
The ugliness of Java generics comes from type erasing. It allowed to stay compatible with JVM. Go has no such restrictions and I do not see why a truly minimal generics in a style of Virgil cannot be added to Go at some point.
|
|
| |
No, type erasure is actually an (unplanned) feature, because it didn't cripple the runtime for other languages.On this one people miss the forest from the trees. dotNET type reification is about introducing runtime meta-data and checks, which you only need when your type system is not strong / expressive enough, which makes you want to do `isInstanceOf` checks. Well, guess what, needing to check that an instance is a List<int> is a failure of the language ;-) This issue is also mixed with specialization for primitives. But that's actually unrelated because you don't need reification to do specialization. And actually you don't need runtime support either, as specialization can be compile time. Also, in actual practice with Java, type reification is only a small usability issue. The real clusterfuck have been those wildcards for expressing use-site variance. > It allowed to stay compatible with JVM. Go has no such restrictions That's circular logic, given the JVM release cycle is tightly linked to Java the language and has had evolved in response to new features of Java. No, the actual reason was to preserve compatibility with older code that weren't using generics (e.g. List vs List<int>) without forking the standard library in pre- and post-generics functionality (like .NET has done). Go will have exactly the exact same problem.
|
|
| |
Luckily, Java’s generics are actually changing with Java 10 towards a more powerful system even.It’s disappointing how Go, in every way, is just Java 2 with a handful of additional libraries. Simply using Java 2, adding some libs, and compiling with GCJ would net the exact same binaries, with far less work.
|
|
| |
> With code generation I can program my types using the same language I use for code with less things to learn.With code generation you are introducing your own compiler, DSL and all that bullshit that becomes yet another dependency you have to manage. That's busy work, that's bureaucracy, that's brittle. Now your codebase depends on pragma statements, manifests and obviously a specific syntax that aren't even managed by the default compiler, that's the opposite of simplicity. At least C has standard macros, and most developers uses the same syntax for them. Go is an extremely divisive language, a fact which will hinder its adoption as criticism of that language will get harsher since more people get exposed to it. The good thing is, as Google relies more and more on Go, peer pressure from other googlers might force changes in the language.
|
|
| |
In Go one can use template library to add custom generics using already known syntax. Essentially it will be Go source with familiar template annotations. Surely it is still a non-standard language, but the setup is trivial and extra flexibility of custom DSL may suits some projects better than hypothetical Go-with-generics.
|
|
| |
> In Go one can use template library to add custom generics using already known syntax. Essentially it will be Go source with familiar template annotations. Surely it is still a non-standard language, but the setup is trivial and extra flexibility of custom DSL may suits some projects better than hypothetical Go-with-generics.You answer addresses none of the points I was making. Your custom DSL is something you have to maintain on your own, and it obviously yet another thing that fragments the go community. Imagine if C had no macros and everybody would use its own pre-processor that you need to download before compiling a C lib. Go is exactly in the same situation. It's obviously "not a feature", nor simple, nor readable, nor smart, nor practical.
|
|
| |
> Most useful typesystems with generics are Turing-complete.The easy solution for that is to put parametric polymorphism in the module language, not the type system. ML, Ada, Modula-3 all did it successfully.
|
|
| |
I think comparing Go and Haskell are like comparing two incomparable different species -- a fish vs. a cat.Why? Because Haskell is an interpreted language while Go is a compiled one. Interpreted language doesn't care much about performance as it isn't designed for that purpose, while in the other hand, compiled language does. As a result, interpreted language tends to be more 'elegant' and has lots of convenient features at the cost of performance. A concrete example is when you talk about preventing unacceptable data type in Haskell. They could make it so in Go, but the performance cost would be undesirable. IIRC, I read that they designed Go to be practical instead of 'elegant', the reason is so that people can learn it easily, making it a good alternative for other compiled languages like C++ whose learning curve is hugeeee and ugly!
|
|
| |
Haskell is a compiled language.
|
|
| |
There is no such thing as a "compiled language" or an "interpreted language", as this is a property of language implementations: interpreters (GHCi, Hugs) and compilers (GHC, JHC).
|
|
| |
Haskell is compiled, and most of the safety features (i.e. everything in the type system) is checked statically at compile-time. Preventing unacceptable data types happens during compilation. The same could be done in Go with no performance cost.
|
|
|