Jay Taylor's notes

back to listing index

Hellogopher – Just “git clone” and “make” any Go project | Hacker News

[web search]
Original source (news.ycombinator.com)
Tags: golang go make news.ycombinator.com
Clipped on: 2017-04-22

Image (Asset 1/2) alt=
Image (Asset 2/2) alt=
Projects like these and their popularity are indicative of how unapproachable the Go development environment is to new users. One of the first things I did when introducing Go at work is write a bunch of script that solved this exact issue for our team, and I suspect that a solution like this has been written a million times over already.

I love the Go language but the development environment tooling is just awful (aside from some great things like gofmt). This requirement of having files in specific folders to be able to build, the (lack of a sane) vendoring system, both problems keep popping up and spawning new solutions over and over. There is no excuse for a modern language to force people to learn how to use the tools before they can even start using the language. Tools should work for the user, not the other way around. Other languages get this right, so why can't Go?

Kudos to the author for spending some time to document and publish their solution to this specific issue. If these projects get enough traction, like govendor, perhaps the language authors will at some point incorporate the most popular solutions into the language platform. At least it'll save everyone else from rewriting the same tool.


Then you'll be happy to hear that part of Russ Cox's 2017 resolutions [0] is to make the go tool work without GOPATH (worded in a way that essentially matches the hellogopher motto), and there's a nice proposal in that sense [1].

Moreover, an integrated package management tool to build the vendor folder is coming soon [2]. We're getting there, now that the problem is getting more understood!

[0]: https://research.swtch.com/go2017

[1]: https://github.com/golang/go/issues/17271

[2]: https://blog.gopheracademy.com/advent-2016/saga-go-dependenc...


Why is go so late to the party on this? Other languages do not seem to have these kinds of problems.

The more I experience go, the more I feel like the core team existed in a bubble away from the past 25 years of advancement in programming languages. They're stuck solving the problems they had with C, without the benefit of learning from or improving on any other modern languages or toolsets.


The go command works pretty well for me since it exists. If you set GOROOT, GOPATH, PATH=$GOPATH/bin and place your projects into domain/dir/project then everything just works automagically. Without any script.

>Other languages get this right, so why can't Go?

Which one? I mean just compare this with other languages/systems that supposedly get it right, e.g. Java with Maven oder node.js with npm. They are complex monsters under the hood, ready to break for sometimes uncontrollable reasons.

I really like the Go philosophy of being able to build and install cross-platform without any Makefile or even autoconf. As a developer I can choose whether to work with plain vi, Sublime or Eclipse with Go plugin.

The only scripts I ended up writing were unit test automation (find -d . and then calling go test) and .deb pkg generation.

IMHO the Go devs should rather communicate more why things are designed the way they are. Another often disputed feature is no generics. I really miss them as well, but maybe Go is just not well suited for every problem. Writing a linear algebra is probably better done in C++ or Scala than Go. Go is really good for building reliable Server software however.


> The go command works pretty well for me since it exists. If you set GOROOT, GOPATH, PATH=$GOPATH/bin and place your projects into domain/dir/project then everything just works automagically. Without any script.

Yeah, it is really that simple. In my experience, most people are confused by this due to its simplicity. It feels like you missed a step or two coming from other toolchains.


I started using Go last year and I was very pleased with how approachable and quick to learn it was. Including the tools. The role of $GOPATH was a bit new to me, but apart from that it has mostly been very smooth sailing.

I'd be interested in reading a bit more about the problems you experienced. Can you recommend any articles or blog postings that describe what kinds of problems people tend to have?


People may just not want to have their source code into a special hierarchy where their own source code is mangled with the source code of the dependencies.

I also use a Makefile to setup an ad hoc GOPATH to workaround this.


Me too. I work under the assumption that a packager can just grab the tarball of the latest stable tag from the repo's releases page on Github, unpack it somewhere and `make && make install`. I converged upon the same approach of setting up an ad-hoc gopath.


I'm not a big fan of the make && make install route.

I'm not thrilled about the GOPATH either. Not least because there is no really clean solution for actively developing code inside the GOPATH and working with GitHub.

Also, I would have liked dependencies to be automatically downloaded and installed in a local cache rather than me having to crap it into the GOPATH by manually issuing a "go get". I had hoped we were done with that.

But I can live with it because it requires no explanation to new developers on a project.

Using some hack to "solve" problems means forcing developers to learn even more stuff before getting started on projects. And one might say "but it is only one small thing", but that has sort of been the mantra in the Java world. And it isn't so small when you spend 2 weeks learning Java and 5 years learning about the little small things that have been put in any non-trivial project.


> But I can live with it because it requires no explanation to new developers on a project.

Even if it does, I'm a big fan of still mentioning it in the README. For my Go projects, it's usually just one sentence: "This project is go-gettable, or you can make && make check && make install". Avoids a lot of useless confusion.


I can understand your problem. Thanks.

Actually, I'm starting to think that the solution is to have per project GOPATH instead and build switching into the shell or the editor. Because messing with Makefiles is just about the last thing I'd like to do (it never stops there. it always escalates)


People are always quick to criticize GOPATH but when you look at other similar languages (not JIT "scripting" languages but competing ones like Java and C++) then Go is actually reasonably good in terms of the difficulty of its tools.

I think the problem with Go is that it's attracted a lot of developers from languages like Python where you just install the runtime and a shebang at the top for the code then call it as if it were a native ELF. No AOT language can work that simply. But given a choice between makefiles and the Go CLI, I'd take the Go design every time.


Hey all, author here. Happy to take any feedback, in particular “I tried and it didn’t work out of the box” stories. I will NOT tell you to RTFM.

I recommend reading https://github.com/cloudflare/hellogopher#why for more insight.

There’s also a gif demo https://twitter.com/FiloSottile/status/822745605806112768

EDIT: uploaded the keynote I gave at Golab where I present the problem and introduce this project https://vimeo.com/200469720


Thanks for making this, hope it catches on. As a non-gopher, whenever I came across a project where I needed to build a Go binary I'd get blocked by the GOPATH and since then I've just avoided such projects.


I did something similar for one of my projects (github.com/lunixbochs/usercorn) if you want another perspective. My primary goal was `make` would work out of the box in most environments (on OS X / Linux), even if you did not have Go installed at all.

This includes downloading the Go binaries for your platform, building three C libraries, fixing gopath, and automatically determining packages for `go get`.

https://github.com/lunixbochs/usercorn/blob/master/Makefile


Am I right to understand that it's "build from anywhere" as long as "anywhere" is not Windows? (Please don't tell me about MinGW/Cygwin if the whole idea is to "make life simpler".)


Sadly, correct.

There is no way a standalone tool couldn't reproduce this behavior on Windows, and the eventual integrated solution definitely will, but it's just beyond the scope of hellogopher and outside my expertise.


I'm also a little confused. This is a fix for the standard go installer on platforms not setting a GOPATH, so that "go get" doesn't work without manually configuring GOPATH?

Or does it actually do anything else (except "break" windows builds - or rather introduce a system complementary to "got get" that isn't quite as cross platform as go itself)?


This is not about platforms (or about go get), this is about users who don't want to or know how to correctly use GOPATH.

It doesn't break anything, since you can still use go get, go build, or anything that worked before. It doesn't help Windows users, but it doesn't hurt them.


But as of 1.8 there is a default gopath negating that


A default gopath doesn't help a user who does "cd dev; git clone https://github.com/coolprojects/project; cd project; go build" (or make)

The default gopath does not help there. Sure, if a user knows how to use go's insane global-project-directory-bullshit then they don't need this, but since it's a go-specific bullshit, they might not.

Every sane language lets you have a directory where you want for each unrelated project. Only go dictates some crazy tree and defaults to creating it for you in $HOME


Right. But in order to build a go project, you need the go compiler. And the go compiler comes with "go get". So my question was more: if a user has installed go (properly) should not "go get" be the sane, easy, one way to install a project?

And isn't "fixing" "go get" better than adding a dependency on make?

(and I guess this project in a sense is a way to fix "go get", but it seems thrasher fragile; any one project might end up with a broken makefile by accident)..


A batch or PowerShell script potentially could (and it could even be named make.bat, so that instructions would stay the same). Windows does actually support symlinks, though the caveat is that it requires elevated privileges. See e.g.: https://en.wikipedia.org/wiki/NTFS_symbolic_link. I suppose some PowerShell magic could invoke the Admin password prompt, though there's also a question whether it'd be good to condition users to require elevated privileges for some random just downloaded script.


Anywhere in the FS of your OS From the readme: >They expect to just clone a repository anywhere, and be able to build it


The annoyance/confusion/hurdle of GOPATH for beginners in the language will finally be resolved with the upcoming Go 1.8 release.

https://beta.golang.org/doc/go1.8#gopath


The default GOPATH is great, but it only helps running "go get", it doesn't solve developing in arbitrary locations on the filesystem. I address it at the end of my introductory talk: https://vimeo.com/200469720


I think Go works very well without Makefiles. (IMHO even better) You can just enter `go build .` You can install the stuff using your favorite package manager, e.g. dpkg - then it's even possible to uninstall your stuff.


Thanks. Now I get a console full of:

  thing.go:10:2: cannot find package "whatever/thing/frob" in any of:
  	/usr/lib/go-1.7/src/whatever/thing/frob (from $GOROOT)
  	($GOPATH not set)
So I go looking for GOPATH documentation, and it tells me I need to put all my source dependencies in one directory. That's unlike everything else on the planet, but OK. I do that, and set GOPATH=~/go.

Now:

  thing.go:10:2: cannot find package "whatever/thing/frob" in any of:
  	/usr/lib/go-1.7/src/whatever/thing/frob (from $GOROOT)
  	/home/ctz/go/src/whatever/thing/frob (from $GOROOT)
  	($GOPATH not set)
OK, so do I need to set GOPATH as well?

  thing.go:4:2: cannot find package "bytes" in any of:
  	/home/ctz/go/src/bytes (from $GOROOT)
  	($GOPATH not set)
  thing.go:8:2: cannot find package "container/list" in any of:
  	/home/ctz/go/src/container/list (from $GOROOT)
  	($GOPATH not set)
That seems worse.

Now somebody teaches me about 'go get'. Great!

  package whatever/thing/frob: unrecognized import path "whatever/thing/frob" (import path does not begin with hostname)
(To be clear, I now can build all the golang code I want. This post is more indicative of my experience getting started a few years ago. In my opinion, GOPATH is a terrible design decision.)


You can check in your dependencies into vendors/, in case they are so complex. However it's recommended to only have few dependencies.

Yes, you need to install your deps separately. (If the maintainer decides not to use vendors/ and not check it into git) With C, the classic ./configure && make && make install also expects you to install your dependencies manually. Using Go you have a canonical way to do it. Also if you don't plan to hack on it, it's probably a got idea to go with binary packages. If there are none, the tool is probably not exactly meant for production anyway.


Then why is this necessary - why isn't it just:

  all:
      go build


All this is necessary because people don't expect or don't want to put their code/projects under $GOPATH/src. My take on it is don't fight the GOPATH, because a lot of Go tooling expects your code to be in a valid GOPATH.


This is one of the main reasons I hate working with go projects. I have about 50 different git repos checked out on my machine right now; written in different languages. Over they years I've worked out a structure for these repos that helps me keep track of what's where and switch between them easily.

But then go comes along and is like 'no, you MUST check this out in this particular location'. And also that location has to be 6 directory layers deep, for some reason. It's just pointlessly broken when compared to every other language, where a project is self-contained within a directory.


Ok but when you hate them, why do you check them out manually? You can just mkdir $GOPATH ; go get { pkg1, pkg2, ..., pkg50 } I mean, nobody complains that npm -g puts stuff into some weird node directory that you should never ever touch manually. Or did you ever compile a random Java project that doesn't use Maven?

I think regarding convenience Go is top notch. Regarding "weirdness" it's in the middle between Unix C and Java.


> Ok but when you hate them, why do you check them out manually? You can just mkdir $GOPATH ; go get { pkg1, pkg2, ..., pkg50 }

That still has the same issue of all my projects following a logical structure, except for anything written in go / with a component written in go, which has at $RANDOM_LOCATION.

Plus, I think `go get` doesn't work with our internal company git server, but I'm not 100% about it.


>That still has the same issue of all my projects following a logical structure, except for anything written in go / with a component >written in go, which has at $RANDOM_LOCATION.

Wellll... if you are not reeeally using Go but only for short snippets or smaller projects you can place the code anywhere you like. (Except for the external deps afaik, but even that might work with ./vendors/) In your imports you need to then use relative paths. Haven't used this coding style since some time but you can totally do that. Of course you loose some goodies like having super isolated modules but it sounds you're not after that anyway.

Disclaimer: I've used this coding style for a few 1 kLoC+ projects, but I later changed to the recommended way.

>Plus, I think `go get` doesn't work with our internal company git server, but I'm not 100% about it.

You can still manually git/hg/... clone but yeah...


As someone who's more fond of this approach than Go's native one, it's funny how we've come full circle (even if this never gains widespread adoption).

Thank goodness for the inclusion of sensible vendoring, by the way! Go's default system broke a lot of older code because of the native vendoring system (if you can call it that).


Sorry, maybe this is a stupid question... but does this require me to store all the dependencies under ./vendor in the repo?

I would really prefer it ./vendor would be .gitignored (like .virtualenv or node_modules) and `make deps` would parse the source and fetch the deps, sort of like `go get` does. Bundling all the dependencies in your own repo just feels wrong to me. Especially if it's not git submodules (and I've heard submodules are frowned upon by those who have tried them - I've haven't had such necessity, so don't hold any opinion). At least very similar approaches had led to some really bad experiences I had in early 2000's with PHP (it was typical to bundle vendored deps there because of relative lack of package management).


That depends on the tool you want to use to manage your vendor folder, so no, hellogopher does not enforce an opinion on that.

For example, you could modify the Makefile to run "gvt restore" before building. However, I warmly recommend checking vendor/ in: not doing so requires the user to have gvt (or whatever tool), breaks go get, and adds unnecessary external dependencies (think left-pad).

More details here https://github.com/FiloSottile/gvt#alternative-not-checking-...


I see. Guess, bare `go get` just can't get a dependency into `./vendor/` so either a third-party or checking in is required.

Is there any suggestions/best practices how to maintain `vendor/` under the source control? I mean, handling updates, branch merges and all the disaster that accidental local patches may bring - this sort of stuff?


The real problem with go get here is that it can't pick a version, only latest. Checking in vendor is much safer because your dependencies can't break you.

there are no good upstream suggestions on how to deal with vendor.

My preference is to use to use http://glide.sh/ and store 'glide.yaml' and 'glide.lock' in the repo, at which point the vendor folder can be recreated to the correct versions without needing to parse code or checkin vendor.

Glide is also really good at handling updates, pulling from alternate sources, tracking alternate branch names, etc etc.


This looks awesome. I haven't tried it yet, but the hopefully it supports writing test coverage files for multiple packages, i.e. I already split the code up into separate packages and put each package in a corresponding subfolder. Using go test ./... I want to collect coverage reports for all my packages while ignoring `vendor`

So my workaround is something like the following, where I generate a coverage report for each package to later concatenate them into a final coverage report

``` for i in $(go list ./... | grep -v 'vendor') do coverName="$(basename $i).coverraw" go test --ldflags '-extldflags "-static"' -v -race -coverprofile="${coverName}" "${i}" | tee -a "${test_outfile}" done```


I've recently done that in go-ipfs project for cross package coverage (where tests in one package are counted towards coverage in all the others). You can see the results here: https://github.com/ipfs/go-ipfs/blob/ce00f384dc244f04ec7d87e...

It is quite a bit of coverage and Makefile magic, but it generates coverage results. The same file also allows us to wire in sharness testing (cli tests) to generate coverage report. It is quite important for us as we don't test cli in unit tests just in the sharness.

-------

The magic happens here: https://github.com/ipfs/go-ipfs/blob/ce00f384dc244f04ec7d87e...

Each package has its one coverage target, for each package I collect dependencies that are in our testing scope and then I create go test cover call that includes those dependencies.


Just run "make cover" :)

It does exactly what you want, and it handles overlapping coverage blocks (if two packages have tests that run the same portion of code), which just concatenating will mess up. (It uses gocovmerge behind the scenes, but it will be hidden from you.)


Awesome!

It would be nice to include something like go2xunit so you can hook the results up to the Jenkins coverage reports.


For Jenkins compatible Junit test results you must run the concatenated stdout of your `go test` through:

  go2xunit -input "${test_outfile}" -output "${GO_TEST_JUNIT_FILENAME}"
For Jenkins compatible coverage reports you must concatenate all cover profiles and use `gocov` for conversion:

  echo "creating coverage report"
  all_coverage_rawfile=$(mktemp tmptest.XXX)

  # concatenate all raw report
  echo -e "mode: atomic\n$(cat *.coverraw | grep -v 'mode: atomic')" > "${all_coverage_rawfile}"
  go tool cover -html="${all_coverage_rawfile}" -o "${GO_COVERAGE_FILENAME_HTML}"
  gocov convert "${all_coverage_rawfile}" | gocov-xml > "${GO_COVERAGE_FILENAME_XML}"
Edit: fix typo


We do that on the Cloudflare CI, but I deemed it too specific for hellogopher. (I can be convinced otherwise.)

Instead, you run hellogopher with CI=1, and it drops logs and artifacts in fixed locations, that you can then post-process as you like.


Ah. OK. Yeah. That works as well for me. Thank you


I also have spent some time to make bash scripts that would allow me to copy my Go project into a temporary folder to linux machine and compile it.


One thing that I really like about this is it makes building go things via Homebrew that much easier, I've just been converting my tools and things I install via a personal tap to use this - great effort!


I have long used docker to build all my Go so I can put it where I wish, and where makes sense for projects that are not all Go. Looking forward to a permanent fix upstream eventually.


I don't understand how HN ranks posts: this was at the very top with just 9 points, no comments, how does that work exactly ?


If the post gets upvotes in a short amount of time it ranks higher


score = (votes - 1) / pow((item_hour_age+2), gravity)

Which means, newer posts have a higher weight. As time goes by, the score lowers. The constant gravity can be adjusted to lower or raise how fast this occurs.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: