How do you figure? Go's packaging is wayyyy better than Python's. I've done considerable work with each and while Go's ecosystem has warts here and there, it's far from disastrous. I can't say that about Python.
If nothing else, Go lets you distribute a static binary with everything built in, including the runtime. Python's closest analog is PEX files, but these don't include the runtime and often require you to have the right `.so` files installed on your system, and they also don't work with libraries that assume they are unpacked to the system packages directory or similar. In general, it also takes much longer to build a PEX file than to compile a Go project. Unfortunately, PEX files aren't even very common in the Python ecosystem.
"packaging" refers to the way the language manages dependencies during the build and import process, not how you distribute programs you have built.
Python has a deservedly poor reputation here, having churned through dozen major overlapping different-but-not-really tools in my decade and a half using it. And even the most recent one is only about a year into wide adoption, so I wouldn't count on this being over.
Go tried to ignore modules entirely, using the incredibly idiosyncratic GOPATH approach, got (I think) four major competing implementations within half as long, finally started converging, then Google blew a huge amount of political capital countermanding the community's decision. My experience with Go modules has been mostly positive, but there's no really major new idea in it that needed a decade to stew nor the amount of emotional energy. (MVS is nice but an incremental improvement over lockfiles, especially as go.sum ends up morally a lockfile anyway.)
I'm slowly deprecating a python system at work and replacing it with elixir. We don't use containerization or anything, and installing the python system is a nightmare. You have to set up virtualenvs, not to mention celery and rabbit, and god help you if you're trying to operate it and you forget something or another.
With elixir, you run "mix release" and the release pipeline is set up to automatically gzip the release (it's one line of code to include that). Shoot the gzip over (actually I upload to s3 and redownload), unzip, and the entire environment, the dependencies, the vm, literally everything comes over. The only thing I have to do is sudo setcap cap_net_bind=+ep on the vm binary inside the distribution because linux is weird and, as they say, "it just works".
I fully agree with this assessment, but I don’t see how this puts Python’s story on par with Go’s. While GOPATH was certainly idiosyncratic, it generally just worked for me. While go modules aren’t perfect and the history was frustrating, they generally work fine for me. Python feels like an uphill battle by comparison.
If Go sticks with modules and doesn't keep making significant changes (e.g. the proxy introduced 1.13 was not handled well), then it will be better than Python.
But if Python finally "picks" poetry, sticks with it for a few years and incrementally fixes problems rather than rolling out yet another new tool, that will also be better.
You can only identify the end of the churn for either retroactively. Python just looks worse right now because it's been around longer.
go: tends to wait and implement something once the problem is understood. took 2 years after go maintainers decided to solve the dependency issues. and as of the latest release its finally been labelled production ready.
and honestly the proxy issues were not real. go modules was still optional. you could just turn it off.
python is how old now? couple decades? and it has only gotten worse over time.
Another thing Python and Go unfortunately have in common is a community (not necessarily core developers) with knee-jerk reactions to any criticism.
> go: tends to wait and implement something once the problem is understood.
Go's modules provide no additional "understanding" over any of the other Bundler-derived solutions in the world. MVS was the primary innovation, but wanting checksum validation means I have to track all the same data anyway.
> took 2 years after go maintainers decided to solve the dependency issues
This is revisionist history. There were other official "solutions" before ("you don't need it", "vgo is good enough", and "we'll follow community decisions"). If this one sticks, it's fine. But you can't say it's good now just because it's the one we have now - it's good now if it's the one we still manage to have in five years.
Go's track record is not "good" (in that regard I think only Cargo qualifies). At best it's "mercifully short."
> and honestly the proxy issues were not real.
Documentation was poor, the needed flags changed shortly before release, the design risks information leaks, and the entire system should not have been on by default for at least one more minor version.
> python is how old now? couple decades? and it has only gotten worse over time.
Yeah, that's exactly why I said "Python just looks worse right now because it's been around longer." It hasn't gotten worse though, it just also hasn't stopped churning. And if Go doesn't stop churning, in 10 years it will look the same.
The age argument works both ways - multiple major versions of Python predate Bundler. Go has no excuse for taking so long to reinvent "Bundler with incidentals", just like every other language.
If nothing else, Go lets you distribute a static binary with everything built in, including the runtime. Python's closest analog is PEX files, but these don't include the runtime and often require you to have the right `.so` files installed on your system, and they also don't work with libraries that assume they are unpacked to the system packages directory or similar. In general, it also takes much longer to build a PEX file than to compile a Go project. Unfortunately, PEX files aren't even very common in the Python ecosystem.