Go away Python

(lorentz.app)

287 points | by baalimago 12 hours ago

59 comments

  • hamishwhc 9 hours ago
    The author’s point about “not caring about pip vs poetry vs uv” is missing that uv directly supports this use case, including PyPI dependencies, and all you need is uv and your preferred Python version installed: https://docs.astral.sh/uv/guides/scripts/#using-a-shebang-to...
    • meander_water 9 hours ago
      Actually you can go one better:

        #!/usr/bin/env -S uv run --python 3.14 --script
      
      Then you don't even need python installed. uv will install the version of python you specified and run the command.
      • rikafurude21 8 hours ago
        alternatively, uv lets you do this:

          #!/usr/bin/env -S uv run --script
          #
          # /// script
          # requires-python = ">=3.12"
          # dependencies = ["foo"]
          # ///
        • semi-extrinsic 6 hours ago
          The /// script block is actually specified in PEP 723 and supported by several other tools apart from uv.
          • nemosaltat 1 hour ago
            The last time I commented extolling the virtues of uv on here, I got a similar reply, pointing out that PEP 723 specs this behavior, and uv isn’t the only way. So I’ll try again in this thread: I’m bullish on uv, and waiting for Cunningham.
          • yjftsjthsd-h 2 hours ago
            That's good to hear; do you know what other tools support it?
            • semi-extrinsic 2 hours ago
              From what I can tell, Hatch, PDM, pipx and pip-run also support it.
        • nemosaltat 1 hour ago
          I’ve started migrating all of my ~15 years of one-off python scripts to have this front matter. Right now, I just update when/if I use them. I keep thinking if were handier with grep/sed/regex etc, I’d try to programmatically update .pys system-wide. But, many aren’t git tracked/version controlled, just laying in whatever dir they service(d). I’ve several times started a “python script dashboard” or “hacky tools coordinator” but stop when I remember most of these are unrelated (to each-other) and un/rarely used. I keep watching the chatter and thinking this is probably an easy task for codex, or some other agent but these pys are “mine” (and I knew^ how they worked when I wrote^ them) and also, they’re scattered and there’s no way I’m turning an agent loose on my file system.

          ^mostly, some defs might have StackOverflow copy/pasta

          • giancarlostoro 46 minutes ago
            You could run ripgrep on your file system root to find most of them, its insanely fast, then feed it to claude or something to generate a script to do it for you.
        • ncouture 3 hours ago
          This is an awesome features for quick development.

          I'm sure the documentation of this featureset highlights what I'm about to say but if you're attracted to the simplicity of writing Python projects who are initialized using this method, do not use this code in staging/prod.

          If you don't see why this is not production friendly it's for the simple a good.reaaon that creating deployable artifacts packaging a project or a dependency of a project this uses this method, creating reproducible builds becomes impossible.

          This will also lead to builds that pass your CI but fail to run in their destination environment and vice versa due to the fact that they download heir dependencies on the fly.

          There may be workarounds and I know nothing of this feature so investigate yourself if you must.

          My two cents.

        • zahlman 6 hours ago
          This isn't really "alternatively"; it's pointing out that in addition to the shebang you can add a PEP 723 dependency specification that `uv run` (like pipx, and some other tools) can take into account.
      • dietr1ch 6 hours ago
        Yeah, but you need `uv`. If we are reaching out for tools that might not be around, then you can also depend on nix-shell,

            #! /usr/bin/env nix-shell
            #! nix-shell -i python3 --packages python3
        • mystifyingpoi 6 hours ago
          Yeah, but you need Nix. If we are reaching out for tools that might not be around, then you can also depend on `curl | sudo bash` to install Nix when not present.

          (this is a joke btw)

          • jonhohle 5 hours ago
            Yeah, but you need curl, sudo, and bash…
            • lioeters 3 hours ago
              "Give me a 190-byte hex0 seed of x86 assembly, and I shall compile the rest of the world." - Archimedes
            • JodieBenitez 29 minutes ago
              ... you must first invent the universe
        • kokada 1 hour ago
          The issue I have with `nix-shell` is that the evaluation time is long, so if you need to run the script repeatedly it may take a long time. `nix shell` at least fix this issue by caching evaluations, but I think uv is still faster.
        • Cyph0n 3 hours ago
          This comes with the added benefit that your environment is reverted as soon as you exit the Nix shell.
      • jonhohle 5 hours ago
        That shebang will work on GNU link based systems, but might not work elsewhere. I know that’s the most popular target, but not working on macOS, BSDs, or even busybox.
        • rented_mule 4 hours ago
          I just tried the one you are replying to and it worked great on macOS. I frequently use a variant of this on my Mac.
      • Supermancho 4 hours ago
        > Then you don't even need python installed. uv will install the version of python you specified and run the command

        What you meant was, "you don't need python pre-installed". This does not solve the problem of not wanting to have (or limited from having) python installed.

    • the__alchemist 4 hours ago
      I solved this in 2019 with PyFlow, but no one used it, so I lost interest. It's an OSS tool written in rust that automatically and transparently manages python versions and venvs. You just setup a `pyproject.toml`, run `pyflow main.py` etc, and it just works. Installs and locks dependencies like Cargo, installs and runs the correct Python version for the project etc.

      At the time, Poetry and Pipenv were the popular tools, but I found they were not sufficient; they did a good job abstracting dependencies, but not venvs and Python version.

      • greensh 2 hours ago
        sounds awesome. Just out of interest, why do you think pyflow didn't catch on, but UV did?
        • the__alchemist 2 hours ago
          My best guess: I'm bad at marketing, and gave up too soon. The feedback I received was generally "Why would I use this when Pip, Pipenv and Poetry work fine?". To me they didn't; they were a hassle due to not handling venvs and Py versions, but I didn't find many people to also have had the same problem.
        • the_mitsuhiko 2 hours ago
          Polish and that uv gets you entire python interpreters automatically without having to compile or manually install them.

          That in retrospective was what made rye temporarily attractive and popular.

    • benrutter 9 hours ago
      I thought that too, but I think the tricky bit is if you're a non-python user, this isn't yet obvious.

      If you've never used Clojure and start a Clojure project, you will almost definitely find advice telling you to use Leiningen.

      For Python, if you search online you might find someone saying to use uv, but also potentially venv, poetry or hatch. I definitely think uv is taking over, but its not yet ubiquitous.

      Ironically, I actually had a similar thing installing Go the other day. I'd never used Go before, and installed it using apt only to find that version was too old and I'd done it wrong.

      Although in that case, it was a much quicker resolution than I think anyone fighting with virtual environments would have.

      • idoubtit 8 hours ago
        That's my experience. I'm not a Python developer, and installing Python programs has been a mess for decades, so I'd rather stay away from the language than try another new tool.

        Over the years, I've used setup.py, pip, pipenv (which kept crashing though it was an official recommendation), manual venv+pip (or virtualenv? I vaguely remember there were 2 similar tools and none was part of a minimal Python install). Does uv work in all of these cases? The uv doc pointed out by the GP is vague about legacy projects, though I've just skimmed through the long page.

        IIRC, Python tools didn't share their data across projects, so they could build the same heavy dependencies multiple times. I've also seen projects with incomplete dependencies (installed through Conda, IIRC) which were a major pain to get working. For many years, the only simple and sane way to run some Python code was in a Docker image, which has its own drawbacks.

        • lexicality 8 hours ago
          > Does uv work in all of these cases?

          Yes. The goal of uv is to defuck the python ecosystem and they're doing a very good job at it so far.

          • prox 6 hours ago
            What are the big offenders right now? What does uv unfuck?

            I only work a little bit with python.

            • lexicality 13 minutes ago
              In my experience every other python tool has a variety of slightly to extremely painful behaviours that you have to work around or at least be aware of.

              Sometimes it's things like updating to Fedora 43 and every tool you installed with `pipx` breaking because it was doing things that got wiped out by the system upgrade, sometimes it's `poetry update --only dep1` silently updating dep2 in the background without telling you because there was an update available and even though you specified `--only` you were wrong to do that and Poetry knows best.

              Did you know that when you call `python -m venv` you should always pass `--upgrade-deps` because otherwise it intentionally installs an out of date version of pip and setuptools as a joke? Maybe you're not using `python -m venv` because you ran the pyenv installer and it automatically installed `pyenv-virtualenv` without asking which overrides a bunch of virtualenv features because the pyenv team think you should develop things in the same way they do regardless of how you want to delevop things. I hate pyenv.

              So far the only problem I've had with `uv` is that if you run `uv venv` it doesn't install pip in the created virtualenv because you're supposed to run `uv pip install` instead of `pip install`. That's annoying but it's not a dealbreaker.

              Outside of that, I feel very confident that I could give a link to the uv docs to a junior developer and tell them to run `uv python install 3.13` and `uv tool install ruff` and then run `uv sync` in a project and everything will work out and I'm not going to have to help them recover their hard drive because they made the foolish mistake of assuming that `brew install python` wouldn't wreck their macbook when the next version of Python gets released.

            • kevin_thibedeau 4 hours ago
              pip and venv. The Python ecosystem has taken a huge step backwards with the preachy attitude that you have to do everything in a venv. Not when I want to have installable utility scripts usable from all my shells at any time or location.

              I get that installing to the site-packages is a security vulnerability. Installing to my home directory is not, so why can't that be the happy path by default? Debian used to make this easy with the dist-packages split leaving site-packages as a safe sandbox but they caved.

              • kstrauser 2 hours ago
                Regarding why not your home directory: which version of Foo do you install, the one that Project A needs or the incompatible one that Project B needs?

                The brilliant part about venvs is that A and B can have their completely separate mutually incompatible environments.

                • kevin_thibedeau 45 minutes ago
                  They have their place. But the default shouldn't force you into a "project" when you want general purpose applicability. Python should work from the shell as readily as it did 20 years ago. Not mysteriously break what used to work with no low-friction replacement.
                • andoando 54 minutes ago
                  Why can't we just have something like npm/gradle/maven dependencies? What makes python any different?
                  • lexicality 6 minutes ago
                    A python virtualenv is just a slightly more complicated node_modules. Tools like PDM, Poetry and uv handle them automatically for you to the point where it effectively is the same as npm.

                    The thing that makes Python different is that it was never designed with any kind of per-project isolation in mind and this is the best way anyone's come up with to hack that behaviour into the language.

                    • andoando 4 minutes ago
                      I mean what about the language prohibits this though?

                      All you have to know is where to import packages from as far as I understand.

                      Id imagine even just a script that does

                      export pythonpath=/python_modules

                      python x.py

                      should do the trick.

                      transitive depedencies I suppose?

              • halostatue 2 hours ago
                For years, pipx did almost all the work that I needed it to do for safely running utility scripts.

                uv has replaced that for me, and has replaced most other tools that I used with the (tiny amount of) Python that I write for production.

          • aeurielesn 7 hours ago
            That's giving way too much credit to uv.
            • kraddypatties 4 hours ago
              I'm interpreting this as "uv was built off of years of PEPs", which is true; that being said the UX of `uv` is their own, and to me has significantly reduced the amount of time I spend thinking about requirements, modules, etc.
            • karel-3d 6 hours ago
              uv is really that good.
            • NeutralCrane 5 hours ago
              It really isnt
        • simonw 8 hours ago
          > IIRC, Python tools didn't share their data across projects, so they could build the same heavy dependencies multiple times.

          One of the neatest features of uv is that it uses clever symlinking tricks so if you have a dozen different Python environments all with the same dependency there's only one copy of that dependency on disk.

          • zahlman 5 hours ago
            Hard links, in fact. It's not hard to do, just (the Rust equivalent of) `os.link` in place of `os.copy` pretty much. The actually clever part is that the package cache actually contains files that can be used this way, instead of just having wheels and unpacking them from scratch each time.

            For pip to do this, first it would have to organize its cache in a sensible manner, such that it could work as an actual download cache. Currently it is an HTTP cache (except for locally-built wheels), where it uses a vendored third-party library to simulate the connection to files.pythonhosted.org (in the common PyPI case). But it still needs to connect to pypi.org to figure out the URI that the third-party library will simulate accessing.

        • runjake 6 hours ago
          I would not be putting up with Python if not for uv. It’s that good.

          Before uv came along I was starting to write stuff in Go that I’d normally write in Python.

          • QuercusMax 4 hours ago
            Coming from a mostly Java guy (since around 2001), I've been away from Python for a while and my two most recent work projects have been in Python and both switched to uv around the time I joined. Such a huge difference in time and pain - I'm with you here.

            Python's always been a pretty nice language to work in, and uv makes it one of the most pleasant to deal with.

            • runjake 1 hour ago
              I don't even like Python as a language (it's growing on me, but only a little).

              It's just so useful: uv is great and there are decent quality packages for everything imaginable.

        • whimsicalism 4 hours ago
          uv solved it, it’s safe to come back now.
      • regularfry 5 hours ago
        There's definitely a philosophical shift that you can observe happening over the last 12-15 years or so, where at the start you have the interpreter as the centre of the world and at the end there's an ecosystem management tool that you use to give yourself an interpreter (and virtual environments, and so on) per project.

        I think this properly kicked off with RVM, which needed to come into existence because you had this situation where the Ruby interpreter was going through incompatible changes, the versions on popular distributions were lagging, and Rails, the main reason people were turning to Ruby, was relatively militant about which interpreter versions it would support. Also, building the interpreter such that it would successfully run Rails wasn't trivial. Not that hard, but enough that a convenience wrapper mattered. So you had a whole generation of web devs coming up in an environment where the core language wasn't the first touchpoint, and there wasn't an assumption that you could (or should) rely on what you could apt-get install on the base OS.

        This is broadly an extremely good thing.

        But the critical thing that RVM did was that it broke the circular dependency at the core of the problem: it didn't itself depend on having a working ruby interpreter. Prior to that you could observe a sort of sniffiness about tools for a language which weren't implemented in that language, but RVM solved enough of the pain that it barged straight past that.

        Then you had similar tools popping up in other languages - nvm and leiningen are the first that spring to mind, but I'd also throw (for instance) asdf into the mix here - where the executable that you call to set up your environment has a '#!/bin/bash' shebang line.

        Go has sidestepped most of this because of three things: 1) rigorous backwards compatibility; 2) the simplest possible installation onramp; 3) being timed with the above timeline so that having a pre-existing `go` binary provided by your OS is unlikely unless you install it yourself. And none of those are true of Python. The backwards compatibility breaks in this period are legendary, you almost always do have a pre-existing Python to confuse things, and installing a new python without breaking that pre-existing Python, which your OS itself depends on, is a risk. Add to that the sniffiness I mentioned (which you can still see today on `uv` threads) and you've got a situation where Python is catching up to what other languages managed a decade ago.

        Again.

        • bee_rider 4 hours ago
          It is sort of funny, if we squint just the wrong way, “ecosystem management tool first, then think about interpreters” starts to look a lot like… a package manager, haha.
      • MarsIronPI 4 hours ago
        > If you've never used Clojure and start a Clojure project, you will almost definitely find advice telling you to use Leiningen.

        I thought the current best practice for Clojure was to use the shiny new built-in tooling? deps.edn or something like that?

        • fulafel 1 hour ago
          Clojure CLI (aka deps.edn) came out in 2018 and in the survey "how do you manage your dependencies?" question crossed 50% usage in early 2020. So for 6-8 years now.
        • codemonkey-zeta 2 hours ago
          deps.edn is becoming the default choice, yes. I interpreted the parent comment as saying "you will see advice to use leiningen (even though newer solutions exist, simply because it _was_ the default choice when the articles were written)"
      • houzi 9 hours ago
        Do you think a non-python user would piece it together if the shebang line reveals what tool to use?
      • zahlman 5 hours ago
        > you might find someone saying to use uv, but also potentially venv, poetry or hatch.

        This is sort of like saying "You might find someone saying to drive a Ford, but also potentially internal combustion engine, Nissan or Hyundai".

        • evilduck 5 hours ago
          Only to those already steeped in Python. To an outsider they're all equally arbitrary non-descriptive words and there's not even obvious proper noun capitalization to tell apart a component from a tool brand.
          • zahlman 5 hours ago
            It's always rather irritating to me that people make these complaints without trying to understand any of the under-the-hood stuff, because the ultimate conclusion is that it's somehow a bad thing that, on a FOSS project, multiple people tried to solve a problem concurrently.
            • NetMageSCW 2 hours ago
              That’s especially ironic given that inside Python part of the philosophy is “There should be one-- and preferably only one --obvious way to do it.” So why does Python’s external environment seem more like something that escape from a Perl zoo?
              • kstrauser 2 hours ago
                The one obvious way is the underlying virtualenv abstraction. Everything else just makes that part easier or more transparent.
              • zahlman 1 hour ago
                What kstrauser said.

                But with much more detail: it seems complicated because

                * People refuse to learn basic concepts that are readily explained by many sources; e.g. https://chriswarrick.com/blog/2018/09/04/python-virtual-envi... [0].

                * People cling to memories of long-obsolete issues. When people point to XKCD 1987 they overlook that Python 2.x has been EOL for almost six years (and 3.6 for over four, but whatever)[1]; only Mac users have to worry about "homebrew" (which I understand was directly interfering with stuff back in the day) or "framework builds" of Python; easy_install is similarly a long-deprecated dinosaur that you also would never need once you have pip set up; and fewer and fewer people actually need Anaconda for anything[2][3].

                * There is never just one way to do it, depending on your understanding of "do". Everyone will always imagine that the underlying functionality can be wrapped in a more user-friendly way, and they will have multiple incompatible ideas about what is the most user-friendly.

                But there is one obvious "way to do it", which is to set up the virtual environment and then launch the virtual environment's Python executable. Literally everything else is window dressing on top of that. The only thing that "activating" the environment does is configure environment variables so that `python` means the virtual environment's Python executable. All your various alternative tools are just presenting different ways to ensure that you run the correct Python (under the assumption that you don't want to remember a path to it, I guess) and to bundle up the virtual environment creation with some other development task.

                The Python community did explicitly provide for multiple people to provide such wrappers. This was not by providing the "15th competing standard". It was by providing the standard (really a set of standards designed to work together: the virtual environment support in the standard library, the PEPs describing `pyproject.toml`, and so on), which replaced a Wild West (where Setuptools was the sheriff and pip its deputy).

                [0]: By the way, this is by someone who doesn't like virtual environments and was one of the biggest backers of PEP 582.

                [1]: Of course, this is not Randall Munroe's fault. The comic dates to 2018, right in the middle of the period where the community was trying to sort things out and figure out how to not require the often problematic `setup.py` configuration for every project including pure-Python ones.

                [2]: The SciPy stack has been installable from wheels for almost everyone for quite some time and they were even able to get 3.12 wheels out promptly despite being hamstrung by the standard library `distutils` removal.

                [3]: Those who do need it, meanwhile, can generally live within that environment entirely.

        • fwip 2 hours ago
          I imagine by this they meant `python -m venv` specifically, using that interface directly, rather than through another wrapper CLI tool.
          • zahlman 1 hour ago
            Fair.

            The way I teach, I would start there; then you always have it as a fallback, and understand the system better.

            I generally sort users into aspirants who really should learn those things (and will benefit from it), vs. complete end users who just want the code to run (for whom the developer should be expected to provide, if they expect to gain such a following).

      • NeutralCrane 5 hours ago
        uv has been around for less than two years. It’s on track to become the default choice, it’s just a matter of time.
    • embedding-shape 7 hours ago
      I've moved over mostly to uv too, using `uv pip` when needed but mostly sticking with `uv add`. But as soon as you start using `uv pip` you end up with all the drawbacks of `uv pip`, namely that whatever you pass after can affect earlier dependency resolutions too. Running `uv pip install dep-a` and then `... dep-b` isn't the same as `... dep-b` first and then `... dep-a`, or the same as `uv pip install dep-a dep-b` which coming from an environment that does proper dependency resolution and have workspaces, can be really confusing.

      This is more of a pip issue than uv though, and `uv pip` is still preferable in my mind, but seems Python package management will forever be a mess, not even the bandaid uv can fix things like these.

      • micik 1 hour ago
        i found uv frustrating. i dont know what problem is it trying to solve. it's not a tool for managing virtualenvs, but it does them as well. i guess it's a tool for dependency management. the "uv tool" stuff. kinda weird. i gave it an honest try but i was working around it with shell functions all the time.

        in the end i went back to good old virtualenvwrapper.sh and setting PYTHONPATH. full control over what goes into the venv and how. i guess people like writing new tools. i can understand that.

        • embedding-shape 1 hour ago
          Maybe I "entered" the Python ecosystem at a different time, but I never used virtualenvwrapper.sh nor sat PYTHONPATH manually ever. When I first came into contact with Python, I think doing `virtuelenv venv && source venv/bin/activate` was what was recommended to me at the time. Eventually I used `python -m venv` but always also with `pip` and a `requirements.txt`. I pretty much stuck with that until maybe 1 year ago I started playing around with `uv`, and for me, I just use `uv venv|pip|init|add` from uv, and nothing else from any other tools, and generally do pretty basic stuff.

          Maybe for more complex projects and use cases it's harder, but it's a lot faster than just pip and pyproject.toml is a lot nicer to manage than `requirements.txt`, so that's two easy enough wins for me to move over.

      • sieep 6 hours ago
        Ive been away from python for awhile now, I was under the impression uv was somehow solving this dependency hell. Whats the benefit of using uv/pip together? Speed?
        • embedding-shape 5 hours ago
          As far as I can tell, `pip` by itself still doesn't even do something basic as resolving the dependency tree first, then download all the packages in parallel, as an basic example. The `uv pip` shim does.

          And regardless if you use only uv, or pip-via-uv, or straight up pip, dependencies you install later steps over dependencies you installed earlier, and no tool so far seems to try to solve this, which leads me to conclude it's a Python problem, not a package manager problem.

        • chuckadams 6 hours ago
          `uv pip` is still uv, it's just uv's compatibility layer for pip.
    • JodieBenitez 5 hours ago
      you don't even need you prefered python version, uv will download it.
    • tgv 7 hours ago
      Won't those dependencies then be global? With potential conflicts as a result?
      • auxym 7 hours ago
        uv uses a global cache but hardlinks the dependencies for your script into a temp venv that is only for your script, so its still pretty fast.
      • stephenlf 7 hours ago
        Nope! uv takes care of that. uv is a work of art.
        • tgv 7 hours ago
          Then I should seriously take a look at it. I figured it was just another package manager.
    • zahlman 6 hours ago
      There are really so many things about this point that I don't get.

      First off, in my mind the kinds of things that are "scripts" don't have dependencies outside the standard library, or if they do are highly specific to my own needs on my own system. (It's also notable that one of the advantages the author cites for Go in this niche is a standard library that avoids the need for dependencies in quick scripts! Is this not one of Python's major selling points since day 1?)

      Second, even if you have dependencies you don't have to learn differences between these tools. You can pick one and use it.

      Third, virtual environments are literally just a place on disk for those dependencies to be installed, that contains a config file and some stubs that are automatically set up by a one-liner provided by the standard library. You don't need to go into them and inspect anything if you don't want to. You don't need to use the activation script; you can just specify the venv's executable instead if you prefer. None of it is conceptually difficult.

      Fourth, sharing an environment for these quick scripts actually just works fine an awful lot of the time. I got away with it for years before proper organization became second nature, and I would usually still be fine with it (except that having an isolated environment for the current project is the easiest way to be sure that I've correctly listed its dependencies). In my experience it's just not a thing for your quick throwaway scripts to be dependent on incompatible Numpy versions or whatever.

      ... And really, to avoid ever having to think about the dependencies you provide dynamically, you're going to switch to a compiled language? If it were such a good idea, nobody would have thought of making languages like Python in the first place.

      And uh...

      > As long as the receiving end has the latest version of go, the script will run on any OS for tens of years in the future. Anyone who's ever tried to get python working on different systems knows what a steep annoying curve it is.

      The pseudo-shebang trick here isn't going to work on Windows any more than a conventional one is. And no, when I switched from Windows to Linux, getting my Python stuff to work was not a "steep annoying curve" at all. It came more or less automatically with acclimating to Linux in general.

      (I guess referring to ".pyproject" instead of the actually-meaningful `pyproject.toml` is just part of the trolling.)

      • kstrauser 5 hours ago
        > Third, virtual environments are literally just a place on disk for those dependencies

        I had a recent conversation with a colleague. I said how nice it is using uv now. They said they were glad because they hated messing with virtualenvs so much that preferred TypeScript now. I asked them what node_modules is, they paused for a moment, and replied “point taken”.

        Uv still uses venvs because it’s the official way Python stores all the project packages in one place. Node/npm, Go/go, and Rust/cargo all do similar things, but I only really here people grousing about Python’s version, which as you say, you can totally ignore and never ever look at.

        • zahlman 5 hours ago
          From my experience, it seems like a lot of the grousing is from people who don't like the "activation script" workflow and mistakenly think it's mandatory. Though I've also seen aesthetic objections to the environment actually having internal structure rather than just being another `site-packages` folder (okay; and what are the rules for telling Python to use it?)

          The very long discussion (https://discuss.python.org/t/pep-582-python-local-packages-d...) of PEP 582 (https://peps.python.org/pep-0582/ ; the "__pypackages__" folder proposal) seems relevant here.

          • kstrauser 4 hours ago
            I've heard those objections, too. I do get that specific complaint: it's another step you have to do. That said, things like direnv and mise make that disappear. I personally like the activation workflow and how explicit it is, as you're activating that specific venv, or maybe one in a different location if you want to use that instead. I don't like sprinkling "uv run ..." all over the place. But the nice part is that both of those work, and you can pick whichever one you prefer.

            It'll be interesting to see how this all plays out with __pypackages__ and friends.

            • zahlman 4 hours ago
              > But the nice part is that both of those work, and you can pick whichever one you prefer.

              Yep. And so does the pyenv approach (which I understand involves permanently adding a relative path to $PATH, wherein the system might place a stub executable that invokes the venv associated with the current working directory).

              And so do hand-made subshell-based approaches, etc. etc.

              In "development mode" I use my activation-script-based wrappers. When just hacking around I generally just give the path to the venv's python explicitly.

              • kstrauser 3 hours ago
                I use your "hacking around" method for things like cron jobs, with command lines like:

                  * * * * * /path/to/project/.venv/python /path/to/project/foo.py
                
                It's more typing one time, but avoids a whole lot of fragility later.
    • t43562 7 hours ago
      ....but you have to be able to get UV and on some platforms (e.g. a raspberry pi) it won't build because the version of rust is too old. So I wrote a script called "pv" in python which works a bit like uv - just enough to get my program to work. It made me laugh a bit, but it works anywhere, well enough for my usecase. All I had to do was embed a primitive AI generated TOML parser in it.
      • zahlman 5 hours ago
        > All I had to do was embed a primitive AI generated TOML parser in it.

        The standard recommendation for this is `tomli`, which became the basis of the standard library `tomllib` in 3.11.

  • forgotpwd16 5 hours ago
    Go has explicitly rejected adding shebang support, mandating this hack, due to being considered "abuse of resources"[0]. Rather `gorun`, which is also called a mistake by Pike, is recommended instead. And can alter this method so not to need to hardcode a path.

        /// 2>/dev/null ; gorun "$0" "$@" ; exit $?
    
    >Good-old posix magic. If you ask an LLM, it'll say it's due to Shebangs.

    Well, ChatGPT gives same explanation as article, unsurprising considering this mechanic has been repeated many times.

    >none other fits as well as Go

    Nim, Zig, D, all have `-run` argument and can be used in similar way. Swift, OCaml, Haskell can directly execute a file, no need to provide an argument.

    [0]: https://groups.google.com/d/msg/golang-nuts/iGHWoUQFHjg/_dbL...

  • vovavili 4 hours ago
    >I don't want to have virtual environments and learn what the difference between pip, poetry and uv is. I don't care. I just want to run the code.

    So this is skill issue, the blog post. `uv run` and PEP 723 solved every single issue the author is describing.

    • rufius 3 hours ago
      While this is true, it is often stunning to me how long it took to get to `uv run`.

      I have worked with Python on and off for 20+ years and I _always_ dreaded working with any code base that had external packages or a virtual environment.

      `uv run` changed that and I migrated every code base at my last job to it. But it was too late for my personal stuff - I already converted or wrote net new code in Go.

      I am on the fence about Python long term. I’ve always preferred typed languages and with the advent of LLM-assisted coding, that’s even more important for consistency.

      • callc 1 hour ago
        Well said. I’m in the same boat of being on the fence about python. I’ve been burned too many times in the past.

        And even if uv was perfectly solves all of our woes, it still seems worse than languages that solve packaging and deployment with a first-party built tools.

        There’s only so much lipstick and makeup you can put on a pig…

    • oceansky 3 hours ago
      Any old enough language will have competing libraries that you need to learn in the long run too.
    • Mawr 1 hour ago
      It's a UX issue. The author is correct — nobody cares about all the mumbo-jambo virtualenvs or whatever other techno-babble.

      The user

      just

      wants

      to run

      the damn program.

      > `uv run` and PEP 723 solved every single issue the author is describing.

      PEP 723 eh? "Resolution: 08-Jan-2024"

      Sure, so long as you somehow magically gain the knowledge to use uv, then you will have been able to have a normal, table-stakes experience for whole 2 years now. Yay, go Python ecosystem!

      Is uv the default, officially recommended way to run Python? No? Remember to wave goodbye to all the users passing the language by.

      • wiseowise 1 minute ago
        > Is uv the default, officially recommended way to run Python?

        Yes, seems like it is de facto default, officially recommended way.

      • vovavili 34 minutes ago
        I don't see your point. The kind of user who will struggle to type out `uv run` will find it even more difficult to type out `//usr/local/go/bin/go run "$0" "$@"; exit`. Neither approaches are the "default, officially recommended ways to run" scripts.

        I strongly encourage you to read the article to acquire the context for the conversation before commenting, which is what I assume is happening here.

      • worksonmine 53 minutes ago
        > The user just wants to run the damn program.

        I don't agree, the user wants to run the program in a way the user wants to, but is frustrated when it doesn't.

        If all dependencies were installed on the machine the script would run no problem. I have some scripts with dependencies that are installed on the system.

        The author writes:

        > The built in tooling within the go ecosystem is another large selling point. We don't need a .pyproject or package.json to configure ad-hoc formatting and linters, backed by pipelines to ensure consistency.

        Maybe shebangs is not the solution to that problem? It's a convenience to run scripts as executable, but the user is supposed to setup the environment. Then he continues to explain that go has a great stdlib which makes it perfect for scripting. This is the reason I usually reach for python for complex scripts, the stdlib is big enough to solve most my problems.

        Now that node includes sqlite the choice isn't as easy, but I wouldn't be pissed at node and javascript if I have to setup the environment to make sure the script runs. I understand how it runs, where it gets the dependencies. If I forget to run `npm i` before running the scripts that's my error, I prefer errors that remind me of my stupidity over magic.

    • andrewmcwatters 3 hours ago
      [dead]
  • PaulRobinson 9 hours ago
    Mad genius stuff, this.

    However... scripting requires (in my experience), a different ergonomic to shippable software. I can't quite put my finger on it, but bash feels very scriptable, go feels very shippable, python is somewhere in the middle, ruby is closer to bash, rust is up near go on the shippable end.

    Good scripting is a mixture of OS-level constructs available to me in the syntax I'm in (bash obviously is just using OS commands with syntactic sugar to create conditional, loops and variables), and the kinds of problems where I don't feel I need a whole lot of tooling: LSPs, test coverage, whatever. It's languages that encourage quick, dirty, throwaway code that allows me to get that one-off job done the guy in sales needs on a Thursday so we can close the month out.

    Go doesn't feel like that. If I'm building something in Go I want to bring tests along for the ride, I want to build a proper build pipeline somewhere, I want a release process.

    I don't think I've thought about language ergonomics in this sense quite like this before, I'm curious what others think.

    • pragma_x 1 hour ago
      I know what you mean.

      For me, the dividing line is how compact the language representation is, specifically if you can get the job done in one file or not.

      I have no doubt that there's a lot of Go jobs that will fit in a 500 line script, no problem. But the language is much more geared towards modules of many files that all work together to design user-defined types, multi-threading, and more. None of that's a concern for BASH, with Python shipping enough native types to do most jobs w/o need for custom ones.

      If you need a whole directory of code to make your bang-line-equipped Go script work, you may as well compile that down and install it to /usr/local/bin.

      Also the lack of bang-line support in native Go suggests that everyone is kinda "doing it wrong". The fact that `go run` just compiles your code to a temporary binary anyway, points in that direction.

    • dingdingdang 9 hours ago
      Talking about Python "somewhere in the middle" - I had a demo of a simple webview gtk app I wanted to run on vanilla Debian setup last night.. so I did the canonical-thing-of-the-month and used uv to instantiate a venv and pull the dependencies. Then attempted to run the code.. mayhem. Errors indicating that the right things were in place but that the code still couldn't run (?) and finally Python Core Dumped.. OK. This is (in some shape or form) what happens every single time I give Python a fresh go for an idea. Eventually Golang is more verbose (and I don't particularly like the mod.go system either) but once things compile.. they run. They don't attempt running or require xyz OS specific hack.
      • fireflash38 7 hours ago
        Gtk makes that simple python program way more complex since it'll need more than pure-python dependencies.

        It's really a huge pain point in python. Pure python dependencies are amazingly easy to use, but there's a lot of packages that depend on either c extensions that need to be built or have OS dependencies. It's gotten better with wheels and manylinux builds, but you can still shoot your foot off pretty easily.

      • skeledrew 7 hours ago
        I'm pretty sure the gtk dependencies weren't built by Astral, which, yes, unfortunately means that it won't always just work, as they streamline their Python builds in... unusual ways. A few months ago I had a similar issue running a Tkinter project with uv, then all was well when I used conda instead.
      • zelphirkalt 8 hours ago
        How were the dependencies specified? What kind of files were provided for you to instantiate the venv?
      • logicallee 8 hours ago
        I haven't had the same issue with anaconda. Give it a try.
        • dns_snek 8 hours ago
          I've had similar issues with anaconda, once upon a time. I've hit a critical roadblock that ruined my day with every single Python dependency/environment tool except basic venv + requirements.txt, I think. That gets in the way the least but it's also not very helpful, you're stuck with requirements.txt which tends to be error-prone to manage.
    • cl3misch 8 hours ago
      > bash obviously is just using OS commands with syntactic sugar

      No, bash is technically not "more" OS than e.g. Python. It just happens that bash is (often) the default shell in the terminal emulator.

      • xg15 7 hours ago
        Have to disagree, "technically" yes, both are interpreted languages, but the ergonomics and mental overhead of doing certain things are wildly different:

        In python, doing math or complex string or collection operations is usually a simple oneliner, but calling shell commands or other OS processes requires fiddling with the subprocess module, writing ad-hoc streaming loops, etc - don't even start with piping several commands together.

        Bash is the opposite: As long as your task can be structured as a series of shell commands, it absolutely shines - but as soon as you require custom data manipulation in any form, you'll run into awkward edge cases and arbitrary restrictions - even for things that are absolutely basic in other languages.

        • t43562 7 hours ago
          The subprocess module is horrendous but even if it was great bash is simpler. I just think about trying to create a pipe of processes in python without the danger of blocking.
      • skeledrew 7 hours ago
        I love Python and dislike Bash, but just look at the difference between listing a folder in Bash vs Python, for example.
    • skybrian 7 hours ago
      Maybe the ergonomics of writing code is less of a problem if you have a quick way of asking an LLM to do the edits? We can optimize for readability instead.

      More specifically, for the readability of code written by an LLM.

  • flufluflufluffy 8 hours ago
    I don’t really understand the initial impetus. I like scripting in Python. That’s one of the things it’s good at. You can extremely quickly write up a simple script to perform some task, not worrying about types, memory, yada yada yada. I don’t like using Python as the main language for a large application.
    • mr_toad 8 hours ago
      I love scripting in Python too. I just hate trying to install other people’s scripts.
      • enriquto 3 hours ago
        > hate trying to install other people’s scripts.

        This phrasing sounds contradictory to me. The whole idea of scripts is that there's nothing to install (besides one standard interpreter). You just run them.

        • lucb1e 2 hours ago
          By that logic, you don't install an OS, you just put the bootloader and other supporting files on your storage medium of choice and run it
      • zahlman 5 hours ago
        > I just hate trying to install other people’s scripts.

        This notion is still strange to me. Just... incompatible with how I understand the term "script", I guess.

        • Brian_K_White 3 hours ago
          You don't understand the concept of people running software written by other people?

          One of my biggest problems with python happens to be caused by the fact that a lot of freecad is written in python, and python3 writes _pycache_ directories everywhere a script executes (which means everywhere, including all over the inside of all my git repos, so I have to add _pycache_ to all the .gitignore ) and the env variable that is supposed to disable that STUPID behavior has no effect because freecad is an appimage and my env variable is not propagating to the environment set up by freecad for itself.

          That is me "trying to install other people's scripts" the other people's script is just a little old thing called FreeCAD, no big.

          • zahlman 3 hours ago
            > That is me "trying to install other people's scripts" the other people's script is just a little old thing called FreeCAD, no big.

            What I don't understand is why you call it a "script".

            > and python3 writes _pycache_ directories everywhere a script executes (which means everywhere, including all over the inside of all my git repos, so I have to add _pycache_ to all the .gitignore )

            You're expected to do that anyway; it's part of the standard "Python project" .gitignore files offered by many sources (including GitHub).

            But you mean that the repo contains plugins that FreeCAD will import? Because otherwise I can't fathom why it's executing .py files that are within your repo.

            Anyway, this seems like a very tangential rant. And this is essentially the same thing as Java producing .class files; I can't say I run into a lot of people who are this bothered by it.

      • flanked-evergl 7 hours ago
        If they use https://packaging.python.org/en/latest/specifications/inline... then it becomes a breeze to run with uv. Not even a thing.
        • hu3 6 hours ago
          but then you need uv

          it's not as portable

          • networked 6 hours ago
            Inline script metadata itself is not tied to uv because it's a Python standard. I think the association between the two comes from people discovering ISM through uv and from their simultaneous rise.

            pipx can run Python scripts with inline script metadata. pipx is implemented in Python and packaged by Linux distributions, Free/Net/OpenBSD, Homebrew, MacPorts, and Scoop (Windows): https://repology.org/project/pipx/versions.

            • zahlman 5 hours ago
              Yes, many things can use inline script metadata.

              But a script only has one shebang.

              • networked 4 hours ago
                Perhaps a case for standardizing on an executable name like `python-script-runner` that will invoke uv, pipx, etc. as available and preferred by the user. Scripts with inline metadata can put it in the shebang line.

                I see it has been proposed: https://discuss.python.org/t/standardized-shebang-for-pep-72....

                • zahlman 4 hours ago
                  I get the impression that others didn't really understand your / the OP's idea there. You mean that the user should locally configure the machine to ensure that the standardized name points at something that can solve the problem, and then accepts the quirks of that choice, yes?

                  A lot of people seem to describe a PEP 723 use case where the recipient maybe doesn't even know what Python is (or how to check for a compatible version), but could be instructed to install uv and then copy and run the script. This idea would definitely add friction to that use case. But I think in those cases you really want to package a standalone (using PyInstaller, pex, Briefcase or any of countless other options) anyway.

    • graemep 8 hours ago
      It seems to be Linux specific (does it even work on other unix like OSes?) and Linux usually has a system Python which is reasonably stable for things you need scripting for, whereas this requires go to be installed.

      You could also use shell scripting or Python or another scripting language. While Python is not great at backward compatibility most scripts will have very few issues. Shell scripts are backward compatible as are many other scripting languages are very backward compatible (e.g. TCL) and they areG more likely to be preinstalled. If you are installing Go you could just install uv and use Python.

      The article does say "I started this post out mostly trolling" which is part of it, but mostly the motivation would be that you have a strong preference for Go.

    • andoando 46 minutes ago
      I just don't get how JS is any worse as a scripting language.

      bla bla bla

      node bla.js

    • Brian_K_White 3 hours ago
      Python is great for the coder, and unholy garbage for everyone else.

      If you care about anyone but yourself, don't write things in python for other people to distribute, install, integrate, run, live with.

      If you don't care about anyone else, enjoy python.

    • Kuinox 7 hours ago
      You do have to worry about types, you always do. You have to know, what did this function return, what can you do with it.

      When you know well the language, you dont need to search for this info for basic types, because you remember them.

      But that's also true for typed languages.

      • phantasmish 5 hours ago
        This is more than just trivially true for Python in a scripting context, too, because it doesn’t do things like type coercion that some other scripting languages do. If you want to concat an int with a string you’ll need to cast the int first, for example. It also has a bunch of list-ish and dict-ish built in types that aren’t interchangeable. You have to “worry about types” more in Python than in some of its competitors in the scripting-language space.
  • esjeon 7 hours ago
    Expected a rant, got a life-pro-tip. Enough for a good happy new year.

    That said, we can abuse the same trick for any languages that treats `//` as comment.

    List of some practical(?) languages: C/C++, Java, JavaScript, Rust, Swift, Kotlin, ObjC, D, F#, GLSL/HLSL, Groovy

    Personally, among those languages, GLSL sounds most interesting. A single-GLSL graphics demo is always inspiring. (Something like https://www.shadertoy.com/ )

    Also, let’s not forget that we can do something similar using block comment(`/* … */`). An example in C:

    /*/../usr/bin/env gcc "$0" "$@"; ./a.out; rm -vf a.out; exit; */

    #include <stdio.h>

    int main() { printf("Hello World!\n"); return 0; }

    • thechao 39 minutes ago
      For C/++ just use "#!". When TCC first came out, we used this exact technique for "C scripting". It requires a dirty SO methodology (you can't really control linking well).

      For larger projects (the exe), the shebang points to a C build file, which when compiled, knows the root path; that C build script then looks for a manifest, builds, links, and fork()s. A good a/m timestamp library with direct ccache support can spin up as fast as a script even on big projects.

      Again, this is all a bad idea bc it's hard to control your environment.

      I guess we were doing all this in the mid 2000s? When did TCC come out?

    • frizlab 7 hours ago
      For Swift there’s even a project[1] that allows running scripts that have external dependencies (posting the fork because the upstream is mostly dead).

      I think it’s uv’s equivalent, but for Swift.

      (Also Swift specifically supports an actual shebang for Swift scripts.)

      [1] https://github.com/xcode-actions/swift-sh

    • kibwen 5 hours ago
      You don't need to abuse comments like this for Rust, because it supports shebangs directly.
  • fiyec30375 4 hours ago
    I thought this was going to be a longer rant about how python needs to... Go away. Which, as a long time python programmer and contributor, and at one time avid proponent of the language, I would entertain the suggestion. I think all of ML being in Python is a collosal mistake that we'll pay for for years.

    The main reasons being it is slow, its type system is significantly harder to use than other languages, and it's hard to distribute. The only reason to use it is inertia. Obviously inertia can be sufficient for many reasons, but I would like to see the industry consider python last, and instead consider typescript, go, or rust (depending on use case) as a best practice. Python would be considered deprecated and only used for existing codebases like pytorch. Why would you write a web app in Python? Types are terrible, it's slow. There are way better alternatives.

    • solatic 3 hours ago
      100%.

      With that said... there is a reason why ML went with Python. GPU programming requires C-based libraries. NodeJS does not have a good FFI story, and neither does Rust or Go. Yes, there's support, but Python's FFI support is actually better here. Zig is too immature here.

      The world deserves a Python-like language with a better type system, a better distribution system, and not nearly as much dynamism footguns / rope for people to hang themselves with.

    • krzyk 4 hours ago
      Typescript?

      Why replace a nice language like python with anything coming out of javascript?

      • chpatrick 3 hours ago
        Its type system is miles better than Python and it has some basic stuff Python doesn't have like block scope. Functional programming is also intentionally kind of a pain in Python with the limited lambdas.

        If TypeScript had the awesome python stdlib and the Numpy/ML ecosystem I would use it over Python in a heartbeat.

        • josephg 2 hours ago
          Typescript also has significantly better performance. This is largely thanks to the browser wars funnelling an insane amount of engineering effort toward JavaScript engines over the last couple decades. Nodejs runs v8, which is the JavaScript engine used by chrome. And Bun uses JSC, written for safari.

          For IO bound tasks, it also helps that JavaScript has a much simpler threading model. And it ships an event based IO system out of the box.

        • tehjoker 3 hours ago
          you can define a named closure in python, i do it from time to time, though it does seem to surprise others sometimes. i think maybe it's not too common.
      • christophilus 4 hours ago
        Typescript is a really nice language even though it sits on a janky runtime. I’d love a subset of typescript that compiles to Go or something like that.
        • polynomial 1 hour ago
          Isn't that what Project Corsa is supposed to solve?
      • hu3 3 hours ago
        TypeScript is great!

        Has shortcomings like all languages but it brought a lot of advanced programming language concepts to the masses!

      • fiyec30375 4 hours ago
        Typescript is ubiquitous in web, and there are some amazing new frameworks that reuse typescript types on the server and client (trpc, tanstack). It's faster (than python), has ergonomic types, and a massive community + npm ecosystem. Bun advances the state of the art for runtime performance (which anthropic just bought and use for Claude code).
        • bdangubic 3 hours ago
          Did you just write both:

          > The only reason to use it is inertia

          and

          > Typescript is ubiquitous in web

          :-)

          • sinkasapa 3 hours ago
            Those are both valid reasons to use both languages. The "only" (whether true or not) is what the argument hinges on. It is roughly the same as saying that the only advantage of X is that it is popular, but Y is also popular and has additional advantages, therefore, Y is better than X. That is a valid argument, whether the premises are true or not.
            • bdangubic 2 hours ago
              I do not disagree but if you are going to say that "X" is only used because of "Y", maybe if you are pitching "Z" instead of "X" do not start with the "Y" :)
      • IshKebab 3 hours ago
        Typescript is a lot nicer than Python in many ways. Especially via Deno, and especially for scripting (imports work like people want!).

        There are some things that aren't as good, e.g. Python's arbitrary precision integers are definitely nicer for scripting. And I'd say Python's list comprehension syntax is often quite nice even if it is weirdly irregular.

        But overall Deno is a much better choice for ad-hoc scripting than Python.

        • chpatrick 3 hours ago
          • IshKebab 3 hours ago
            I am aware, but Python has that by default. In Javascript it's opt-in and less ergonomic. E.g. try loading a 64-bit integer from JSON.
            • josephg 2 hours ago
              I agree, but bigints are missing from json because the json spec defines all numbers as 64 bit floats. Any other kind of number in JSON is nonstandard.

              JavaScript itself supports bigint literals just fine. Just put an ‘n’ after your number literal. Eg 0xffffffffffffffn.

              There’s a whole bunch of features I wish we could go in and add to json. Like comments, binary blobs, dates and integers / bigints. It would be so much nicer to work with if it has that stuff.

              • IshKebab 35 minutes ago
                > the json spec defines all numbers as 64 bit floats

                It absolutely doesn't. It doesn't impose any limits on number precision or magnitude.

    • rahen 3 hours ago
      "> I think all of ML being in Python is a colossal mistake that we'll pay for for years.

      Market pressure. Early ML frameworks were in Lisp, then eventually Lua with Torch, but demand dictated the choice of Python because "it's simple" even if the result is cobbled together.

      Lisp is arguably still the most suitable language for neural networks for a lot of reasons beyond the scope of this post, but the tooling is missing. I’m developing such a framework right now, though I have no illusions that many will adopt it. Python may not be elegant or efficient, but it's simple, and that's what people want.

      • Joker_vD 3 hours ago
        Gee, I wonder why the tooling for ML in Lisp is missing even though the early ML frameworks were in Lisp. Perhaps there is something about the language that stifles truly wide collaboration?
        • rahen 3 hours ago
          I doubt it considering there are massive Clojure codebases with large teams collaborating on them every day. The lack of Lisp tooling and the prevalence of Python are more a result of inertia, low barrier to entry and ecosystem lock-in.
      • wild_egg 1 hour ago
        What sort of tooling is missing in Lisp? I'd love to check out your framework if you've shared it somewhere
        • rahen 1 hour ago
          Lisp isn't missing anything, it's a natural fit for AI/ML. It’s the ecosystem's tooling that needs catching up.

          The code hasn't reached RC yet, but I'll definitely post a Show HN once it's ready for a preview.

    • dkarl 4 hours ago
      I'd love to replace Python with something simple, expressive, and strongly typed that compiles to native code. I have a habit of building little CLI tools as conveniences for working with internal APIs, and you wouldn't think you could tell a performance difference between Go and Python for something like that, but you can. After a year or so of writing these tools in Go, I went back to Python because the LOC difference is stark, but every time I run one of them I wish it was written in Go.

      (OCaml is probably what I'm looking for, but I'm having a hard time getting motivated to tackle it, because I dread dealing with the tooling and dependency management of a 20th century language from academia.)

      • ufmace 25 minutes ago
        Rust might be worth a look. It gets much closer to the line count and convenience of the dynamic languages like Python than Go, plus a somewhat better type system. Also gets a fully modern tooling and dependency management system. And native code of course.
      • rangerelf 3 hours ago
        Have you tried Nim? Strong and static typed, versatile, compiles down to native code vía C, interops with C trivially, has macros and stuff to twist your brain if you're into that, and is trivially easy to get into.

        https://nim-lang.org

        • dkarl 3 hours ago
          That looks very interesting. The code samples look like very simple OO/imperative style code like Python. At first glance it's weird to me how much common functionality relies on macros, but it seems like that's an intentional part of the language design that users don't mind? I might give it a try.
      • archargelod 3 hours ago
        You can replace Python with Nim. It checks literally all your marks (expressive, fast, compiled, strong-typing). It's as concise as Python, and IMO, Nim syntax is even more flexible.

        https://nim-lang.org

        • kevin_thibedeau 3 hours ago
          And compilation is fast enough that you can run it as a script with shebang methods.
      • loic-sharma 54 minutes ago
        You might want to try Dart. It is simple, has great tooling, and compiles to native code.

        Disclaimer: I work on Flutter at Google.

      • eru 3 hours ago
        Yes, Go can hardly be called statically typed, when they use the empty interface everywhere.

        Yes, OCaml would be a decent language to look into. Or perhaps even OxCaml. The folks over at Jane Street have put a lot of effort into tooling recently.

      • Hasnep 3 hours ago
        I bounced off OCaml a few years ago because of the state of the tooling, despite it being almost exactly the language I was looking for. I'm really happy with Gleam now, and recommended it over OCaml for most use cases.
        • ZenoArrow 1 hour ago
          Did you consider using F#? The language is very similar to OCaml, but it has the added benefit of good tooling and a large package ecosystem (can use any .NET package).
        • dkarl 3 hours ago
          I always assumed a runtime specialized for highly concurrent, fault-tolerant, long-running processes would have a noticeable startup penalty, which is one of the things that bothers me about Python. Is that something you notice with Gleam?
          • antsinmypants 2 hours ago
            I tried out Gleam for Advent of Code this year. There was a significant difference in startup times, about 13 ms for Python and 120 ms for Gleam.

            If you want something with minimal startup times then you need a language that complies to native binaries like Zig, Rust or OCaml.

        • IshKebab 3 hours ago
          Can you use Gleam for ad-hoc scripting? In my mind that means two requirements that most languages fail at.

          1. You can import by relative file path. (Python can't.)

          2. You can specify third party dependencies in a single file script and have that work properly with IDEs.

          Deno is the best option I've found that has both of those and is statically typed.

          I'm hoping Rust will eventually too but it's going to be at least a year or two.

      • yawaramin 1 hour ago
      • fiyec30375 3 hours ago
        I suppose you could try typescript which can compile to a single binary using node or bun. Both bun and node do type stripping of ts types, and can compile a cli to a single file executable. This is what anthropic does for Claude code.
    • strunz 3 hours ago
      I swear the only the people who care about Python types are on Hacker News comments. I've never actually worked with or met someone who cared so much about it, and the ones that care at all seem just fine with type hints.
      • hu3 3 hours ago
        Perhaps some people that cared moved to other languages.

        And part of those who still complain are momentarily stuck with it.

        Just like survivorship bias. It's productive to ponder on the issues experienced by those who never returned.

      • josephg 1 hour ago
        The people we happen to work with is an incredibly biased sample set of all software engineers.

        As an example, almost everyone I’ve worked with in my career likes using macOS and Linux. But there are entire software engineering sub communities who stick to windows. For them, macOS is a quaint toy.

        If you’ve never met or worked with people who care about typing, I think that says more about your workplace and coworkers than anything. I’ve worked with plenty of engineers who consider dynamic typing to be abhorrent. Especially at places like FAANG.

        Long before typescript, before nodejs, before even “JavaScript the good parts”, Google wrote their own JavaScript compiler called Closure. The compiler is written in Java. It could do many things - but as far as I can tell, the main purpose of the compiler was to add types to JavaScript. Why? Because googlers would rather write a compiler from scratch than use a dynamically typed language. I know it was used to make the the early versions of Gmail. It may still be in use to this day.

    • CamperBob2 3 hours ago
      I think all of ML being in Python is a colossal mistake that we'll pay for for years.

      If ML fulfills its promise, it won't matter in the least what language the code is/was written in.

      If it doesn't, it won't matter anyway.

    • tayo42 3 hours ago
      How much does python really impact ml? All of the libraries are wrappers around C code that uses gpus any way, it's distributed and inference can be written in faster languages for serving anyway?
      • roadside_picnic 2 hours ago
        You're thinking only about the final step where we're just doing a bunch of matrix computation. The real work Python does in the ML world is automatic differentiation.

        Python has multiple excellent options for this: JAX, Pytorch, Tensorflow, autograd, etc. Each of these libraries excels for different use cases.

        I also believe these are cases where Python the language is part of the reason these libraries exist (whereas, to your point, for the matrix operations pretty much any language could implement these C wrappers). Python does make it easy to perform meta-programming and is very flexible when you need to manipulate the language itself.

    • robomartin 3 hours ago
      > I think all of ML being in Python is a collosal mistake that we'll pay for for years.

      > The main reasons being it is slow, <snip>, and it's hard to distribute.

      Don't forget that Python consumes approximately 70x more power when compared to C.

      • fwip 2 hours ago
        Not really applicable to ML. The massive amount of compute running on the GPU is not executing in Python, and is basically the same regardless of host language.
    • vovavili 4 hours ago
      This is a needlessly dismissive and narrow-minded attitude. Have you ever tried to use FastAPI?
      • yunnpp 3 hours ago
        You say he's narrow-minded, but you focus on the least relevant thing of everything he said, speed, and suggest that, somehow, something with "fast" in its name will fix it?

        Speed is the least concern because things like numpy are written in C and the overhead you pay for is in the glue code and ffi. The lack of a standard distribution system is a big one. Dynamic typing works well for small programs and teams but does not scale when either dimension is increased.

        But pure Python is inherently slow because of language design. It also cannot be compiled efficiently unless you introduce constraints into the language, at which point you're tackling a subset thereof. No library can fix this.

        • vovavili 3 hours ago
          Very little of what you're claiming is relevant for FastAPI specifically, which in terms of speed isn't too far from an equivalent app written in Go for writing a web app. You need to research the specifics of a problem at hand instead of making broad but situationally incorrect assumptions. The subject here is web apps, and Python is very much a capable language in this niche as of the end of 2025, both in terms of speed, code elegance and support for static typing (FastAPI is fully based on Pydantic) - https://www.techempower.com/benchmarks/#section=test&runid=7...
        • spockz 3 hours ago
          > But pure Python is inherently slow because of language design. It also cannot be compiled efficiently unless you introduce constraints into the language, at which point you're tackling a subset thereof. No library can fix this.

          A similar point was raised in the other python thread on cpython the other day, and I’m not sure I agree. For sure, it is far from trivial. However, GraalVM has shown us how it can be done for Java with generics. Highover, take the app, compile and run it. The compilation takes care of any literal use of Generics, running the app takes care of initialising classes and memory, instrumentation during runtime can be added to add runtime invocations of generics otherwise missed. Obviously, this takes a lot of details getting it right for it to work. But it can be done.

      • fiyec30375 3 hours ago
        Yes. Have you ever tried trpc?
        • vovavili 3 hours ago
          Implying that existence of your tool of preference in another programming language makes other equally impressive tools something akin to "[colossal] mistake that we'll pay for for years" "simply motivated by inertia" is way below the level of discussion I would expect from Hacker News.
          • carderne 2 hours ago
            To be fair, your comment didn't add much either.

            Their main criticisms of Python were:

            > it is slow, its type system is significantly harder to use than other languages, and it's hard to distribute

            Your comment would have been more useful if it had discussed how FastAPI addresses these issues.

            • vovavili 2 hours ago
              I would have given the OOP the effort and due respect is formulating my response if it was phrased in the way you're describing. It's only fair that comments that strongly violate the norms of substantive discourse don't get a well-crafted response back.
  • kazinator 49 minutes ago
    That's a neat trick; I don't think I've seen that before. It can be made to work for just about any language that has // comments.

    It does rely on // which is implementation-defined according to POSIX. In some system //usr could refer to some kind of network path.

    Last sentence here:

    3.254 Pathname

    A string that is used to identify a file. In the context of POSIX.1-2024, a pathname may be limited to {PATH_MAX} bytes, including the terminating null byte. It has optional beginning <slash> characters, followed by zero or more filenames separated by <slash> characters. A pathname can optionally contain one or more trailing <slash> characters. Multiple successive <slash> characters are considered to be the same as one <slash>, except it is implementation-defined whether the case of exactly two leading <slash> characters is treated specially.

    [IEEE Std 1003.1, 2024 Edition]

    It really is better for a language to either have # comments, or else support #! as a special case in a file that is presented for execution. You're also not launching an extra shell instance. (Too bad this // trick cannot use the "exec" shell command to replace the shell with the go program.)

  • adonovan 4 hours ago
    > The one big problem: gopls. We need the first line of the script to be without spaces...

    Specifically the problem here is automated reformatting. Gopls typically does this on save as you are editing, but it is good practice for your CI system to enforce the invariant that all merged *.go files are canonically formatted. This ensures that the user who makes a change formats it (and is blamed for that line), instead of the hapless next person to touch some other spot in that file. It also reduces merge conflicts.

    But there's a second big (bigger) problem with this approach: you can't use a go.mod file in a one-off script, and that means you can't specify versions of your dependencies, which undermines the appeal to compatibility that motivated your post:

    > The primary benefit of go-scripting is [...] and compatibility guarantees. While most languages aims to be backwards compatible, go has this a core feature. The "go-scripts" you write will not stop working as long as you use go version 1.*, which is perfect for a corporate environment.

    > In addition to this, the compatibility guarantees makes it much easier to share "scripts". As long as the receiving end has the latest version of go, the script will run on any OS for tens of years in the future.

    • kardianos 4 hours ago
      True, but major versions are locked in through the import path and should be compatible.
  • g947o 7 hours ago
    While you are at it, might as well do this for C++ or assembly. You hate scripting so much and would rather go to great lengths to use a complied language and throw away all the benefits of a scripting language and scripting itself, just because you don't like the language, not because of technical merit. Congratulations, you just wasted yourself many hours of precious time.

    > The price of convenience is difficulties to scale

    Of course, they never scale. The moment you start thinking about scaling, you should stop writing code as throwaway scripts but build them properly. That's not an argument to completely get rid of Python or bash. The cost of converting Python code to Go is near zero these days if there is a need to do so. Enough has been said about premature optimization.

    > Anyone who's ever tried to get python working on different systems knows what a steep annoying curve it is.

    If you need 10 libraries of certain versions to run a few lines of Python code, nobody calls that a script anymore. It becomes a proper project that requires proper package management, just like Go.

    • BobbyJo 7 hours ago
      There is a much larger gap in language ergonomics between python and C++ than between python and golang. Compile time and package management being some of the major downsides to C++.

      "You'd rather drive a compact car than an SUV? Might as well drive a motorcycle then!"

    • array_key_first 3 hours ago
      The main problem with python for system scripts is that's even in that domain it's not a very good choice.

      Perl is right there, requires no installation, and is on practically every unix-like under the sun. Sure it's not a good language, or performant, or easy to extend, but neither is python, so who cares. And, if anything, it's a bit more expressive and compact than python, maybe to a fault.

  • fsmv 6 hours ago
    I made one of these too! I decided not to use // because I use gofmt auto formatting in my editor and it puts a space between the // and the usr. This one isn't changed by gofmt:

        /*?sr/bin/env go run "$0" "$@"; exit $? #*/
    • tandr 32 minutes ago
      It works, but the best in me I cannot explain fully first 3 symbols. /*?sr/bin/env finds /usr by expanding *? to a first matching directory. But why not just /*usr/ instead?
  • llmslave2 10 hours ago
    I love it. I'm using Go to handle building full stack javascript apps, which actually works great since esbuild can be used directly inside a Go program. The issue is that it's a dependency, so I settled for having a go mod file and running it directly with Go. If somehow these dependencies could be resolved without an explicit module configured (say, it was inline in the go file itself) it would be perfect. Alas, it will probably never happen.

    That being said...use Go for scripting. It's fantastic. If you don't need any third party libraries this approach seems really clean.

    • ahartmetz 10 hours ago
      [flagged]
      • dangoodmanUT 8 hours ago
        dwight shrute detected
        • ahartmetz 2 minutes ago
          At least spell my name correctly ffs, it's right there in the org chart
      • llmslave2 10 hours ago
        Yes. Yes, I'm doing all of that with Javascript :P
      • mstipetic 10 hours ago
        Don't be that guy.
        • ahartmetz 9 hours ago
          I am going to be that guy.

          I make computers do things, but I never act like my stuff is the only stuff that makes things happen. There is a huge software stack of which my work is just the final pieces.

          • mstipetic 9 hours ago
            The term "full stack" has a widely well understood meaning, you're being pedantic
            • SunlitCat 9 hours ago
              The problem with calling it “full stack” (even if it has a widely understood meaning) is that it implicitly puts the people doing the actual lower-level work on a pedestal. It creates the impression that if this is already “full stack,” then things like device drivers, operating systems, or foundational libraries must be some kind of arcane magic reserved only for experts, which they aren’t.

              The term “full stack” works fine within its usual context, but when viewed more broadly, it becomes misleading and, in my opinion, problematic.

              • ahartmetz 8 hours ago
                Or, alternatively, it ignores and devalues the existence of these parts. In both cases, it's a weird "othering" of software below a certain line in the, ahem, full stack.
            • ahartmetz 9 hours ago
              It doesn't for me and I don't think that my subculture of computing uses similarly myopic terms.
              • konart 8 hours ago
                >It doesn't for me

                And it's okay. It doesn't mean it should be this way for everyone else.

                It is pretty common (and been so for at least two decades) for web devs to differentiate like so: backend, frontend or both. This "both" part almost always is replaced by "full stack".

                When people say this they just mean they do both parts of a web app and have no ill will or neglect towards systems programmers or engineers working on a power plant.

              • llmslave2 9 hours ago
                Where did your subculture come from, Pedanticville?
                • ahartmetz 8 hours ago
                  Mostly not web-based software, written in compiled languages
                  • flir 6 hours ago
                    A grown-up ;)
          • bheadmaster 8 hours ago
            I agree with you in sentiment - the term "full-stack" is odd and a little too grandiose for its meaning.

            But it is already established in the industry, and fighting it is unlikely to yield any positive outcomes.

  • magicalhippo 9 hours ago
    You can do the same[1] with .Net Core for those of us who like that.

    [1]: https://learn.microsoft.com/en-us/dotnet/csharp/fundamentals...

    • zahlman 5 hours ago
      That's explicit support rather than using the same // hack. The language is specifically ignoring a shebang even though it doesn't match the usual comment syntax.
    • rr808 9 hours ago
      dotnet was always really good for this. There were a bunch of third party tools that have done this since the 90s like toolsack.

      I think Java can run uncompiled text scripts now too

  • evanmoran 4 hours ago
    In a similar way I changed all of my build and deployment scripts to Go not long ago. The actual benefit was utility functions used by the service could be shared in deployment. So I could easily share code to determine if services/dbs were online or to access cloud secrets in a uniform way. It also improved all the error checks to be much clearer (did the curl fail because it’s offline or malformed).

    Additionally, it is even more powerful when used with go modules. Make every script call a single function in the shared “scripts” module and they will all be callable from anywhere symmetrically. This will ensure all scripts build even if they aren’t run all the time. It also means any script can call scripts.DeployService(…) and they don’t care what dir they are in, or who calls it. The arguments make it clear what paths/configuration is needed for each script.

  • trvv 4 hours ago
    You can lose the ugly "exit" at the end and space it right for formatting too

        // 2>/dev/null; exec go run "$0" "$@"
  • alkh 1 hour ago
    Have to post this monstrocity that let's you either run a python script with uv or with python directly if uv is not installed(for some of my collegues)

    #!/usr/bin/env bash

    """:"

    if command -v uv > /dev/null

    then exec uv run --script "$0" "$@"

    else

    exec python3 "$0" "$@"

    fi

    ":"""

  • dherman 2 hours ago
    Ha, I just tried the same trick with Rust:

      //$HOME/.cargo/bin/rustc "$0" && ${0%.rs} "$@" ; exit
      
      use std::env;
      
      fn main() {
          println!("hello, world!");
          for arg in env::args() {
              println!("arg: {arg}");
          }
      }
    
    Total hack, and it litters ./ with the generated executable. But cute.
  • itopaloglu83 6 hours ago
    > Sidetrack: I actually looked up what the point of arg0 even is since I failed to find any usecases some months back and found this answer.

    I think arg0 was always useful especially when developing multifunctional apps like busybox that changes its behavior depending on the name it was executed as.

  • petercooper 6 hours ago
    Cute trick! I pointlessly wondered if I could make it work with Ruby and you kinda can, if you can tolerate a single error message before the script runs (sadly # comments don't work as shells consider them comments too):

        =begin
        ruby $0; exit
        =end
    
        puts "Hello from Ruby"
    
    Not immediately useful, but no doubt this trick will pop up at some random moment in the future and actually be useful. Very basic C99 too, though I'm not sure I'd want to script with it(!):

        //usr/bin/cc $0 && ./a.out && exit
  • age123456gpg 10 hours ago
    Official stance about supporting interpreter mode for the reference https://github.com/golang/go/issues/24118
  • rtpg 8 hours ago
    You don't even need to end the file in `.go` or the like when using shebangs, and any self-respecting editor will be good at parsing out shebangs to identify file types (... well, Emacs seems to do it well enough for me)

    no need to name your program foo.go when you could just name it foo

    • kbolino 4 hours ago
      The `go run` tool will not execute (or even recognize) a file that does not end in .go, so this is not good advice.
  • yawaramin 1 hour ago
    I have a better idea: `ocaml script.ml` ;-)

    Get started here: https://dev.to/yawaramin/practical-ocaml-314j

  • ikrenji 2 hours ago
    I never had any trouble setting up or working with venvs. I don't even feel the need to learn what uv is because it's such a non problem for me.
  • w4rh4wk5 10 hours ago
    Back in the days, I've seen that with C files, which are compiled on the fly to a temporary file an run.

    Something like //usr/bin/gcc -o main "$0"; ./main "$@"; exit

    • ernst_klim 9 hours ago
      Tcc even supports that with `#!/usr/local/bin/tcc -run`, although I don't understand people who use c or go for "scripting", when python, ruby, TCL or perl have much superior ergonomics.
      • w4rh4wk5 9 hours ago
        This was a relatively old project that used a C program as build system / meta generator. All you needed was a working C compiler (and your shell to execute the first line). From there, it built and ran a program that generated various tables and some source code, followed by compiling the actual program. The final program used a runtime reflection system, which was set up by the generated tables and code from the first stage.

        The main reason was to do all this without any dependencies beyond a C compiler and some POSIX standard library.

  • marifjeren 6 hours ago
    > don't want to have virtual environments and learn what the difference between pip, poetry and uv is

    Oh come on, it's easy:

    Does the project have a setup.py? if so, first run several other commands before you can run it. python -m venv .venv && source .venv/bin/activate && pip install -e .

    else does it have a requirements.txt? if so python -m venv .venv && source .venv/bin/activate && pip install -r requirements.txt

    else does it have a pyproject.toml? if so poetry install and then prefix all commands with poetry run ...

    else does it have a pipfile? pipenv install and then prefix all commands with pipenv run ...

    else does it have an environment.yml? if so conda env create -f environment.yml and then look inside the file and conda activate <environment_name>

    else does it have a uv.ock? then uv sync (or uv pip install -e .) and then prefix commands with uv run.

    • zahlman 5 hours ago
      > Oh come on, it's easy: (satire)

      If you've checked out a repo or unpacked a tarball without documentation, sure.

      If you got it from PyPI or the documentation indicates you can do so, then you just use your tooling of choice.

      Also, the pip+venv approach works fine with pyproject.toml, which was designed for interoperability. Poetry is oriented towards your own development, not working with someone else's project.

      Speaking of which, a project that has a pipfile, environment.yml, uv.lock etc. and doesn't have pyproject.toml is not being seriously distributed. If this is something internal to your team, then you should already know what to do anyway.

      • NetMageSCW 2 hours ago
        Any time you have to resort to the no true scotsman fallacy you are telling me everything I need to know to run in the other direction.
        • zahlman 2 hours ago
          It is not "no true scotsman" to point out that tons of projects are put on GitHub etc. without caring about whether others will actually be able to download and "install" and use the code locally, and that it's unreasonable to expect ecosystems to handle those cases by magic. To the extent that a Python ecosystem exists and people understand development within that ecosystem, the expectations for packaging are clear and documented and standard.

          Acting as if these projects using whatever custom tool (and its associated config, by which the tool can be inferred), where that tool often isn't even advertised as an end-user package installer, are legitimate distributions is dishonest; and acting as if it reflects poorly on Python that this is possible, far more so. Nothing prevents anyone from creating a competitor to npm or Cargo etc.

    • 9dev 6 hours ago
      And you consider that easy?
  • chrisweekly 5 hours ago
    Related tangent: I recently learned about Mise^1 -- a tool for managing multiple language runtime versions. It might ease some of the python environment setup/mgmt pains everyone complains about. It apparently integrates with uv, and can do automatic virtualenv activation....

    1. https://mise.jdx.dev/lang/python.html

    via https://gelinjo.hashnode.dev/you-dont-need-nvm-sdkman-pyenv-...

  • codelikeawolf 5 hours ago
    > Did you (rightfully) want to tear your eyes out when some LLM suggested that you script with .mjs?

    I respectfully disagree with this sentiment. JS is a fantastic Python replacement for scripts. Node.js has added all kinds of utility functions that help you write scripts without needing external dependencies. Bun, Deno, and Node.js can execute TS files (if you want to bring types into the mix). All 3 runtimes are sufficiently performant. If you do end up needing external dependencies, they're only a package.json away. I write all my scripts in JS files these days.

  • chrismorgan 9 hours ago
    One suggestion: change `exit` to `exit $?` so an exit code is passed back to the shell.
  • rewilder12 3 hours ago
    I've tried Go scripting but would still still prefer python (uv is a game changer tbh). My go-to for automation will always be powershell (on linux) though. It's too bad PowerShell has the MSFT ick keeping people away from adopting it for automation. I can convince you to give it a try if you let me
  • emersion 9 hours ago
    The following would probably be more portable:

        ///usr/bin/env go run "$0" "$@"; exit
    
    Note, the exit code isn't passed through due to: https://github.com/golang/go/issues/13440
    • zahlman 5 hours ago
      Does the third leading slash do something?
    • incognito124 8 hours ago
      To quote the blog in question:

      > How true this is, is a topic I dare not enter.

      • loosescrews 1 hour ago
        The blog says that in regard to finding bash with env. My reading is that it does not make the same claim regarding finding go with env. bash is commonly found at /bin/bash (or a symlink there exists) as it is widely used in scripts and being available at that path is a well known requirement for compatibility. Go does not so much have a conical path and I have personally installed it at a variety of paths over the years (with the majority working with env). While I agree with the author of the blog that using env to find bash may or may not improve compatibility, I also agree with the parent comment that using env to find go probably does improve compatibility.
  • jas39 8 hours ago
    May I...

    augroup fix autocmd! autocmd BufWritePost *.go \ if getline(1) =~# '^// usr/bin/' \ | call setline(1, substitute(getline(1), '^// ', '//', '')) \ | silent! write \ | endif augroup END

  • commandersaki 3 hours ago
    One thing I hate about Python executables, at least the ones I've seen installed in Debian/Ubuntu is that the ones in /usr/bin are wrappers to execute somewhere in your site-packages.

    I just want to see the full script where I execute it.

    • kstrauser 2 hours ago
      Can you show an example of that?
  • chamomeal 3 hours ago
    For another excellent scripting solution that has - fast startup (no compilation) - uses a real language - easy to upgrade beyond a script - has tons of excellent dependencies baked-in

    Look no further than babashka! It’s a clojure interpreter that has first class support scripting stuff. Great built in libs for shelling out to other programs, file management, anything http related (client and server), parsing, html building, etc.

    Babashka is my go-to tool for starting all new projects now. It has mostly everything you need. And if it’s missing anything, it has some of the most interesting and flexible dependency management of any runtime I’ve ever seen. Via the “pod protocol” any process (written in go/rust/java whatever) can be exposed as a babashka dependency and bundled straight in. And no separate “install dependencies” command is required, it installs and caches things as needed.

    And of course you keep all of the magic of REPL based development. It’s got built in nrepl support, so just by adding on ‘—nrepl-server 7888’ to your command, you can connect to it from your editor and edit the process live. I’m building my personal site this way and it’s just SO nice.

    Sorry for the rant but when superior scripting solutions come up, I have to spread the love for bb. It’s too good to not talk about!!

  • throw-12-16 10 hours ago
    I've been meaning to port some dotfiles utils over to go, I think I'll give this a shot.
  • sharno 4 hours ago
    You can also use jbang to run java scripts with dependencies

    https://www.jbang.dev/

  • networked 7 hours ago
    You can use https://github.com/erning/gorun as a Go script runner. It lets you embed `go.mod` and `go.sum` and have dependencies in Go scripts. This is more verbose than Python's inline script metadata and requires manual management of checksums. gorun caches built binaries, so scripts start quickly after the first time.

    Example:

      #! /usr/bin/env gorun
      //
      // go.mod >>>
      // module foo
      // go 1.22
      // require github.com/fatih/color v1.16.0
      // require github.com/mattn/go-colorable v0.1.13
      // require github.com/mattn/go-isatty v0.0.20
      // require golang.org/x/sys v0.14.0
      // <<< go.mod
      //
      // go.sum >>>
      // github.com/fatih/color v1.16.0 h1:zmkK9Ngbjj+K0yRhTVONQh1p/HknKYSlNT+vZCzyokM=
      // github.com/fatih/color v1.16.0/go.mod h1:fL2Sau1YI5c0pdGEVCbKQbLXB6edEj1ZgiY4NijnWvE=
      // github.com/mattn/go-colorable v0.1.13 h1:fFA4WZxdEF4tXPZVKMLwD8oUnCTTo08duU7wxecdEvA=
      // github.com/mattn/go-colorable v0.1.13/go.mod h1:7S9/ev0klgBDR4GtXTXX8a3vIGJpMovkB8vQcUbaXHg=
      // github.com/mattn/go-isatty v0.0.16/go.mod h1:kYGgaQfpe5nmfYZH+SKPsOc2e4SrIfOl2e/yFXSvRLM=
      // github.com/mattn/go-isatty v0.0.20 h1:xfD0iDuEKnDkl03q4limB+vH+GxLEtL/jb4xVJSWWEY=
      // github.com/mattn/go-isatty v0.0.20/go.mod h1:W+V8PltTTMOvKvAeJH7IuucS94S2C6jfK/D7dTCTo3Y=
      // golang.org/x/sys v0.0.0-20220811171246-fbc7d0a398ab/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
      // golang.org/x/sys v0.6.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
      // golang.org/x/sys v0.14.0 h1:Vz7Qs629MkJkGyHxUlRHizWJRG2j8fbQKjELVSNhy7Q=
      // golang.org/x/sys v0.14.0/go.mod h1:/VUhepiaJMQUp4+oa/7Zr1D23ma6VTLIYjOOTFZPUcA=
      // <<< go.sum
    
      package main
    
      import "github.com/fatih/color"
    
      func main() {
          color.Green("Hello, world!")
      }
    
    The shebang line can be replaced for compatibility with standard Go tooling:

      /// 2>/dev/null ; gorun "$0" "$@" ; exit $?
      //
      // go.mod >>>
      // ...
    • karel-3d 6 hours ago
      I was looking for something like this a few times! great
  • bushbaba 5 hours ago
    This is great. Means my scripts in a golang repo can also be written in Golang vs bash/python. It can even import libs from my project.

    Awesome!

  • arccy 5 hours ago
    i tried this and security pinged me about behavior based security rules firing because it looks like an infostealer...
  • liveoneggs 7 hours ago
    what's even cooler is when the language comes with first class support for this: https://www.erlang.org/docs/18/man/escript

    Or the venerable https://babashka.org/

  • df0b9f169d54 4 hours ago
    See also https://blog.cloudflare.com/using-go-as-a-scripting-language... and https://gist.github.com/posener/73ffd326d88483df6b1cb66e8ed1... . They explained the direct use of "go run" was not good in some scenario. Is that still applied today?

    > go run does not properly return the script error code back to the operating system and this is important for scripts, because error codes are one of the most common ways multiple scripts interact with each other and the operating system environment.

  • spacecow 3 hours ago
    > Sidetrack: I actually looked up what the point of arg0 even is since I failed to find any usecases some months back and found this answer[0]. Confused, and unsatisfied by the replies, I gave up trying to understand "why arg0?" as some sort of legacy functionality.

    I struggle to think of how the answers provided here could be clearer or more satisfactory. Why write an article if you're going to half-ass your research? Why even mention this nothingburger sidetrack at all...? (Bait?)

    [0] https://stackoverflow.com/questions/24678056/linux-exec-func...

  • paulddraper 4 hours ago
    Go is poor for cheap scripts for one reason: error handling.

    It’s great for “robust” code, not for quick things that you’re okay with exploding in the default way.

  • dare944 4 hours ago
    > I started this post out mostly trolling

    So your goal was to waste your reader's time. Thanks.

  • semiinfinitely 2 hours ago
    skill issue
  • api 7 hours ago
    Tangent but... I kinda like the Python language. What I don't like about Python is the way environments are managed.

    This is something I generally believe, but I think it's particularly important for things like languages and runtimes: the idea of installing things "on" the OS or the system needs to die.

    Per-workspace or per-package environment the way Go, Rust, etc. does it is correct. Installing packages globally is wrong.

    There should not be such a thing as "globally." Ideally the global OS should be immutable or nearly so, with the only exception being maybe hardware driver stuff.

    (Yes I know there's stuff like conda, but that's yet another thing to fix a fundamentally broken paradigm.)

    • zahlman 5 hours ago
      > This is something I generally believe, but I think it's particularly important for things like languages and runtimes: the idea of installing things "on" the OS or the system needs to die.

      Python has been trying to kill it for years; or at least, the Linux distros have been seeking Python's help in killing it on Linux for years. https://peps.python.org/pep-0668/ is the latest piece of this.

      • api 3 hours ago
        I feel like this principle could be codified as "the system is not a workspace."

        The use of the system as a workspace goes back to when computers were either very small and always personal only to one user, or when they were very big and administrated by dedicated system administrators who were the only ones with permission to install things. Both these conditions are obsolete.

        • NetMageSCW 2 hours ago
          But the system is not a workspace acts like resources are free. Everything that’s wrong with a modern computer being slower than one from 30 years ago at running user applications has its roots in this kind of thing. It’s more obvious on mobile devices but desktops still suffer. Android needs more RAM and had worse power utilization until a lot was done to move toward native compiled code and background process control. Meanwhile Electron apps think it’s okay to run multiple copies of Javascript environments like working RAM is free and performance isn’t hurt.
          • kstrauser 2 hours ago
            Perhaps, but that's not really relevant here. Python's virtualenvs wouldn't increase RAM usage any more than using the system-wide environment.
  • knodi 4 hours ago
    Yes, you can but should you?
  • rubymamis 7 hours ago
    What about Mojo?
    • lgas 6 hours ago
      What about it?
  • Zababa 6 hours ago
    >This second method is, by the way, argued to increase compatibility as we utilize env to locate bash, which may not be located at /bin/bash. How true this is, is a topic I dare not enter.

    At least it seems important on NixOS, I had to rewrite a few shebangs on some scripts that used /bin/bash and didn't work on NixOS.

  • mlmonkey 3 hours ago
    I remember when I first experienced golang, I tried compiling it.

    The compilation command returned immediately, and I thought it had failed. So I tried again and same result. WTF? I thought to myself. Till I did an `ls` and saw an `a.out` sitting in the directory. I was blown away by how fast the golang compiler was.

  • jxbdbrhcb 6 hours ago
    argv0 is very necessary for use cases like busybox
  • avidphantasm 7 hours ago
    Now try to call some C++ code from your Go script…
    • dana321 3 hours ago
      Rust can do that
  • solumos 9 hours ago
    > I started this post out mostly trolling, but the more I've thought about it's not a terrible idea.

    I feel like this is the unofficial Go motto, and it almost always ends up being a terrible idea.

    • hu3 5 hours ago
      for more terrible ideas in 2026 then!
  • timcavel 2 hours ago
    [dead]
  • assanineass 8 hours ago
    [dead]
  • wiseowise 10 hours ago
    [flagged]
  • flanked-evergl 7 hours ago
    Using `uv` with python is significantly safer and better. At least you get null safety. Sure, you can't run at the speed of light, but at least you can have some decent non-halfarsed-retrofitted type checking in your script.
    • kbolino 4 hours ago
      In what way does Python have more null safety than Go? Using None will cause exceptions in basically all the same places using nil will cause panics in Go, and Python similarly lacks the usual null-safe operators like traversal (?.), coalescing (??), etc.

      You can abuse the falsity of None to do things like `var or ""`, but this ground gets quite shaky when real bools get involved.

    • tgv 7 hours ago
      I think you're mistaking Go for some other language.
  • xg15 7 hours ago
    So the entire reason why this is not a "real" shebang and instead takes the roundtrip through the shell is because the Go runtime would trip over the # character?

    I think this points to some shortcomings of the shebang mechanism itself: That it expects the shebang line to be present and adhering a specific structure - but then passes the entire file with the line to the interpreter where the interpreter has to process (and hopefully ignore) the line again.

    I know that situations where one piece of text is parsed by multiple different systems are intellectually interesting and give lots of opportunities for cleverness - but I think the straightforward solution would be to avoid such situations.

    So maybe the linux devs should consider adding a new form for the shebang where the first line is just stripped before passing the file contents to the interpreter.

    • eichin 3 hours ago
      Linux already did one better: binfmt_misc (see https://blog.cloudflare.com/using-go-as-a-scripting-language... for using it for a much cleaner way of using it to use gorun on executable *.go files.)
    • wyufro 7 hours ago
      It doesn't pass the file contents at all, it passes the file path.
      • kbolino 4 hours ago
        Yep, this is a common misunderstanding, and the blog post itself repeats it.

        The only way to "pass the file contents" would be through the standard input stream, but the script might want to use stdin like normal, so this isn't an option.

  • shevy-java 3 hours ago

        Try the following in sh:
    
        ////////usr/local/go/bin/go 
    
    Well, how about this: I use ruby or python. And not shell.

    Somehow I have been doing so since +25 years. Never regretted it. Never really needed shell either. (Ok, that's not entirely true; I refer to shell scripts. I do use bash as my primary shell, largely due to simplicity; I don't use shell scripts though, save for keeping a few legacy ones should I be at a computer that has no support for ruby, python or perl. But this is super-rare nowadays.)