• @Ramenator
    link
    fedilink
    211 months ago

    I guess the biggest benefit is that you can ship it directly from there and don’t have to rewrite your application because Debian ships with an outdated version of some core library

    • @vrighter@discuss.tchncs.de
      link
      fedilink
      6
      edit-2
      11 months ago

      So it’s not a dev environment at all. It’s a runtime.

      If your code only works on your machine, to the extent that you literally have to ship a copy of your entire machine, your code sucks.

      “it works on my machine” is an excuse. And a shitty one at that.

      edit, and this way, after a week or two, your container will be the one using an outdated version of a library, not the system.

      • TheLinuxGuy
        link
        fedilink
        English
        111 months ago

        I concur, there was a few problems that might come up on various platforms like Windows not implementing C11 standard threads and other stuff, you would instead use TinyCThread library that works like a polyfill.

        All problems and challenges are workable, if the problem with Debian is out of date library, you could set up CI/CD for release build that rebuild your software when update occurs and static link the updated dependencies.

        Back to your point, if they didn’t design their code and architecture to be multiplatform like in C, they need to re-evaluate their design decisions.

    • @CameronDev@programming.dev
      link
      fedilink
      511 months ago

      But then your shipping your entire Dev Env as well? Including vscode? Seems a bit antithetical to what docker containers are meant to be? Or do you then just strip the container back down again?

      • Tempy
        link
        fedilink
        16 months ago

        With vscode’s “Remote Containers” plugin at least, it’s clever enough to install that into the container after building the image. So the image built from the dockerfile doesn’t contain the vscode stuff.