There was a time where this debate was bigger. It seems the world has shifted towards architectures and tooling that does not allow dynamic linking or makes it harder. This compromise makes it easier for the maintainers of the tools / languages, but does take away choice from the user / developer. But maybe that’s not important? What are your thoughts?

  • lysdexic@programming.dev
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 years ago

    The main problem is that dynamic linking is hard.

    That is not a problem. That is a challenge for those who develop implementations, but it is hardly a problem. Doing hard things is the job description of any engineer.

    Dynamic linking does not even reliably work with C++, an “old” language with decades of tooling and experience on the matter.

    This is not true at all. Basically all major operating systems rely on dynamic linking, and all of them support C++ extensively. If I recall correctly, macOS even supports multiple types of dynamic linking. On Windows, DLLs are use extensively by system and userland applications. There are no problems other than versioning and version conflicts, and even that is a solved problem.

    You get into all kind of UB when interacting with a separate DSO, especially since there are minimal verification of the ABI compatibility when loading a dynamic library.

    This statement makes no sense at all. Undefined behavior is just behavior that the C++ standard intentionally did not imposed restrictions upon by leaving the behavior without a definition. Implementations can and do fill in the blanks.

    ABI compatibility is also a silly thing to bring up in terms of dynamic linking because it also breaks for static linking.

    So dynamic linking never really worked,

    This statement is patently and blatantly false. There was no major operating system in use, not a single one, where dynamic linking is/was not used extensively. This has been the case for decades.