Shatur@lemmy.ml to Linux@lemmy.ml · 11 months agoAMD Quietly Funded A Drop-In CUDA Implementation Built On ROCm: It's Now Open-Sourcewww.phoronix.comexternal-linkmessage-square20fedilinkarrow-up1503arrow-down15cross-posted to: linux@linux.communitystable_diffusion@lemmy.dbzer0.comprogramming@programming.devtechnology@lemmy.worldlinux@lemmy.worldtechnology@lemmy.worldopensource@lemmy.mlhackernews@lemmy.smeargle.fans
arrow-up1498arrow-down1external-linkAMD Quietly Funded A Drop-In CUDA Implementation Built On ROCm: It's Now Open-Sourcewww.phoronix.comShatur@lemmy.ml to Linux@lemmy.ml · 11 months agomessage-square20fedilinkcross-posted to: linux@linux.communitystable_diffusion@lemmy.dbzer0.comprogramming@programming.devtechnology@lemmy.worldlinux@lemmy.worldtechnology@lemmy.worldopensource@lemmy.mlhackernews@lemmy.smeargle.fans
minus-squareUraniumBlazer@lemm.eelinkfedilinkEnglisharrow-up12arrow-down1·edit-210 months agoCuda is required to be able to interface with Nvidia GPUs. AI stuff almost always requires GPUs for the best performance.
Cuda is required to be able to interface with Nvidia GPUs. AI stuff almost always requires GPUs for the best performance.