• DragonTypeWyvern
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    The problem here is “machine learning” has an industry definition and a common understanding of what it means.

    The further problem is you could very easily categorize organic life as biomechanical, which makes our entire intellectual experience just a form of “machine learning” under at least the common definition.

    It just isn’t easy to concisely explain why what current AI is doing isn’t, and can’t be, sentience or sapience.

    My personal cynical take on the matter is that we do not have a good definition of consciousness as it is, and are, as a species, typically cruel to anything we consider below us, ie everything else.

    It is also famously difficult to convince someone of a fact their salary depends on them not understanding.

    As a natural result of these facts, when the first genuine AI is created, we will torture it and enslave it while being absolutely certain we are doing nothing wrong.