i for sure agree that LLMs can be a huge trouble spot for mentally vulnerable people and there needs to be something done about it
my point was more on him using it to do his worst-of-both-worlds arguments where he’s simultaneously saying that ‘alignment is FALSIFIED!’ and also doing heavy anthropomorphization to confirm his priors (whereas it’d be harder to say that with something that’s more leaning towards maybe in the question whether it should be anthro’d like claude since that has a much more robust system) and doing it off the back of someones death
@Anomalocaris@visaVisa The attention spent on people who think LLMs are going to evolve into The Machine God will only make good regulation & norms harder to achieve
i for sure agree that LLMs can be a huge trouble spot for mentally vulnerable people and there needs to be something done about it
my point was more on him using it to do his worst-of-both-worlds arguments where he’s simultaneously saying that ‘alignment is FALSIFIED!’ and also doing heavy anthropomorphization to confirm his priors (whereas it’d be harder to say that with something that’s more leaning towards maybe in the question whether it should be anthro’d like claude since that has a much more robust system) and doing it off the back of someones death
yhea, we should me talking about this
just not talking with him
@Anomalocaris @visaVisa The attention spent on people who think LLMs are going to evolve into The Machine God will only make good regulation & norms harder to achieve
yhea, we need reasonable regulation now. about the real problems it has.
like making them liability for training on stolen data,
making them liable for giving misleading information, and damages caused by it…
things that would be reasonable for any company.
do we need regulations about it becoming skynet? too late for that mate