https://xcancel.com/tsarnick/status/1882927003508359242
Eliezer Yudkowsky says he would like to be a post-human some day, but the way to get there is by experimenting on augmenting biological intelligence through adult gene therapy targeting the human brain with suicide volunteers who may end up schizophrenic rather than taking a “leap of death” into unconstrained AI development
(found via flipping through LW for sneerable posts/comments)
https://www.lesswrong.com/posts/sT5MX8jK9tHiBM5NK/re-taste
No it doesn’t, you fools, you absolute rubes
wat
Your future region of the spacetime diagram is inside a locker, nerd