• 0 Posts
  • 894 Comments
Joined 1 年前
cake
Cake day: 2024年3月22日

help-circle

  • Well found.

    Also I love that the conversation almost certainly started with a comment about how everyone assumes they’d be the in the king’s court the cast majority of people would have been some variant of peasant farmer for the vast majority of history. But somehow he still would have totally been the Chief Rabbi, given the most beautiful woman, and generally be a king. I wasn’t there obviously but either he missed the point or they all missed the point. Even when talking specifically about how you can’t choose the circumstances of your birth or their consequences he still can’t imagine himself not being the king.





  • Longer read than I had realized but worth every word. Very well done.

    In other words, we may eventually reach a sort of wealth singularity, a point when the wealth of a few grows so exponentially that it basically reaches the point of infinity.

    I actually question whether or not this has already happened. The wealthy already have access to enough money that they don’t actually need to sell assets - to give anything up - in order to get credit. Just taking away Elon’s money doesn’t make him stop being Elon. It doesn’t take away his connections, his charisma, his loyal follower base, etc. Even if he did get taken down in court any financial consequence wouldn’t actually hurt his power base nearly as much as the reputational shift (see also Orange Man). Their net worth may not be literally infinite, but I can’t think of any additional power or prestige they could command if it was.







  • I do think Ed is overly critical of the impact that AI hype has had on the job market, not because the tools are actually good enough to replace people but because the business idiots who impact hiring believe they are. I think Brian Merchant had a piece not long ago talking about how mass layoffs may not be happening but there’s a definite slowdown in hiring, particularly for the kind of junior roles that we would expect to see impacted. I think this actually strengthens his overall argument, though, because the business idiots making those decisions are responding to the thoughtless coverage that so many journalists have given to the hype cycle just as so many of the people who lost it all on FTX believed their credulous coverage of crypto. If we’re going to have a dedicated professional/managerial class separate from the people who actually do things then the work of journalists like this becomes one of their only connectors to the real world just as its the only connection that people with real jobs have to the arcane details of finance or the deep magic that makes the tech we all rely on function. By abdicating their responsibility to actually inform people in favor of uncritically repeating the claims of people trying to sell them something they’re actively contributing to all of it and the harms are even farther-reaching than Ed writes here.




  • Your bonus points link is even dumber than you’re suggesting. The first half of the tweet:

    I don’t want to live in the world of “Camp Of The Saints”.

    I don’t want to live in the world of “Atlas Shrugged”.

    I don’t want to live in the world of “The GULag Archipelago”.

    I don’t want to live in the world of “Nineteen Eighty-Four”.

    I don’t want to live in the “Brave New World”.

    I want to live in the world of Hyperion, Ringworld, Foundation, and Dune

    I don’t want bad things! I want good-ish things!

    Also I’ve never read Ringworld or Hyperion but the other two stories span literal millennia and show wildly different societies over that period. Hell, showcasing that development is the entire first set of Foundation stories. Just… You can absolutely tell this sonofabitch doesn’t actually read.


  • I mean you could make an actual evo psych argument about the importance of being able to model the behavior of other people in order to function in a social world. But I think part of the problem is also in the language at this point. Like, anthropomorphizing computers has always been part of how we interact with them. Churning through an algorithm means it’s “thinking”, an unexpected shutdown means it “died”, when it sends signals through a network interface it’s “talking” and so on. But these GenAI chatbots (chatbots in general, really, but it’s gotten worse as their ability to imitate conversation has improved) are too easy to assign actual agency and personhood to, and it would be really useful to have a similarly convenient way of talking about what they do and how they do it without that baggage.