This is a slightly emotional response off the back of a discussion with a heavily TESCREAList family member recently. Which concluded with his belief there are a very small number of humans with incredible information processing abilities that know the real truth about humanity’s future. He knows I hate Yudkowsky, I know he considers him one of the most important voices of our time. It’s not fun listening to someone I love and value heading into borderline scientology territory. I kind of feel like, just with Peterson a few years ago, this is the next post-truth battle on our hands.

  • -dsr-@awful.systems
    link
    fedilink
    English
    arrow-up
    11
    ·
    1 year ago

    The thing about rationalists is that they are fully invested in irrational beliefs, which they prefer not to examine. In other words, just like most people, but with a specific terminology that, if they use it properly, identifies them as one of the elect.

    I suggest that whenever your relative talks about EA, you talk about kindness. When they bring up longtermism, point out that you have to survive in the short term to reach the long term, so working on better policies now is rather important. If they start in on life extension, note that until quite recently, all the major advances in improving average human lifespan come from improving infant mortality, and be prepared to explain the demographic transition.

    When they go extropian, say that it’s a nice vision of the future but your kids are unlikely to see it unless we fix the world we’re currently in.

    But most of all, point out that multiplying infinitesimals by infinities to justify any course of action (a) is Pascal’s Wager and (b) justifies every course of action – so it can’t justify any.