Hello all. People were very kind when I originally posted the start of this series. I’ve refrained from spamming you with every part but I thought I’d post to say the very final installment is done.

I got a bit weird with it this time as I felt like I had an infinite amount to say, all of which only barely got to the underlying point i was trying to make. So much that I wrote I also cut, it’s ridiculous.

Anyway now the series is done I’m going to move on to smaller discrete pieces as I work on my book about Tech Culture’s propensity to far-right politics. I’ll be dropping interesting stuff I find, examples of Right Libertarians saying ridiculous things, so follow along if that’s your jam.

  • BlueMonday1984@awful.systems
    link
    fedilink
    English
    arrow-up
    11
    ·
    3 months ago

    Ah, hell yeah, the much-anticipated finale.

    Gonna give particular praise to the opening, because this really caught my eye:

    Tech culture often denigrates humans through its assumptions that human skills, knowledge and functions can be improved through their replacement by technological replacements, and through transhumanist narratives that rely on a framing of human consciousness as fundamentally computational.

    I’ve touched on the framing of human consciousness part myself - seems we may be on the same wavelength.

    As for the whole “replacement by technological replacements” part…well, we’ve all seen the AI art slop-nami, its crystal fucking clear what you’re referring to.

    • UnseriousAcademic@awful.systemsOP
      link
      fedilink
      English
      arrow-up
      8
      ·
      3 months ago

      Forgot to say: yes AI generated slop is one key example, but often I’m also thinking of other tasks that are often presumed to be basic because humans can be trained to perform them with barely any conscious effort. Things like self-driving vehicles, production line work, call center work etc. Like the fact that full self drive requires supervision, often what happens with tech automation is that they create things that de-skill the role or perhaps speed it up, but still require humans in the middle to do things that are simple for us, but difficult to replicate computationally. Humans become the glue, slotted into all the points of friction and technical inadequacy, to keep the whole process running smoothly.

      Unfortunately this usually leads to downward pressure on the wages of the humans and the expectation that they match the theoretical speed of the automation rather than recognise that the human is the the actual pace setter because without them the pace would be 0.

    • Don Piano@feddit.org
      link
      fedilink
      English
      arrow-up
      5
      ·
      3 months ago

      It seems to me like when you say “human minds are computational things” you can mean this in several ways that can be roughly categorized by what your ideas of “minds” and of “computational things” are.

      You can use “computational things” to be an extremely expansive category, capable of containing vast complexity but potentially completely impractical for fully recreating on a drawing board. In this use, the word user would often agree with the statement but it wouldn’t belittle the phenomenon that is the human mind.

      Or you can use “human minds” in a way that sees them as something relatively simple - kinda like a souped up 80486 computer, maybe. Nothing all too irreplaceable or special, in any case. Maybe an Athlon can be sentient and sapient! Most who say it like that would probably disagree with the sentiment because it small-mindedly minimizes people.

      Then there’s the tech take version, which somehow does both: “Computation is everything and everything is computation, but also I have no appreciation for complexity nor a conceptualization of what all I don’t see about the human mind”. Within the huge canvas of what can be conceived of if you think in computation terms, they opt for tiny crayon scribbles.

      • Don Piano@feddit.org
        link
        fedilink
        English
        arrow-up
        3
        ·
        3 months ago

        Shorter: “Minds are computers” can imply views of (1) minds as simpler than they are, (2) computers as potentially very complex and general, or (3) both.

        1 and 3 are not only wrong but also bad.

    • UnseriousAcademic@awful.systemsOP
      link
      fedilink
      English
      arrow-up
      5
      ·
      3 months ago

      Funnily enough that was the bit I wrote last just before hitting post on Substack. A kind of “what am I actually trying to say here?” moment. Sometimes I have to switch off the academic bit of my brain and just let myself say what I think to get to clarity. Glad it hit home.

      Thanks for the link. I’m going to read that piece and have a look though the ensuing discussion.

  • Mii@awful.systems
    link
    fedilink
    English
    arrow-up
    2
    ·
    3 months ago

    Thanks for this series. I really enjoyed reading it (even though it reminded me that Yud’s Dust-Spec-vs.-Torture bullshit exists which I had successfully banished from my mind).

    I remember watching Devs a few years back and I think you put everything I felt about that show into words much better than I ever could have.

    • UnseriousAcademic@awful.systemsOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 months ago

      I really should have done a full risk assessment before invoking the dust specks mind virus, my apologies.

      Thanks for the kind feedback, I’m glad that my thoughts resonated with people. Sometimes I start these things and wonder if I’ve just analysed my way into a weird construct of my own creation.

  • AnarchistArtificer@slrpnk.net
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 months ago

    I think one of my favourite parts was this footnote in part 4:

    "Though the idea that politics is ingrained into material design choices is a growing consensus within recent Science and Technology studies work. I once wrote a piece on the role of software design in shaping people’s interpretation of texts - because that’s how I get wild on Friday nights.

    (N.b. link preserved from original)

    This is obviously a joke, but the truth in the joke is that you’re a huge nerd, and it makes me happy to see people like you on the internet. There was a while where I perceived the tech-people you critique as “my people”, but ultimately the euphoria of finding community dissolved into isolation and dread, as I grew to understand the impact of the cultish culture of tech.

    • UnseriousAcademic@awful.systemsOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      the truth in the joke is that you’re a huge nerd

      Oh absolutely. Yes I think partly my fascination with all of this is that I think I could quite easily have gone the tech bro hype train route. I’m naturally very good with getting into the weeds of tech and understanding how it works. I love systems (love factory, strategy and logistics games) love learning techy skills purely to see how it works etc. I taught myself to code just because the primary software for a particularly for of qualitative analysis annoyed me. I feel I am prime candidate for this whole world.

      But at the same time I really dislike the impoverished viewpoint that comes with being only in that space. There’s just some things that don’t fit that mode of thought. I also don’t have ultimate faith in science and tech, probably because the social sciences captured me at an early age, but also because I have an annoying habit of never being comfortable with what I think, so I’m constantly reflecting and rethinking, which I don’t think gels well with the tech bro hype train. That’s why I embrace the moniker of “Luddite with an IDE”. Captures most of it!