• 3 Posts
  • 343 Comments
Joined 1 year ago
cake
Cake day: December 7th, 2023

help-circle


  • Oh hey, someone else who uses Comic Code - greetings!

    I remember when I first saw it, I laughed - and then it grew on me. Then it turned into “I can’t believe I am buying a derivation of comic sans” but it is actually a really nice monospaced font.

    Only thing I didn’t like was having to figure out how to use Font Patcher to make a copy of it that supports nerd fonts, but it was a one and done process.

    (I also don’t really like how it looks in my IDE the few times I find myself on Windows, but I don’t really blame the font for that one - looks perfect in the same IDE on Linux…)




  • COSMIC is definitely a really neat desktop, and I’m looking forward to the stable release!

    I have had the alphas installed and usually give it a go for a day or two each revision. Currently there’s definitely some oddities regarding graphical glitches, such as after suspending the system - but I like where it’s headed.

    For me right now the biggest roadblock is the lack of a night light mode (blue light filtering). I’ve gotten so used to relying on it in other desktops.

    I do have some crappy $20 blue light filtering glasses, but they’re… Not great or comfortable to wear for any extended amount of time. And my monitors’ built in color shifting mode is also pretty lackluster.


  • I think downvotes are criticism/judgment - even if it’s more of a silent type (in lieu of actually replying, as you pointed out).

    Even from the standpoint of “You should only use downvotes to indicate that a comment/post is off topic for the community” that Reddit originally tried to (naively IMO, you can’t enforce it not being a “I disagree” button, but I digress) have is still what I’d consider to be criticism. Mainly because regardless of the vote being cast as that vs a general “I disagree”, it’s still an indication of disapproval of the commenter.

    Criticism of course comes in a lot of forms, and can vary on the “level” of it - I wouldn’t say that downvotes are a high level of criticism, but one nonetheless.

    That’s just my view of it, at least, I can’t see how they wouldn’t be a form of criticism - you shouldn’t use them as a “This breaks the rules” indicator because that should be a report instead of a vote IMO, otherwise it’s far less likely to be acted upon/handled.


  • Ooh! Is that swap implementation the default? I got back into LE for the launch of the newest season, and while I haven’t had any problems on my Ally or Deck yet, I just finished the campaign so I’m barely into endgame - I hear the issues start as you get deeper into monos…

    Funnily enough, I use Cachy on my desktop, but I don’t recall seeing anything regarding this, but I’m definitely happy to run it on my Ally too if it helps avoid future potential crashes.



  • I can only speak for myself, but honestly I’ve never been able to figure out that root of why it’s so complex to me and difficult to keep track of / understand. The only thing that seems to have a “rational” explanation to me is… Selective memory. It has been a burning question to myself for so long.

    For a while I just said “It’s too arbitrary and not logical” except math is built upon logic - 1 + 1 is clearly 2 because if I hold one finger on one hand then bring another finger from my other hand I have two fingers held.

    (Imaginary numbers though can fuck off)

    I got into programming long ago because it is logical - there’s (almost) always a reason why a computer does $THING even if I can’t tell you, someone surely can. Though generally the answer is “someone told it to do the wrong thing”. If I dig deep enough, I can usually find the answer. My life is full of so many questions that I’ll probably never have the answer to, and I found refuge in the fact that I can get the answers here.

    However… computers follow a set of rules, just like mathematicians do. So for me to call it arbitrary would just be wrong. I mean sure, a lot of the rules and formulas certainly seem arbitrary to me, there’s a reason why they are the way they are and it can be tracked down just like you can track down why a computer does $THING.

    When it comes to numbers though, my brain just doesn’t seem to hold on to it properly. I can randomly recall weird functions and quirks in libraries that I use - even remember plenty of arbitrary “things” like Vim motions… Yet ask me what nine times seven is and I can’t tell you what the answer is without doing the weird finger trick.

    So the only explanation that I can come up for that is just selective memory. I like computers and as such my brain is willing to actually memorize these things. Whereas I’ve never liked math and so my brain doesn’t see a reason to “memorize math”.

    It really frustrates me because math and computer science intersect in a lot of ways, and I’ll always be held back by this. Games for example, they run really well on your GPU because GPUs happen to be excellent at math, specifically in parallel. Encryption? Fancy math equations! Almost everything at a low level comes down to math.

    Similarly, for as much as I love logical things, I could never hold the concepts of logic gates in my head. I mean, logic is literally in the name! Even when I was heavily into Minecraft I couldn’t pick it up through Redstone.

    As such, I think for me, the “logic” argument doesn’t hold up as much as I like to think it does. The analyst in me says that I want it to be something as logical as “math is illogical” because that’s easier to admit and sounds better than “I just don’t like math”. Even worse, perhaps that subconsciously stops me liking it, thus blocking myself from ever being able to excel at it… And yet, here we are (or rather, “here I am”).



  • Welcome to Lemmy!

    For me the first Linux distribution I used was Ubuntu 8.04 - though I never had installed it on physical hardware, just a VM - VirtualBox IIRC (that didn’t occur till Ubuntu 8.10). I was in my early teenage years and had discovered Linux and found it interesting, I used the WUBI tool to install it through Windows and updated the bootloader to keep Windows as the default (with a one second timeout) since it was the family computer, I think my family would’ve shat their pants if they randomly rebooted the PC and was greeted with Linux heh.

    Though a few years later on an old secondary family laptop (it was the “someone else is using the other computer” spare/backup) that was running Vista, it had gotten so buggy and bogged down that I installed Kubuntu for my family and they happily used that until eventually that laptop was retired. It never got them to really look into permanently switching to Linux, but I think that’s more than fine - I’ve never been one to “proselytize” Linux: If it is the right tool for you, fantastic - if not, no hard feelings is how I see it. In the aforementioned case, it was the better tool over the bogged down and buggy Vista.

    As for nowadays, its CachyOS on my desktop (I’m not married to it, but its been working alright for me for about a year now), SteamOS on my Deck, Fedora on my secondary laptop (an old intel macbook), and then Bazzite on my ROG Ally. Windows is still installed on a secondary drive on my desktop, but I very rarely have to boot into it.




  • You’d basically have to do the modifications at build-time rather than at runtime, so you’d need to edit the image definition to do so (or effectively, create an “extension” of the image) - at least in the case of UBlue/Fedora Atomic based distributions. Each one has their own system for doing this (VanillaOS for example works similarly, IIRC).

    (There is the rpm-ostree layering system, though from what I know the usage of it is discouraged)

    This is pretty much why I don’t use atomic/immutable distributions on my main system - they can still be tinkered with, but it ends up requiring a lot of setup in order to do so. The last time I checked, creating custom images based on the UBlue images was quite complex and the documentation left me pretty confused. In theory, I shouldn’t have any issues with it, I work with containers all the time at both work and my own personal projects, but it just didn’t “click” for me at the time.

    It’s been a bit though, so I’ll need to revisit it at some point - I just don’t really have the time currently to learn an entire system just to make tweaks to my system. That being said, I’m perfectly happy with Bazzite on my ROG Ally where I don’t need to make any tweaks to the base system (same with my Steam Deck running SteamOS - atomic based distributions are great for these devices/use-cases).

    I have also tried out NixOS a few times, but same issue - it requires a lot of time investment to get the hang of the Nix ecosystem. For what its worth, I find the idea of atomic distributions to be intriguing and I see their appeal, but it just isn’t for me at the moment.


  • Russ@bitforged.spacetolinuxmemes@lemmy.worldLinux For Life
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 days ago

    If you hit Ctrl Alt Delete very quickly in succession (I believe it’s 7 times in a row) it will bail out from a stop job and proceed with shutting down

    Learned that trick because I was so tired of seeing that occur ha. Along that research I swear I recall seeing that it’s a KDE/SDDM issue but I might be getting some wires crossed on that (and thus, don’t quote me/take my word on that 😅)




  • You absolutely do not need AI in order to sound different in one context versus another. I mean, I highly doubt most people on Lemmy speak to their bosses in the exact way that they write their comments here.

    Hell, I’d be surprised if they spoke to their friends and family the same way all the time (yes, I’m aware that you can generally be more lax around friends - but there’s a time and place for it, whereas comments on message boards tend to just be lax all the time).

    That very concept has been around far longer than “AI” has.


  • I really don’t think there was any malice intended by them. Pretty sure the intent was more along the lines of"Yes, it has gotten better. Here’s a quick demonstration using the current conversation as context." (which reads very similar to what they said)

    They could’ve left it at “Yes it’s gotten better” but I suppose it’s similar to the idea of “A picture is worth a thousand words”. Rather than “Ugh your grammar is terrible.” Of course no one should expect perfect grammar on Lemmy or similar platforms.

    (Unless I’m just missing a giant ‘whoosh’ moment here - in that case, I’m sorry)