• 0 Posts
  • 30 Comments
Joined 1 year ago
cake
Cake day: July 10th, 2023

help-circle


  • I feel the OOP debate got a bit out of hand. I hate OOP as well, as a paradigm.

    But I love objects. An object is just a struct that can perform operations on itself. It’s super useful. So many problems lend themselves to the use of objects.

    I’ve been writing a mix of C and C++ for so long I don’t even know where the line is supposed to be. It’s “C with objects”. I probably use only 1% of the functionality of C++, but that 1% is a huge upgrade from bare C IMO.


  • I was more referring to the fact that everything is immutable by default. As someone who’s just starting to get old (40) and literally grew up with C, it’s just ingrained in me that a variable is… Variable.

    If I want a variable to be immutable I would declare it const, and I’m just not used to the opposite. So when playing with Rust, the tutorial said that “most people find themselves fighting with the borrow checker” and sure enough, that’s what I ended up doing!

    I like the concepts behind it, it really encourages writing safe code, and I feel like it’s not just going to be a fad language but will likely end up underlying secure systems of the future. Linux kernel rewrite in Rust when?

    It’s just that personally I don’t have the flow of writing code like I would in C/++, just not used to it. The scoping, the way you pass variables and can sort of “use up a reference” so it’s not available anymore just feels cumbersome compared to just passing &memory_location and getting on with it, lol


  • Rust is heresy. Everything should be mutable, the way that God intended it to be!

    Seriously though as someone who has mainly done embedded work for decades and got used to constrained environments, the everything is immutable paradigm seems clunky and inelegant. I don’t want to copy everything all the time.

    Now if you’ll excuse me, these null pointers aren’t going to dereference themselves




  • evranch@lemmy.catolinuxmemes@lemmy.worldIndeed
    link
    fedilink
    arrow-up
    1
    ·
    3 months ago

    Interesting, I’ve installed it on quite a few machines now, all with widely varying hardware. Aside from my development/gaming rig I’ve got a shop laptop which is used by various goons to view shop drawings and look up parts, one the ex-wife still hasn’t managed to break, one is my 9 year old daughter’s and another is a potato that runs my 3d printer (to be fair this one is fossilized and doesn’t get updates).

    All are working great with no setup effort and no maintenance so I guess it’s a classic case of YMMV. I wouldn’t have used Arch for any of those use cases except maybe the 3d printer.



  • Used to for one package - stupid tax filing software that won’t run under Wine, likely because it’s shitty garbage that was written in VB. The forms don’t reflow properly.

    I had enough of the two systems trying to clobber each other’s bootloaders and this year am running Tiny10 in a VM instead. The forms STILL don’t reflow properly in anything except for VMWare. Don’t ask me why, it’s financial software and it always comes out broken and is patched just in time to file before the deadline.

    Steam’s Proton and modern AMD drivers have been super effective in allowing me to do all my gaming on Linux now, and all my dev work always was. Don’t see much reason for Windows these days.


  • evranch@lemmy.catolinuxmemes@lemmy.worldIndeed
    link
    fedilink
    arrow-up
    2
    arrow-down
    2
    ·
    3 months ago

    This is why I run Manjaro, which I never hear any love for here for some reason. It’s the rolling releases and cutting edge updates of Arch, but with the ease of use and reliability of Debian. Insert a bootable USB and have a fully functional system in a couple minutes.

    Manjaro just works, from gaming to development, and I’ve never been forced to play games to install a hardware driver or newer library that isn’t part of the release like with Debian or Ubuntu.

    Been using Linux for over 20 years and never seen a distro so trouble free.




  • evranch@lemmy.catolinuxmemes@lemmy.worldme♾️irl
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    4 months ago

    I’ve never run Arch itself but have been super happy with Manjaro. They do the testing and batch up the updates for you. 6 months in on several different machines with no issues at all, honestly better than any Debian based desktop I’ve run.

    Almost anything I’ve ever wanted has been in either the main repo or AUR, no more hassle with stale versions of this or that when I want to run some hot new software of the week. Everything just works.

    However as mentioned elsewhere it’s all Debian all the time for my servers, where stability is the name of the game.


  • evranch@lemmy.catolinuxmemes@lemmy.worldHere we go
    link
    fedilink
    arrow-up
    5
    ·
    5 months ago

    That doesn’t sound too different from the regular Unix paradigm where all your config is stored in your home directory. I’ve wiped my root partition many times over the last decade but usually everything in my desktop environment is just the same as it was. Aside from migration of dotfiles into .config which was honestly overdue.

    Unless NixOS is kind of like Ansible and is a build script for the whole system, package management and all? Haven’t tried it myself.

    My concern would be slow buildup of unused packages if that’s the case. It’s nice to wipe out that junk on an upgrade.


  • A million tiny decisions can be just as damaging. In my limited experience with several different local and cloud models you have to review basically all output as it can confidently introduce small errors. Often code will compile and run, but it has small errors that can cause output to drift, or the aforementioned long-run overflow type errors.

    Those are the errors that junior or lazy coders will never notice and walk away from, causing hard to diagnose failure down the road. And the code “looks fine” so reviewers would need to really go over it with a fine toothed comb, which only happens in critical industries.

    I will only use AI to write comments and documentation blocks and to get jumping off points for algorithms I don’t keep in my head. (“Write a function to sort this array”) It’s better than stack exchange for that IMO.


  • I tried using AI tools to do some cleanup and refactoring of some legacy embedded C code and was curious if it could do any optimization or knew any clever algorithms.

    It’s pretty good at figuring out the function of the code and adding comments, it did some decent refactoring of some sections to make them more readable.

    It has no clue about how to work in a resource constrained environment or about the main concepts that separate embedded from everything else. Namely that it has to be able to run “forever”, operate in realtime on a constant flow of sensor data, and that nobody else is taking care of your memory management.

    It even explained to me that we could do input filtering by using big arrays to do simple averaging on a device with only 1kB RAM, or use a long long for a never-reset accumulator without worrying about what will happen because “it will be years before it overflows”.

    AI buddy, some of these units have run for decades without a power cycle. If lazy coders start dumping AI output into embedded systems the whole world is going to get a lot more glitchy.



  • We’re talking about replacing lost content here though. And as such you can use the streaming services as a “backup” by re-ripping your whole collection if you lose it.

    I’m actually doing this now as part of a library cleanup. Zotify + beets are a great combo to pull down vast quantities of music and properly sort and tag it.

    Then I stream it to my phone in my truck using ampache and ultrasonic, which does have a local buffering option.

    However if you have some exotics that you ripped from rare discs, demos or prerelease, live recordings with sentimental value etc. I would suggest keeping those properly backed up. I don’t have many of these, but the ones I do have are backed up both cloud and offsite.