• 0 Posts
  • 23 Comments
Joined 1 year ago
cake
Cake day: June 20th, 2023

help-circle

  • I was on the phone with our ISP after our internet service went out. The rep asked me if the box had a green light on it - yes - then asked me to plug a light into the same outlet and confirm the power was on. I said, “Look, I understand you have to follow a script, but you literally just asked me to confirm the power light on the box was on. Clearly the power is working.”

    Same ISP sends me an email whenever we have a power outage letting me know that our internet might not work when the power is out. (I’ve joked that this email arrives before the ceiling fans have come to a stop.) But when my internet goes down, they’re completely clueless. “Ohhhh it must be that your power is out even though we monitor that closely and aren’t showing a power outage right now!”



  • This is a tough one. The problem with local only backups is, what if there’s a fire?

    I use Amazon Glacier to store my pictures. It’s $0.0036 / GB per month, so I pay less than $2/month for ~535 GB of storage that I’m using right now. There is also a cost for downloading, but if I need it, I’m going to be happy to pay it (and the costs aren’t crazy). Uploads are free.

    (The other problem with Glacier is that it’s not really an end-user-friendly experience, nor is it something easily automated. I use SimpleAmazonGlacierUploader, a Java program someone wrote, to do it. You can also upload to S3 and have it archive things to Glacier automatically - I’ve never tried this but it should work.)

    I considered getting my brother or a friend to build two storage servers (with RAID5 or something) that we’d each keep at home, and just sync to each other. Good if you have a friend or family member willing to do it (or at least host your offsite box). Down sides: Cost to build it, time to build and maintain it, cost to replace things that break, plus cost for electricity. I’ve been using Glacier for many years, so by now maybe I would have spent less on that theoretical backup system, but I also did not have to worry about it.


  • I was mistaken in my earlier comment. It’s my desktop machine that has 50 gb for everything but home, and it’s running very short on space. The server, which is what I thought I was thinking of, has 25 gb for / (not including /var, /tmp, or/home) and I’m using just over half of that space.




  • I have to admit, I only have the barest understanding of this flatpak, snap, docker, etc. business. I’ve been using Linux since the late 90s and missed this development. I haven’t been following what was happening in the development end, I suppose - in part because there isn’t that much need to, because Linux has gotten so good.

    But while I’m a power user (hey, I used Slackware until about 2015), I’ve found that I much prefer not having to spend hours and hours administering my machines every few weeks or months.

    Sorry for the long comment. But this has been bugging me for a long time and you triggered me. No need to try to answer my questions if you’re not feeling it, I’m just dumping. You can stop reading here, if you like.

    I’ve used a few appimages for limited cases - BalenaEtcher to burn HomeAssistant on to SD cards for my Raspberry Pi, and I think the scanning software I use is also an AppImage. The idea of having the libraries and binaries all together for certain things is a good one for certain cases, such as software I do not use regularly. BelenaEtcher strikes me as a perfect use case for appimage, because I don’t want to spend time installing Balena and keeping it up to date or uninstalling it, when I’m only likely going to use it once or twice. I never even move it out of my Downloads directory, just download, run, and delete.

    Ubuntu (I use Kubuntu) moved Firefox to a snap image some time back. I get it, sandboxing, not a bad idea. But I’m pretty sure I had one installed by root, and one erroneously installed by my user account, possibly caused by forgetting to “sudo” during an update one day (I’m really not sure how it happened). And that latter one, if it existed, was almost certainly sitting in my /home directory somewhere, because my user account doesn’t have authority to write to /usr or /opt or anywhere like that. I didn’t plan to install software in /home, and didn’t allocate space for it, and don’t really like the concept in general. (I’ve switched debs for Firefox, and think I got the snaps for it cleaned up.)

    If we’re going to do images installed by users, /opt seems like a much better choice, albeit with some controls - maybe /opt/username/ with permissions set by user; I’d be okay with the user account being able to install there and being unable to screw up system files. My current backup strategy involves grabbing everything in /home with a few very specific exceptions, and clearly I don’t need the current release of Firefox on my backup.

    I have OpenProject (community edition) installed for keeping track of a restoration project I’m working on, and I’m pretty sure I used docker to install it. I have to admit it was easy to install (but so are debs 99.9% of the time), but now I’m wondering about the best way to get the data back out so I can migrate that software to my server (it’s running on my desktop because my server was that 2008 computer). I assume I can backup and restore, but I haven’t yet looked into this. Or heck, maybe it’s possible to just move the Docker image, the way I moved the HomeAssistant KVM image. It looks like the data is stored in a separate volume (which I interpret to mean a file that acts as a virtual disk, similar to how KVM has a virtual disk for the OS and apps in the virtual machine). Also, I’m not clear if docker images automatically update or if I should be updating them manually.

    Then there’s Zwift. Zwift is a virtual cycling program that runs on Windows, Mac, Android, and iOS. No Linux client, which isn’t a surprise. I have a whole Windows 10 computer in the basement that only runs Zwift, and it’s my only Windows machine that I use. But, someone created a docker image of Zwift! I tried it on my Linux desktop machine a while back, and it worked. Very cool! But Zwift updates the program regularly, introducing new bugs and features - does the maintainer of that image have to do anything? What if he or she loses interest? It’d be nice to ditch Windows, but I have no idea if that docker image will remain usable indefinitely.

    I think Zwift is using Wine to run. So it seems the docker image for that has the Zwift Windows client, some Wine libraries, and everything supporting Wine is already supplied by the Kubuntu install…but I’m really not sure. Theoretically I don’t need to know, until something breaks.

    I have yet to use a flatpak, I think.

    I’ve considered asking about all of this in the Linux community here on Lemmy, but there’s probably an article with an overview of it somewhere, and I just need to search for it.


  • I was starting an install of Debian the other day, and it suggested 25 GB as the root partition (including /usr, but not /var, /home, or /tmp). I had to laugh. My server has a 50 GB partition for that purpose and it’s around half full.

    I aborted the installation. Might try again later today. (Switching this machine from Kubuntu, using a new drive, so it’s not critical that it be done at a certain time.)


  • It’s never perfect!

    I have a temperature sensor in box here that I haven’t yet installed…hmmmmmwhere could I put it!

    Actually if Shelly ever gets me the stuff I ordered IN NOVEMBER I would replace the one for the pellet stove with that…but last night I saw a Tuya Zigbee air quality monitor on Amazon for under $25 that would be even better…

    Oh wait, I installed a Zigbee switch the other day but haven’t added the lights it controls to my floor plan view!


  • Wellllllll it depends on what you want to do. For example, I have some devices from Shelly (a company I’m currently pissed off at) that allow me to make a lot of devices smart. For example, I’m using one as a thermostat for our pellet stove, controlled by HA, with a fairly complex script setting the pellet stove temperature based on the time of day, outside temperature, and other factors. The Shelly switches come with software that’s pretty good, but you can install Tasmota (I think) on them, which is open source software (I still use the original Shelly software). Those are wifi devices.

    Similarly, I have some power-monitoring plugs (the washer is plugged into one to monitor power usage, to alert me when the cycle is done), and those are made by a company called Sonoff, but they’re (relatively) easy to flash to Tasmota, I’ve gotten to the point where I can do one in a few minutes, without soldering. I have another one from Sonoff for a ceiling fan that I flashed Tasmota on, and it also works well with one limitation (in the hardware, not software): The light is only on or off, no dimming. These are also wifi devices.

    Things like Zigbee and Z-wave are supported, so they do not have direct internet access, so it’s not the same sort of concerns as with Wifi.

    Proprietary stuff is supported as well, mainly because there’s so much of it - the big name is Tuya, a Chinese company that makes a ton of cheap Wifi devices (also Zigbee, but those are less of an issue). Some of them can be flashed to install something more open like ESPHome, some cannot, it depends on the device. The default for them is cloud control, but there are various options for controlling them locally.

    My light switches and most of my temperature sensors are Zigbee. As mentioned I have some Wifi devices for various things.

    Things like thermostats: You’re basically stuck with proprietary because there aren’t many open source options. I mentioned the pellet stove, which I use the Shelly to control, paired with a Zigbee temperature and humidity sensor (Aqara brand), and it works well, but I’m not sure I’d be comfortable turning over my house’s main HVAC to HA - it’s more complex, and if HA goes down my house is going to get hot or cold. I use an ecobee, but other brands are supported as well.






  • So they are producing drivers for their cards, and I still don’t understand your comment. They’re not fully open source, which is a valid concern, but you said they “often don’t give much of a shit about linux”…they’re literally producing drivers for their cards for Linux, just like they do for Windows. I’m not sure what else you want them to do.



  • Yeah, very odd. A few weeks ago, I retired a computer that had 4 GB of RAM that was doing server duties, running Debian. It was doing a great job until I tried running a virtual machine on it (for Home Assistant); that was just killing the poor thing. The processor was a Core 2 Quad that was introduced in 2008, so I got plenty of life out of that setup.