• 0 Posts
  • 13 Comments
Joined 2 years ago
cake
Cake day: August 6th, 2023

help-circle

  • Yes, absolutely. And they can drag Canonical into it as well if they wish though it’s harder. Being UK based doesn’t protect them from the long arm of US law including arresting any US personnel, freezing and seizing their funds, putting out arrest warrants for and harassing those in the UK with the fear of arrest and rendition to the US if they go to a third country (for a conference, vacation, etc, most would buckle rather than live under that). Additionally the US could sanction them for non-cooperation by making it illegal for US companies to sell them products and services, for US citizens to work for or aid them, etc.

    They can go after community led projects too, just send the feds over to the houses of some senior US developers and threaten and intimidate them, intimate their imminent arrest and prison sentence unless they stop contact and work with parties from whatever countries the US wishes to choose to name. Raid their houses, seize their electronics, detain them for hours in poor conditions. Lots of ways to apply pressure that doesn’t even have to stand up to extensive legal scrutiny (they can keep devices and things and the people would have to sue to get them back).

    The code itself is likely to exist in multiple places so if someone wanted to fork from say next week’s builds for an EU build they could and there would be little the US could do to stop that but they could stop cooperation and force these developers to apply technical measures to attempt to prevent downloads from IP addresses known to belong to sanctioned countries of their choosing.

    It’s not like the US can slam the door and take its Linux home and China and the EU and Russia are left with nothing, they’d still have old builds and code and could develop off of those though with broken international cooperation it would be a fragmented process prone to various teething issues.


  • If you’re just backing up and not serving this data just get 2-3 4TB drives (new, recertified, or used) and an external dock and test the drive then back it up then test again and check SMART both times. Place one drive with a relative or trusted friend. Connect and power up each of the drives at least once annually, refresh the data with anything new at that time and check the smart stats, consider running at least a quick SMART test to ensure none are mechanically failing then back to being unplugged. Really every 3-6 months would be ideal to power on and check SMART but I wouldn’t pester a relative that often for the external one, 1-2 times a year should be fine for that.

    This strategy protects you from cryptolocker malware by not leaving any of them live and accessible.

    • What’s the cheapest and most flexible NAS I can make from eBay or local? What kind of processors and what motherboard features?

    Cheapest or most flexible, choose one. If you want absolute cheapest but not that flexible you can buy a used office PC, a Thinkcenter or Dell optiplex are the most reliable ones though depending on the model they may accommodate anywhere from 1 to if you’re lucky 4 (though commonly only 2) drives via that many SATA ports (often half the SATA ports are 1.0/2.0 for DVD drives so you may not get full speed). Finding space inside them for more than 1 drive could also be a problem depending on form factor but mid-tower models often have room for 2 with space for a third lying on the case itself if you really want to push it.

    Most flexible I suppose someone else’s old NAS build, a used case with room for at least 4 3.5" drives gives you a little room to expand.

    • What separate guides should I follow to source the drives? What RAID?

    You don’t need RAID, it’s not a back-up solution, RAID is for high data availability and integrity. If you really want to you can set-up a RAID 1 I suppose though know this means you’d require at minimum 4 disks for your data and one copy and 6 disks for two copies.

    As to sourcing the drives, there are various companies, server parts deals is one that’s well known and decent though their presently available sizes may be larger than what you’re after. No matter whether the drive is brand new, recertified or bought used on ebay the recommendation is test, test, test. Even new drives can be bad. Run a full SMART test at least once, check the SMART data and make sure there are no failure indicators. If you want to be really thorough I’d suggest checking the SMART data when you get it, noting anything concerning, running an extended/full SMART test then after that finishes formatting the drive but unchecking quick format and doing a slower format option that writes zeros across the drive, then filling with your data, then doing another full/extended SMART test and again checking the SMART values before putting it away. Re-test and check SMART at least annually if you’re keeping the drives cold.

    • What backup style should I follow? How many cold copies? How do I even handle the event of a fire?

    At least two copies, ideally three, at least one copy off-site for things such as fire. If you don’t have a relative, friend, or workplace where you can stash an off-site copy your option would be basically cloud storage back-up which for 4TB wouldn’t be too costly (backblaze personal would allow this much IF you keep one copy connected to a computer that has their app and is turned on at least monthly and they’re $100 a year though note they will delete your data if you go more than 30 rolling days without syncing so if there is a disaster you have a limited time to either get another drive and download it again or contact them and pay to have a copy shipped to you before it’s deleted).

    You could also I suppose invest in a fireproof safe though that doesn’t protect against burglary where they steal your safe thinking it has valuables in it. You really need a copy off-site. Other options would be a bank safe deposit box though probably more costly.

    One way to get friends to help is to buy more storage space than you need, say two 8TB drives and you offer to back-up a copy of their stuff at your house so you have a copy of their stuff+yours at your house and they have the same copy at theirs. Though you could also use separate drives.

    Most are redundant video files that are in old encodings or not encoded at all

    All re-encoding unless it’s from lossless to lossless induces degradation. For archival purposes I’d suggest against re-encoding unless it’s to another lossless format or unless they’re in a lossless format or very high bitrate (>20MBps video for SD or 1080p HD) and you’re keeping a high bitrate in the new encoding. Also avoid hardware encoding, it’s faster but introduces more degradation and is less precise than software encoding. Removing duplicates is another matter.


  • MakeMKV can rip the DVDs without touching the contents. I’d suggest either an ISO or more helpfully the contents in folder layout which should be preserved under a top level folder with the name of the disc and at the bottom level .vob files.

    You certainly can use Handbrake but it is re-encoding and if you have no experience it’s easy to mess up (among other things de-interlacing doesn’t always work right without tweaking so it’s typically best for archiving to not re-encode DVDs before sharing).

    -If- you do chose to use Handbrake (again I wouldn’t recommend it if archiving, it takes skill and there’s a reason why to this day full DVD rips are useful to people who want a copy while someone’s best attempt at an AVI file made 15 years ago looks awful and is considered useless given the low bitrates and old codecs) I’d plead you use software not hardware encoding, choose x264 or x265 (10bit for x265) and use the slow preset at CRF, constant quality 16 and make sure de-interlacing is set right on auto, also pass through the audio in original dolby digital as well as vobsub subtitles. But it really is best to not encode and just copy.

    You can share directly to the DHT swarms by just creating your own torrent and eventually people will find it assuming it’s named correctly in the format of <movie name (year)>.

    Don’t duplicate other people’s work if you can help it. There are various sites for sharing this type of stuff, I don’t want to get in trouble so won’t name the one but there is one listed in the piracy community megathread, a Russian one, semi-private. I would search disc titles first to make sure what you’re doing doesn’t already exist and focus on archiving and sharing original non-re-encoded copies of those which don’t presently exist elsewhere.


  • Interesting project. Thanks for the link and I do appreciate it and could see some very good uses for that but it’s not quite what I meant.

    Unfortunately as it notes it works as a companion for reverse proxies so it doesn’t solve the big hurdle there which is handling secure and working flow (specifically ingress) of Jellyfin traffic into a network as a turn-key solution. All this does is change the authorization mechanism but my users don’t have an issue with writing down passwords and emails. Still leaves the burden of:

    • choosing and setting up the reverse proxy,
    • certificates for that,
    • paying for a domain so I can properly use certificates for encryption,
    • making sure that works,
    • chore of updating the reverse proxy, refreshing certs (and it breaking if we forget or the process fails), etc

    Which is a hassle and a half for technically proficient users and the point that most other people would give up.

    By contrast with Plex how many steps are there?

    1. Install (going to skip media library setup as Jellyfin requires that too so it’s assumed)
    2. Set up any port settings, open any relevant ports on firewall, enable remote access in setting with a tickbox
    3. Set up users
    4. Done, it now works and doesn’t need to be touched. It will handle connecting clients directly to the server. Users just need to install Plex client, login to their account and they have access.

    By contrast this still requires the hoster set up a reverse proxy (major hassle if done securely with certificates as well as an expense for a domain which works out to probably $5 a year), to then have their users point their jellyfin at a domain-name (possibly a hard to remember one as majesticstuffbox[.]xyz is a lot cheaper than the dot com/org/net equivalents or a shorter domain that’s more to the point), auth and so on. It’s many, many, many more steps and software and configurations and chances for the hosting party to mess something up.

    My point was I and many others would rather take the $5 we’d spend a year on a domain name and pay it for this kind of turn-key solution for ourselves and our users even if provided by a third party but that were Jellyfin to integrate this as an option it could provide some revenue for them and get the kinds of people who don’t want to mess with reverse proxies and certificates into their ecosystem and off Plex.


  • There is AFAIK no way to do this.

    Apple’s never open-sourced the APIs and interfaces and it only works on Macs and Windows. For this you will need to have either a Windows install (recommend separate drive so it doesn’t break Linux bootloader) or a persistent or not Windows VM with USB passthrough. I’m not even sure how well the VM situation works but it probably should. You don’t even have to have a license for Windows, you can just run it in the VM for this purpose alone but it does mean oh at least 40GB set aside on your drive for the VM image plus more if you want to do things like back-up the phone.


  • Jellyfin needs to partner with someone people can pay a very low and reasonable and/or one-time fee to enable remote streaming without the fuss of setting up either dangerous port-forwarding or the complexity of reverse proxies (paying for a domain-name, the set-up itself including certificates, keeping it updated for security purposes).

    And no a VPN is not a solution, the difficulty for non-technical users in setting up a VPN (if it’s even possible, on smart-tvs it’s almost always not, and I don’t think devices like AppleTV and other streaming boxes often support them) is too high and it’s an unwanted annoyance even for technical users.

    I’m not talking about streaming video’s through someone else’s servers or using their bandwidth. I’m talking about the connection phase of clients and servers where Plex acts like an enhanced dynamic DNS service with authentication. They have an agent on the local media server which sends to the remote web service of the third party the IP address, the port configured for use, the account or server name, etc. When a client tries to connect they go to this remote web service with the servername/username info, the web service authenticates them then gives them the current IP address and any other information necessary. It then sends some data to the local Jellyfin server about the connecting client to enable that connection and then the local media Jellyfin server and the client talk directly and stream directly.

    Importantly the cost of running this authentication and IP address tracking scheme would be minimal per Jellyfin server. You could charge $5/year for up to 20 unique remote clients and come out ahead with a slight profit which could be put back into Jellyfin development and things like their own hosting costs for code, etc. Even better if they offer lifetime for this at $60-$80 they’d get a decent chunk of cash up-front to use for development (with reasonable use restrictions per account so someone hosting stuff in Hetzner or whatever and serving 300 people with 400 devices will need to pay more because they’re clearly doing this for profit and can afford to throw some more money at Jellyfin).

    Until Jellyfin offers something that JUST WORKS like that it’s not going to be a replacement for Plex, whatever other improvements they offer to users it’s still a burden for the server runner to set up remote streaming in a way that isn’t either incredibly dangerous (port forwarding) OR either involves paying money to third parties AND/OR the trouble of running your own reverse proxy and/or involves walking users through complicated set-up process for each device that you have to repeat if you change anything major like your domain name when using a VPN.


  • This is most likely cached images. For example emojis from your instance or other instances you’ve viewed as well as maybe other images but definitely emojis. Possibly other things like the images in post thumbnails which helps reduce costs by ensuring your instance doesn’t have to re-send you the same images over and over again each time you close the browser.

    Lemmy doesn’t generate enough content yet daily that most people who check in twice a day and scroll a bit through pages won’t almost certainly encounter several posts with images they’ve already seen before. I’ve had many cases where just a bit of scrolling brings up 3-4 day old posts I’ve seen before so caching associated images could save in cases like those at least 3-4 transfers of those images per user which adds up for a non-profit no ads service like lemmy.


  • It just does more and more easily. It styles things better, makes them more professional looking with a click. It can do certain things like nested tables in Word that Writer cannot do. Excel is much more powerful than calc, it has more functions, more refined functions, it’s easier to work with, has more and prettier chart options. And oh you can create tables in Excel that are sortable. There are many other cases.

    Now for the last two the die-hards will whine and whinge about how you should just use a software for creating charts and a database but sometimes you just want to make something quick, sometimes that’s overkill for what you need. Grandpa doesn’t need to learn how to deal with databases just to make a sortable list of books he’s read, he can just use excel and the Libreoffice people telling him to pound sand because they won’t add that feature to calc because it doesn’t belong there means he and many other people don’t use calc, they use MS office. Likewise the Libreoffice defense force saying of making graphs and charts to just use dedicated software, well many corporate types, business people, white collar workers don’t understand those things and may not be able to get them installed, what they understand, what they already have is MS office and it works and has lots of pretty, professional, very slick options which don’t make them look poorly in office meeting presentations.

    Just on the sortable tables front, I can’t tell you how many times I’ve run into hobby stuff that’s based on an excel file with tables that rely on being sortable. From stat sheet creators to mini-databases (<2000 rows) on some game created by fans.

    It’s useful for those who need the very bare basics of being able to open and read basic MS word documents, csv files, excel files, and to write an occasional letter. But the moment you need to start doing beyond basic formatting or dealing with files that have that, you run into issues.

    You have this gulf of usability, it’s useful for people at the very bottom of the basic needs pole, barely computer literate types who think facebook is the internet and it’s useful for highly technically competent people who can and do use other dedicated software, often without GUIs to solve problems, it’s a frustration for the middle 60% of the population who are more than basically computer literate but not scientifically trained, not CS or IT.



  • i haven’t yet encountered an AP that is capable of providing all of the features that i currently use. ie ad blocking; personal vpn;

    Pfsense does both of these. pfblocker NG in particular is a very powerful network adblocker with lots of lists. Pfsense can also run VPNs, it supports openvpn and wireguard in both client and server mode and you can set up multiple so one client, one server.

    web hosting; and cloud-like internet accessible storage via ssh tunnel (in addition to others).

    If you just need personal services it would be best to run something local, setup a wireguard tunnel on pfsense that gives access to your network and VPN in to access things remotely. If you need to share with others I suppose this can become a problem.



  • If the computers have any value it would be better to just buy a new quality modern ATX power supply of the right size for the case (take dimensions for fit and ensure it’s at least as many watts as the old one though it can be more) and do a drop-in replacement. Just make sure the power supply comes with some molex adapters and the maker usually sells additional cables if it’s a semi or fully modular design so you could buy more molex plugs if the 3-8 they give you aren’t enough.

    That said power supplies can of course be repaired by anyone with soldering skills and sufficient electrical engineering knowledge and experience. They shouldn’t be repaired by amateurs because they can store enough power to kill or maim a person who doesn’t know what they’re doing. It would be cheaper though to just replace the power supplies unless having all original/period equipment in the machines is important.


  • What are you average file sizes for movies and series?

    Movies? 4-8GB for most 60-120 minute features. TV shows. For live action 800MB/episode, 500MB/ episode for animation. For hour long stuff probably 1.2-1.6GB/episode.

    What would you do? what’s your target size for movies and series? What bitrate do you go for in which codec?

    HEVC 10 bit. I target in the 7000-9000kbps bitrate range generally. For animation that can be as low as 3000-5000kbps. Sometimes a bit higher for very grainy old films, occasionally a little lower for grain less modern digital camera work that hasn’t had digital grain added.

    If you want to maximize space savings without losing quality you have to understand what needs more bitrate and what can do with less. Across the board you could do something like CRF 20 but you’d have outliers where you don’t get enough bitrate and those where you still end up with rates of 14,000kbps.

    The above is for 1080p content.

    If you can stand HD video content at 2-3000kbps more power to you but on a large TV I can tell. I think even being reckless and not caring about future-proofing less than 6000kbps is a bad idea for anything but TV shows. Even those I think outside animation you want minimum 2000kbps for 1080p.