• 0 Posts
  • 14 Comments
Joined 1 year ago
cake
Cake day: July 1st, 2023

help-circle

  • I hate the fact that none of the big names support CalDAV natively. DAVx5 is cool and all, but app developers really need to step up their shit and support CalDAV already. Not just Microsoft Exchange and Google Calendar but CalDAV as well. It’s not like they need to rebuild their apps from scratch.

    At this point you might just be better served using a web app instead of a native mobile app. Maybe K-9 Mail transformation into Thunderbird Mobile might bring some good news, but I’m not holding high hopes.

    Maybe we should, under the EU’s DMA, force anyone that bundles a calendar/note app with their phone OS to support CalDAV as well as any proprietary protocol of their choice.




  • From personal experience working in a Microsoft ecosystem, it’s mostly a matter of being able to hire the right people.

    There is a near-infinite source of IT workers that have some expertise with Microsoft software and services. And those kinds of numbers simply don’t exist for the Linux world, especially with all the different configurations out there.

    Medium-sized organizations have to employ a strategy of throwing enough idiots at a problem in order to keep things running. This also creates some of the issues they need the idiots for because no one has detailed knowledge of how things work.

    My attempts at proposing a linux-based application server have been met with all sorts of “but our domain policy”, “we can’t guarantee continuity”, “none of my people know how to admin this stuff” type responses.

    It definitely is a matter of mindset, but there is also a big commitment to make if switching systems to Linux. And that is a choice managers will only make if the benefits are clearly illustrated in a businesscase.


  • While I’m not exactly an expert user of AutoCAD (my background is architecture, industrial design and full stack development), I know enough about the software where I can tell it’s based on a lot of legacy spaghetti code.

    It’s the same for Solidworks, which I know through and through, including the shitty VBA scripting environment. My CAD teachers always used to say the software is built like a wooden playhouse, which has been extended over the years to include a second story, a slide, a swingset and a roof extension. But underneath it all, it is still the same “don’t fix it if it aint broke” codebase that Dassault has taken their chances on since the '90s.

    The second someone invests any kind of money into an open source alternative, the way Blender has done for the mesh modeling industries, both Autodesk and Dassault systemes stand to lose their respective monopolies on 2D and 3D CAD.

    But the trend is not limited to CAD software only, it is also highly prevalent in software providers for governmental tasks. Most of which sell the same products for years without iteration on their codebase. The result is that government organisations have to deal with shitty software that requires their individual users to connect to the database (yes, you heard that right, every user has to manually input database credentials that include all grants on all of the relevant datasets). Most of these cronies are reselling badly thought out software, where they’ve outsourced the development to third-world shitholes. Is is a goddamn miracle that there aren’t more major incidents with government organisations.

    The only solution for this kind of bullshit is open standards that encourage an open source approach to these kinds of critical applications. Where more parties are actually encouraged to build their own software and where the businessmodel is built around being a service provider and not a magical black box salesman.

    If you’re able to stop worrying about generating revenue based on your intellectual property and focus on generating revenue from the service you provide, surrounding your product… you’ll automatically build a better product.



  • I haven’t dabbled that much in PCB design but I have seen some good things in KiCAD. All my electro engineer homies assure me Altium’s the way to go for now though. Most of them also happen to be big F(L)OSS nerds so I’m curious to see where KiCAD goes in the future.

    FreeCAD is an awesome attempt at building a parametric CAD modeler, though it will need a lot of polish to be usable. Especially on the UX side of things the software could do with a lot of improvement. As far as I know the most difficult part to program for parametric modelers is the actual geometry kernel, which is why so many modelers are based on Parasolid, including the recent hybrid modeler Plasticity. For a F(L)OSS parametric CAD modeler to truly succeed some genius needs to build an open geometry kernel that performs at least close to on par with Parasolid. But that takes a special kind of autistic in order to achieve. Either that or the engineering world needs to collectively decide this needs to happen.

    As much as I hope FreeCAD becomes the open source alternative everyone is looking for, it is trying to be everything at once and that might be too ambitious for the current state of the project. I’m secretly hoping we also get a new project sometime soon with a smaller scope.


  • The problem is mostly a lack of competition in specific fields. And the companies that own the monopoly in their respective niches make it so that any form of competition is either…

    • immediately acquired and killed
    • handicapped by market dependencies on pantented features
    • unable to generate business because customer processes are completely dependant on proprietary solutions

    Most of these applications have codebases that are FUCKING ANCIENT. Let’s take a look at Solidworks for example, which is the industry standard for Computer Aided Design for the manufacturing industry. Under the hood, it’s still the same software from the 1990’s. And there is no incentive for Dassault Systemes to rewrite the codebase.

    Lots of these giant monopolistic software products have turned into frankenstein-esque monstrosities over the years. I often tell people they are built like backyard playhouses that have been expanded over the years by building an extra story on top, adding a swingset, adding a slide, extending the roof and attaching a rope ladder to the side.

    All of this makes for more functionality, but they haven’t really thought about the structural integrity of the original playhouse. In a direct parallel many of these programs have unmaintainable code that no one dares touch because “hey it works, and we need to keep it that way because if we break it we’re no longer getting payed”.

    These companies unintentionally hold their businessmodel hostage by choosing profits over innovation and investment in an adaptable codebase.

    Which is why it is near impossible for them to support technologies that are different from their original install base. And this is also why they have incentives to make sure they stay in the lead becuase they know damn well that open source movements that get some support and take flight are dangerous to their market share, and by extension their profits.

    Blender is probably one of the best examples of what good open source software will do to an industry. The day someone develops a parametric CAD solution that’s platform agnostic and based on open standards we’ll see a lot of engineers ditch Windows for Linux.


  • I’m not blind to the fact that Windows has a terrible search experience, but I won’t say it’s the worst. Out of the box it is fucking useless, but if you actually take the time to index specific paths and make use of the metadata options in Explorer you can actually get some decent results in reasonable time.

    Apple had the right idea with Spotlight, it’s just sad that both parties can’t properly integrate the functionality into their file managers.

    My search needs are mostly covered by Voidtools’ Everything, which is able to scan through the whole NTFS partition in a matter of milliseconds for realtime results. The caveat being that you have to know the name of the file you’re looking for. Otherwise I just use Powertoys Run for search and application launching needs, which is what Spotlight could have been if it was made by passionate nerds.

    I do realise this makes my argument lose it’s bite somewhat, but it comes down to user experience. MacOS has a terrible out of the box experience that can’t be fixed, use something else instead. Windows has a terrible out of the box experience that can be fixed to some degree if you take some time to learn how it works, but you can still opt to not bother with any of it and use something else instead.

    Linux was always going to be the clear winner here.

    Now for the dock icon strategy: try doing that repeatedly with multiple instances/windows from the same application and compare the experience to the “never combine, show labels” taskbar in Windows. I guarantee there is going to be a clear winner in terms of usability. As always, under Linux is not an issue because you can just do whatever the fuck you want.

    The troubleshooting bullshit a pattern seen in all of Apple’s products. They have a habit of hiding all important information in case of an issue, and I have had this complaint about all the iDevices I’ve had the displeasure of touching over the years. iPhone update failed? Tough shit, have a red message saying “something went wrong, try again”. Fan controllers randomly stuck on 100% speed on an iMac without the ability to get any information about the sensors. None of the system tools provide any information beyond the bare minimum. I’ve come to a point where I just refuse to help any family member having trouble with an Apple device because it turns into a multi-hour wild goose chase.

    I’d argue having an overall stable experience with the occasional vague issues that can’t be troubleshooted is worse than having slightly more frequent issues that actually lead to a solution. Apple’s products by design teach the user nothing about technology because there’s no entry-point to the knowledge itself. Meanwhile Windows, while flawed, does provide the user with an incentive to learn about proper maintenance and troubleshooting which leads to more competent users overall.

    Last but not least, the command line: I love my experience using package managers on both Linux and lately also on Windows with winget. It has quickly become the main way I install and manage most utilities. MacOS has options but none of them are integrated neatly and have to be installed separately.

    Even on Windows I use command line utilities where I can and GUI functionality where it makes sense. While the realm of possibilities is not as broad as the GNU/Linux world provides, I at least feel I have a great deal of control over what I do. MacOS is an impostor that has murdered a UNIX distribution and is wearing its skin. The terminal experience feels like it’s a remnant from the early days that they never bothered to put any more love into but they can’t get rid of it either. Just like some of those Windows 3.0 components you can still find in the modern versions of the OS.

    I’m 100% done with Apple and their products because they make everything I’m trying to do slightly more difficult and annoying than the alternative. And those are just limited to my issues with the way they do software. I also have very valid issues with the way they design hardware and with the way they conduct business (ethics, monopolies, their overall effect on tech markets in general).


  • You’re getting ratio’d but you’re right. Core parts of the user experience are steaming piles of dogshit while people praise MacOS for its many gimmicks.

    • Finder is an absolute pile of shit and gets first time users addicted to bad habits. It takes digging through hidden settings to even make it match the out of the box functionality that Windows Explorer offers and it still can’t match the full potential of Explorer for file management. The integrated search is unpredictable and fuzzy so they went and made Spotlight its own thing.
    • Window management is a nightmare if you’re actually trying to do multiple things at once without switching windows. Mac OS has not implemented window snapping for years and they still managed to make it suck when they did. Not once have they considered stealing great ideas from the tiling window managers, Apple simply decided to reinvent the wheel and make it square.
    • Got multiple applications running at the same time while minimized? Lol, get fucked. The only way to know what’s actually running at a glance is the shitty little dot below the dock icon and restoring a specific window takes either way too many clicks or requires you to know the magic keyboard shortcut for untangling your windows (another gimmick they added later in order to actually make the OS usable bearable).
    • Got any sort of issue during startup? Here, take this black screen with a single icon. Not even a slight hint as to what the actual problem might be and if you should worry about it. MacOS might seem like a stable OS but that is mainly because it is very well integrated with the limited set of hardware it can actually run on. If any real issues do come up, the troubleshooting experience is basically just a giant “get fucked” sign pointing to the nearest Apple store.
    • Sometimes simplicity is a good thing, but usually designing something to be accessible means severely limiting the amount of depth you can go into as an experienced user. Every aspect of the OS and the tools that come with it share this overall problem that there’s just not much depth to what you can do with it. Can’t have a steep learning curve if there’s just nothing to learn.

    And I feel like none of these are unreasonable. I like using the right tools for the right jobs, which is why I run Windows for heavy productivity and engineering work. Desktop Linux has come a long way but it just doesn’t (yet) have the required toolset to support engineering workflows. While programming of any kind and getting more complex data wrangling done is best done using Linux. My server needs are also best covered by Linux as most distributions can be run without all the bloat that Windows comes with. And I am sure as shit not paying for Windows Server.

    I just can’t find a valid reason for using MacOS. It seems to combine the worst of both worlds into an OS that’s like a trial experience of actually using a computer to get things done.


  • Seconded, depending on what your goals are with transcoding, you might want to reconsider your strategy.

    Hardware encoding (with a GPU) is mostly useful for realtime transcoding applications like streaming video. There are definitely some caviats that come with the realtime performance, and you’ll find that NVENC encoded video is almost always inferior to the slower equivalent software encoded variants.

    So let’s talk codecs: While h.265 might seem like the holy grail, it is way more computationally intensive than h.264 is. In some cases the difference in encoding time will even be as high as 3-5x. Not really worth it if all you’re gaining is a slightly lower filesize.

    Your results will vary by the media you’re encoding, by your encoder quality settings, tuning and encoding speed. As a rule of thumb: slower encoding speeds equal more efficiently compressed video (a.k.a. relatively higher quality for lower file size).

    Handbrake is my choice of software for encoding video. It includes pretty much everything you could ever want if you’re not looking for niche codecs and exotic video formats.

    I find myself mostly using x264 because it is relatively fast and still provides awesome results. My encoding speed is always set to “slow” or “superslow” (not much difference for my setup). I usually set the quality by making use of the preview function in handbrake, which transcodes just a short section of the video which I use for pixel peeping and checking for any major artifacts that would ruin the content. The resulting file also provides an estimate for how large the final transcoded file will be. If you’re happy with the quality setting, you can opt to mess with the encoder tuning. There are different presets for film, animated content and such. I usually do use film tuning if transcoding live-action media.

    All this generally leaves me with pretty compact file sizes for 1080p media. And transcoding usually happens at a rate of 60-75 fps depending on the resolution. Going up from “slow” to “medium” improves fps by about 25% and increases file size by about 10%. The ideal balance is up to you.

    Advanced tips: try using VMAF (objective video quality analysis algorithm developed by Netflix) to score and compare your different encoding settings. VMAF is neatly integrated into FFMetrics, which is a GUI for FFMpeg and a couple of video analysis algorithms. I also use MPV (open source media player) with FFMpeg command line arguments for playing videos synchronized in a 2x1 or 2x2 matrix. This helps compare the results for quality.


  • You might want to consider setting up a VPN tunnel to your own network. Main benefit is that you can access your home network as if you were connected to it locally. Which makes switching between mobile data and WiFi a non-issue.

    This requires some sort of VPN server and usually a single port-forwarding rule for the protocol which your VPN software of choice uses. For the simplest default configuration of OpenVPN this means setting UDP port 1194 to point to your OpenVPN server.

    Generally, keeping things simple, there’s two types of VPN you can set up:

    • split tunnel VPN, which gives you access to your home network but accesses the internet directly.
    • full tunnel VPN, which sends all of your traffic through your home router.

    It is a little more complicated than that, and there’s more nuance to it, such as wether to user your own DNS server or not, but all that is best left to some further reading.

    I’ve setup an OpenVPN server myself, wich is open source and completely free to mess around with. (Save for maybe some costs for registring your own domain or DDNS serviced. Those are all optional though, and mainly provide convienience and continuity benefits. You can definitely just setup a VPN server and connect with your external IP adress)


  • Wow that’s a cool setup, I’ll definitely steal some ideas.

    I’m used to slinging lots of data around and one of the more helpful tools for general purpose automation has been n8n. Though it might have limited use if you’re not trying to glue all kinds of services together. I also host actualbudget to keep track of finances. Both are running comfortably in their own little docker containers.

    I’m currently looking into setting up Nextcloud and experimenting some more with presence detection for Home Assistant. I’m considering CO2 sensors, which will either tell me my home is ventilated properly, or which rooms are occupied.