• 0 Posts
  • 30 Comments
Joined 1 year ago
cake
Cake day: July 2nd, 2023

help-circle

  • accideath@lemmy.worldtolinuxmemes@lemmy.worldShit...
    link
    fedilink
    arrow-up
    6
    arrow-down
    1
    ·
    1 month ago

    Chromium was, however, a Google product from the very beginning that Google open-sourced themselves. Linux is too big with too many non-profit and for-profit companies and tons of independent individuals participating in its development for one person or company to control it outright.

    I mean, sure, for profit companies like Red Hat, SUSE, Canonical do have some influence but not so much that you can’t ignore their contributions if you don’t like them.

    For example, some ubuntu based distros (i.e Mint) circumvent snap from being installed the ubuntu way (without asking) because it goes against their philosophy. And if that’s still too much Ubuntu for you, there’s a Debian Edition of mint. And if that’s still too same-y for you, there are dozens of other distros based on slackware, rhel/fedora, arch, Gentoo, etc. There even are Linux distros without GNU.

    So, unless Muskiboy buys Linus Torvalds, I think the Linux community could easily ignore him building his own xOS.


  • accideath@lemmy.worldtolinuxmemes@lemmy.worldtoxic help forum
    link
    fedilink
    arrow-up
    7
    arrow-down
    1
    ·
    1 month ago

    Yea, bit gimp is particularly difficult to learn. A few years ago, when I first needed something more complex than paint.net, I of course first downloaded gimp because it’s free. It was difficult to use, to say the least. But sure, I didn’t have any experience with more complex image editors. However, just to see what the difference is, I also downloaded Photoshop and didn’t have any trouble at all. Everything I needed to do was easily understandable and the UI was very easy to use. I haven’t used any once of them before and I haven’t used Gimp since. (Also tried krita btw, only found it mildly easier to use than gimp, still miles behind Adobe).

    That isn’t to say, that professional OpenSource software can’t be intuitive and well designed. Today I used kdenlive for the first time because premiere didn’t support the codec+container combo I need and it was a very pleasant experience. A very familiar interface, if you’ve used any video editor before. I didn’t go in-depth but it didn’t immediately alienate me like gimp did.


  • The 970 works for encoding h.264 only. My recommendation: If you have a 7th Gen Intel CPU with iGPU or later, use that. Otherwise, sell the 970 and get one of these (in this order):

    • Intel Arc A310
    • GTX 1650
    • GTX 1630
    • Quadro P1000
    • Quadro T400
    • GTX 1050 Ti
    • GTX 1050
    • Quadro P620
    • Quadro P600
    • Quadro P400

    The Intel Card has the best encoder, followed by Nvidia Turing, then Pascal. I recently sold my 970 and got a 1050 Ti for the same price. Works great with Jellyfin. If you need to tone map HDR, you probably shouldn’t get anything with much lower performance than that. If it’s just some UHD to HD or h.265 to h.264 for compatibility, even the P400 will work well.


  • A few reasons.

    For one, storing multiple versions of the same film takes up a lot of storage, which is more expensive than a cheap 40€ gpu for transcoding. And I definitely wanna keep the highest quality I can. Besides transcoding on the fly is more flexible, ensuring the best possible quality at any time, instead of having to pick between the good and the shit version.

    And secondly, usually I only need transcoding when I don’t watch on my home setup (or when some friends watch on my server). My upload isn’t as high as some of my film’s bitrates and some clients do not support h.265 or HDR thus needing transcoding and/or tonemapping.



  • I mean, calling them Mac(Book) does clarify that they run macOS. And historically „Mac and PC“ have been used to differentiate between Windows and macOS, not just by Mac users. Never met anyone who persisted on MacBooks not being laptops. People just call them MacBooks because that’s what they are…



  • In short: Because HDR needs additional metadata to work. You can watch HDR content on SDR screens and it’s horribly washed out. It looks a bit like log footage. The HDR metadata then tells the screen how bright/dark the image actually needs to be. The software issue is the correct support for said metadata.

    I‘d speculate (I’m not an expert) that the reason for this is, that it enables more granularity. Even the 1024 steps of brightness 10bit colour can produce is nothing compared to the millions to one contrast of modern LCDs or even near infinite contrast of OLED. Besides, screens come in a number of peak brightnesses. I suppose doing it this way enables the manufacturer to interpret the metadata to look more favorably on their screens.

    And also, with your solution, a brightness value of 1023 would always be the max brightness of the TV. You don’t always want that, if your TV can literally flashbang you. Sure, you want the sun to be peak brightness, but not every white object is as bright as the sun… That’s the true beauty of a good HDR experience. It looks fairly normal but reflections of the sun or the fire in a dark room just hit differently, when the rest of the scene stays much darker yet is still clearly visible.


  • HDR or High Dynamic Range is a way for images/videos/games to take advantage of the increased colour space, brightness and contrast of modern displays. That is, if your medium, your player device/software and your display are HDR capable.

    HDR content is usually mastered with a peak brightness of 1000nits or more in mind, while Standard Dynamic Range (SDR) content is mastered for 80-100nit screens.



  • People aren’t “sensitive” to 3D movies due to lack of stereoscopic vision (That’s typical for people who were cross-eyed from birth for example, even if they had corrective surgery). Or they can see them and don’t care or don’t like the effect.

    If you’re not “sensitive” to 4K, that would suggest you‘re not capable of perceiving fine details and thus you do not have 20/20 vision. Given of course, you were looking at 4K content on a 4K screen in a size and distance, where the human eye should generally be capable of distinguishing those details.


  • HDR is not just marketing. And, unlike what other commenters have said, it’s not (primarily) about larger colour bitrate. That’s only a small part of it. The primary appeal is the following:

    Old CRTs and early LCDs had very little brightness and very low contrast. Thus, video mastering and colour spaces reflected that. Most non HDR (“SDR”) films and series are mastered with a monitor brightness of 80-100nits in mind (depending on the exact colour space), so the difference between the darkest and the brightest part of the image can also only be 100nits. That’s not a lot. Even the cheapest new TVs and monitors exceed that by more than double. And of course, you can make the image brighter and artificially increase the contrast but that‘s the same as upsampling DVDs to HD or HD to 4K.

    What HDR set out to do was providing a specification for video mastering, that takes advantage of modern display technology. Modern LCDs can get as bright as multiple thousands of nits and OLEDs have near infinite contrast ratios. HDR provides a mastering process with usually 1000-2000nits peak brightness (depending on the standard), thus also higher contrast (the darkest and brightest part of the image can be over 1000 nits apart).

    Of course, to truly experience HDR, you’ll need a screen that can take advantage of it. OLED TVs, bright LCDs with local dimming zones (to increase the possible contrast), etc. It is possible to profit from HDR even on screens that aren’t as good (my TV is an LCD without local dimming and only 350nit peak brightness and it does make a noticeable difference although not a huge one) but for “real” HDR you’d need something better. My monitor for example is Vesa DisplayHDR 600 certified, meaning it has a peak brightness of 600nits plus a number of local dimming zones and the difference in supported games is night and day. And that’s still not even near the peak of HDR capabilities.

    tl;dr: HDR isn‘t just marketing or higher colour depth. HDR video is mastered to the increased capabilities of modern displays, while SDR (“normal”) content is not.

    It’s more akin to the difference between DVD and BluRay. The difference is huge, as long as you have a screen that can take advantage.


  • aren‘t sensitive to 4K video

    So you’re saying you need glasses?

    But yes, it does make a difference how much of your field of view is covered. If it’s a small screen and you’re relatively far away, 4K isn’t doing anything. And of course, you need a 4K capable screen in the first place, which is still not a given gor PC monitors, precisely due to their size. For a 21" desktop monitor, it’s simply not necessary. Although I‘d argue, less than 4K on a 32" screen, that’s like an arms length away from you (like on a desktop), is noticeably low res.




  • accideath@lemmy.worldtolinuxmemes@lemmy.worldSteam on Linux
    link
    fedilink
    arrow-up
    11
    arrow-down
    1
    ·
    3 months ago

    The trend still goes towards HDR, since it’s not just an effect in games. Nearly any modern TV can decode HDR metadata and most streaming services support HDR. Of course, entry level TVs and monitors cannot take advantage of HDR as much but as better TVs get cheaper that’ll spread even more. My TV isn’t particularly amazing but the difference between HDR content and SDR content is clear. If I have a choice, I never watch the SDR version.

    HDR isn’t just an effect like bloom. It’s a way of using the capabilities of modern TVs in a way SDR can’t. HDR is made for taking advantage of OLED, quantum dots, high contrast, local dimming, higher colour gamuts and/or the high brightness consumer screens reach nowadays.

    So if you wanna bet, I‘d personally bet on HDR being more like the standard in 10 years because screen tech usually only gets better and HDR is the software/firmware implementation to take advantage of those hardware improvements.


  • I figured if utilities like Rectangle / Magnet / Middle were fair game, then PT certainly were :P

    Fair enough xD. And thank god there are people adding missing functionality to both OSes.

    I don’t know about opening windows on new virtual desktops, but you can easily switch virtual desktops with 3/4 finger swipes on a trackpad.

    Fair, but I find virtual desktops rather useless without an easy way to put windows there. In macOS it’s one click for fullscreen or a click and a drag for non fullscreen (which I rarely use. Usually it’s fullscreen)

    I think the biggest think MacOS suffers with is supporting not just one large screen like you have, but having multiple monitors.

    Possibly. It’s The Apple Way, I suppose. MacBooks and iMacs. And if you have multiple displays, they better be all the same Apple Studio Displays or Pro Displays. And in all fairness, I did notice, that, when I have multiple monitors connected, I tend to just use one, primarily and the second one just to dump a single program there, that I don’t need to touch, like discord, a website or document or a reference image. But I’m a one big monitor guy, through n through, anyways.


  • pretty much all of these have been addressed through Powertoys

    Fair. Powertoys are awesome but since they’re not part of the OS, I never much got into them. Might have alleviated some of my issues…

    I would argue that you go fullscreen all the time because the titlebar does bother you.

    Partially, maybe. But primarily also, because it also gets rid of the window top bar with the close icons and title and that’s something that windows just can’t do. Fullscreen on macOS hides it until needed. I would absolutely fullscreen programs on windows, if I could, in the way macOS does (fullscreen to new virtual desktop).

    It’s a little flabbergasting to me that someone with any desire for tinkering would go to MacOS, but building and continuing to run a hackintosh certainly does sound like a challenge, fair enough.

    Well, let’s just say, I came for the challenge and stayed for the peace of mind. When I started hackintoshing, I was still at school, but by now, I don’t really wanna think about stuff like that on my main computer. For example, on windows, I used to keep my libraries on a secondary HDD, because I didn’t trust windows to not fuck itself up to the point where I‘d need to reinstall it and that saved my butt a number of times. The yearly windows nuking was almost ritualistic. I don’t even think about that on macOS.

    I’ve spent far more time trying to get Mac stuff working with third party accessories than I have had to deal with Windows stability issues, and I appreciate that my Windows machine can still work flawlessly with perfectly good PC accessories I bought decades ago, whereas Apple tries to break compatibility as often as possible

    Well, I don’t have many accessories and such. And even macOS doesn’t break compatibility with storage devices, mice and keyboards. I have benefited from Windows‘ backwards compatibility in form of 20 year old driver software for a film scanner, however there is a modern third party program for macOS which also does the trick. Also, one point for macOS compatibility with stuff would be printers. Never had to track down any driver software for printers, as long as they’re new enough to support USB or network printing. Plug it in, macOS finds it and you just print. On windows finding old drivers is often enough almost impossible…

    To each their own, but having to constantly work full screen and lose context for everything else I’m doing is not a workflow I see myself getting used to anytime soon.

    I mean, you can splitscreen two programs in fullscreen and switching between multiple fullscreen apps is as easy as a swipe on the trackpad or a mouse gesture on my mx master or cmd+arrows. imo that’s much more efficient and quick on small screens. And on large screens like my 32“ monitor, the space you lose through the menu bar is so negligible, that it doesn’t matter. And then the also quickly activated mission control is great for quickly switching between open windows. On the other side, I found Windows implementation of virtual desktops to be almost useless and I find it much more distracting to deal with a lot of open windows, on a laptop sized display, on windows, because you cannot easily put them on a secondary desktop and swipe from one to the other


  • I use a lot of software like the Adobe suite, office, scanning, managing and editing photos or my (linux based) server, etc.

    macOS vastly superior search is worth a lot to me. So are built in features like bulk renaming files, merging pdfs and a fast built in pdf viewer, that isn’t a browser. I‘m also a huge fan of the stacks feature to keep my desktop tidy, something I never manage to do on windows. Also quick look is very neat, especially if you work with lots of images, videos or documents.

    I also prefer Apple‘s office suite to Microsoft Office by a long shot. (Speaking of vertical space. MS Office takes up a lot through the ribbon bar). It’s not as feature rich but has anything I could need and is much cleaner and less bloated.

    And for managing open programs: I just don’t. If I want a program to be closed I close it manually (cmd+Q), otherwise I just leave it open (the program, not the program window), especially for programs I use a lot. macOS is very good at RAM management, in my experience and the new window opening immediately is great.

    Also, I prefer my programs running in full screen, especially on my macbook, and macOS easy gesture based navigation to switch between virtual desktops and to launch mission control to switch between apps quick and fluidly are great. Even works with my logitech mx master mouse. And with third party apps like magnet, there’s even window snapping.

    The titlebar doesn’t bother me. If I need the space, I just go to fullscreen. Mostly I don’t because I have a rather large monitor and even on my MacBook screen it’s not that huge. It’s like a third of the thickness of the windows start menu and programs don’t need their own menu bar because it’s global.

    And I’ve actually used windows a lot, even recently. I‘ve just this month finally replaced it with linux on my gaming PC. I miss the Windows 7 era because it was still bloat free, coherent, and a desktop OS through and through. Windows 8 was a joke and Windows 10 (which I downgraded to again, after 11 was even buggier) is fine in comparison but full of ads and Microsofts overemphasis on telemetry and pushing features and programs no one – or at least I don’t want (cortana, co-pilot, bing, edge, one drive, office 365, that weird weather/news widget in the task bar, etc.). They don’t even allow you to use a local account anymore, unless you use a workaround in terminal. Dealing with audio devices is a mess, double so, if you throw bluetooth into the mix. I’ve already mentioned the keyboard input problem in 11. For some reason, games tend to minimize from time to time, a few minutes after launching, … The list goes on. Even small things like windows not being able to correctly use MB and MiB, which got defined by the IEC decades ago and is used correctly by any other noteworthy OS, get on my nerves.

    macOS and windows have wildly different philosophies and well, if you’re used to one it’s not that easy to switch. But also, Windows’ shortcomings, combined with my own curiosity and knack for tinkering with PCs) drove me towards macOS, years ago, since I started using it on a hackintosh, that, in my memory, ran more stable and faster than Windows on the same machine… it was just a bit too much hassle to keep updated.