• 0 Posts
  • 35 Comments
Joined 1 year ago
cake
Cake day: July 2nd, 2023

help-circle
  • You‘re a step too far again though. The average newbie would insta-panic by the thought of using the terminal. Needing a command to install drivers or to update is already too hard.

    Arch based distros like Manjaro, endeavorOS or even SteamOS, for that matter are great (have used manjaro myself in the past until I settled for fedora/nobara) and the AUR can make acquiring software a lot easier. However, the moment something breaks, a newbie will be lost and the Arch Wiki won’t save someone who doesn’t know what to look for in the first place.

    If anything, my recommendation for absolute beginners (as long as their hardware isn’t state of the art or they want to game, primarily) would be Mint. It’s easy to set up, has a nifty (and graphical) driver installer, has a default DE that is close enough to windows as to not confuse someone who hasn’t used anything else in their life and also, it shares enough DNA with ubuntu that most tutorials out there work without having shit like snap in there.


  • accideath@lemmy.worldtolinuxmemes@lemmy.worldOh, come on!
    link
    fedilink
    arrow-up
    21
    arrow-down
    1
    ·
    20 days ago

    Sure but it’s not a rarity that forum answers expect you to be very familiar with linux file structures and terminal commands. If you’re a beginner who runs into an issue (as beginners do), you oftentimes need to find a tutorial and then tutorials that explain the tutorial. It gets even worse if you’re not on a debian/ubuntu based distro (although, to be fair, if you’re a newbie, that’s sorta asking for trouble).






  • accideath@lemmy.worldtolinuxmemes@lemmy.worldShit...
    link
    fedilink
    arrow-up
    6
    arrow-down
    1
    ·
    5 months ago

    Chromium was, however, a Google product from the very beginning that Google open-sourced themselves. Linux is too big with too many non-profit and for-profit companies and tons of independent individuals participating in its development for one person or company to control it outright.

    I mean, sure, for profit companies like Red Hat, SUSE, Canonical do have some influence but not so much that you can’t ignore their contributions if you don’t like them.

    For example, some ubuntu based distros (i.e Mint) circumvent snap from being installed the ubuntu way (without asking) because it goes against their philosophy. And if that’s still too much Ubuntu for you, there’s a Debian Edition of mint. And if that’s still too same-y for you, there are dozens of other distros based on slackware, rhel/fedora, arch, Gentoo, etc. There even are Linux distros without GNU.

    So, unless Muskiboy buys Linus Torvalds, I think the Linux community could easily ignore him building his own xOS.


  • accideath@lemmy.worldtolinuxmemes@lemmy.worldtoxic help forum
    link
    fedilink
    arrow-up
    7
    arrow-down
    1
    ·
    5 months ago

    Yea, bit gimp is particularly difficult to learn. A few years ago, when I first needed something more complex than paint.net, I of course first downloaded gimp because it’s free. It was difficult to use, to say the least. But sure, I didn’t have any experience with more complex image editors. However, just to see what the difference is, I also downloaded Photoshop and didn’t have any trouble at all. Everything I needed to do was easily understandable and the UI was very easy to use. I haven’t used any once of them before and I haven’t used Gimp since. (Also tried krita btw, only found it mildly easier to use than gimp, still miles behind Adobe).

    That isn’t to say, that professional OpenSource software can’t be intuitive and well designed. Today I used kdenlive for the first time because premiere didn’t support the codec+container combo I need and it was a very pleasant experience. A very familiar interface, if you’ve used any video editor before. I didn’t go in-depth but it didn’t immediately alienate me like gimp did.


  • The 970 works for encoding h.264 only. My recommendation: If you have a 7th Gen Intel CPU with iGPU or later, use that. Otherwise, sell the 970 and get one of these (in this order):

    • Intel Arc A310
    • GTX 1650
    • GTX 1630
    • Quadro P1000
    • Quadro T400
    • GTX 1050 Ti
    • GTX 1050
    • Quadro P620
    • Quadro P600
    • Quadro P400

    The Intel Card has the best encoder, followed by Nvidia Turing, then Pascal. I recently sold my 970 and got a 1050 Ti for the same price. Works great with Jellyfin. If you need to tone map HDR, you probably shouldn’t get anything with much lower performance than that. If it’s just some UHD to HD or h.265 to h.264 for compatibility, even the P400 will work well.


  • A few reasons.

    For one, storing multiple versions of the same film takes up a lot of storage, which is more expensive than a cheap 40€ gpu for transcoding. And I definitely wanna keep the highest quality I can. Besides transcoding on the fly is more flexible, ensuring the best possible quality at any time, instead of having to pick between the good and the shit version.

    And secondly, usually I only need transcoding when I don’t watch on my home setup (or when some friends watch on my server). My upload isn’t as high as some of my film’s bitrates and some clients do not support h.265 or HDR thus needing transcoding and/or tonemapping.



  • I mean, calling them Mac(Book) does clarify that they run macOS. And historically „Mac and PC“ have been used to differentiate between Windows and macOS, not just by Mac users. Never met anyone who persisted on MacBooks not being laptops. People just call them MacBooks because that’s what they are…



  • In short: Because HDR needs additional metadata to work. You can watch HDR content on SDR screens and it’s horribly washed out. It looks a bit like log footage. The HDR metadata then tells the screen how bright/dark the image actually needs to be. The software issue is the correct support for said metadata.

    I‘d speculate (I’m not an expert) that the reason for this is, that it enables more granularity. Even the 1024 steps of brightness 10bit colour can produce is nothing compared to the millions to one contrast of modern LCDs or even near infinite contrast of OLED. Besides, screens come in a number of peak brightnesses. I suppose doing it this way enables the manufacturer to interpret the metadata to look more favorably on their screens.

    And also, with your solution, a brightness value of 1023 would always be the max brightness of the TV. You don’t always want that, if your TV can literally flashbang you. Sure, you want the sun to be peak brightness, but not every white object is as bright as the sun… That’s the true beauty of a good HDR experience. It looks fairly normal but reflections of the sun or the fire in a dark room just hit differently, when the rest of the scene stays much darker yet is still clearly visible.


  • HDR or High Dynamic Range is a way for images/videos/games to take advantage of the increased colour space, brightness and contrast of modern displays. That is, if your medium, your player device/software and your display are HDR capable.

    HDR content is usually mastered with a peak brightness of 1000nits or more in mind, while Standard Dynamic Range (SDR) content is mastered for 80-100nit screens.



  • People aren’t “sensitive” to 3D movies due to lack of stereoscopic vision (That’s typical for people who were cross-eyed from birth for example, even if they had corrective surgery). Or they can see them and don’t care or don’t like the effect.

    If you’re not “sensitive” to 4K, that would suggest you‘re not capable of perceiving fine details and thus you do not have 20/20 vision. Given of course, you were looking at 4K content on a 4K screen in a size and distance, where the human eye should generally be capable of distinguishing those details.


  • HDR is not just marketing. And, unlike what other commenters have said, it’s not (primarily) about larger colour bitrate. That’s only a small part of it. The primary appeal is the following:

    Old CRTs and early LCDs had very little brightness and very low contrast. Thus, video mastering and colour spaces reflected that. Most non HDR (“SDR”) films and series are mastered with a monitor brightness of 80-100nits in mind (depending on the exact colour space), so the difference between the darkest and the brightest part of the image can also only be 100nits. That’s not a lot. Even the cheapest new TVs and monitors exceed that by more than double. And of course, you can make the image brighter and artificially increase the contrast but that‘s the same as upsampling DVDs to HD or HD to 4K.

    What HDR set out to do was providing a specification for video mastering, that takes advantage of modern display technology. Modern LCDs can get as bright as multiple thousands of nits and OLEDs have near infinite contrast ratios. HDR provides a mastering process with usually 1000-2000nits peak brightness (depending on the standard), thus also higher contrast (the darkest and brightest part of the image can be over 1000 nits apart).

    Of course, to truly experience HDR, you’ll need a screen that can take advantage of it. OLED TVs, bright LCDs with local dimming zones (to increase the possible contrast), etc. It is possible to profit from HDR even on screens that aren’t as good (my TV is an LCD without local dimming and only 350nit peak brightness and it does make a noticeable difference although not a huge one) but for “real” HDR you’d need something better. My monitor for example is Vesa DisplayHDR 600 certified, meaning it has a peak brightness of 600nits plus a number of local dimming zones and the difference in supported games is night and day. And that’s still not even near the peak of HDR capabilities.

    tl;dr: HDR isn‘t just marketing or higher colour depth. HDR video is mastered to the increased capabilities of modern displays, while SDR (“normal”) content is not.

    It’s more akin to the difference between DVD and BluRay. The difference is huge, as long as you have a screen that can take advantage.


  • aren‘t sensitive to 4K video

    So you’re saying you need glasses?

    But yes, it does make a difference how much of your field of view is covered. If it’s a small screen and you’re relatively far away, 4K isn’t doing anything. And of course, you need a 4K capable screen in the first place, which is still not a given gor PC monitors, precisely due to their size. For a 21" desktop monitor, it’s simply not necessary. Although I‘d argue, less than 4K on a 32" screen, that’s like an arms length away from you (like on a desktop), is noticeably low res.