• humanspiral@lemmy.ca
    link
    fedilink
    arrow-up
    7
    ·
    2 hours ago

    I’m skeptical of author’s credibility and vision of the future, if he has not even reached blink tag technology in his progress.

  • Maxxie@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    29
    arrow-down
    1
    ·
    edit-2
    3 hours ago

    (let me preach a little, I have to listen to my boss gushing about AI every meeting)

    Compare AI tools: now vs 3 years ago. All those 2022 “Prompt engineer” courses are totally useless in 2025.

    Extrapolate into the future and realize, that you’re not losing anything valuable by not learning AI tools today. The whole point of them is they don’t require any proficiency. It “just works”.

    Instead focus on what makes you a good developer: understanding how things work, which solution is good for what problem, centering your divs.

    • Rusty@lemmy.ca
      link
      fedilink
      English
      arrow-up
      15
      ·
      5 hours ago

      I don’t think it was supposed to replace everyone in IT, but every company had system administrators or IT administrators that would work with physical servers and now there are gone. You can say that the new SRE are their replacement, but it’s a different set of skills, more similar to SDE than to system administrators.

      • MinFapper@startrek.website
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 hours ago

        And some companies (like mine) just have their SDEs do the SRE job as well. Apparently it incentivizes us to write more stable code or something

    • Elrecoal19@lemmy.world
      link
      fedilink
      arrow-up
      5
      arrow-down
      1
      ·
      5 hours ago

      Yeah, AI is going to put some people out of work, but in turn will open lots of more specialized positions. And these positions that are lost could adapt to AI (for example, being part of the training instead of just being let go).

      • Jankatarch@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        edit-2
        4 hours ago

        There is still difference.

        Cloud was FOR the IT people. Machine learning is for predicting patterns following data.

        Maybe stock predictors will adapt or replace but average programmer didn’t have to switch to replit because it’s “cloud IDE”

      • Ferk@programming.dev
        link
        fedilink
        arrow-up
        4
        ·
        edit-2
        4 hours ago

        I mean, isn’t that what “get on or get left behind” means?

        It does not necessarily mean you’ll lose your job. Nor does “get on” mean you have to become a specialist on it.

        The post picks specifically on things that didn’t catch on (or that only catched on for a period of time but were eventually superseeded), but does not apply it to other successful technologies.

        • Elrecoal19@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          4 hours ago

          Yeah, I realized it suffers from (inverse) survivorship bias, only pointing out the ones that didn’t survive.

          Didn’t one company claim something like “the internet is a fad” or “touchscreen phones are a fad” and went bankrupt/became irrelevant because they didn’t adapt?

          • thanks AV@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            2 hours ago

            touchscreen phones are a fad

            Blackberry? I was like 10 at the time so this is based off my memory of who had what phone but that seems like the right guess

  • superkret@feddit.org
    link
    fedilink
    arrow-up
    31
    ·
    6 hours ago

    This technology solves every development problem we have had. I can teach you how with my $5000 course.

    Yes, I would like to book the $5000 Silverlight course, please.

    • not_woody_shaw@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 minutes ago

      I use it to discuss the pros and cons of potential refactorings, then laugh as it botches the implementation.

    • sidelove@lemmy.world
      link
      fedilink
      arrow-up
      16
      ·
      5 hours ago

      Which is honestly its best use case. That and occasionally asking it to generate a one-liner for a library call I don’t feel like looking up. Any significant generation tends to go off the rails fast.

      • Omgpwnies@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 hours ago

        I’ve been using it to write unit tests, I still need to edit them to mock out some things and change a bit of logic here and there, but it saves me probably 50-75% of the time it used to take, just from not having to hand-write all that code.

      • T156@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        5 hours ago

        Getting it to format documentation for you seems to work a treat. Nothing too complex, just “move this bit here, split that into points”.

      • andioop@programming.dev
        link
        fedilink
        English
        arrow-up
        9
        ·
        edit-2
        5 hours ago

        I do wonder about inventions that actually changed the world or the way people do things, and if there is a noticeable pattern that distinguishes them from inventions that came and went and got lost to history, or that did get adopted but do not have mass adoption. Hindsight is 20/20, but we live in the present and have to make our guesses about what will succeed and what will fail, and it would be nice to have better guesses.

        • Lightfire228@pawb.social
          link
          fedilink
          arrow-up
          7
          ·
          5 hours ago

          Quality work will always need human craftsmanship

          I’d wager that most revolutionary technologies are either those that expand human knowledge and understanding, and (to a lesser extent) those that increase replicability (like assembly lines)

          • Transtronaut@lemmy.blahaj.zone
            link
            fedilink
            arrow-up
            4
            ·
            4 hours ago

            It’s tricky, because there’s no hard definition for what it means to “change the world”, either. To me, it brings to mind technologies like the Internet, the telephone, aviation, or the steam engine. In those cases, it seems like the common thread is to enable us to do something that simply wasn’t possible before, and is also reliably useful.

            To me, AI fails on both those points. It doesn’t really enable us to do anything new. We already had chat bots, we already had Photoshop, we already had search algorithms and auto complete. It can do some of those things a lot more quickly than older technologies, but until they solve the hallucination problems it doesn’t seem reliable enough to be consistently useful.

            These things make it come off more as a potential incremental improvement that is still too early in it’s infancy, than as something truly revolutionary.

            • zqwzzle@lemmy.ca
              link
              fedilink
              English
              arrow-up
              3
              ·
              4 hours ago

              Well it’ll change the world by consuming a shit ton of electricity and using even more precious water to fill the data centres. So changing the world is correct in that regard.

        • Elrecoal19@lemmy.world
          link
          fedilink
          arrow-up
          6
          ·
          5 hours ago

          I think AI will definitively have an impact in how shit is done, but propably not the way AI bros think. It might not revolutionize the world, but become and standard.

          I don’t know enough about AI or about the entire IT world so I cannot 100% affirm or deny anything, though.

    • Kissaki@programming.dev
      link
      fedilink
      English
      arrow-up
      18
      ·
      edit-2
      8 hours ago

      I’d love to read a list of those instances/claims/tech

      I imagine one of them was low-code/no-code?

      /edit: I see such a list is what the posted link is about.

      I’m surprised there’s not low-code/no-code in that list.

      • jubilationtcornpone@sh.itjust.works
        link
        fedilink
        arrow-up
        9
        ·
        6 hours ago

        “We’re gonna make a fully functioning e-commerce website with only this WYSIWYG site builder. See? No need to hire any devs!”

        Several months later…

        “Well that was a complete waste of time.”

    • 3abas@lemm.ee
      link
      fedilink
      arrow-up
      13
      arrow-down
      9
      ·
      8 hours ago

      Non of those examples are relevant.

      Those examples are specific tools or specific implementation pattern, AI in development is a tool.

      It doesn’t dictate how to write software or what the written code will look like, it’s a tool that speeds up your code wiring. It catches typos and silly bugs that take hours to debug, it’s able to generate useful unit tests, it can clean up and apply my code style way better than codemaid or resharper ever code, it’s taken care of so much tedious shit and made software development fun again.

      Vibe coding is not the future of development. If you aren’t learning to use AI as a tool in development, you are going to be left behind.

      It’s more apt to compare it to IDEs. Sure, you can still write you entire app in vim and compile it in the terminal, but you would have been very foolish to deny the future of development was in IDEs.

      • qqq@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        2 hours ago

        Pretty much everyone I work with uses vim, emacs, sublime, or vscode. I like IDEs and use them for… well Java, but I wouldn’t argue that they’ve made the other tools obsolete or you’re a fool for sticking with the old ones. If it ain’t broke and all that. It actually seems like more people are moving back to pluggable text editors over IDEs

        I’ve used AI tools a bit. They’ve really helped drop in code that would previously just be a bunch of TODOs; they get you up and writing the core parts much faster to see if the idea even works. They’ve also really helped answer specific questions or lead me towards the answer. They’ve also straight up lied to me quite a bit. It’s a weird tool.

        I think the OP image is pretty wrong with the comparison it makes. LLMs/AI are a class of technology that are most definitely not going anywhere unless something dramatic happens. Some people, myself included, feel uneasy about the way they’re created and the fact that people in powerful positions completely misunderstand them, and I think that leads to the hope that they’re just a fad.

      • TheOneCurly@lemm.ee
        link
        fedilink
        English
        arrow-up
        13
        ·
        6 hours ago

        You’re describing exactly how all these web tools worked. “HTML, CSS, and JS are too hard to do manually. Here’s a shiny new tool that abstracts all that away and lets you get right to making your site!” Except they all added additional headaches, security concerns, and failed to fill in edge cases, so you still need to know how to do all that HTML, CSS, and JS anyway. That’s exactly how LLM generated code works now. It’ll be useful and common for a while and then the technical debt will pile up and pile up and eventually everyone will look around and think “what the hell were we thinking” and tear it all down.

  • fuzzzerd@programming.dev
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    2
    ·
    4 hours ago

    I don’t remember progressive web apps having anywhere near the level of fanfare as the other things on this list, and as someone that has built several pwas I feel their usefulness is undervalued.

    More apps in the app store should be pwas instead.

    Otherwise this list is great and I love it.

  • teodorista@lemm.ee
    link
    fedilink
    arrow-up
    27
    arrow-down
    3
    ·
    edit-2
    11 hours ago

    Thanks for summing it up so succinctly. As an aging dev, I’ve seen quite a lot of tech come and go. I wish more people interested in technology would spend more time learning the basics and the history of things.

  • someacnt@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    2
    ·
    9 hours ago

    It pains me so much when I see my colleagues pay OpenAI to do programming assignments… they see it is faster to ask gpt, than learn it properly. Sadly, I can say nothing to them, or I would risk worsening relations with them.