• zerakith@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    3 months ago

    It is probably good that OS community are exploring this however I’m not sure the technology is ready (or will ever be maybe) and it potentially undermines the labour intensive activity of producing high quality subtitling for accessibility.

    I use them quite a lot and I’ve noticed they really struggle on key things like regional/national dialects, subject specific words and situations where context would allow improvement (e.g. a word invented solely in the universe of the media). So it’s probably managing 95% accuracy which is that danger zone where its good enough that no one checks it but bad enough that it can be really confusing if you are reliant on then. If we care about accessibility we need to care about it being high quality.

    • markinov@lemmygrad.ml
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      3 months ago

      While good quality subtitles are essential VLC can’t ensure that, it’s the responsibility of the production studio. AI subtitles on vlc are for those videos which doesn’t have any sub (which are a lot). The pushback shouldn’t be for vlc implementing AI, but production studios replacing translators or transcriber with AI (like crunchyroll tried last year).

      Also while transcribing and subtitle editing is a labour intensive job, use of AI to help the editors shouldn’t be discouraged, it can increase their productivity by automating repeatative tasks so that they can focus on better quality.

  • FuckyWucky [none/use name]@hexbear.net
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    This means that they most likely went for lighter AI models that use fewer resources, so that they run smoothly without putting too much strain on the machine.

    Pretty good. Captions are one of the legitimate uses of “AI”.

  • S13Ni@lemmy.studio
    link
    fedilink
    arrow-up
    0
    ·
    3 months ago

    This is not by default bad thing, if it is something you only use when you decide to do so, when you don’t have other subtitles available tbh. I hate AI slop too but people just go to monkey brain rage mode when they read AI and stop processing any further information.

    I’d still always prefer human translated subtitles if possible. However, right now I’m looking into translating entire book via LLM cause it would be only way to read that book, as it is not published in any language I speak. I speak English well enough, so I don’t really need subtitles, just like to have them on so I won’t miss anything.

    For English language movies, I’d probably just watch them without subtitles if those were AI, as I don’t really need them, more like nice to have in case I miss something. For languages I don’t understand, it might be good, although I wager it will be quite bad for less common languages.

    • The Doctor@beehaw.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      There’s a difference between LLM slop (“write me an article about foo”) and using an LLM for something that’s actually useful (“listen to the audio from this file and transcribe everything that sounds like human speech”).

      • S13Ni@lemmy.studio
        link
        fedilink
        arrow-up
        0
        ·
        3 months ago

        Exactly. I know someone who is really smart and works in machine learning and when I listen to him in isolation, AI sounds like actually useful thing. Most people just are not smart like that, and most applications for AI are not very useful.

        One of the things I often think is that AI makes it possible to do things that shouldn’t be done very easily and fast, that would had previously been too much effort or craft for some people, like now they can easily make website for whatever grift they are pushing.

    • superkret@feddit.org
      link
      fedilink
      arrow-up
      0
      ·
      3 months ago

      “Do one thing well” is what gives you software like mutt, which requires several other programs to be actually useful, all of which have to be configured separately to work together, with wildly different syntax.

    • RedstoneValley@sh.itjust.works
      link
      fedilink
      arrow-up
      0
      ·
      3 months ago

      VLC always had a ton of applications, network device playback, TV, streaming server, files, physical media, music player, effects, recording, AV format conversion, subtitles, plugins and so on.

    • SoulWager@lemmy.ml
      link
      fedilink
      arrow-up
      0
      ·
      3 months ago

      What do you mean by active component? Is processing the audio being played back to add subtitles active?

      • Despotic Machine@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        3 months ago

        Is processing the audio being played back to add subtitles active?

        Not sure where you are confused. If any part of this feature is active by default I will disable it.

        • limelight79@lemm.ee
          link
          fedilink
          arrow-up
          0
          ·
          3 months ago

          The way you wrote this, I thought you meant that if it required a cloud service you would turn it off. But now I think you’re just saying you wouldn’t use this feature.

          I share the confusion over your definition of “active”. You got all defensive when someone asked, so now no one really knows what you meant.

  • IronKrill@lemmy.ca
    link
    fedilink
    arrow-up
    0
    ·
    3 months ago

    Not against this feature, but this quote made me laugh:

    … once this is in place, people won’t have to scour the internet for sourcing subtitles to their favorite movies, shows, or even anime.

    As if MTL will get anywhere near the nuance of a properly made human translation.

    • Ferk@lemmy.ml
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      3 months ago

      Personally, I would be happy even if it didn’t translate it but were able to give some half decent transcription of, at least, English voice into English text. I prefer having subtitles, even when I speak the language, because it helps in noisy environments and/or when the characters mumble / have weird accents.

      However, even that would likely be difficult with a lightweight model. Even big companies like Google often struggle with their autogenerated subtitles. When there’s some very context-specific terminology, or uncommon names, it fumbles. And adding translation to an already incorrect transcript multiplies the nonsense, even if the translation were technically correct.

  • metaStatic@kbin.earth
    link
    fedilink
    arrow-up
    0
    ·
    3 months ago

    I’ve seen some pretty piss poor implementations on streaming apps but if anyone can get it right it’s VLC

  • Juntti@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    I wonder how good it is.

    Does it translate from audio or from text?

    Does it translate multiple languages, if video has a, b, c languages does it translate all to x.

    Does user need to set input language?

  • coolmojo@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    3 months ago

    What would be actually cool if it could translate foreign movies based on audio and add the English subtitles to it.

  • Fonzie!@ttrpg.network
    link
    fedilink
    arrow-up
    0
    ·
    3 months ago

    Oh so that wasn’t a joke from their booth.

    This seems really out of place, but locally ran auto subtitles from ethically sourced AI would be great.

    It’s just that there’s two very big conditions in that sentence there.

      • Fonzie!@ttrpg.network
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        3 months ago

        JetBrains’ AI code suggestions were only trained on code where authors gave explicit permission for it, but that’s the only one I know from the top of my head. Most chat-oriented LLMs (ChatGPT, Claude, Gemini…) were almost certainly trained using corporate piracy.

      • smayonak@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        3 months ago

        There are a number of open weight open source models out there with all their data sourced from the public domain. Look up BLOOM and Falcon. There are others.