Is it like a rough inference of what’s being said based on mouth movements, or is it more precise somehow? Would it be a mistake to think you knew exactly what was said by reading lips (even if you were good at it)?

  • deur@feddit.nl
    link
    fedilink
    arrow-up
    44
    ·
    edit-2
    1 year ago

    As someone who was born with unilateral moderate->severe hearing loss, I can read lips. The experience is likely unique across the hearing loss spectrum / time of onset, and some people may be able to learn that skill themselves, idk. I’m sure 100% deaf people experience it in their own interesting way.

    To me it’s not anything I conciously do, and it’s not something that’s really that visible to me. The fact I can still hear, but not as well as people with normal hearing affects how it works. The way I’d explain it for me is kinda like this:

    Sometimes I can’t hear enough to tell what is being said, one way my brain naturally deals with this is by reading the speakers lips and using that to help filter and understand what its hearing. I can kinda apply it as a skill, like with muted videos and people I can’t hear because of distance, but it doesn’t work that well and isn’t worthy of trust.

    So for me it’s more of a sense, not something I do or think about. However, its basically the least effort way to understand speech that isn’t clear enough. This is in contrast to another way I/my brain goes about it, which is trying really hard to figure out what it just heard.

    To answer your last question, yes it is likely a mistake. Theres a youtube channel about that whole concept, called bad lip reading or something. They dub over video with audio that matches the lips well enough.

    To put my experience into perspective, which might work for at least a few people: closed captions subtitles. I mean… I’ve never asked anyone else but yall arent just reading them, right? To me they just clarify the speech subconsciously (for the most part), rather than me reading them off the screen when I need them. Captions are weird… Who knows if this is accurate to my experience or similar to others.

    • MrShankles@reddthat.com
      link
      fedilink
      arrow-up
      8
      ·
      1 year ago

      Your experience seems very much like my own. I don’t have hearing loss, but what I assume is an auditory processing issue with speech.

      It’s much easier for me to understand what someone is saying when I can see their mouth and microexpressions. If my back is turned, I don’t always catch everything. Sometimes I keep hearing the “wrong thing” no matter how much I ask for them to repeat it… so I just learned to repeat back whatever non-sensical thing I “heard”; and that either helps me process what they were trying to say, or they will repeat it back slower and more clearly. It’s frustrating sometimes, especially in noisier environments with a lot of other stimulus… that’s when seeing someone’s lips will help the most for me

      And of course, I love subtitles. Otherwise I have to blast the TV, and still will miss things. The subtitles just clarify what I’m pretty sure I heard, or what I missed. I’m not just reading my way through everything, unless it’s in another language… which than does feel like a switch in the way I “see subtitles”

    • Punkie@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      Similar for me: when my hearing started to go in my 30s, the doctor said “you already know how to lip read.” I didn’t believe him until he showed me “am I saying ‘top’ or ‘cup’?” and if he had his mouth covered, I couldn’t tell which one he was saying.

    • AnalogyAddict@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      This, completely. I didn’t even know how much I depended on reading lips until everyone worth listening to was wearing a mask.

      • deur@feddit.nl
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        Yeah it really sucked, slightly muffled by the mask and no lips to read.

        • rosymind@leminal.space
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          Same. My mind makes up random words to sounds all the time, so if I dont have context or lips to read, the sentences I hear people say are just wild.

          I repeat things back to people wearing masks so that I can be sure that I understand them. It annoys some of them, but whatever

    • OneWomanCreamTeam@sh.itjust.works
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      As far as being similar to others’ experiences: I don’t have any significant hearing loss, but you basically just described my subjective experience reading lips (and subtitles).

    • Mouselemming@sh.itjust.works
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      When you say subtitles do you mean closed captions? Because I agree those are a boost for me to follow what I’m also seeing and hearing the person say. But with subtitles they’re speaking a different language so lip-reading isn’t helpful and hearing just adds tone of voice.

    • ALostInquirer@lemm.eeOP
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      Thanks! I appreciate the perspective on this, as lip-reading is kinda like “eye-reading” to me in that I’ve struggled to understand what’s involved.

      To put my experience into perspective, which might work for at least a few people: subtitles. I mean… I’ve never asked anyone else but yall arent just reading them, right? To me they just clarify the speech subconsciously (for the most part), rather than me reading them off the screen when I need them. Subtitles are weird… Who knows if this is accurate to my experience or similar to others.

      This also helps me understand, as I often do watch stuff with subtitles to help better follow dialogue, and I’m usually not closely reading them all throughout.

  • Lvxferre@lemmy.ml
    link
    fedilink
    arrow-up
    15
    ·
    edit-2
    1 year ago

    It’s inference based on mouth movements, but it isn’t as rough as it seems like - context plays a huge role on disambiguation, just like it would for you with homonyms that you hear. It’s just that the number of words that look similar when you mouth read is larger than the number of words that sound the same, since some sounds are distinguished by articulations that you can’t immediately see (such as [f] vs. [v] - you won’t see the vocal folds vibrating for the later, so “fine” and “vine” look almost* the same.)

    Also, the McGurk effect hints that everyone uses a bit of lip reading, on an unconscious level; it’s just that for most of us [users of spoken languages] it’s just to disambiguate/reinforce the acoustic signal.

    *still not identical - for [v] the teeth will touch the bottom lip a bit longer.

  • CluckN@lemmy.world
    link
    fedilink
    arrow-up
    11
    ·
    1 year ago

    I think exact is impossible but I think with enough practice you can get pretty good at identifying words. A good example of lip reading for false words is Bad Lip Reading

  • Contramuffin@lemmy.world
    link
    fedilink
    arrow-up
    9
    ·
    1 year ago

    No hearing loss here, but I’m semi-alright at reading lips. It’s somewhat of a guesswork, but you can make out a decent amount of info, depending on how clearly the other person enunciates their words.

    I suspect most or all people already do lip reading to some extent, but you can definitely “train” yourself to read lips better.

    I mainly look for consonants since those are the easiest to identify (the shape you make when you make an m sound looks super different from when you make a t sound, for instance). There’s a slight bit of guessing involved, since several consonants have the same mouth shape (m and b, for instance). Sometimes, vowels can throw you a bone and be really easy to read (the a in apple, for instance, has you open your mouth very wide), but I generally struggle to read most vowels. The rest is just piecing together what was said based on context clues.

  • Otter@lemmy.ca
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    1 year ago

    I’ve heard that it’s easier if you’re familiar with the person, past that I’m curious too!

  • CaptObvious
    link
    fedilink
    arrow-up
    7
    ·
    1 year ago

    As a linguist, I suspect that everyone lipreads to some extent as a conversation repair mechanism. Accuracy probably depends on skill and context. Family members with hearing loss are pretty good at understanding a speaker that they can see clearly, even when there’s no sound information available at all.

  • Pulptastic@midwest.social
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 year ago

    I can’t do it at all, scouring this thread for tips. I suspect it is pattern recognition my brain has not yet been trained to do.

  • ℕ𝕖𝕞𝕠@midwest.social
    link
    fedilink
    arrow-up
    6
    arrow-down
    1
    ·
    1 year ago

    All I know is that when people started wearing masks, suddenly I had trouble understanding them. I guess I’d picked it up subconsciously alongside my hearing loss.

    • ALostInquirer@lemm.eeOP
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      For anyone not wanting to click, it’s a short video from the National Geographic channel titled: “What It’s Like to Read Lips” and it’s good! It definitely reinforces how I’m not great at reading lips. 😅

    • Pulptastic@midwest.social
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      Oh man that was awful. I watched it with sound off and did not get a single word. It is indecipherable flapping.