• ɔiƚoxɘup@infosec.pub
    link
    fedilink
    arrow-up
    18
    ·
    1 day ago

    I’ve had this very exact thought. I think the urge to tell the story of the world is parental instinct.

    Whenever the kiddos ask a question, I don’t stop answering and describing until they get bored… Sometimes beyond that.

    • tatterdemalion@programming.dev
      link
      fedilink
      arrow-up
      12
      ·
      1 day ago

      I have the same urge, but rather to be a teacher/mentor than a parent. Too bad the US doesn’t want to pay teachers what they’re worth, or I’d strongly consider a career change.

      • SorteKanin@feddit.dkOP
        link
        fedilink
        arrow-up
        2
        ·
        16 hours ago

        I’ve had the exact same thought and it’s not just in the US. Teachers are woefully underpaid in general. I think a part of the problem is that there’s so many teachers that it would very expensive for society as a whole to give them a higher salary. I would totally do teaching if it paid better though.

  • Hammocks4All@lemmy.ml
    link
    fedilink
    arrow-up
    21
    ·
    1 day ago

    It’s pretty fascinating that babies are largely the same across big timescales and just learn the culture of the time. They are ready to learn “utopia”, we just have to figure out how to teach it.

    • oldfart@lemm.ee
      link
      fedilink
      arrow-up
      10
      ·
      1 day ago

      Haha, that’s so far from reality though. Dad is talking about boring things again, I will cover my ears and scream.

      • cynar@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        18 hours ago

        Children are often a mirror to our true selves. My daughter is fascinated by the world, and loves learning. She’s a sponge for new knowledge. The only limitation is building up the layers of knowledge, to understand what you are trying to teach.

        • oldfart@lemm.ee
          link
          fedilink
          arrow-up
          2
          ·
          17 hours ago

          Children are often a mirror to our true selves.

          Woah, that was a good roast.

      • ZoopZeZoop@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        1 day ago

        My 6 year old loves to learn about math, sciences, a few other subjects to a lesser degree, plus practical stuff, like driving/traffic behavior. He mainly likes biological sciences and astronomy, but some physics and engineering.

          • ZoopZeZoop@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            14 hours ago

            I don’t know how much is innate and how much we fostered, but we read a ton with him from a very young age and made it into little quiz games and fill in the blank questions. He’s always liked knowing the answers. My 3 year old is less interested, but I think she’ll absorb a lot just by being around her brother. We do read to her, of course, but she chooses different books than the 6 year old does (and even did at 3).

  • mannycalavera@feddit.uk
    link
    fedilink
    arrow-up
    19
    ·
    2 days ago

    I was expecting the baby to pee in his face or something. Why have you tricked me into feelings, internet? Why?

    • skulbuny@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      2 days ago

      I’ve already resigned to the fact that I will probably die alone in an apartment not to be found for weeks because no one checks in on me other than my parents. I won’t kill myself, but I’m not good at socializing

      • TheRealKuni@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        ·
        1 day ago

        Most people aren’t good at things they don’t practice. But lots of people get really good at things they weren’t good at.

  • tacosanonymous@lemm.ee
    link
    fedilink
    arrow-up
    19
    ·
    2 days ago

    I feel like my knowledge would be too basic. I’d get the gist of it but not well enough to replicate and they’d just humor me bc they’d think I’m special.

    • ɔiƚoxɘup@infosec.pub
      link
      fedilink
      arrow-up
      3
      ·
      1 day ago

      I run up against that with my kiddos, so I just show them how to get answers and how to learn. When they get older, I’ll teach them more about vetting sources and sussing out misinformation. I already have taught them as much about that as I can for their age.

  • FaceDeer@fedia.io
    link
    fedilink
    arrow-up
    5
    arrow-down
    70
    ·
    2 days ago

    We’ve already got LLMs that can simulate conversing with those dead people to some degree, I wouldn’t say they’re beyond the reach of any technology. In a few years they might be good enough simulations that you can’t tell the difference.

    • Rayquetzalcoatl@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      ·
      2 days ago

      No, they are beyond the reach of any technology. An Isaac Newton themed chatbot isn’t actually the spirit of Isaac Newton. It’s a chatbot.

    • OsrsNeedsF2P@lemmy.ml
      link
      fedilink
      arrow-up
      28
      arrow-down
      2
      ·
      2 days ago

      People downvoted you but I don’t think they touched on the main idea; I don’t want to show Einstein modern physics for my entertainment, I want to teach it so he can be amazed.

    • qjkxbmwvz@startrek.website
      link
      fedilink
      arrow-up
      26
      ·
      2 days ago

      good enough simulations that you can’t tell the difference.

      This requires us having actual conversations with those dead people to compare against, which we obviously can’t do.

      There is simply not enough information to train a model on of a dead person to create a comprehensive model of how they would respond in arbitrary conversations. You may be able to train with some depth in their field of expertise, but the whole point is to talk about things which they have no experience with, or at least, things which weren’t known then.

      So sure, maybe we get a model that makes you think you’re talking to them, but that’s no different than just having a dream or an acid trip where you’re chatting with Einstein.

      • MajorHavoc@programming.dev
        link
        fedilink
        arrow-up
        3
        ·
        edit-2
        1 day ago

        There is simply not enough information to train a model on of a dead person to create a comprehensive model of how they would respond in arbitrary conversations.

        True. And even if we did, most of them would be super racist, anyway. Just like chatbots from a few years ago!

        Wait, maybe we do have the necessary technology…Hooray? Lol.

      • ɔiƚoxɘup@infosec.pub
        link
        fedilink
        arrow-up
        0
        arrow-down
        1
        ·
        1 day ago

        Well, I’d think you’d test that model with living authors with similar inputs and make comparisons and then refine the process till nobody can tell the difference. We’ll never get all the way there, but I bet we’ll get far enough that we won’t be able to tell the difference.

        As for when… Who knows?

    • Cypher@lemmy.world
      link
      fedilink
      arrow-up
      36
      arrow-down
      3
      ·
      2 days ago

      LLMs are no better than speaking to a clever parrot. It might say the correct words but has no understanding and so there is no value in teaching it beyond a petty parlour trick.

        • TheFinn@discuss.tchncs.de
          link
          fedilink
          arrow-up
          4
          ·
          20 hours ago

          They might mimic those things in a convincing fashion in a few years but there wouldn’t be a reason for them to exist. There’s no person behind the curtain, or inside the multilayered, statistically-weighted, series of if statements.