• Fal@yiffit.net
    link
    fedilink
    English
    arrow-up
    2
    ·
    7 months ago

    They kinda are though lol.

    Not really. Explain what you do differently in -10F temperatures that you wouldn’t do in 0F temperatures in normal life. I don’t want to hear about how you would choose a different sleeping bag or prepare your snow shoes differently or some shit. When your day consists of commuting to work, going to the grocery store, then going home, what meaningful difference do any values below 0F have.

    But what about sauna. What about it?

    What about really cold weather.

    What about it?

    What about cooking.

    What about it?

    Hell, what about my PC.

    What about it?

    What about when I have a fever

    This is actually the perfect example. Above 100 is a fever. Below is fine

    What about really hot weather

    What about it?

    The temperatures are about much more than the fuzzy idea about normal-ish weather in certain places on earth.

    Not in 99% of how people use the temperature.

    And your examples of cooking and your PC are not what we’re talking about. We’re talking about human environmental temperature. But in fact, cooking is another good use for F. You generally only care about a few specific temps. 350F and 400F. Anything else is nuance but basically only matter on the 25 degree marks. So 375, 425. It’s actually a pretty great scale for cooking, with broiling generally maxing out at 500 (unless you’re talking very specific application, like pizza ovens or some shit)

    • Kusimulkku@lemm.ee
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      7 months ago

      Yes, negative numbers exist, and numbers beyond 100. But they’re not that important.

      The 100 degree scale is about describing the normal range that humans interact with their environment in

      “Well what about all these things outside of this range people use in their daily life?”

      What about it?

      LOL

      And your examples of cooking and your PC are not what we’re talking about. We’re talking about human environmental temperature.

      I’m making the case that your “human environmental temperature” is a shit reason to pick Fahrenheit because we have all these things that surprisingly don’t conform to it. So you’ll have to go outside the 0-100 range anyway. So you won’t get any “benefit” from it, even when the “benefit” was dubious to begin with.

      But in fact, cooking is another good use for F. You generally only care about a few specific temps. 350F and 400F. Anything else is nuance but basically only matter on the 25 degree marks. So 375, 425. It’s actually a pretty great scale for cooking, with broiling generally maxing out at 500 (unless you’re talking very specific application, like pizza ovens or some shit)

      Wait till you see international ovens and cooking manuals. It’s gonna blow your mind.

      • Fal@yiffit.net
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 months ago

        I’m making the case that your “human environmental temperature” is a shit reason to pick Fahrenheit because we have all these things that surprisingly don’t conform to it. So you’ll have to go outside the 0-100 range anyway. So you won’t get any “benefit” from it, even when the “benefit” was dubious to begin with.

        It’s better to pick the scale that does conform to it for the vast majority of applications, and then just deal with the others. Either by using C or just dealing with it. For every 1 time you need to deal with temps of your computer, you’ll interact with the environmental temperature a thousand times. And neither C or F are inherently better for describing CPU temps.

        Wait till you see international ovens and cooking manuals. It’s gonna blow your mind.

        Oh, I forgot to pull out my cooking manual. Yeah C is MUCH better.

        • Kusimulkku@lemm.ee
          link
          fedilink
          arrow-up
          2
          ·
          7 months ago

          It’s better to pick the scale that does conform to it for the vast majority of applications, and then just deal with the others. Either by using C or just dealing with it. For every 1 time you need to deal with temps of your computer, you’ll interact with the environmental temperature a thousand times. And neither C or F are inherently better for describing CPU temps.

          I mean neither conforms very well, that’s the whole point. And what’s the deal with 0-100, why is that so beneficial in your opinion?

          And neither C or F are inherently better for describing CPU temps.

          Well yeah, it was simply about the 0-100 thing.

          Oh, I forgot to pull out my cooking manual. Yeah C is MUCH better.

          Wait till you see the ovens. It’s incredible. There’s usually few temps you need to care about and it changes in 20 degree marks. Incredible, I know.

      • bermuda@beehaw.org
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 months ago

        Wait til you learn that there are other things to do in life than bitch about temperature systems.

        • Kusimulkku@lemm.ee
          link
          fedilink
          arrow-up
          1
          ·
          7 months ago

          This would probably be more impactful if you didn’t just jump into a discussion that didn’t involve you to make this observation lol

          • bermuda@beehaw.org
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            7 months ago

            I like how you respond to somebody saying you’re bitching by bitching more. Continue crying about it by yourself, please.

            • Kusimulkku@lemm.ee
              link
              fedilink
              arrow-up
              1
              ·
              7 months ago

              You replied to me, unprompted. Can’t go complaining I replied after that, that’d be silly