EDIT: A few days ago I bought a HP X34. It’s a 3440x1440 ultrawide with 165Hz refresh rate. Obviously my setup can’t hit constant 165fps in all games, but I’m comfortably getting 100+ in most games, 120+ in Forza Horizon 5 or Doom Eternal. Can’t complain :)

I’m looking to buy a new monitor, making a switch from two 16:9 to a single 21:9. Everywhere I read the opinions are that on an ultra wide it doesn’t make sense to go lower than 1440p, which I guess holds true for 34" monitors.

However, I’m worried my 3060 Ti won’t be enough for that many pixels. Right now I’m enjoying uninterrupted framerates playing at 1920x1080.

How much should I really worry about making this switch? My other option is to go for a 2560x1080 monitor that’s smaller (29").

  • rexxiteer@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    1 year ago

    It completely depends on the games you play.

    Personally, I’d just go with the higher-resolution display, if you notice your GPU struggling at that resolution, you can still turn down the render resolution of the game, but you’d still be on a larger monitor.

    The larger issue when gaming is probably that some games don’t play nice with ultrawide monitors, so your mileage may vary.

    • MHcharLEE@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      I’m fully prepared to have to play some games in 16:9 if that’s the case. I naturally want to get the full experience where possible, but I’m ready to compromise.

      My potential concern with lowering the resolution is the fact that 1080p and 1440p don’t align that well with one another. Bigger issue with text than with games, but I haven’t had a chance to see that supposed issue with my own eyes.

      • Aloomineum@kbin.social
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        I have an ultrawide and there are lots of work arounds for games that don’t directly support it. You can look up the flawless widescreen program for some games, for others games they often have a patch or mod that works.

        Outside of fighting games like street fighter, I cant remember the last game I didnt get working with ultrawide.

        Hope you enjoy it!

      • godofpainTR@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        Most (recent) games have a way to reduce the render resolution, while keeping the UI at native resolution. And for those that don’t, lowering the settings a bit should be enough with your gpu

  • MedicareForSome@lemmy.ml
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    I run a 3070 with the gigabyte G34WQC (21:9 1440p). The 3070 is basically the same card as the 3060 ti.

    I would say the major limitation is VRAM. The card has enough horsepower to comfortably run most games. With DLSS I expect >100 fps.

    However, it’s definitely showing its age. Current consoles have effectively 12 gb of VRAM so the 8 gb is sort of teetering on the edge. If it was a 16 gb card it would be fine for years to come.

    If all you want is no stuttering, it will be great. I never see below 80 in demanding games. However, driving a full 144 fps will be an issue.

    • MHcharLEE@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      That’s what I needed. Thank you. I’ll upgrade the GPU sooner or later (probably later), I knew what I was getting myself into with 8GB of VRAM.

      On the note of G34WQC, I’m eyeing that monitor among a few others. Is the antiglare coating really as grainy as some people claim it to be? Supposedly makes text look fuzzy on bright backgrounds.

      • MedicareForSome@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I feel like a lot of the complaints about the G34WQC come from monitor snobs. It would argue it’s one of the best bang-for-your-buck monitors on the market. Certainly it’s the best budget ultrawide. Personally, I’ve yet to experience these issues. The coating being grainy is the first I’m hearing about it and I haven’t experienced that with mine. I code and read on it without issues. In terms of text I’ve seen people complain about the curve but I handle that just by snapping the window to 1 side. It essentially just serves as two monitors.

        Another big complaint is black smearing/ghosting. I haven’t noticed this at all. I run dark mode on windows. I assume it’s real but I just don’t have an eye for it. I use an IPS monitor for work because I need color accuracy and switching back to the G34WQC, I notice that blacks look better but don’t see any smearing or inconsistency.

        Out of the box, the monitor looks not the best but this is true of all cheap-o monitors. However, it can be fixed through software. You have to calibrate it. I use the ICC profile from rtings.

        According to their tests, this brings color accuracy up substantially. Though this isn’t special to this monitor, they really should all be calibrated.

        I got it for $350 on sale a few years ago and it was a great purchase.

        • MHcharLEE@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Oh I’m by no means a monitor snob. Took me two years to realize my IPS monitor wasn’t true 8 bit, but FRC instead haha.

          I checked again and that antiglare complaint is about the M34WQ - flat, IPS version of your monitor. I mixed them up.

  • Giddy@aussie.zone
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    Definitely don’t buy a 1080 monitor at thaose sizes. For me 27" is the highest I would go at 1080p and that is a stretch.

    • MHcharLEE@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      29" 21:9 at 1080p has about the same pixel density as 24" 16:9 at 1080p, which is what I’m using right now.

  • widowhanzo@lemmy.fmhy.ml
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    1 year ago

    Juat go for it. I’m running a 2080 with a 4K 1440Hz monitor and it runs ok, and when it doesn’t, I can reduce the details or enable DLSS or something. 3060Ti should be perfectly fine for 3440x1440.

    But then again it really depends on the games you play, if you’re currently struggling to keep 60fps at 1080p, then it’s gonna run worse on bigger resolution of course. And there are benefits of the larger display and higher resolution outside of games as well.

    Don’t get a 1080p 29", it’s gonna be overall smaller than two 24" monitors, an ultra wide 29" is the same height as 24", just wider, but not twice as wide.

    • MHcharLEE@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Appreciate the insight.

      The benefits outside of games are half the reason I’m considering an ultra wide. I’m considering a 29" 1080p as a lesser choice, but am considering it regardless. That’s because I don’t exactly use my current second monitor. I’m fine with downsizing, but I’m also happy with a flashy big wide 34" monitor.

  • SaveComengs@lemmy.federa.net
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I run a 3060ti with a m34wq and it works fine. I get decent (60+) fps on all games I’ve played, though recently I’ve been just playing undemanding games

    • MHcharLEE@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      M34WQ is the flat IPS one, right? I asked about it in this thread but my question was about the weong monitor.

      On your M34WQ, how is the antiglare coating, in your opinion? Is it as grainy and distracting as people claim it to be?

      • SaveComengs@lemmy.federa.net
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I find it not annoying at all, and I sometimes even have the sun directly hitting the panel. When that happens, I just turn up the brightness with Twinkle Tray and it’s fine.

        The colours are really vibrant, and I likd it overall.

  • WereCat@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I’ve used 3060ti ar 1440p 165Hz until recently, the card is definitely adequate if you don’t need high/ultra settings all the time. Obviously some new games will not run that well but everything should still be very playable with some tweaking.

  • Cat@kbin.social
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    I’m running that GPU right now with 2 4k monitors and a TV at 2k. All at 60hz. Running great.

  • CaptPretentious@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    There’s not a single yes/no answer. It very much depends on the work load. Even knowing the game isn’t enough, because there’s also the settings in said game, any mods that might be running, any additional apps that might be using the video card, the CPU you have to see if it can feed the video card.

    Personally, I’d say ultrawide monitors just aren’t worth it for gaming since most games won’t support it. Honestly, if you’re looking for more, a single 4k monitor might be the better route? Just a thought.

  • Giddy@aussie.zone
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I’m running 3 x WQHD (2560x1440) monitors on an ancient RX580. Currently playing a lot of Diablo 4 on 1 screen at low-medium settings with no troubles

  • GetsThruBuckner@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I’ll give my personal input…

    I play at 3440x1440 with a 3070 and really feel like it’s held back by the 3070. Especially when you have to consider all these unoptimized games now. If you’re looking at just playing at 60 FPS then you might be alright for a min. I often just lock more demanding games to 60 fps now because otherwise its pushes the card to 99% usage and makes games feel awful

    • R0cket_M00se@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Same setup for me and yeah I feel like I’m good to go now especially since I don’t really play a lot of the games that are optimized to shit, but an upgrade will need to happen eventually.

      I’d suggest a 3070ti at least or go AMD for someone trying to get into entry level 3440x1440.