I just bought a “new” homelab server and am considering adding in some used/refurbished NVIDIA Tesla K80s. They have 24 GB of VRAM and tons of compute power for very cheap if you get them used.

The issue is that these cards run super hot and require extra cooling set ups. I was able to find this fan adapter kit on eBay. But I still worry that if I pop one or two of these bad boys in my server that the fan won’t be enough to overcome the raw heat put off by the K80.

Have any of you run this kind of card in a home lab setting? What kind of temps do you get when running models? Would a fan like this actually be enough to cool the thing? I appreciate any insight you guys might have!

  • wobblywombat@kbin.social
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    1 year ago

    I look at this route and didn’t have a use case that merited the hassle, heat, and cost. I use my CPU to tinker and spin up cloud resources if I need more oompf. Can I ask, what are you planning to do with the compute power?

    • wagesj45@kbin.socialOP
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      One thing I’d like to do is generate a podcast by scraping news articles, generating a script of two characters discussing the content of the article, then render it using an AI voice. With larger context lengths becoming a thing, I think it’s doable now, especially with so much VRAM in one of these things. I’d also like to run a Stable Diffusion bot for the small chatroom I host for me and my friends. I’m sure ill be able to come up with some more uses as time goes on.