i got sick again so the financial update and also this thread are late. i’ll get the financial update up at a later point, or i might just combine it with january since there’s not that much to report as far as i can tell

  • MangoKangaroo@beehaw.org
    link
    fedilink
    English
    arrow-up
    4
    ·
    11 months ago

    The weekend was pretty rough for me. I’m currently going through a medication taper, and that combined with some other life stuff had me super depressed for a few days. I was actually gonna go see my dad for the first time in a decade, but I decided to cancel because I was down and couldn’t be bothered to drive for four hours and sleep somewhere that wasn’t my home. (Also the weather along the coast is crap right now.)

    Other than that, I’ve kept myself distracted with some planning for a new home lab. I’m starting to give a fuck about fashion, so I’ve got some new stuff coming in that I’m excited to try. I’ve also got a razor coming so I can do my first real head shave that isn’t just the 1/8" that my clippers can manage.

    • silentdanni@beehaw.org
      link
      fedilink
      English
      arrow-up
      2
      ·
      11 months ago

      Ah man, dealing with depression can be really hard. I hope things are getting better for you and you’re happily trying out your new clothes.

      What are your plans for your new homelab, if I may ask?

      Hang in there, sir.

      • MangoKangaroo@beehaw.org
        link
        fedilink
        English
        arrow-up
        2
        ·
        11 months ago

        Thank you for your kind words. Every day gets brighter.

        For homelab, I’m not 100% sure yet. I’m at least going to be getting a Synology NAS to replace my ancient Lenovo EMC2. I really wanted to get some hardware for running LLama 2 and KobaldCPP, but I’m struggling to find something that’s equal parts not noisy (I live in a studio), affordable(ish), and that has the minimum specs I’d need. I was unironically considering a Mac Mini with a rack converter because of the energy efficiency and powerful iGPU, but sadly they only ship up to 32GB of RAM. Since my reading suggests I’d want at least 64GB of RAM for LLama 2’s 70B version, I’m having to try some other way of doing things. I just wish I didn’t live in a studio so I could grab a secondhand rackmount server without worrying about noise levels. 😭

        • silentdanni@beehaw.org
          link
          fedilink
          English
          arrow-up
          2
          ·
          11 months ago

          I have the same problem; my flat is only about 50sqm. Judging by the way things are going, I think there’s a chance Nvidia will release some consumer-grade hardware meant for LLMs in the near-ish future. Until they reveal their next lineup, although it may seem like a poor financial decision, I’m just sticking to using the cloud for running llms.

          I’m also hoping to get my hands on some raspberry pis too. I would like to build a toy k3s cluster at some point and maybe run my own mastodon instance. :)

          • MangoKangaroo@beehaw.org
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            11 months ago

            Well at least I’m not the only one whose homelab ambitions are being crushed by their apartment layout. I think that I’m going to end up with a 2U compute rack, which means I’ll probably limp along on one or two consumer low-profile GPUs. Now if only I could work out the details of the actual rack server hardware…

            A Raspberry Pi cluster is interesting! My only real exposure to using Pis in a homelab was an old 1B I was using for PiHole. It was great right up until it stopped working.