• 0 Posts
  • 6 Comments
Joined 1 year ago
cake
Cake day: July 2nd, 2023

help-circle
  • Sure, but I’m just playing around with small quantized models on my laptop with integrated graphics and the RAM was insanely cheap. It just interests me what LLMs are capable of that can be run on such hardware. For example, llama 3.2 3B only needs about 3.5 GB of RAM, runs at about 10 tokens per second and while it’s in no way comparable to the LLMs that I use for my day to day tasks, it doesn’t seem to be that bad. Llama 3.1 8B runs at about half that speed, which is a bit slow, but still bearable. Anything bigger than that is too slow to be useful, but still interesting to try for comparison.

    I’ve got an old desktop with a pretty decent GPU in it with 24 GB of VRAM, but it’s collecting dust. It’s noisy and power hungry (older generation dual socket Intel Xeon) and still incapable of running large LLMs without additional GPUs. Even if it were capable, I wouldn’t want it to be turned on all the time due to the noise and heat in my home office, so I’ve not even tried running anything on it yet.


  • The only time I can remember 16 GB not being sufficient for me is when I tried to run an LLM that required a tad more than 11 GB and I had just under 11 GB of memory available due to the other applications that were running.

    I guess my usage is relatively lightweight. A browser with a maximum of about 100 open tabs, a terminal, a couple of other applications (some of them electron based) and sometimes a VM that I allocate maybe 4 GB to or something. And the occasional Age of Empires II DE, which even runs fine on my other laptop from 2016 with 16 GB of RAM in it. I still ordered 32 GB so I can play around with local LLMs a bit more.


  • WSL 1 is a compatibility layer that lets Linux programs run on the Windows kernel by translating Linux system calls to Windows system calls, so in that sense I understand the name: it’s a Windows subsystem for Linux [compatibility]. It doesn’t use the Linux kernel at all. With WSL 2 they’re using a real Linux kernel in a virtual machine, so there the name doesn’t make much sense anymore.



  • Average none, though 2.5 Gbps is getting more and more common and WiFi is catching up too. You could max out multiple slower devices at the same time without hitting the limit of your uplink. I don’t have a use case for that, so I’d only upgrade from my current 1 Gbps to higher speeds if the price is comparable. That doesn’t mean that others don’t have a use case for it.


  • Agreed. In the past you would pay for calling and text messages and data was often unlimited at the higher tiers, but since nobody pays extra for calling and texting anymore, they’re now charging for data. Luckily they can’t charge extra for EU roaming anymore.

    Data caps on landlines is something that I haven’t seen for a very long time in my EU country. The last time I had a subscription with a data cap must have been with a 56k modem, if at all. Cable and DSL might have had fair use policies back in the day (or maybe they still do, who knows), but no hard cap. Or at least not that I can remember.

    Internet nowadays is way too important to have data caps, especially at home. 5G should definitely be next. Differentiate in speed all you want, but ditch the caps.