Low Budge Worstation

Hi there,

I want to setup a LLM workstation to start developing my own agent and tools and experiment. I travel a lot and don’t have a big budget at the moment to expend.

I saw the Nvidia Jetson Nano Orin Super and it looks cool but I’m not sure if is the best option for my needs.
I use Linux and like to have freedom and don’t be tied to an specific ecosystem, there are very little reviews about this one and none of then cover Agentic development on deep.

I also read that a NVIDIA 3060 should be enough for my needs but I would have to use it as eGPU which has a shitty performance or build a mini workstation, which is a very attractive option and I wouldn’t mind to expend a bit more of money if it truly fits my needs.

So what do I need/want??

I want to be able to develop agents and integrate them via CLI for Sysadmin and Cyber Security purposes, I would like to have a decent level of inference to basically play and explore as much is possible to know exactly what I will need in the future and develop tools that will scale once I have a more beefy setup.

I’m also interesting on coding agents but I guess I would need the capacity to train the model to achieve what i have in mind. And I don’t know how realistic it is to expect to be able to train model with such a low budget. At least I would like to run something that allows me to get ride of Cursor.

I really want to get my hands on ASAP but I’m afraid to make an investment that I will end regretting after I dive on LLMs more, that’s why I’m writing this post so maybe I can get some feedback and guidance about the best way to start this project based of my circumstances and needs

1 Like

For hardware consultations or fine-tuning, I think it’s best to ask questions on the HF Discord or Unsloth’s Discord.

Nvidia Jetson Nano Orin Super and it looks cool but I’m not sure if is the best option for my needs.

It’s cool but not well-suited for various tasks with LLM. It’s more geared toward edge devices, so I think it’s better to choose a GPU this time.

a NVIDIA 3060 should be enough

Yeah. I’m using a 3060 Ti too. Well, with 8GB of VRAM, you can manage some things. Ideally, 12GB or 16GB—the more VRAM you have, the more you can do. For anything other than high-end, VRAM size matters more than clock speed.

how realistic it is to expect to be able to train model with such a low budget.

I think this might be helpful.

BTW, setting aside security concerns, renting cloud GPUs for fine-tuning is straightforward. Google Colab, for instance.

About OSS Coding Assistant

1 Like

Wow, all this is awesome! Thank you very much!! I did also wrote this post on the Discord Server!

1 Like

This topic was automatically closed 12 hours after the last reply. New replies are no longer allowed.