I currently use a base MacBook Air M1 (8 GB RAM, 256 GB SSD). It is now a hard bottleneck for AI work due to RAM, GPU limits, and thermals.
Budget is ~2600 EUR.
My original plan was a refurbished MacBook Pro M4 Pro (48 GB, 1 TB) from Canada, but it is not available. The M4 Max is available, but at ~4000 CAD it feels unjustifiably expensive.
My professors advised that large training should be done on the cloud, but a local Mac should still have at least Base M4, 16 GB RAM and 512 GB SSD, preferably M4 Pro,32 GB RAM and 1 TB SSD.
I plan to keep my next laptop 5–6 years. Longevity matters more than peak performance.
I am now considering a split setup:
Refurbished base MacBook Pro M4 (16 GB RAM, 512 GB SSD) for daily work and portability. Price is ~1100 EUR.
A desktop PC for heavier workloads, built via PCPartPicker, with ~1500 EUR budget. NVIDIA GPU for CUDA, AMD CPU to control cost.
PC parts would be bought in Germany. Mac would be bought refurbished in Canada via family.
Main questions:
Is the Mac + desktop split the smarter long-term choice for an AI student?
Is base M4 with 16 GB enough if heavy work is offloaded?
Hi @IPelo, could you share more about what AI work you will be doing? Will you be mainly using the model or also training the AI? How large is the model and training data?
Also does your university provide a GPU server for you to use? When I was researching on AI, my institution provided access to a GPU server which is a saver as although the model I build is considered small, I maxed out my 3090 work station’s 24 GB of VRAM easily during training.
In general, you should consider how big your model is and how big your training set is. While I don’t work on AI anymore, I was excited for the Framework desktop with AMD’s strix halo chip when I was working on AI. It has the same unified system memory approach of Apple’s mac without the Apple memory tax. While the strix halo GPU might not be as strong as an NVIDIA card, being able to fit your epoch batch into vram speeds up training way more than being bottlenecked on memory bus as you will be transferring each data from CPU ram to GPU vram. You could technically lower batch size but it does affect performance (could be better or worse, batch size is a training parameter you need to fine tune).
TLDR, framework desktop with strix halo provides great option for AI devs without breaking your wallet. The integrated GPU offer lots of memory to allow larger models and data. The iGPU is not as fast as Nvidia xx90 cards obviously but most AI devs would rather have our model training run at slower speeds rather than not at all due to insufficient memory. Also you should ideally be looking at your University’s GPU server for even larger models such as foundation models.
My AI work is still early-stage university level. Mostly learning workflows, prototyping, smaller models, some reinforcement learning, and working with datasets. I do train models, but nothing close to foundation scale.
My university does have GPU servers, but the waiting times are very long, so they’re not always practical for regular experimentation.
I agree that memory and VRAM are the main bottlenecks. That’s why I’m leaning toward a desktop for training. I’m probably aiming for a 16 GB VRAM GPU, but in the more affordable ~400 EUR class (RTX 5060 Ti ), not the ~900 EUR tier (RTX 5080).
Also, even on the PC side, costs add up fast. In Germany, 2×16 GB RAM can be around 300 EUR, which makes me want to spend smart and avoid overshooting specs I won’t fully use yet.
The Strix Halo / unified memory point is interesting too. I hadn’t seriously considered the Framework desktop before, but having a big shared memory pool for larger batches sounds like a real advantage.
Yes, I was also unaware like you before. I didn’t know that there were applications like kraggle and google Collab for heavy AI computation workloads. I use a very old gaming laptop that has the rtx 3060 6GB hence i can do small workloads locally in my computer and for heavier projects i usually do it in Collab pro. My friends either have a bulky gaming laptop like me or a MacBook pro and some even have the MacBook Air. They usually use Collab more.
My friend has the M3 Pro MacBook pro and he mostly uses that to train small quantised models and local inference. Other than this all his work is done on cloud. If you ask me, macbooks are known for being long-lasting laptops, your M1 MacBook Air survived for 5 years that is proof of how well a MacBook can last a user. Things I liked about my friend’s mac is that thanks to it having 36 GB ram he can run big LLMs locally. My laptop maxes out at 3B models or heavily quantised 7B models. His MacBook on the other hand can run 13B models and even 32B models ( for example qwen 2.5 32B quantised to Q4, it ran at 12-15 tokens per second. Not great but still usable- on the contrary my windows laptop would simply just crash.
You can go for the M4 max but learning cuda is also important, i usually learn using cuda in my University’s computer labs that has better nvidia rtx gpus like the 5080 which have 16 GB vram or smaller 5060 gpus for smaller tasks. Do you have the pass for computer lab in your University? If you can get one and you’re allowed to use it at least once a week then buying a M4 pro MacBook pro can be a great choice, even a 32 GB ram model will work well for your usecase. What MacBook options do you have ?
Building a PC in this time is a very bad choice, you will be paying a unfathomable amount on ram alone and you know RAM is crucial for AI dev. Anyways what kind of work do you expect your laptop to perform locally? Anything specific?
Yeah, I was in the same boat at first. I do use Colab for heavier stuff and it definitely helps, but I don’t always want to rely on it for everything, especially when I’m experimenting a lot.
Unfortunately, the uni GPU lab isn’t very practical for me. You have to reserve it and the waiting time is around two weeks, so it’s not something I can depend on regularly. That’s also why learning CUDA there hasn’t really happened yet.
For the Mac options, I’m realistically looking at an M4 Pro MacBook Pro with 48 GB RAM and ~512 GB storage(refurbished). The 1 TB and 2 TB configs just aren’t available right now, and the M4 Max feels too overpriced for what I’d actually use.
Running LLMs locally is definitely something I’d like to try. A few friends do it and it looks really useful for learning, but my current M1 Air just can’t handle it at all, so that’s part of why I’m upgrading.
I can manage using cloud when needed, but I don’t want to be fully blocked locally either. After thinking more about PC build costs, especially RAM, I’m also less convinced that building a desktop right now is the best move.
So at the moment I’m leaning more toward a higher-RAM M4 Pro as a balanced option rather than going all-in on a desktop or paying extra for an M4 Max.
Honestly the M4 pro MacBook pro with 48GB memory will be great for you, the only limitation is the SSD, so you will have to buy an external drive over time.
M4 max could work out for you but it’s ridiculously priced ( ig 4000$). If you could find a M4 pro option with a 1 TB option then that’s the perfect laptop, if not then 512 GB is also fine. A quick question tho, how much of a price difference is there on this refurbished M4 pro vs a new MacBook pro from apple’s website or certified resellers?
That is really a lot, in that price difference you can even buy yourself a base iphone 17. Usually I would suggest to go with 1TB, But then this refurbished 48 GB Ram and 1 TB SSD varient is out of stock last you checked right. The only sensible option is the 512 GB SSD variant. Check if there are other resellers which have the 1 TB version. My conclusion is, buy the M4 pro. If you can’t find a 1 TB option then get the 512 GB version and later on if you need more space, you can always buy an external SSD. My laptop has 512 GB SSD. In which 150 GB is Just games but now my disk space is almost full. 512 GB can work for you if you dont store everything on your laptop, like 5-6 year old photos, videos, etc.