AI Dose
0
Likes
0
Saves
Back to updates

[r/LocalLLaMA] Does going from 96GB -> 128GB VRAM open up any interesting model options?

Impact: 1/10
Swipe left/right

Summary

A user has expanded their local AI setup's VRAM from 96GB (RTX Pro 6000) to 128GB by adding a 5090 via a Thunderbolt 4 dock. They are seeking recommendations for new coding-focused models or quants that can now fit this increased VRAM, which were previously inaccessible with 96GB. The user prioritizes model options over speed for now.

Continue Reading

Explore related coverage about community news and adjacent AI developments: [r/ML] [D] MYTHOS-INVERSION STRUCTURAL AUDIT, [r/LocalLLaMA] karpathy / autoresearch, [r/ML] [R] Agentic AI and Occupational Displacement: A Multi-Regional Task Exposure Analysis (236 occupations, 5 US metros), [r/ML] Building behavioural response models of public figures using Brain scan data (Predict their next move using psychological modelling) [P].

Related Articles

Comments

Sign in to leave a comment.

Loading comments...