minus-squareyoureusingitwrong@programming.devOPtoFree Open-Source Artificial Intelligence@lemmy.world•Easy to setup locally hosted LLM with access to file systemlinkfedilinkEnglisharrow-up1·1 month agoJust a 1080, though it handles just fine with 7b models, could also work with a 14b probably. linkfedilink
minus-squareyoureusingitwrong@programming.devOPtoFree Open-Source Artificial Intelligence@lemmy.world•Easy to setup locally hosted LLM with access to file systemlinkfedilinkEnglisharrow-up2·1 month agoZero, as said I’d prefer to self host. linkfedilink
youreusingitwrong@programming.dev to Free Open-Source Artificial Intelligence@lemmy.worldEnglish · 1 month agoEasy to setup locally hosted LLM with access to file systemplus-squaremessage-squaremessage-square7fedilinkarrow-up114arrow-down11
arrow-up113arrow-down1message-squareEasy to setup locally hosted LLM with access to file systemplus-squareyoureusingitwrong@programming.dev to Free Open-Source Artificial Intelligence@lemmy.worldEnglish · 1 month agomessage-square7fedilink
Just a 1080, though it handles just fine with 7b models, could also work with a 14b probably.