this post was submitted on 09 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

I have a cluster of 4 A100 GPUs (4x80GB) and want to run meta-llama/Llama-2-70b-hf. I'm a beginner and need some guidance.

- Need a script to run the model.

- Is 4xA100 enough to run the model ? or its more than required?

Need the model for inference only.

you are viewing a single comment's thread
view the rest of the comments
[–] HenkPoley@alien.top 1 points 1 year ago