this post was submitted on 26 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

It seems to me that the next big boom for cloud computing will be offering to train and host models that understand the unique business domains it serves.

Are the smart corporations already training local LLMs to understand and answer questions about their business, or is this space too new to accommodate them?

I feel like some of you may be missing a huge business opportunity. You may not realize the value of what you have already researched.

you are viewing a single comment's thread
view the rest of the comments
[–] Belnak@alien.top 1 points 11 months ago (8 children)

My perspective as a Fortune 500 IT solutions architect... why would I spend a few million dollars and a year of project time to build out local infrastructure that'll already be outdated by the time it's installed, when I can just hand my developers and data team permissions on Azure to be able to immediately access the same or better resources for a fraction of the cost? Scale is value, and cloud service providers will always have far greater scale.

[–] Rutabaga-Agitated@alien.top 1 points 11 months ago (2 children)

Yeah might fit in the US, but not in Europe. Dependencies can lead to problems. Especially when there might be a conflict. I would not want to run important infrastructure that is dependent on US services only.

[–] Belnak@alien.top 1 points 11 months ago (1 children)

All major cloud providers have data centers in Europe.

[–] Rutabaga-Agitated@alien.top 1 points 11 months ago

You are right. But if you have a chinese customer for example, there might come up different problems like with NVIDIA and GPUs. Independency is key for a lot of players.

load more comments (5 replies)