this post was submitted on 13 Feb 2024
501 points (97.2% liked)

Technology

73878 readers
6580 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Neato@ttrpg.network 1 points 2 years ago (4 children)

Why are they that big? Is it more than code? How could you get to gigabytes of code?

[–] General_Effort@lemmy.world 51 points 2 years ago

Currently, AI means Artificial Neural Network (ANN). That's only one specific approach. What ANN boils down to is one huge system of equations.

The file stores the parameters of these equations. It's what's called a matrix in math. A parameter is simply a number by which something is multiplied. Colloquially, such a file of parameters is called an AI model.

2 GB is probably an AI model with 1 billion parameters with 16 bit precision. Precision is how many digits you have. The more digits you have, the more precise you can give a value.

When people talk about training an AI, they mean finding the right parameters, so that the equations compute the right thing. The bigger the model, the smarter it can be.

Does that answer the question? It's probably missing a lot.

[–] Aatube@kbin.social 10 points 2 years ago* (last edited 2 years ago) (1 children)

It's basically a huge graph/flowchart.

[–] acockworkorange@mander.xyz -4 points 2 years ago (1 children)

It's really nothing of the sort.

[–] Aatube@kbin.social 12 points 2 years ago (1 children)
  1. Specifying weights, biases and shape definitely makes a graph.
  2. IMO having a lot of more preferred and more deprecated routes is quite close to a flowchart except there's a lot more routes. The principles of how these work is quite similar.
[–] General_Effort@lemmy.world -1 points 2 years ago (1 children)
  1. There are graph neural networks (meaning NNs that work on graphs), but I don't think that's what is used here.

  2. I do not understand what you mean by "routes". I suspect that you have misunderstood something fundamental.

[–] Aatube@kbin.social 3 points 2 years ago (1 children)
  1. I'm not talking about that. What's weights, biases and shape if not a graph?
  2. By routes, I mean that the path of the graph doesn't necessarily converge and that it is often more tree-like.
[–] General_Effort@lemmy.world 3 points 2 years ago* (last edited 2 years ago) (1 children)

You can see a neural net as a graph in that the neurons are connected nodes. I don't believe that graph theory is very helpful, though. The weights are parameters in a system of linear equations; the numbers in a matrix/tensor. That's not how the term is used in graph theory, AFAIK.

ETA: What you say about "routes" (=paths?) is something that I can only make sense of, if I assume that you misunderstood something. Else, I simply don't know what that is talking about.

[–] Natanael@slrpnk.net 2 points 2 years ago (1 children)

If you look at the nodes which are most likely to trigger from given inputs then you can draw paths

[–] General_Effort@lemmy.world 2 points 2 years ago

I still don't know what this is supposed to mean for neural nets. I think it reflects a misunderstanding.

[–] Amir@lemmy.ml 7 points 2 years ago

They're composed of many big matrices, which scale quadratically in size. A 32x32 matrix is 4x the size of a 16x16 matrix.

[–] 9point6@lemmy.world 6 points 2 years ago* (last edited 2 years ago) (1 children)

The current wave of AI is around Large Language Models or LLMs. These are basically the result of a metric fuckton of calculation results generated from running a load of input data in, in different ways. Given these are often the result of things like text, pictures or audio that have been distilled down into numbers, you can imagine we're talking a lot of data.

(This is massively simplified, by someone who doesn't entirely understand it themselves)