this post was submitted on 14 Dec 2025
19 points (100.0% liked)

Technology

40987 readers
512 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 3 years ago
MODERATORS
 

If GPT (decoder-only transformer) models are text predictors, then why keyboard apps on PCs/phones don't have GPTs as text predictor options? They can be more accurate than the widely used N-gram models.

top 6 comments
sorted by: hot top controversial new old
[–] limerod@reddthat.com 6 points 1 day ago

Have you heard about futo keyboard? It embeds a small LLM model for this very task. It also has an option to retrain based on user data.

[–] TachyonTele@piefed.social 18 points 2 days ago (1 children)

Probably power. It takes a lot to power the text predictors. Adding it to every single phone would probably mean a huge uptick in resource usage.

I'm just guessing though.

[–] iloveDigit@piefed.social 9 points 2 days ago

You're absolutely correct. Battery and RAM usage too hard to manage for now, phone makers will probably wait until the tech gets further before experimenting with this publicly

[–] Ulrich@feddit.org 14 points 2 days ago (1 children)

They are, and have been since before GPT existed

[–] iloveDigit@piefed.social 9 points 2 days ago

Yeah, also true. Autocorrect on Android was already like a micro GPT

[–] adespoton@lemmy.ca 12 points 2 days ago

They’re not text predictors; they’re text transformers. So, more for translating a text from French to English, or providing you a summary of what a large input text is trying to say.

They can do prediction, but not well, and not without a lot of extra computation, as they’re really trying to summarize everything you’ve said, including the bit you haven’t written yet.