As someone at a company still using free AI credits in their commercial products and hasn't figured out how he's going to price the shit when the credits are up.... this AI market looks a lot like Uber subsidies..
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
Like a year or two from now, probably any AI stuff that isn't self hosted is going to be 100% inaccessible to normal people due to cost. It's just a question of how hard they're going to fight to keep current free to download LLM models off the internet once this happens.
We're seeing this all over the tech and tech adjacent space. Can't grow forever at a loss, especially not with increased interest rates and a potential economic downturn.
My guess, if you want to have decent services we're going to end up needing to pick few (or a suite of the basics) to pay for on a monthly basic and cut out all the "free" stuff that is/will get enshittified.
in my eyes they put themselves in an awkward position by garnering a reputation of always collecting more user data than justified, and at this point i assume they do the same with paid products as it's an industry norm. however I'm not ok with it and will never pay when the product doesn't respect privacy. the saying used to be "if you don't pay, you're the product", but it is increasingly shifting to: you're the product and also you have to pay so that our shareholders can experience more infinite growth
I don't understand this. Hasn't Intel or Nvidia (or someone else) been making claims about their next CPUs having AI functionality built-in?
Na, once ml inference and training chips are purpose built it'll be built into devices. AI models are the mainframes of today
They're already trying this, sort of.
They know charging for total access will cause a riot, so instead they're enshitifying the whole experience and holding access to the current non-shit experience hostage with monthly fees.
I think at this point with so many tech giants introducing ads to their services and increasing subscription prices, I think we can expect some kind of subscription fee to access assistants with the AI/LLM capability. It would make sense to offer a 'basic' version of these services for free since people have already invested in the hardware, but wouldn't be surprised if these companies suddenly block us from using the smart functionality suddenly unless you pay.
This is the best summary I could come up with:
The emerging generation of "superhuman" AI models are so expensive to run that Amazon might charge you to use its Alexa assistant one day.
In an interview with Bloomberg, outgoing Amazon executive Dave Limp said that he "absolutely" believes that Amazon could start charging a subscription fee for Alexa, and pointed to the cost of training and running generative artificial intelligence models for the smart speaker's new AI features as the reason why.
Limp said that the company had not discussed what price it would charge for the subscription, adding that "the Alexa you know and love today is going to remain free" but that a future subscription-based version is "not years away."
Generative AI models require huge amounts of computing power, with analysts estimating that OpenAI's ChatGPT costs $700,000 a day or more to run.
Limp, Amazon's senior VP of devices and services, announced he would step down from his role at the company after 13 years a month before the launch of the new products.
Insider's Ashley Stewart reported that former Microsoft exec Panos Panay is expected to replace Limp.
The original article contains 298 words, the summary contains 182 words. Saved 39%. I'm a bot and I'm open source!