this post was submitted on 14 Apr 2026
476 points (98.8% liked)

Technology

83930 readers
2781 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Buried in the story was a deceptively simple question: does your AI agent count as an employee?

At a recent conference, Microsoft executive Rajesh Jha floated a provocative idea. In a future where companies deploy fleets of AI agents, those agents may need their own identities — logins, inboxes, and even seats inside software systems. If so, AI wouldn't shrink software revenue. It could expand it.

top 50 comments
sorted by: hot top controversial new old
[–] pinball_wizard@lemmy.zip 2 points 2 days ago

Absolutely everything in modern technology is rent seeking from people who really only need a free copy of Open Offfice Calc.

[–] HertzDentalBar@lemmy.blahaj.zone 9 points 3 days ago* (last edited 3 days ago)

They require licenses then they should be taxed like employees and since those employees make no wages the tax should be 100%

[–] the_riviera_kid@lemmy.world 8 points 3 days ago

So they are going to sell themselves a license?

[–] lowspeedchase@lemmy.dbzer0.com 139 points 5 days ago

Reads: Our flagship operating system and services have gotten to the point of such terrible shite for humans that we need to pivot to a less discerning customer base.

[–] deliriousdreams@fedia.io 97 points 5 days ago (9 children)

If the AI Agent counts as an employee then the company "employing" it is liable for what it does.

My guess is the argument will be that "it's a tool", not an employee, and therefore they take no responsibility. Though I'm sure that argument is not going to fly for very long. If your air hammer harms someone because the person operating it wasn't using it correctly, you're still liable.

[–] village604@adultswim.fan 7 points 5 days ago* (last edited 5 days ago) (1 children)

What? Companies aren't liable if the user doesn't follow the instructions or warnings and hurts themselves.

DeWalt isn't liable because I was using their mini chainsaw while holding a branch with my bare hand and the saw bounced and cut me. I'm liable for being stupid.

[–] deliriousdreams@fedia.io 8 points 5 days ago

I don't think you understand the context of the situation I was proposing. I am not supposing that DeWalt would be liable. But let's say we work in a shop together and I'm using an air hammer to I dunno. Punch rivets. If I as an employee of that shop use the air hammer and something involving the air hammer happens to my coworker or a customer or whatever, it is extremely likely that the company I work for would be on the hook. Could they try to penalize me personally? Yes. Could the person who was injured sue me personally? Certainly. Would the company be off the hook if the air hammer malfunctioned causing injury? Maybe - And at that point I would expect the manufacturer to be liable. But my comment never mentioned the manufacturer.

The context was companies using AI as a tool not companies manufacturing AI.

load more comments (8 replies)
[–] bookmeat@fedinsfw.app 71 points 5 days ago (1 children)

Jesus, you don't announce that kind of thing until you have your customers locked in! Amateur.

[–] FauxLiving@lemmy.world 15 points 5 days ago* (last edited 5 days ago)

The customers are already locked in by virtue of every company who is hoping to run the same rent seeking play around AI are buying up all of the compute and storage hardware on the planet which prices consumers out of everything except the soon-to-be-overpriced subscription service(s) that they offer.

[–] utopiah@lemmy.world 36 points 4 days ago (1 children)

That's the beauty of totally arbitrary restrictions, you can change them as you want.

Pay by seat? Pay by client? Pay by byte of data stored? Pay by backup location?

... pay by moonphase? Pay by AI personality? Pay by virtual AI seat?

Such BS but why wouldn't Microslop extend its business model. It worked well so far. It's not about software, or datacenter, or AI, it's just about entrenchment.

[–] tehfishman@lemmy.world 3 points 4 days ago (1 children)

It's also a billing strategy that only works in a monopoly situation. If there was healthy competition and no vendor lock-in for the office suite of tools, Microsoft wouldn't be able to even float this as an idea.

[–] utopiah@lemmy.world 2 points 3 days ago

The one thing Microslop excels at is precisely lock-in.

[–] edgemaster72@lemmy.world 51 points 5 days ago (1 children)

MicroSlop: We have this AI for you to use so you can reduce workforce and associated costs

Also Sloppy: j/k, fuck you pay me

[–] pdxfed@lemmy.world 6 points 5 days ago

Omniscient, omnipotent Business Leaders: "what? There is a catch?!?"

[–] StitchInTime@piefed.social 10 points 4 days ago

Microsoft can do whatever they want. So can I, and I have no want or need of their products.

[–] brucethemoose@lemmy.world 18 points 4 days ago* (last edited 4 days ago)

On a technical level, that makes zero sense.

AI “agents” are basically just fancy prompts with a tool calling harness. They are infinitely replicable, at zero cost, with no intrinsic value; the cost comes from the generic CPU host, and the API calls to GPU servers, databases, or whatever else that are all centralized anyway.


Wanna hear a dirty secret?

“AI” cost is going to zero.

Model capabilities aren’t scaling, but inference efficiency is exploding, thanks to more resource-constrained labs and breakthroughs in papers. The endgame of the current bubble is mediocre but useful tools anyone can host themselves, dirt cheap. Maybe a bit more reliable and refined than what we have now, but about as “intelligent.”

And guess what?

Microsoft can’t profit off that. None of the Tech Bros can.

Point being, this exec is either delusional, or jawboning, so the world doesn’t realize that "AI" is a dumb utility/aid, and they can't make any profit off it.

[–] LordMayor@piefed.social 44 points 5 days ago (1 children)
  1. Integrate AI into the OS
  2. Demand purchase of a Windows license for the AI in the OS
  3. GOTO 2

It’s an infinite amount of money from every customer!

[–] pinball_wizard@lemmy.zip 7 points 5 days ago* (last edited 5 days ago)

It’s an infinite amount of money from every customer!

But it's okay, because there's infinite money to be saved by laying off technical expert staff.

[–] NewNewAugustEast@lemmy.zip 25 points 5 days ago (1 children)

I have always hated the term "seats". Get bent microsoft.

load more comments (1 replies)
[–] SpatchyIsOnline@lemmy.world 29 points 5 days ago

So the "amazing tool of the future" that's "going to make software developers obsolete" is also going to need to buy software licenses?

Which one is it Microslop?

[–] WanderingThoughts@europe.pub 35 points 5 days ago (2 children)

The agent immediatly makes cost-benefit analysis and moves everything to open source solutions, and contracts a coding AI agent to write a simple conversion interface.

[–] adespoton@lemmy.ca 9 points 5 days ago

Or… the agent hallucinates that it has a valid license.

[–] pinball_wizard@lemmy.zip 7 points 5 days ago* (last edited 5 days ago)

Yes! This is legitimately one of the ways the bubble may burst. Particularly if the AI gets substantially smarter, and just starts recommending full switches to existing libraries and software suites - at a cost of exactly one token, instead of churning out thousands of lines of slop code that require ongoing tokens to maintain.

[–] CatAssTrophy@safest.space 16 points 4 days ago (1 children)

This gets close to an idea I heard long ago that I think has some merit.

Hire an employee? You must not only pay them, but cover taxes to have them there. Buy a robot to replace them? It's a business expense, no taxes!

Okay, pay taxes for your robot usage. Use that money to fund UBI, social programs and/or retraining people for other jobs.

[–] muusemuuse@sh.itjust.works 9 points 4 days ago (4 children)

Then they’ll just make one robot do multiple things. Suddenly the big company only has one taxable employee.

[–] CatAssTrophy@safest.space 6 points 4 days ago* (last edited 4 days ago)

Depends. If the tax is based on jobs replaced, not the abstractly defined number of robots that exist, it would have an impact. Also, monolithic solutions tend to be inherently less efficient than similarly developed defined ones, so limiting the robot models for a tax benefit would have another limit on their efficiency.

It's an issue that could be accounted for, if there were sufficient political will. If taxes from automation were committed to public good, there would likely be pretty widespread acceptance.

load more comments (3 replies)
[–] db2@lemmy.world 27 points 5 days ago (1 children)

A house of cards built on top of ten other houses of cards. What could possibly go wrong.

[–] greyscale@lemmy.grey.ooo 10 points 5 days ago

A house of cards which in turn, is itself a house of cards

Governments using Azure scares the shit out of me, having read that.

[–] pdxfed@lemmy.world 26 points 5 days ago* (last edited 5 days ago)

The natural extension of a non-open internet ala Reddit and charging developers for API pulls.

[–] Justdoingmybest@lemmy.ca 20 points 5 days ago

I am going to advise my Copilot that it cannot afford to keep using Microsoft Office, but it has to switch to LibreOffice for reasons of affordability.

[–] DarkSurferZA@lemmy.world 11 points 4 days ago

MMM, interesting. Would the AI companies then need to buy a license for all the information they stole to train their AI? Or would they need to buy a license everytime someone uses micro-slop AI to ask it a question about something that has been trademarked?

Or does licencing only apply to their software

[–] SaharaMaleikuhm@feddit.org 10 points 5 days ago

Sounds good. I was not interested anyways

[–] greyscale@lemmy.grey.ooo 11 points 5 days ago (1 children)

Do AI sit in "seats" 🤭 and is it per-agent or per-agent-instance? Or per-agent-instance-second?

"All of those embodied agents are seat opportunities," Jha said, envisioning organizations with more agents than humans — each effectively a user that must pay for a software license, or "seat" in industry lingo.

He's been watching Pantheon, I think.

Can the AI take the in-office seats so I can go back to being productive at home instead of listening to my coworker loudly talk to a garage door salesman on the phone?

[–] WesternInfidels@feddit.online 9 points 5 days ago

This is going to wind up granting AI agents a piecemeal, half-assed, legal-fiction version of "personhood," like corporations have. The AIs will wind up with freedoms like: They can spend all the money they want, that's "free speech."

And the fleshy unfortunates among us still won't have a right to a living wage, to medical care, etc.

Lmao ok sure buddy

[–] Solaris1220@lemmy.world 9 points 5 days ago

I don’t know why, but this headline made me laugh so hard

[–] catdog@lemmy.ml 8 points 5 days ago (1 children)

So if I use Windows pre-installed Copilot, I need to buy two Office Licenses, a Copilot subscription, and a Windows license?

[–] forrgott@lemmy.zip 6 points 5 days ago

Yes. But actually, no.

This is about enterprise licensing, not retail (home users). But otherwise, yeah, that's the basic idea they're proposing.

When it comes to a consumer using Copilot, it's all about having a new way to manipulate you into voluntarily handing them more of your personal data (which they will sell on the scummy as hell market created just for enabling surveillance capitalism).

[–] ch00f@lemmy.world 7 points 5 days ago

I don't understand, why wouldn't the AI simply write its own version of whatever software it needs to license?

[–] homesweethomeMrL@lemmy.world 6 points 5 days ago

HAHAHAHAHAHA

load more comments
view more: next ›