half
Individual data points like "I take pilates", "I work nights and weekends", and "I live in Smalltown, ST" might not mean anything on their own, but if you can connect this data to a single person, then realize there's only one pilates studio in Smalltown, then look up their hours and notice there's only one day class on weekdays, you can make a reasonable guess as to a regular time when a person is away from home. This is called data brokerage.
This is a comically contrived example; the real danger is in the association of countless data points spread across millions of correlated identities. It's not just your data, it's the association of your data with that of your friends and family. Most people are constantly streaming their location, purchases, beliefs, and affiliations out to anyone who cares enough to look. Bad actors may collate their data and use it to take advantage of them, and the only move they have is to ask for prohibitive legislation. As if we don't already have prohibitive legislation.
Anonymity is expensive, inconvenient, and fragile, but it's the only mechanism that protects individuals from the information economy, which I would put right next to ecology in terms of critical 21st-22nd century social problems. It also helps us resist censorship, but that's a different essay.
The issue isn't the composition of the object but rather property and contract. The prosthetic limb comparison isn't bad in my opinion, except this would be an experimental prosthetic limb that patients agreed to test with full knowledge and consent that it could be removed without their permission.
Again, I would hate to be in that position, but if I agreed to it, I understand my legal options would be limited. Again, this isn't a company ruining someone's life over a little money, this is a corporation unable to continue operating. Again, please consider the fact that a corporation which can treat epilepsy went backrupt because it couldn't afford to do business in the regulatory environment of the health industry. I don't understand how adding more subjective laws with hand-waved economic foundations is supposed to help this situation.
I can't imagine what it's like to live with epilepsy, nor to have a debilitating disease reenter your life after you'd become accustomed to its management. In her position, I imagine I would be doing everything I could to regain access to life-changing technology. Sympathy for Rita Leggett doesn't make this story "dystopian," nor is it a violation of anyone's rights.
It was a trial! All participants agreed to have the device removed. If they didn't, they'd be walking around with unsupported hardware in their brains, because the system that hardware was connected to was dissolved. Representing this legal outcome as a human rights violation is a predictable dilution of human rights.
Ienca likens it to the forced removal of organs, which is forbidden in international law.
There's a vital difference between the removal of a body part and the removal of a tool you agreed to host, on condition of its release, before changing your mind. NeuroVista used novel technology to make meaningful progress in the treatment of epilepsy! Our response to this should be to encourage others like them, not to build bureaucratic restrictions hindering new innovators.
Companies should have insurance that covers the maintenance of devices should volunteers need to keep them beyond the end of a clinical trial, for example.
Who would insure this requirement?! Indefinite support of novel technology? Be serious. This article absolutely breezes over NeuroVista's bankruptcy like it's a little inclement weather. The fact is that biotech research is nearly illegal by default. Try to restrain your distaste for industrialization long enough to imagine starting and running this company: