“What am I without my legs?” “What am I without my eyes?” “What am I without my arms?”
What counts as “the real me” has been evolving for decades, if not centuries. I’m not volunteering for brain implants, but I’m not writing off the idea sometime in the future. As for AI, this is going to be more of the ML variety, not the LLM variety. Think more of “neurochemical levels have been trending in a certain direction for too long, release opposing neurochemicals to halt the spiral” and less of a little voice inside your head giving quite possibly incorrect answers to whatever you’re thinking of.
This is absolutely risky stuff, but less risky than recurring electroshock therapy? Hard for me to say. Note that the article is from nearly 2 decades ago, but there are articles in the news from just the last couple weeks.
Those are some good nuances that definitely require a nuanced response and forced me to refine my thinking, thank you! I’m actually not claiming that the brain is the sole boundary of the real me, rather it is the majority of me, but my body is a contributor. The real me does change as my body changes, just in less meaningful ways. Likewise some changes in the brain change the real me more than others. However, regardless of what constitutes the real me or not, (and believe me, the philosophical rabbit hole there is one I love to explore), in this case I’m really just talking about the straightforward immediate implications of a brain implant on my privacy. An arm implant would also be quite bad in this regard, but a brain implant is clearly worse.
There have already been systems that can display very rough, garbled images of what people are thinking of. I’m less worried about an implant that tells me what to do or controls me directly, and more worried about an implant that has a pretty accurate picture of my thoughts and reports it to authorities. It’s surely possible to build a system that can approximate positive or negative mood states, and in combination this is very dangerous. If the government can tell that I’m happy when I think about Luigi Mangione, then they can respond to that information however they want. Eventually, in the same way that I am conditioned by the panopticon to stop at stop sign, even in the middle of a desolate desert where I can see for miles around that there are no cars, no police, no cameras - no anything that could possibly make a difference to me running the stop sign - the system will similarly condition automatic compliance in thoughts themselves. That is, compliance is brought about not by any actual exertion of power or force, but merely by the omnipresent possibility of its exertion.
(For this we only need moderately complex brain implants, not sophisticated ones that actually control us physiologically.)
I absolutely think that privacy within your own mind should be inviolable (trusting corporations and even government to agree is laughable). Iain Banks’ Culture series explores some of these implications, as well as who should be in control of your mental state. It’s messy and hard, and is one of the reasons I currently wouldn’t get a brain implant. I might change my mind if I had ALS, for instance.
“What am I without my legs?” “What am I without my eyes?” “What am I without my arms?”
What counts as “the real me” has been evolving for decades, if not centuries. I’m not volunteering for brain implants, but I’m not writing off the idea sometime in the future. As for AI, this is going to be more of the ML variety, not the LLM variety. Think more of “neurochemical levels have been trending in a certain direction for too long, release opposing neurochemicals to halt the spiral” and less of a little voice inside your head giving quite possibly incorrect answers to whatever you’re thinking of.
This is absolutely risky stuff, but less risky than recurring electroshock therapy? Hard for me to say. Note that the article is from nearly 2 decades ago, but there are articles in the news from just the last couple weeks.
Those are some good nuances that definitely require a nuanced response and forced me to refine my thinking, thank you! I’m actually not claiming that the brain is the sole boundary of the real me, rather it is the majority of me, but my body is a contributor. The real me does change as my body changes, just in less meaningful ways. Likewise some changes in the brain change the real me more than others. However, regardless of what constitutes the real me or not, (and believe me, the philosophical rabbit hole there is one I love to explore), in this case I’m really just talking about the straightforward immediate implications of a brain implant on my privacy. An arm implant would also be quite bad in this regard, but a brain implant is clearly worse.
There have already been systems that can display very rough, garbled images of what people are thinking of. I’m less worried about an implant that tells me what to do or controls me directly, and more worried about an implant that has a pretty accurate picture of my thoughts and reports it to authorities. It’s surely possible to build a system that can approximate positive or negative mood states, and in combination this is very dangerous. If the government can tell that I’m happy when I think about Luigi Mangione, then they can respond to that information however they want. Eventually, in the same way that I am conditioned by the panopticon to stop at stop sign, even in the middle of a desolate desert where I can see for miles around that there are no cars, no police, no cameras - no anything that could possibly make a difference to me running the stop sign - the system will similarly condition automatic compliance in thoughts themselves. That is, compliance is brought about not by any actual exertion of power or force, but merely by the omnipresent possibility of its exertion.
(For this we only need moderately complex brain implants, not sophisticated ones that actually control us physiologically.)
I absolutely think that privacy within your own mind should be inviolable (trusting corporations and even government to agree is laughable). Iain Banks’ Culture series explores some of these implications, as well as who should be in control of your mental state. It’s messy and hard, and is one of the reasons I currently wouldn’t get a brain implant. I might change my mind if I had ALS, for instance.