Artwork

Nội dung được cung cấp bởi The Nonlinear Fund. Tất cả nội dung podcast bao gồm các tập, đồ họa và mô tả podcast đều được The Nonlinear Fund hoặc đối tác nền tảng podcast của họ tải lên và cung cấp trực tiếp. Nếu bạn cho rằng ai đó đang sử dụng tác phẩm có bản quyền của bạn mà không có sự cho phép của bạn, bạn có thể làm theo quy trình được nêu ở đây https://vi.player.fm/legal.
Player FM - Ứng dụng Podcast
Chuyển sang chế độ ngoại tuyến với ứng dụng Player FM !

LW - When is a mind me? by Rob Bensinger

23:40
 
Chia sẻ
 

Manage episode 413097339 series 3337129
Nội dung được cung cấp bởi The Nonlinear Fund. Tất cả nội dung podcast bao gồm các tập, đồ họa và mô tả podcast đều được The Nonlinear Fund hoặc đối tác nền tảng podcast của họ tải lên và cung cấp trực tiếp. Nếu bạn cho rằng ai đó đang sử dụng tác phẩm có bản quyền của bạn mà không có sự cho phép của bạn, bạn có thể làm theo quy trình được nêu ở đây https://vi.player.fm/legal.
Link to original article
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: When is a mind me?, published by Rob Bensinger on April 17, 2024 on LessWrong. xlr8harder writes: In general I don't think an uploaded mind is you, but rather a copy. But one thought experiment makes me question this. A Ship of Theseus concept where individual neurons are replaced one at a time with a nanotechnological functional equivalent. Are you still you? Presumably the question xlr8harder cares about here isn't semantic question of how linguistic communities use the word "you", or predictions about how whole-brain emulation tech might change the way we use pronouns. Rather, I assume xlr8harder cares about more substantive questions like: If I expect to be uploaded tomorrow, should I care about the upload in the same ways (and to the same degree) that I care about my future biological self? Should I anticipate experiencing what my upload experiences? If the scanning and uploading process requires destroying my biological brain, should I say yes to the procedure? My answers: Yeah. Yep. Yep, this is no big deal. A productive day for me might involve doing some work in the morning, getting a sandwich at Subway, destructively uploading my brain, then texting some friends to see if they'd like to catch a movie after I finish answering e-mails. \_(ツ)_/ If there's an open question here about whether a high-fidelity emulation of me is "really me", this seems like it has to be a purely verbal question, and not something that I would care about at reflective equilibrium. Or, to the extent that isn't true, I think that's a red flag that there's a cognitive illusion or confusion still at work. There isn't a special extra "me" thing separate from my brain-state, and my precise causal history isn't that important to my values. I'd guess that this illusion comes from not fully internalizing reductionism and naturalism about the mind. I find it pretty natural to think of my "self" as though it were a homunculus that lives in my brain, and "watches" my experiences in a Cartesian theater. On this intuitive model, it makes sense to ask, separate from the experiences and the rest of the brain, where the homunculus is. ("OK, there's an exact copy of my brain-state there, but where am I?") E.g., consider a teleporter that works by destroying your body, and creating an exact atomic copy of it elsewhere. People often worry about whether they'll "really experience" the stuff their brain undergoes post-teleport, or whether a copy will experience it instead. "Should I anticipate 'waking up' on the other side of the teleporter? Or should I anticipate Oblivion, and it will be Someone Else who has those future experiences?" This question doesn't really make sense from a naturalistic perspective, because there isn't any causal mechanism that could be responsible for the difference between "a version of me that exists at 3pm tomorrow, whose experiences I should anticipate experiencing" and "an exact physical copy of me that exists at 3pm tomorrow, whose experiences I shouldn't anticipate experiencing". Imagine that the teleporter is located on Earth, and it sends you to a room on a space station that looks and feels identical to the room you started in. This means that until you exit the room and discover whether you're still on Earth, there's no way for you to tell whether the teleporter worked. But more than that, there will be nothing about your brain that tracks whether or not the teleporter sent you somewhere (versus doing nothing). There isn't an XML tag in the brain saying "this is a new brain, not the original"! There isn't a Soul or Homunculus that exists in addition to the brain, that could be the causal mechanism distinguishing "a brain that is me" from "a brain that is not me". There's just the brain-state, with no remainder. All of the same functional brain-states occur whether yo...
  continue reading

1645 tập

Artwork
iconChia sẻ
 
Manage episode 413097339 series 3337129
Nội dung được cung cấp bởi The Nonlinear Fund. Tất cả nội dung podcast bao gồm các tập, đồ họa và mô tả podcast đều được The Nonlinear Fund hoặc đối tác nền tảng podcast của họ tải lên và cung cấp trực tiếp. Nếu bạn cho rằng ai đó đang sử dụng tác phẩm có bản quyền của bạn mà không có sự cho phép của bạn, bạn có thể làm theo quy trình được nêu ở đây https://vi.player.fm/legal.
Link to original article
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: When is a mind me?, published by Rob Bensinger on April 17, 2024 on LessWrong. xlr8harder writes: In general I don't think an uploaded mind is you, but rather a copy. But one thought experiment makes me question this. A Ship of Theseus concept where individual neurons are replaced one at a time with a nanotechnological functional equivalent. Are you still you? Presumably the question xlr8harder cares about here isn't semantic question of how linguistic communities use the word "you", or predictions about how whole-brain emulation tech might change the way we use pronouns. Rather, I assume xlr8harder cares about more substantive questions like: If I expect to be uploaded tomorrow, should I care about the upload in the same ways (and to the same degree) that I care about my future biological self? Should I anticipate experiencing what my upload experiences? If the scanning and uploading process requires destroying my biological brain, should I say yes to the procedure? My answers: Yeah. Yep. Yep, this is no big deal. A productive day for me might involve doing some work in the morning, getting a sandwich at Subway, destructively uploading my brain, then texting some friends to see if they'd like to catch a movie after I finish answering e-mails. \_(ツ)_/ If there's an open question here about whether a high-fidelity emulation of me is "really me", this seems like it has to be a purely verbal question, and not something that I would care about at reflective equilibrium. Or, to the extent that isn't true, I think that's a red flag that there's a cognitive illusion or confusion still at work. There isn't a special extra "me" thing separate from my brain-state, and my precise causal history isn't that important to my values. I'd guess that this illusion comes from not fully internalizing reductionism and naturalism about the mind. I find it pretty natural to think of my "self" as though it were a homunculus that lives in my brain, and "watches" my experiences in a Cartesian theater. On this intuitive model, it makes sense to ask, separate from the experiences and the rest of the brain, where the homunculus is. ("OK, there's an exact copy of my brain-state there, but where am I?") E.g., consider a teleporter that works by destroying your body, and creating an exact atomic copy of it elsewhere. People often worry about whether they'll "really experience" the stuff their brain undergoes post-teleport, or whether a copy will experience it instead. "Should I anticipate 'waking up' on the other side of the teleporter? Or should I anticipate Oblivion, and it will be Someone Else who has those future experiences?" This question doesn't really make sense from a naturalistic perspective, because there isn't any causal mechanism that could be responsible for the difference between "a version of me that exists at 3pm tomorrow, whose experiences I should anticipate experiencing" and "an exact physical copy of me that exists at 3pm tomorrow, whose experiences I shouldn't anticipate experiencing". Imagine that the teleporter is located on Earth, and it sends you to a room on a space station that looks and feels identical to the room you started in. This means that until you exit the room and discover whether you're still on Earth, there's no way for you to tell whether the teleporter worked. But more than that, there will be nothing about your brain that tracks whether or not the teleporter sent you somewhere (versus doing nothing). There isn't an XML tag in the brain saying "this is a new brain, not the original"! There isn't a Soul or Homunculus that exists in addition to the brain, that could be the causal mechanism distinguishing "a brain that is me" from "a brain that is not me". There's just the brain-state, with no remainder. All of the same functional brain-states occur whether yo...
  continue reading

1645 tập

Tất cả các tập

×
 
Loading …

Chào mừng bạn đến với Player FM!

Player FM đang quét trang web để tìm các podcast chất lượng cao cho bạn thưởng thức ngay bây giờ. Đây là ứng dụng podcast tốt nhất và hoạt động trên Android, iPhone và web. Đăng ký để đồng bộ các theo dõi trên tất cả thiết bị.

 

Hướng dẫn sử dụng nhanh