Page 10 of 22 FirstFirst ... 68910111214 ... LastLast
Results 91 to 100 of 216

Thread: There Is No God

  1. #91
    Quote Originally Posted by Uriel View Post
    We've just created an exact copy of you in a computer, that feels and thinks like you and thinks that it is you, but you're still here and are still going to die.
    I'm not entirely convinced that will work in the end. I have a feeling that our minds are built to work when certain other conditions are met. And I think one of those conditions is our body. Like, uh, that we have one.

    I wouldn't be surprised that if we ever do emulate the human mind, or copy our own minds, that the resulting system would eventually go mad. I don't think our minds can work without that three dimensional point of reference.

    We're probably going to discover that highly advanced robots that copy our minds need to look as much like us for their own reasons as our own. I think we need that identity to stay sane. And anything that copies us will probably need it to.

  2. #92
    As for us, living forever because of advances in medicine, I doubt that will happen. 1. If it does happen, not everyone is going to know about it. There is no way that anyone making more than 100k a year is going to want every tom dick and harry living forever.

    Secondly, how are they going to deal with the problem of the brain? You can't just regrow it. I'm taking a wild guess, but doesn't human memory work in such a way that if you were to make the brain fix itself as it aged, wouldn't it make you forget things?

    We're getting to wear we can regrow parts in a lab with stem cells. Actually making shit stop aging is a bit far off imo. Unless someone has read something I haven't (which is possible).

  3. #93
    Quote Originally Posted by Fe 26 View Post
    I'm not entirely convinced that will work in the end. I have a feeling that our minds are built to work when certain other conditions are met. And I think one of those conditions is our body. Like, uh, that we have one.

    I wouldn't be surprised that if we ever do emulate the human mind, or copy our own minds, that the resulting system would eventually go mad. I don't think our minds can work without that three dimensional point of reference.

    We're probably going to discover that highly advanced robots that copy our minds need to look as much like us for their own reasons as our own. I think we need that identity to stay sane. And anything that copies us will probably need it to.
    To add to this, if you were able to download your mind to something electronic, it wouldn't be you. It would probably emulate your brain under ideal conditions. Which you're brain doesn't do. It also wouldn't take into account your physical health has on your mind. Like say you are depressed with a lot of testosterone in your body. That is going to have an impact on how you think. If your mind is copied to a computer, it will probably be ran under ideal physical conditions. You'll never have depression and you well have a normal level of hormones in your body.

    Another thing to consider is the impact of the hardware on your mind. Some of what you are is how fast you think. And how many things you can think of at one time. The electronic you will be able to think at a different speed than you, and probably to think about more than you.

    About the only thing that an electronic copy of you will share with you is your memories (and these too will be slightly different to the electronic you). Even if they are able to simulate all the other factors (which might be considered immoral in the future "is it moral to give a copy of yourself depression?") you will still be different from your electronic copy. If only because of the way modern computers work. There would have to be some very basic and fundamental changes in computing to get a 1 to 1 copy. It can't be done with the way things are now.
    Last edited by Fe 26; 24 May 2011 at 03:59 AM.

  4. Quote Originally Posted by Opaque View Post
    Actually the best argument against it would be the fact that every time your cells divide you lose a little bit of DNA and getting that to stop is going to be a bitch.
    My sister is part of a team that already did that.

    http://www.scripps.edu/news/press/080309.html

    The article doesn't hammer it home like it should, but the clone mouse did not have the aged DNA of it's twin/father. The DNA resets to new. If you don't feel like growing a whole damn animal, just inject some back in to the original animal. Got some cancer issues to work out... But even if I can't live forever I'm confident I'll have my hair back before I die.

    It's going to rock so hard having people with senile 100 year old brains strutting around in ripped twenty year old bodies.

  5. No, they didn't do what I said. You understand so little about this topic that what I said and what they did appear to be the same thing to you. What they did was show that every cell in your body has the DNA needed to make every part of your body, which is awesome, but not the problem I was talking about and also, in no way, a solution to it either. I'll explain, since you obviously don't know shit about shit:

    DNA polymerase can only travel in one direction when duplicating DNA and that is from the 5' side to the 3' side, it also cannot attach to the very tip of the DNA and because of these two factors and some others we won't go into a small section is always, 100% of the time, lost on duplication. It's a small amount, but it happens every time in every creature ever observed and over a long enough time span you could essentially run out of functioning DNA. The thing is, DNA can only be copied, it can't be rewritten from scratch and when information is lost it's lost forever, at least in that cell. The good news is that the end of your DNA, and a mouses DNA, has TONS of Telomeres (non coding sections of DNA) on them and many things, like those mice for example, have not had the chance to pass the Telomere sections and get into functioning DNA.

    To stop this from happening you would need to either A) create a new version of DNA Polymerase that can attack to the very tips of DNA or B) Create a different enzyme that attaches Telomeres onto DNA at a more controlled rate than we see in cancer cells. For either of those things you would need for them to either be injected into a gamete or dispersed around the body via some advanced nano machines or viruses. It would have to hit every cell, eventually, for it to truly work.

    Making that happen, without having cancer or something terrible happen, is going to be intense. It's also going to require, for the first time ever, us creating a functioning enzyme which fits into the DNA replication cycle without fucking shit up, which is a monumental task if there ever was one.

    To clarify:

    Quote Originally Posted by Cheebs View Post
    The DNA resets to new.
    If you mean that it magically regained all the Telomeres lost to replication during the fathers/twins life cycle, then, no. In fact, the DNA doesn't even do anything different once they transported it; all DNA ever does is sit there and code for proteins. The rest of the cell, in response to it's environment relative to the body or lack there of, determines what happens with the proteins that are coded and what type of cell it will be. DNA never chooses what it does and it does not reset. You are grossly misrepresenting the incredible work your sister is doing by talking about it improperly.
    Last edited by Opaque; 24 May 2011 at 05:00 AM.

  6. Quote Originally Posted by Fe 26 View Post
    To add to this, if you were able to download your mind to something electronic, it wouldn't be you. It would probably emulate your brain under ideal conditions. Which you're brain doesn't do. It also wouldn't take into account your physical health has on your mind. Like say you are depressed with a lot of testosterone in your body. That is going to have an impact on how you think. If your mind is copied to a computer, it will probably be ran under ideal physical conditions. You'll never have depression and you well have a normal level of hormones in your body.

    Another thing to consider is the impact of the hardware on your mind. Some of what you are is how fast you think. And how many things you can think of at one time. The electronic you will be able to think at a different speed than you, and probably to think about more than you.

    About the only thing that an electronic copy of you will share with you is your memories (and these too will be slightly different to the electronic you). Even if they are able to simulate all the other factors (which might be considered immoral in the future "is it moral to give a copy of yourself depression?") you will still be different from your electronic copy. If only because of the way modern computers work. There would have to be some very basic and fundamental changes in computing to get a 1 to 1 copy. It can't be done with the way things are now.
    I'm not seeing the negative here. You're basically saying that if the singularity does come to pass, not only are we all going to live forever, but we're also getting the mental equivalent of Captain America's super soldier serum. (And as for the issue of needing a three-dimensional frame of reference you brought up in the previous post, one word: Matrix.)

    My big issue with the whole singularity notion is, suppose we do manage to create consciousness-duplicate A.I.'s that pass the Turing test... even then, how would we ever know there really was a "ghost in the machine"? How would we tell that there actually was a mind in there, thinking, experiencing, feeling, and not just creating output that very convincingly mimics the appearance of such?

    Quote Originally Posted by Gohron View Post
    I like doing stuff with animals and kids

  7. The answer to that last question is simple: if you create a real AI, you are not programming in specific responses. If behaviors and such emerge naturally without being taught or referenced, then it's legit.

    The singularity is simply about creating something smarter than us though, not necessarily human or having a soul in any way. Really, giving it things like emotions is an obviously terrible idea.

  8. #98
    Quote Originally Posted by The Gas View Post
    I'm not seeing the negative here. You're basically saying that if the singularity does come to pass, not only are we all going to live forever, but we're also getting the mental equivalent of Captain America's super soldier serum.
    That parallel would only work if Captain America was a clone, and they shot the man that donated the DNA shortly after he donated.

    That is the crux of the singularity idea. If you really downloaded yourself into whatever, it wouldn't be you. It would be a copy. You're still a big sack of organs that will eventually die. And you're copy isn't a very good copy of you. Its going to be something much better than you with your memories. But it won't be you. It won't like girls with big asses that talk dirty, because the ideal you won't have whatever chemical make up that makes you attracted to such things. It won't stop and get upset about anything that was due to emotional trauma. If anything, it will be a sentient biography of you. Like an electronic ghost or those paintings from the Hairy Potter movies.


    (And as for the issue of needing a three-dimensional frame of reference you brought up in the previous post, one word: Matrix.)
    are you really using the Matrix as a counter point?

    And if we go as far to simulate a three dimensional world for our copied selves to live in, I have to ask, what is the point? What exactly is the point of all that computing power? And it would require a fuck ton. You'd be simulating the human mind AND a world that it can deal with. You'd be simulating every little sensory input, for every single person, for every second of ever day.

    And what is the point of that? Code running on code. Simulations bouncing off of simulations. That sounds like maybe the greatest joke mankind could give the universe. The biggest ego masturbation ever conceived. You love yourself soooo much, you're going to make a copy of yourself, and it is so frail, you're going to let it live in a perfect code world so it won't go mad at the realization of having no body and living in a box.

    Quote Originally Posted by The Gas View Post
    My big issue with the whole singularity notion is, suppose we do manage to create consciousness-duplicate A.I.'s that pass the Turing test... even then, how would we ever know there really was a "ghost in the machine"? How would we tell that there actually was a mind in there, thinking, experiencing, feeling, and not just creating output that very convincingly mimics the appearance of such?
    What does it matter? All living creatures are little more than a combination of systems working together or against each other.

    You asking that is like a musician asking "how do I know the guitar sim software really sounds like a fender champ and isn't just pretending to sound like a fender champ"

    Well, it is never going to stop pretending to be something. That is the nature of simulations and software. It is always going to be a system of equations. If it ever became sentient, that only means that it has became more complex and incorporated feedback.

  9. What in the sweet name of fuck is everyone talking about?
    Quote Originally Posted by C.S. Lewis
    Of all tyrannies, a tyranny sincerely exercised for the good of its victims may be the most oppressive. It would be better to live under robber barons than under omnipotent moral busybodies. The robber baron's cruelty may sometimes sleep, his cupidity may at some point be satiated; but those who torment us for our own good will torment us without end for they do so with the approval of their own conscience.

  10. #100
    Quote Originally Posted by Uriel View Post
    The singularity is simply about creating something smarter than us though, not necessarily human or having a soul in any way. Really, giving it things like emotions is an obviously terrible idea.
    The thing is, if machines become super intelligent, they will develop morals and emotions.

    The vanity of people is amazing. Do you really think of your mind as something special and outside of the world of machines? Your mind is a computer like any other. Your emotions are your mind crunching variables faster than you are consciously aware, and the resulting feelings are the variables or answers your mind has assigned to the outcome. When you feel bad, that is your mind setting a weight to something. The same when you feel good.

    And truth be told, we have no guarantee that machines will not be any worse off than we are. People fuck up and are stupid because they live in a world where they don't have all the information they need. Or ignore information that they should not ignore. Even if they can process more information than us, they are still going to have about the same amount of information that we have.
    Last edited by Fe 26; 24 May 2011 at 08:51 AM.

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Games.com logo