okay. so i read the article and honestly it is the same thing as the movie that dominguez showed us the trailor on. so i kinda got a double dose of it. this is good though because me, wyatt, davies, jeff, aleks, and niel had a pretty good debate. heres some things that i picked up on.
okay. so my position is that there should not be a singularity and that it is too dangerous to consider trying. why? because the situation is that we dont know what the singularity will bring. thats what everyone is saying and the scientists are saying that there willing to take the risk. (or at least some are) what risk is this??? loss of humanity, loss of freedom, death, etc... people say that you cant say that that will happen. but i say they cant say it wont. would you really want to risk it. i mean your life is worth so much and your life is actually a very good one. technology can improve so much. true, but i think striving for the singularity is like buying a super lotto ticket and if you dont win... hello slavery to machines that you wanted. how could you even say yes to the singularity with even the slightest chance of death or loss of humanity. that was one question that came up at davies house. when do you lose humanity. i think its the moment you switch out a finger for a machine finger. that finger doesnt have life. one thing we said and agreed on was this. humanity is made up of 2 things. life(just living breathing) and conciousness. dont you agree???
oh and if you couldnt tell. i think bernard is on to the very thing that will save us from the singularity. it is not iminent as kurzweil says. "He believes that this moment is not only inevitable but imminent."
i wish i could sTAy cause i have so many other points. but we'll probably talk in class. bye bye