I want to welcome our machine overlords, and HURRY UP AND DIE!

Posted on February 11, 2013

0


This was going to be two posts but the first burbled in my head too long and merged with the second.  Since they tie together, vaguely, I did just that.  Probably with duct tape, like the people in the basement.  It’s been a long week.

I walk around with the Hitchhikers Guide to the Galaxy.  Not Douglas Adams’ book.  Although actually the book too.  But the item that forms the core conceit of the book.  I suspect most any who read this do as well.  You do have some form of smartphone, correct?  It connected to the internet.  You can access wikipedia.  They’re the same thing.  Especially the part about being wildly inaccurate.

Calm yourself children, the last was a joke.  I know Wikipedia tends to get corrected very quickly.

But we’re already starting to offload memory functions into machines.  My smartphone can talk to me and tell me where I am, what I’m supposed to be doing, what I can do where I am.  If I’m wearing a headset it can speak directly into my ear canals, almost directly to my brain.  So I can look up anything I need to.  And I’m lucky enough that I only have to typically do it once.  But a side effect of this is that when I don’t have my phone I feel noticeably more stupid.  There was a period this fall when my phone took a bath and what I wanted to replace it with wasn’t available for 2 more weeks.  Those were a long 2 weeks.

Charlie Stross’ Accelerando, touches on what may happen when we can actually offload whole portions of memory and consciousness to devices.  The results are mixed.  One of the characters is able to offload a lot of what he sees as non critical operations to his various devices but when the main interface unit is stolen he’s reduced to a barely functioning imbecile.  He is so reliant on the devices that he can’t function even as he did before he had them.  His reliance on technology didn’t make him stupider, but the loss of it rendered him useless.

Also, if you haven’t read Accelerando, click on the link and download it.  The book is awesome, it won awards, and Senor Stross has it on his website for free.  So read it.  I’ll wait.

Still waiting.

Manfred Macx’s problem is that he was offloading to external storage and processing.  The interface tools were built in but someone was able to steal his “disc”.  It doesn’t matter how good you virtual defenses are if someone can bash you in the back of the head and steal your smart glasses.

The obvious solution to issues like this is to build the hardware in.  It’s presented in both positive and negative lights throughout Sci-Fi.  Sometimes I find myself looking forward to when I’ll be able to access the computer that has been implanted in me.

Then Moore’s Law rears up and beats me about the head and shoulders.

Building a computer into a human body may work.  And it may happen.  Actually, we already do.  We put machinery into people all the time.  The problem I see with doing so with devices a person will use for memory and thought processing is that it build obsolescence into a human.  More so than already is.

I can buy a memory card the size of my pinkie nail that holds more data than the every computer I used, combined, from 1990 to 2000.  My current home PC probably has more processing power than DARPA did in 1980.  That last is just a guess, I don’t know off hand.  Miniaturization isn’t a problem.  I suspect we’ll be able to figure out how to interface with internal biological systems.  But what about Moore’s Law?  What happens when your hardware isn’t up to snuff?  What happens when 2 or 4 or 10 years have passed and the new guy in the department really is smarter and better than you because she’s younger and has the latest generation of hardware?  You can’t study your way out of obsolete tech.  How often will you go under the knife to get upgrades?  How long can you be out of work to get upgrades?  How long will the recovery be?  It takes a while to bounce back from surgery.  A lot of authors hand wave these issues away by proclaiming SCIENCE very loudly.

This is where I’m going  to veer into the second title, which is actually what was going to be the first post.

There are days I feel like Moses standing just outside the promised land.  I can see it.  I can smell it.  But some of the people I’m traveling with are fucking up and delaying.  I can’t go until they die and I might not make it long enough.

So to a large portion of the baby boomers and a lot of the older Gen X’ers, please hurry up, fuck off, and die.

Sometimes I feel bad about wishing that on my parents’ friends.  Sometimes.  But society  isn’t equipped to deal with a generation staying in power so long.  We’re not far removed from the age ranges we’re experiencing being a 1% thing, not a majority of the group.  We don’t have the equipment to deal with such a large group of old bastards being in power so long.  It was ok if you had a few old people around for advice, wisdom, insight, whatever.  But now that large groups of the dumb ones and the smart ones get to hang around it’s an issue.  They got into power and they don’t want to let go.  That’s natural.  Humans like to be in charge.  The problem is the methods of staying in charge are becoming increasingly regressive.  Boomers are actively campaigning against scientific progress in a number of fields, climate, biology, sociology.  I have no problem with disagreeing about findings in an experiment.  I disagree with not letting the experiment proceed because you got yours and god damn if it was good enough for me than fuck off youngster it’s good enough for you.

The stem cell debacle at the beginning of the century was part of it.  The complete denial and attempted defunding of climate research is another.  Hell, not letting anyone study why so many people in the US die from gun injuries and the refusal to invest ANYTHING in our infrastructure is also part of it.

The future costs money.  The future is an investment from a preceding generation to the ones that will follow.  The boomers are a generation that grew up knowing the world was going to be blasted into an irradiated ball of glass.  You can’t invest in the future when you know there isn’t one.  And now, now they’re old.  They’re old and they’re set in their ways.  They’re old and set in their ways and they’re not going to change because fuck you if you can’t take a joke.  If they can’t see a benefit to themselves they don’t want to pay for it.

We’re doing this.  And this.  And this.  We are doing things that were only the province of Science Fiction just 5 years ago.  And if we didn’t have a generation of old bastards standing in the road yelling, “Stop!” as loud as possible we might be doing more.

So fuck off and die because you are between me and the singularity.

And I want the next one.  We’ve had singularities before.  Fire, tools, agriculture, the printing press.  They were all moments in the development of the species when the past was no longer prologue.  History could not repeat because there was nothing like those moments in history.  Singularities in physics break physics.  You can see to them but not past them.  Singularities in history broke paradigms (I feel like a douche for using that word but it works so well).  We don’t know what happens inside the black hole and no one could have seen New York as a direct result of throwing seeds into the ground over and over again.

I think the singularity will be a synthesis of biology and technology.  I think we will be cyborgs.  Which brings me back to how the upgrades will happen.

See, eventually I get to the point.

Ray Kurzweil thinks we’ll create artificial minds within 35 years.  I’m reading the book right now and haven’t found any glaring holes.  On the other hand, fusion and AI are always just around the corner.  That said, if we create an artificial mind we’re not far from the ability to transfer our minds into it.  If we can transfer our minds into it than we can transfer our minds out.  If we can transfer our minds in we can hold them in stasis.

We’re starting to print organs.  We’ve successfully transplanted grown organs.  We’re close to printing them.  It’s becoming less of a problem of theory and more of an engineering problem.  And whatever our flaws, humans have always been decent engineers.  We’ve cloned organs.  We are probably near cloning, or copying, with bio printers, a person.

If I can transfer your consciousness to a machine, and then print a new body around your upgraded computer hardware, I’ve just done the upgrade as painlessly as possible.

There are, possibly, moral issues.  What is to be done with the old body?  What rights does the new body have before a mind is copied to it?  What is murder if you can print a new body and load a backup mental state?  What about war?  What about suicide terrorism?

What about computer viruses that can now infect a person?  Is a person criminally negligent for not keeping their antivirus software current if they’re toombied and hurt someone? (Credit to David Weber and John Ringo for that one)  Can computer viruses be considered biological warfare?

We can’t know until we get there.  That’s why it’s a singularity.  We don’t know what’s on the other side until we pass through.  We can guess, but there is nothing on this side to tell us.

I want to know.  I can see it.  I can smell it.  I just want people to get out of the way so we can make it.

That is all.

I was wrong.  That’s not all.

I’m not all Rapture of the Nerds about the singularity.  Humans are still humans and will be in whatever guise we take once we can do the things I think we’ll be able to do and humans are mostly sucky bags of suck.  Our history is littered with hellish periods of anarchy and strife with only a few glittering moments of growth.  That there are more glittering moments now does not mean there will in the future.  But nothing levels the field like knowledge and information and having the Guide buried in the core of your being is a pretty good way to work.

Now that is all.

Advertisements