Monday, August 20, 2007

Robot Rights

I've been reading about the ethical question concerning whether robots with some modicum of self-determination might have some or all of the same rights as humans. Although we might be some distance from this technologically, it isn't unreasonable--given the pace of development in robitics and artificial intelligence--to assume that at least limited-function robots will be flitting around homes and businesses within a generation.

The most important question that comes with the issue of rights, in my opinion, is that of personal responsibility. In other words, who's responsible if a robot intentionally injures or kills someone? Our current laws hold the individual responsible when dealing with humans. However, will we tend to blame the engineers or programmers when a robot goes awry? After all, they determined the robot's wiring and installed the code that makes it function. And if we do blame the robot's "creators", does that call into question the humans' creator(s) as well?

If we accept that nature endows each human with genetic wiring that predisposes us to certain personality traits and intelligence capacity, and that our environment provides the programming for that wiring, then why wouldn't a deity and our parents be responsible for our actions and not us? This is an interesting question that philosophers and legal minds will no doubt struggle with within the next two to three decades, but it seems obvious that the introduction of robotic intelligence will begin our questioning of ethical responsibility for all sentient entities.

No comments: