Thursday, January 1, 2009

Russianbare Fotos Gratis

Laws must not forget Continued ...

the three laws of robotics

are a set of rules written by Isaac Asimov, which most robots of his novels and stories are designed to meet. In this universe, the laws are "mathematical formulas printed on the trails positronic brain" of the robots (what we now call ROM). Appearing for the first time in the story Runaround (1942), provides:

1. A robot may not injure a human being or, through inaction, let a man come to harm.
2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law

This wording of the laws is the conventional way in which human stories are set out, his real form would that of an equivalent set of instructions and more complex in the brain of the robot. Asimov

attributed the three laws to John W. Campbell, had written that during a conversation held on December 23, 1940. However, Campbell says Asimov and had thought, and simply the expressed between the two in a more formal. The 3 laws

appear on a large number of Asimov's stories as they appear in all his series of robots, as well as several related stories, and the series of novels featuring Lucky Starr. They have also been used by other authors when they have worked in Asimov's fictional universe, and there are frequent references to them in other works, both science fiction and other genres. Zeroth Law

The Zeroth Law of Robotics is a variation of one of the laws of robotics, which first appeared in the book of Isaac Asimov, Robots and Empire. This law could only be enunciated by a dying robot, R. Giskard Reventlov, after having to kill a human being trying to defend a future benefit for the whole of humanity.

This discussion makes a much more advanced robot humaniform, R. Daneel Olivaw begin to assimilate the Act, would end up being defined as "A robot may not injure humanity or, through inaction, allow humanity to come to harm.", Leaving the three laws of robotics to be always modified lower binding to this new law.

The existential problem that arises is the definition of humanity, since it is not a mathematical value, it difficult to gauge when deciding the course of an action by a robot. Features of this difficulty may be found in novels like The boundaries of the Foundation or Robots and Empire, in which a robot Solaria defined as a single human creature that has a certain linguistic features. Luis Alberto Rojas Leal


Post a Comment