Heinlein Readers Discussion Group Thursday 05-26-2005 9:00 P.M.EDT Beware the stobor – Robots? In Heinlein

Heinlein Readers Discussion Group
Thursday 05-26-2005 9:00 P.M.EDT
Beware the stobor – Robots? In Heinlein

Click Here to Return to Index

Here Begin The Postings
From: LNC
Newsgroups: alt.fan.heinlein
Subject: Notice of no meeting of Heinlein Readers Group this month
Date: Wed, 20 Apr 2005 18:43:21 GMT

April is the cruellest month, breeding
Lilacs out of the dead land, mixing
Memory and desire, stirring
Dull roots with spring rain.
Winter kept us warm, covering
Earth in forgetful snow, feeding
A little life with dried tubers…
*******************************
Now, if you think it’s more B.S. than T.S., you may be right but the
fact of the matter is that I got real, unexpectedly, unavoidably busy
this month and will be out of town on the days during the last week of
this month, April, 2005, on which the meetings are usually scheduled.
No, it’s not that I’m still recovering from having hosted in March. Yes,
I’m late with the announcement and invite interested persons to shoot me
dead in the street if they see me. Maybe, I’ll just go ahead and make a
hidden announcement here.

ANNOUNCEMENT:

HEINLEIN READERS GROUP MEETING SCHEDULED
WHEN: May 26, 2005, 7:00 PM EDT and May 28, 2005, 5:00 PM EDT
WHERE: The usual AIM chatroom
TOPIC: Beware the stobor

Rod, who may or may not have been of color but was certainly so straight he was never off color, went through a Tunnel in the Sky and found himself walking through a frontier where he encountered a signpost which read: Beware the stobor. It (the sign) could have read, “Danger, Will Robinson,” but that wasn’t his name and that’s more a mechanical utterance. What it said, notwithstanding, what does it mean? Yeah, the TitS crew (not Charlie’s Angels) eventually hung the appellation on a certain indigenous scavenger/predator but only by default. What did it mean, then?

Now, I’ve advanced the theory before that since “stobor” is “robots,” backwards, the writer (and you all know who he is) is making a statement. No, not a statement about backwards robots; a statement about the presence of mechanical persons in science fiction. After all, there aren’t any mechanical persons in Heinleinian fiction, are there? (Or are there?) There are plenty of mechanical persons in some competitors’ science fiction writings, as I recall. One, major competing writer during the ’50’s populated whole fictional landscapes with them and if it’s word games (like backwards words) you want to play, “Beware the stobor” is an anagram of “A Hebrew et robots.”

So for the May discussion, what about robots in Heinlein? Are they too easy? Are they classical dei ex machinae and, in fact, machinae ut deo? What about Gay and Minerva and even Dan Davis’s little invention that got him in time trouble? Was “beware the stobor,” “nay, robots?” If so, why; if not, why not?

In summation, no meeting this month: the moderator dropped the ball; meeting next month: keep a close eye on your Roomba and post your suspicions. Be ready to talk about them, live, online, no bots allowed, next month.

L.N.C.
From: “David Wright Sr.”
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Date: Sun, 24 Apr 2005 00:07:09 +0000 (UTC)

LNCwrote in news::

(snip)

> ANNOUNCEMENT:

> HEINLEIN READERS GROUP MEETING SCHEDULED
> WHEN: May 26, 2005, 7:00 PM EDT and May 28, 2005, 5:00 PM EDT
> WHERE: The usual AIM chatroom
> TOPIC: Beware the stobor

(snip)

>
> Now, I’ve advanced the theory before that since “stobor” is “robots,”
> backwards, the writer (and you all know who he is) is making a
> statement. No, not a statement about backwards robots; a statement about
> the presence of mechanical persons in science fiction. After all, there
> aren’t any mechanical persons in Heinleinian fiction, are there? (Or are
> there?) There are plenty of mechanical persons in some competitors’
> science fiction writings, as I recall. One, major competing writer
> during the ’50’s populated whole fictional landscapes with them and if
> it’s word games (like backwards words) you want to play, “Beware the
> stobor” is an anagram of “A Hebrew et robots.”

Tbere may be something to your theory. I noticed the ‘stobor”robots’ when I first read the book,(mumble 50 years ago), but could never see any reason for it.

>
> So for the May discussion, what about robots in Heinlein? Are they too
> easy? Are they classical dei ex machinae and, in fact, machinae ut deo?
> What about Gay and Minerva and even Dan Davis’s little invention that
> got him in time trouble? Was “beware the stobor,” “nay, robots?” If so,
> why; if not, why not?
>

I always felt that Heinlein’s forte was in human characterization and since it robots was a field dominated and explored thoroughly by Asimov, that there was little point in doing the same kind of stories over again.

I can’t recall any ‘human form’ robots in any of Heinlein’s stories or novels. The only ‘robot’ called that, IIRC, were the ones on Lanador in _Have Space Suit – Will Travel_.

“–there are robots wherever you turn on Lanador. I don’t mean things that
looked like the Tin Woodman; I mean machines that do things for you such as
the one which led us to our rooms, then hung around like a bellhop waiting
for a tip. It was a three-wheeled cart with a big basket on top, for luggage
if we had any.”

Heinlein always had calculator/computers of some sort from the earliest days: Monroe Alpha’s ‘integrating accumulator’; ‘Joe the Robot pilot’ in _Rocket Ship Galileo_ which ran on a cam, “designed by a remote cousin of Joe’s, the great ‘Eniac’ computer at the University of Pennsylvania”; Shorty Weinstein’s ‘tons of IBM computer at Supra-New-York’ in _Space Jockey; and of course, the very dumb computer in _Starman Jones_.

Starting with _The Moon Is A Harsh Mistress_, Heinlein did a quantum leap from the old style computers to the intelligent computers such as Mike, Minerva, Gay Deceiver and Dora.


The next meetings of the Heinlein Readers Group
Thursday 5/26/05 @ 9:00 P.M. EST and
Saturday 5/28/05 @ 5:00 P.M. EST
The topic for this discussion will be:
“Beware the stobor – Robots? in Heinlein”
See: https://www.heinleinsociety.org/readersgroup/index.html

From: “Dr. Rufo”
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Date: Sun, 24 Apr 2005 19:32:13 GMT

LNC wrote:

< snip >

> HEINLEIN READERS GROUP MEETING SCHEDULED
> WHEN: May 26, 2005, 7:00 PM EDT and May 28, 2005, 5:00 PM EDT
> WHERE: The usual AIM chatroom
> TOPIC: Beware the stobor
>
> Rod, who may or may not have been of color but was certainly so straight
> he was never off color, went through a Tunnel in the Sky and found
> himself walking through a frontier where he encountered a signpost which
> read: Beware the stobor. It (the sign) could have read, “Danger, Will
> Robinson,” but that wasn’t his name and that’s more a mechanical
> utterance. What it said, notwithstanding, what does it mean? Yeah, the
> TitS crew (not Charlie’s Angels) eventually hung the appellation on a
> certain indigenous scavenger/predator but only by default. What did it
> mean, then?

Notwithstanding that David Wright has plunged ahead on the track you requested, I’d like to double back to, if for nothing else, mention that RAH did answer your first question in the TiTS scheme of things:

RAH has the students identify “stobor” as, first, a “leonine predator” and, later, as the “dopy joes” during their cyclical migrations leading them to the Beach of Bones.

There is, of course, the further evidence of the near-the-end-of-the-book conversation between Rod and the Deacon: (pray pardon any typos) < quote >

Rod looked thoughtful. “These are stobor, aren’t they? Little
carnivores heavy in front, about the size of a tom cat and eight
times as nasty?”
“Why ask me?”
“Well, you warned us against stobor. All the classes were warned.”
“I suppose these must be stobor,” Matson admitted, “but I did not
know what they looked like.”
“Huh?”
***”Rod, every planet has its ‘stobor’ . . . all different.
Sometimes more than one sort.”***[added emphasis is mine] He stopped
to tap his pipe. “You remember me telling the class that every
planet has unique dangers, different from every other planet in the
Galaxy?” [This is RAH repeating the last comment for those of us who
might have breezed by it the first time.]
“Yes. . .”
***”Sure, and it meant nothing, a mere intellectual concept. But you
have to be afraid of the thing behind the concept, if you are to
stay alive. So we personify it . . . but we don’t tell you
what it is. We do it differently each year. It is to warn you that
the unknown and deadly can lurk anywhere . . . and to plant it deep
in your guts instead of in your head.”***
“Well, I’ll be a Then there weren’t any stobor! There never were!”
“Sure there were. You built these traps for them, didn’t you?”
< end quote >

My paraphrase to enhance meaning: The term ‘stobor’ is a neologism (for the students [and theoretically, the readers] in this situation) which identifies the contents of each planet/environment/eco-system that is/are potentially capable of preventing a human from staying alive.” It concentrates and makes specific the general cautions contained in “Take care!” or “Watch your back!”

> Now, I’ve advanced the theory before that since “stobor” is “robots,”
> backwards, the writer (and you all know who he is) is making a
> statement. No, not a statement about backwards robots; a statement about
> the presence of mechanical persons in science fiction. After all, there
> aren’t any mechanical persons in Heinleinian fiction, are there? (Or are
> there?) There are plenty of mechanical persons in some competitors’
> science fiction writings, as I recall. One, major competing writer
> during the ’50’s populated whole fictional landscapes with them and if
> it’s word games (like backwards words) you want to play, “Beware the
> stobor” is an anagram of “A Hebrew et robots.”

You suggest an anagrammatic content within an aphoristic neologism. That RAH used anagrams in his later work (cf.TNOTB) is, I suppose and concur, a valid theoretical basis for this. Especially when combined with RAH’s well-known high regard for the meaning(s) of the names he used in his works (cf. the Introduction to the “original” version of SIASL-1991). Are you suggesting the anagram to mean (rephrased) “(“Beware” to be understood and left unstated) A Jewish person AND constructed persons?” Wouldn’t that make it a paralogism at the very least?

For the sake of background, the word “robot” was coined in connection with a play first produced in 1920, “R.U.R.” by a Hungarian, Carl Capek. The initials stand for “Rossum’s Universal Robots,” the company in the play which constructs the robots to do the drudgery work which has been until that point, the responsibility of humans. In the play, these robots apply themselves diligently to repetitive tasks and suffer from an inability to produce “new thought.” (The comment is made that they’d be excellent university professors.) They are, in the play, produced through “organic” rather than “mechanical” processes.

>
> So for the May discussion, what about robots in Heinlein? Are they too
> easy?

What, pray tell, do you mean by “too easy”?

> Are they classical dei ex machinae and, in fact, machinae ut deo?

I translate the first phrase as either: “gods from the machine(s)” or “gods from the mechanics of the plot.”

The second, I contrue as either “machinery that is/are god(s)” or ‘machinery that exists [because they are] from god.”

What’s your intention? Do you construe either/both differently? If so, how?

Until later.

Rufe
From:
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Date: Sun, 24 Apr 2005 16:41:29 -0400

I’m with Dr. Rufo on this. Coincidences do happen, and the robots/stobor thing looks like one to me. If Heinlein had noticed it or had it pointed out to him before publication, he probably would have changed it to some other coined word.

Also, while Asimov and Heinlein may have had their differences, I seriously doubt if Heinlein had any objections to robots as such. He rarely if ever objected to labor saving technology.
From: “David Wright Sr.”
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Date: Sun, 24 Apr 2005 16:50:44 -0400 wrote in news:1114374700.86c71c2b04e7d8d3b744323e18de7960 @bubbanews:

> I’m with Dr. Rufo on this. Coincidences do happen, and the
> robots/stobor thing looks like one to me. If Heinlein had noticed it
> or had it pointed out to him before publication, he probably would
> have changed it to some other coined word.
>
> Also, while Asimov and Heinlein may have had their differences, I
> seriously doubt if Heinlein had any objections to robots as such. He
> rarely if ever objected to labor saving technology.
>

But, even as Asimov realized, robots, carried to the extreme, became more than just ‘labor saving technology’and would be a major factor in the development of humanity. I don’t believe that Heinlein would have been any happier with robots as humanity’s nursemaids, any more than he felt that the ‘group consciousness’ of the Little People was a way for humanity.

Asimov, on the other hand, seemed to believe that such was the way of the future in his later books with his Gaia concept.


David Wright Sr.
If you haven’t joined the Society, Why Not?
https://www.heinleinsociety.org/join.html

Keep Up with the Latest
https://www.heinleinsociety.org/updates.html

Benefit The Heinlein Society by ordering books thru
http://home.alltel.net/dwrighsr/heinlein-amazon.htm

From:
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Date: Sun, 24 Apr 2005 19:45:30 -0400 On Sun, 24 Apr 2005 16:50:44 -0400, “David Wright Sr.” wrote:

>> Also, while Asimov and Heinlein may have had their differences, I
>> seriously doubt if Heinlein had any objections to robots as such. He
>> rarely if ever objected to labor saving technology.
>>
>
>But, even as Asimov realized, robots, carried to the extreme, became more
>than just ‘labor saving technology’and would be a major factor in the
>development of humanity. I don’t believe that Heinlein would have been any
>happier with robots as humanity’s nursemaids, any more than he felt that
>the ‘group consciousness’ of the Little People was a way for humanity.
>
>Asimov, on the other hand, seemed to believe that such was the way of the
>future in his later books with his Gaia concept.

Possibly, but I am still not convinced. Just about anything can be harmful if carried to the extreme. From: “Sean”
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Date: Mon, 25 Apr 2005 10:06:10 +1000

> I’m with Dr. Rufo on this. Coincidences do happen, and the
> robots/stobor thing looks like one to me. If Heinlein had noticed it
> or had it pointed out to him before publication, he probably would
> have changed it to some other coined word.

Recycling some of my previous ideas on this….. I originally thought Heinlein may have meant it as a deliberate cryptic message or joke, but the actual text doesn’t support it (as such). The students came to think of the Dopey Joes as the stobor, but (as Rufe points out) Matson says “I suppose these must be stobor…. but I didn’t know what they looked like”. He goes on about personifying the unknown unique dangers any planet, and how they do it differently each year to “plant it deep in your guts instead of your head.” What may have been called “stobor” one year could easily be called “swollip” the next, and Matson wouldn’t care which. There were no robots on Tangaroa, so no cryptic warning to the students (or the reader) about them was necessary. Also, none of the characters in the novel ever said “Hey! Stobor spelt backwards is “robots”! We better watch out for them.” But I equally find it difficult to believe that Heinlein could have overlooked the obvious connection, which so many readers seem to have spotted after just one or two readings.

I think it was Ginny who said RAH was not aware of the “stobot-robots” connection until someone told him about it. Who knows? Maybe he just liked sound of the name?

> Also, while Asimov and Heinlein may have had their differences, I
> seriously doubt if Heinlein had any objections to robots as such. He
> rarely if ever objected to labor saving technology.

Indeed.


Sean
“Stobor name – no one man robots”

From: “David Wright Sr.”
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Date: Sun, 24 Apr 2005 21:27:35 -0400 “Sean” wrote in news:YvWae.8$f_2.726 @nnrp1.ozemail.com.au:

(snip)

> But I equally find it difficult to believe that Heinlein could have
> overlooked the obvious connection, which so many readers seem to have
> spotted after just one or two readings.
>
> I think it was Ginny who said RAH was not aware of the “stobot-robots”
> connection until someone told him about it.
> Who knows? Maybe he just liked sound of the name?
>

I find it very difficult to believe that the man who came up with the following could have overlooked ‘stobor’ spelled backwards. Of course, the list below comes from a much later date _Tunnel In The Sky_, 1955 and _The Number Of The Beast_, 1979 and maybe he had developed the talent as he grew older.

From The Readers Group Archives

https://www.heinleinsociety.org/readersgroup/AIM_02-01-2001.html#anagram

From a posting by Jane Davitt

The first number is the page number in the USA editions; the second number
( in parentheses) refers to the UK editions.

19 (9) Neil O’Heret Brain = Robert A Heinlein

93 (93) Bennie Hibol = Bob Heinlein

176 (177) Morinosky = Simon York (pen name; UNKNOWN et al.)

262 (273) Iver Hird-Jones = John Riverside (pen name: UNKNOWN et al.)

499 (539) The Villains Nine Rig Ruin = Lt Virginia Heinlein USNR

499 (540) Torne, Hernia, Lien and Snob = Robert Anson Heinlein

509 (553) Sir Tenderloinn the Brutal = Lt Robert A Heinlein USN RTD

509 (554) L Ron O’Leemy = Lyle Monroe ( pen name for SF 1939 -46)

510 (555) Mellrooney = Lyle Monroe ( pen name for SF 1939 – 46)

Heinlein then signed the letter R.A “Beast” Heinlein.


The next meetings of the Heinlein Readers Group
Thursday 5/26/05 @ 9:00 P.M. EST and
Saturday 5/28/05 @ 5:00 P.M. EST
The topic for this discussion will be:
“Beware the stobor – Robots? in Heinlein”
See: https://www.heinleinsociety.org/readersgroup/index.html

From:
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Date: Sun, 24 Apr 2005 21:42:21 -0400 On Sun, 24 Apr 2005 21:27:35 -0400, “David Wright Sr.” wrote:

>> I think it was Ginny who said RAH was not aware of the “stobot-robots”
>> connection until someone told him about it.
>> Who knows? Maybe he just liked sound of the name?
>>
>
>I find it very difficult to believe that the man who came up with the
>following could have overlooked ‘stobor’ spelled backwards. Of course, the
>list below comes from a much later date _Tunnel In The Sky_, 1955 and _The
>Number Of The Beast_, 1979 and maybe he had developed the talent as he grew
>older.

Maybe he developed the talent further, or maybe he just missed something. The man was only human, after all.

If the students had been told to beware of “vomisa” that would be much more suspicious. While he never had robots play an extensive part in anything he wrote, he also never had them taking over the world or putting huge numbers of people out of work or making humans feel worthless because they cannot keep up with the robots or anything like that.
From: (Michael Black)
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Date: 25 Apr 2005 02:30:28 GMT () writes:

> On Sun, 24 Apr 2005 21:27:35 -0400, “David Wright Sr.”
> wrote:
>
>>> I think it was Ginny who said RAH was not aware of the “stobot-robots”
>>> connection until someone told him about it.
>>> Who knows? Maybe he just liked sound of the name?
>>>
>>
>>I find it very difficult to believe that the man who came up with the
>>following could have overlooked ‘stobor’ spelled backwards. Of course, the
>>list below comes from a much later date _Tunnel In The Sky_, 1955 and _The
>>Number Of The Beast_, 1979 and maybe he had developed the talent as he grew
>>older.
>
> Maybe he developed the talent further, or maybe he just missed
> something. The man was only human, after all.
>
> If the students had been told to beware of “vomisa” that would be much
> more suspicious. While he never had robots play an extensive part in
> anything he wrote, he also never had them taking over the world or
> putting huge numbers of people out of work or making humans feel
> worthless because they cannot keep up with the robots or anything like
> that.

And that’s a good point. At the moment, I can’t think of robots (as in mechanical men) in anything but Door Into Summer. In that book, the robots were merely a product, so he could tell the story of time travel and a guy being cheated out of his company and ideas. He could have had any product for that purpose. Robots had the advantage that they are a familiar thing to readers. He could have used something mundane, but not only would there not be much cause for innovation, but it would hardly be “futuristic”. He could have cooked up something completely new, but that takes time and maybe more important takes time to explain to the reader, when he was more concerned with conveying the concept of off the shelf components etc.

Various other others wrote about robots because that was the focus of their writing. Jack Williamson wrote “With Folded Hands” (I can’t remember if that was the first book or the sequel), to tell the story of what happens if robots take over by being oh so good. Asimov cooked up the Laws of Robotics, and mined it well, but he wanted to write about their impact on a society.

Michael
From: “Big_Fella”
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Date: Tue, 26 Apr 2005 09:43:00 +1000

wrote in message news:1114392753.7cd16b3ae86cb8340c0193836fa2e4e0@bubbanews…

> On Sun, 24 Apr 2005 21:27:35 -0400, “David Wright Sr.”
> wrote:
>
>>> I think it was Ginny who said RAH was not aware of the “stobot-robots”
>>> connection until someone told him about it.
>>> Who knows? Maybe he just liked sound of the name?
>>>
>>
>>I find it very difficult to believe that the man who came up with the
>>following could have overlooked ‘stobor’ spelled backwards. Of course, the
>>list below comes from a much later date _Tunnel In The Sky_, 1955 and _The
>>Number Of The Beast_, 1979 and maybe he had developed the talent as he
>>grew
>>older.
>
> Maybe he developed the talent further, or maybe he just missed
> something. The man was only human, after all.
>
> If the students had been told to beware of “vomisa” that would be much
> more suspicious. While he never had robots play an extensive part in
> anything he wrote, he also never had them taking over the world or
> putting huge numbers of people out of work or making humans feel
> worthless because they cannot keep up with the robots or anything like
> that.

I always thought the way Friday felt about herself was also a bit of comment on robots ; ie ” artificial people “.
:-[ ) From: “David M. Silver”
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Date: Mon, 25 Apr 2005 20:05:30 -0700

In article, “Big_Fella”wrote:

> wrote in message
> news:1114392753.7cd16b3ae86cb8340c0193836fa2e4e0@bubbanews…
> > On Sun, 24 Apr 2005 21:27:35 -0400, “David Wright Sr.”
> > wrote:
> >
> >>> I think it was Ginny who said RAH was not aware of the “stobot-robots”
> >>> connection until someone told him about it.
> >>> Who knows? Maybe he just liked sound of the name?
> >>>
> >>
> >>I find it very difficult to believe that the man who came up with the
> >>following could have overlooked ‘stobor’ spelled backwards. Of course, the
> >>list below comes from a much later date _Tunnel In The Sky_, 1955 and _The
> >>Number Of The Beast_, 1979 and maybe he had developed the talent as he
> >>grew
> >>older.
> >
> > Maybe he developed the talent further, or maybe he just missed
> > something. The man was only human, after all.
> >
> > If the students had been told to beware of “vomisa” that would be much
> > more suspicious. While he never had robots play an extensive part in
> > anything he wrote, he also never had them taking over the world or
> > putting huge numbers of people out of work or making humans feel
> > worthless because they cannot keep up with the robots or anything like
> > that.
>
> I always thought the way Friday felt about herself was also a bit of comment
> on robots ; ie ” artificial people “.
> :-[ )

The “artificial persons” tag is also an ironic comment on the status corporations have under the current law, and the law in the novel — they, the corporations, considered by the artifice of law to be citizens for Constitutional purposes, have rights as citizens that “APs,” the humans created by genetic matching, do not. The setting and plot in _Friday_ is an example of how far that might be taken in the future, with the corporations really running the society instead of its nominally elected government functionaries.


David M. Silver

“The Lieutenant expects your names to shine!”
Robert Anson Heinlein, USNA ’29
Lt.(jg), USN, R’td

From: “Big_Fella”
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Date: Wed, 27 Apr 2005 07:44:13 +1000

“David M. Silver” wrote in message
news:
> In article ,
> “Big_Fella” wrote:
>
>> wrote in message
>> news:1114392753.7cd16b3ae86cb8340c0193836fa2e4e0@bubbanews…
>> > On Sun, 24 Apr 2005 21:27:35 -0400, “David Wright Sr.”
>> > wrote:
>> >
>> >>> I think it was Ginny who said RAH was not aware of the
>> >>> “stobot-robots”
>> >>> connection until someone told him about it.
>> >>> Who knows? Maybe he just liked sound of the name?
>> >>>
>> >>
>> >>I find it very difficult to believe that the man who came up with the
>> >>following could have overlooked ‘stobor’ spelled backwards. Of course,
>> >>the
>> >>list below comes from a much later date _Tunnel In The Sky_, 1955 and
>> >>_The
>> >>Number Of The Beast_, 1979 and maybe he had developed the talent as he
>> >>grew
>> >>older.
>> >
>> > Maybe he developed the talent further, or maybe he just missed
>> > something. The man was only human, after all.
>> >
>> > If the students had been told to beware of “vomisa” that would be much
>> > more suspicious. While he never had robots play an extensive part in
>> > anything he wrote, he also never had them taking over the world or
>> > putting huge numbers of people out of work or making humans feel
>> > worthless because they cannot keep up with the robots or anything like
>> > that.
>>
>> I always thought the way Friday felt about herself was also a bit of
>> comment
>> on robots ; ie ” artificial people “.
>> :-[ )
>
> The “artificial persons” tag is also an ironic comment on the status
> corporations have under the current law, and the law in the novel —
> they, the corporations, considered by the artifice of law to be citizens
> for Constitutional purposes, have rights as citizens that “APs,” the
> humans created by genetic matching, do not. The setting and plot in
> _Friday_ is an example of how far that might be taken in the future,
> with the corporations really running the society instead of its
> nominally elected government functionaries.
>
> —
> David M. Silver
> https://www.heinleinsociety.org
> “The Lieutenant expects your names to shine!”
> Robert Anson Heinlein, USNA ’29
> Lt.(jg), USN, R’td

AHH. Thanks David. That makes sense.

:-{)
From: “David M. Silver”
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Date: Sun, 24 Apr 2005 22:43:33 -0700
Date: Mon, 25 Apr 2005 02:27:24 EDT (be02)

In article ,
“Sean” wrote:

>
>
> > I’m with Dr. Rufo on this. Coincidences do happen, and the
> > robots/stobor thing looks like one to me. If Heinlein had noticed it
> > or had it pointed out to him before publication, he probably would
> > have changed it to some other coined word.
>
> Recycling some of my previous ideas on this….. I originally thought
> Heinlein may have meant it as a deliberate cryptic message or joke, but the
> actual
> text doesn’t support it (as such).

Maybe that’s true: yet, let’s list some characters that may not usually be discussed in “schoolhouse in the sky,” because they are “walk-ons,” and sooner or later disposed of.

1. Johann Braun (und his oberhund Thor): believes that highly advanced weapons and target acquisitions technology will ensure his survival, so much so that he appears to be carrying little, if any, food and water. Braun is killed from ambush almost immediately after the test begins by persons or beasts unknown.

2. The unidentified outlaw stalker: who shadows, attacks and attempts to murder Roderick Walker, at the stream watering place, stealing Rod’s food, gear and weaponry, and leaving his body behind. Possibly, this individual is later killed by an unknown accident if this bushwacker is the same individual who got Braun, was still carrying Braun’s exhausted, or exhausted it during the accident, weaponry at the time of the accident, as evidenced by the corpse found with the weaponry, the metal from the powerpack of which was salvaged, and reused.

3. Jock McGowan, his brother and two cronies: the Girty brothers of this frontier planet [1]. Satisfied to rule as renegades over the other students, they are twice foiled — twice, necessarily, because the criminal younger brother is allowed back in the settlement after the demise of his elder.

4. Grant Cowper: the by-the-textbook elected mayor of what becomes Cowperstown after his death.

I could maintain that each of the above are ‘types’ of humans who, rather than think solutions, act out robotic solutions, either learned early, or picked out of texts, rather than adapt and think flexibly in the schoolhouse in the sky in which they are placed.

Compare how they act to the choir in Golding’s then contemporary best-seller, Lord of the Flies. The choir is a metaphore for how people act within civilizations, following rote taught conduct specified by their leaders, unthinking as robots.

Heinlein notes several times in Tunnel in the Sky that the most dangerous beast to fear is man himself — out beyond where we have cops and traffic lights, where the only truce is the Truce of the Bear itself.

Robots don’t think, they don’t use the best weapon humanity has, that mass of cells between our ears, they act with rote solutions to any problem they encounter. And the Three Laws of Asimov can create irresolvable conflicts that robots cannot solve.

[ … snip whether or not Heinlein ever discussed the question of “robots” with Ginny — what does that prove? The absence of evidence isn’t evidence of absence. The topic never came up. This is the guy you’ll recall later put her name in the Number of the Beast as an anagram. ]

> > Also, while Asimov and Heinlein may have had their differences, I
> > seriously doubt if Heinlein had any objections to robots as such. He
> > rarely if ever objected to labor saving technology.
>
> Indeed.

Indeed, what? Johann Braun carried a wonderful labor-saving piece of technology into the test. He lasted a few minutes. Rod Walker carried a knife hidden inside a bandage on his leg. Who lasted longer? Johann the robot of Rod the Romantic (who had an occasional glimmer of thought between his ears)?

[1] The three (some accounts number as many as five) Girty brothers, the best known of whom was Simon Girty, were hated and reviled renegades, in the period 1780-94 on the Northwest frontier of the United States (Ohio River Valley) who born in Western Pennsylvania fled the American settlements and operating from British trading posts along the Miami and Ohio trail (a course later taken by a canal built from the Ohio River to Lake Erie) they are alleged to have betrayed settlers, planned massacres of them by hostile indians, scouted for the indians with whom they maintained trade relationships, and seemingly enjoyed the torture of captured Americans they betrayed. Some claim they were originally hired by a British general during the Revolution, but continued on with their activities after the war. After their trading posts became the object of a campaign and were seized by the U.S. Army in 1794, they fled to Canada. See, e.g., http://www.edsanders.com/hist005.htm and see, Zane Grey’s novel, Spirit of the Border (1938), described here: http://www.d.umn.edu/~tbacig/writing/Metis/brdmetis.html


David M. Silver

“The Lieutenant expects your names to shine!”
Robert Anson Heinlein, USNA ’29
Lt.(jg), USN, R’td

From: “Sean”
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Date: Tue, 26 Apr 2005 17:16:29 +1000

“David M. Silver” wrote in message
news:
> In article ,
> “Sean” wrote:
>
>>
>>
>> > I’m with Dr. Rufo on this. Coincidences do happen, and the
>> > robots/stobor thing looks like one to me. If Heinlein had noticed it
>> > or had it pointed out to him before publication, he probably would
>> > have changed it to some other coined word.
>>
>> Recycling some of my previous ideas on this….. I originally thought
>> Heinlein may have meant it as a deliberate cryptic message or joke, but
>> the
>> actual
>> text doesn’t support it (as such).
>
> Maybe that’s true: yet, let’s list some characters that may not usually
> be discussed in “schoolhouse in the sky,” because they are “walk-ons,”
> and sooner or later disposed of.

The thing is, if you limit the meaning of the warning “Beware of Stobor” to “beware of humans who act like robots” (or, don’t act like a robot yourself), you negate the actual purpose of the warning as stated by Matson near the end of the novel i.e. to personify the unknown unique dangers of any planet, and to do it differently each year to “plant it deep in your guts instead of your head.” The danger presented by other humans (or robot-like tendencies) would only be a sub-set of dangers that the warning is meant to cover.

> [ … snip whether or not Heinlein ever discussed the question of
> “robots” with Ginny — what does that prove? The absence of evidence
> isn’t evidence of absence. The topic never came up. This is the guy
> you’ll recall later put her name in the Number of the Beast as an
> anagram. ]

No, you seem to miss my point. I may need to search a bit, but I was of the opinion that VH confirmed either here on AFH, or during a chat-session, that RAH had not been aware of the “stobor=robots backwards” connection until it was pointed out to him after publication. If so, I wouldn’t be so quick to dismiss the possibility that this is just coincidence, even while acknowledging his use of anagrammes in tNotB some quarter century later.

>> > Also, while Asimov and Heinlein may have had their differences, I
>> > seriously doubt if Heinlein had any objections to robots as such. He
>> > rarely if ever objected to labor saving technology.
>>
>> Indeed.
>
> Indeed, what? Johann Braun carried a wonderful labor-saving piece of
> technology into the test. He lasted a few minutes. Rod Walker carried a
> knife hidden inside a bandage on his leg. Who lasted longer? Johann the
> robot of Rod the Romantic (who had an occasional glimmer of thought
> between his ears)?

Indeed, RAH rarely if ever objected to labor saving devices. In a non-survival situation the weapon chosen by Braun may have been quite handy. Its function in the story was to show what may happen if one becomes complacent in a dangerous environment, not to show anything intrinsically wrong with labor-saving technology, IMO.


Sean
RAH on Australians in _Tramp Royale_ “They think as we do, only more so.”

From: “David M. Silver”
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Date: Tue, 26 Apr 2005 10:47:59 -0700

In article ,
“Sean” wrote:

> “David M. Silver” wrote in message
> news:
> > In article ,
> > “Sean” wrote:
> >
> >>
> >>
> >> > I’m with Dr. Rufo on this. Coincidences do happen, and the
> >> > robots/stobor thing looks like one to me. If Heinlein had noticed it
> >> > or had it pointed out to him before publication, he probably would
> >> > have changed it to some other coined word.
> >>
> >> Recycling some of my previous ideas on this….. I originally thought
> >> Heinlein may have meant it as a deliberate cryptic message or joke, but
> >> the
> >> actual
> >> text doesn’t support it (as such).
> >
> > Maybe that’s true: yet, let’s list some characters that may not usually
> > be discussed in “schoolhouse in the sky,” because they are “walk-ons,”
> > and sooner or later disposed of.
>
>
>
> The thing is, if you limit the meaning of the warning “Beware of Stobor” to
> “beware of humans who act like robots” (or, don’t act like a robot
> yourself), you negate the actual purpose of the warning as stated by Matson
> near the end of the novel i.e. to personify the unknown unique dangers of
> any planet, and to do it differently each year to “plant it deep in your
> guts instead of your head.”

To say there is possibly or likely another level of meaning isn’t limiting it, Sean. Much of reading RAH a second or many later times turns up the “I wonder if possibly he meant _this_ as well as what is obviously stated _here_.” Sometimes, I’d submit, it turns out to be a pretty strong case that he meant both or all of the possible levels. > The danger presented by other humans (or > robot-like tendencies) would only be a sub-set of dangers that the warning > is meant to cover. > That there are distinct subsets rather than merely the obvious set isn’t always so apparent, which is why I’m considering the point LN raised. RAH keeps having his teacher in this schoolhouse in the sky talking about the Truce of the Bear, Rod’s “romanticism,” and the out beyond the traffic lights and cops point. The point isn’t only about non-human stobor — we drive in traffic when cops are around, especially, like robots.

> > [ … snip whether or not Heinlein ever discussed the question of
> > “robots” with Ginny — what does that prove? The absence of evidence
> > isn’t evidence of absence. The topic never came up. This is the guy
> > you’ll recall later put her name in the Number of the Beast as an
> > anagram. ]
>
> No, you seem to miss my point. I may need to search a bit, but I was of the
> opinion that VH confirmed either here on AFH, or during a chat-session, that
> RAH had not been aware of the “stobor=robots backwards” connection until it
> was pointed out to him after publication. If so, I wouldn’t be so quick to
> dismiss the possibility that this is just coincidence, even while
> acknowledging his use of anagrammes in tNotB some quarter century later.
>

I remember a session or a post, but I don’t recall Ginny being as definitive in her opinion as you do of whether RAH actually was surprised or not at the acronym’s presence. Even husbands are sometimes a bit deceptive toward their wives (Except for me of course, for I always tell my wife 100 % absolute truth. Just ask my sister. I used to win the hatchet every Washington’s Birthday. Honest George they call me.)

> >> > Also, while Asimov and Heinlein may have had their differences, I
> >> > seriously doubt if Heinlein had any objections to robots as such. He
> >> > rarely if ever objected to labor saving technology.
> >>
> >> Indeed.
> >
> > Indeed, what? Johann Braun carried a wonderful labor-saving piece of
> > technology into the test. He lasted a few minutes. Rod Walker carried a
> > knife hidden inside a bandage on his leg. Who lasted longer? Johann the
> > robot of Rod the Romantic (who had an occasional glimmer of thought
> > between his ears)?
>
> Indeed, RAH rarely if ever objected to labor saving devices. In a
> non-survival situation the weapon chosen by Braun may have been quite handy.
> Its function in the story was to show what may happen if one becomes
> complacent in a dangerous environment, not to show anything intrinsically
> wrong with labor-saving technology, IMO.

The schoolhouse in the sky test, as all good tests are, was to test brains, not equipment, not rote, not strength, nor even marksmanship. Braun’s “labor” saving device indicated a preference for mechanical brawn over brains — he ‘thought’ he was just going to mechanically bully or use brute strength to overcome that array of animals he thought was set out as mere game to test him (or his mechanically aided marksmanship), animal and non-animals alike, as if he were an occupying force of panzer grenadiers just as the sound of his name. It was a ‘saving’ device; but it didn’t save labor; it really “saved” thought, something Braun wasn’t quite as equipped to do, probably, as others; and so he compensated — and Braun’s brawn-saving was pennywise and pound foolish in the circumstances. Johann and his two Thunderbolt-wielding aides, didn’t quite make the cut. He died. He didn’t last even an hour.

Heinlein showed a device that “saves” too much ‘labor’ can kill you in this story. Why send Rod forth with the knife, if you’re not trying to make that point? If they’d all been nine feet tall and covered with hair they’d probably all died as Braun, who tried to make himself that, did. — David M. Silver https://www.heinleinsociety.org “The Lieutenant expects your names to shine!” Robert Anson Heinlein, USNA ’29 Lt.(jg), USN, R’td
From: “Sean”
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Date: Thu, 28 Apr 2005 20:59:19 +1000

“David M. Silver” wrote in message
news:
> In article ,
> “Sean” wrote:

> The thing is, if you limit the meaning of the warning “Beware of Stobor”
>> to
>> “beware of humans who act like robots” (or, don’t act like a robot
>> yourself), you negate the actual purpose of the warning as stated by
>> Matson
>> near the end of the novel i.e. to personify the unknown unique dangers of
>> any planet, and to do it differently each year to “plant it deep in your
>> guts instead of your head.”
>
> To say there is possibly or likely another level of meaning isn’t
> limiting it, Sean. Much of reading RAH a second or many later times
> turns up the “I wonder if possibly he meant _this_ as well as what is
> obviously stated _here_.” Sometimes, I’d submit, it turns out to be a
> pretty strong case that he meant both or all of the possible levels.

I probably overstated my opinion, because I agree with what you say above. That doesn’t mean I think there is any direct evidence in the text (or elsewhere) that RAH used the word “stobor” as a cryptic warning against robots or robot-like behaviour in humans. I grant it is *possible*, as can almost any assertion be.

 

> I remember a session or a post, but I don’t recall Ginny being as
> definitive in her opinion as you do of whether RAH actually was
> surprised or not at the acronym’s presence. Even husbands are sometimes
> a bit deceptive toward their wives (Except for me of course, for I
> always tell my wife 100 % absolute truth. Just ask my sister. I used to
> win the hatchet every Washington’s Birthday. Honest George they call me
>.)

Well, I’ve had a search through various chat-logs and posts on AFH. All I can find is a remark from Jane Davitt some years ago in which she seems to remember “someone” saying that RAH was not aware of the stobor-robots connection until later. I don’t know who that “someone” is, so as any kind of evidence goes it is pretty weak.

Oh, and technically “robots” would be a reversal of “stobor”, not an acronym, and not even a palindrome.

 

>> Indeed, RAH rarely if ever objected to labor saving devices. In a
>> non-survival situation the weapon chosen by Braun may have been quite
>> handy.
>> Its function in the story was to show what may happen if one becomes
>> complacent in a dangerous environment, not to show anything intrinsically
>> wrong with labor-saving technology, IMO.
>
> The schoolhouse in the sky test, as all good tests are, was to test
> brains, not equipment, not rote, not strength, nor even marksmanship.
> Braun’s “labor” saving device indicated a preference for mechanical
> brawn over brains — he ‘thought’ he was just going to mechanically
> bully or use brute strength to overcome that array of animals he thought
> was set out as mere game to test him (or his mechanically aided
> marksmanship), animal and non-animals alike, as if he were an occupying
> force of panzer grenadiers just as the sound of his name. It was a
> ‘saving’ device; but it didn’t save labor; it really “saved” thought,
> something Braun wasn’t quite as equipped to do, probably, as others; and
> so he compensated — and Braun’s brawn-saving was pennywise and pound
> foolish in the circumstances. Johann and his two Thunderbolt-wielding
> aides, didn’t quite make the cut. He died. He didn’t last even an hour.
>
> Heinlein showed a device that “saves” too much ‘labor’ can kill you in
> this story. Why send Rod forth with the knife, if you’re not trying to
> make that point? If they’d all been nine feet tall and covered with hair
> they’d probably all died as Braun, who tried to make himself that, did.

The schoolhouse in the sky test was to test “survival”, of which “brains” is a most important aspect. Why didn’t Heinlein send Rod into the test with *nothing*? Wouldn’t that be an even greater test of brains? The whole thing about not taking a high-powered weapon into the test was about “attitude”, and the lesson is clearly demonstrated by the demise of the arrogant Braun. To use the example of Braun as some kind of stance RAH had against labor-saving devices is really stretching it, IMO.


Sean
RAH on Australians in _Tramp Royale_ “They think as we do, only more so.”

From: “David M. Silver”
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Date: Thu, 28 Apr 2005 10:03:59 -0700

In article ,
“Sean” wrote:

> Heinlein showed a device that “saves” too much ‘labor’ can kill you in
> > this story. Why send Rod forth with the knife, if you’re not trying to
> > make that point? If they’d all been nine feet tall and covered with hair
> > they’d probably all died as Braun, who tried to make himself that, did.
>
> The schoolhouse in the sky test was to test “survival”, of which “brains” is
> a most important aspect. Why didn’t Heinlein send Rod into the test with
> *nothing*? Wouldn’t that be an even greater test of brains? The whole thing
> about not taking a high-powered weapon into the test was about “attitude”,
> and the lesson is clearly demonstrated by the demise of the arrogant Braun.
> To use the example of Braun as some kind of stance RAH had against
> labor-saving devices is really stretching it, IMO.

Actually, Helen, Rod’s sister, considered the point. She noted that she usually sent her best scouts on reconnaissance with nothing but a coat of mud. The reason she agreed Rod should take a knife was the knife was a useful low tech tool as well as a low tech weapon, and the test was scheduled to be of longer duration than a recon. In that case two was better than one in that one could be a spare in the event of breakage or loss.


David M. Silver

“The Lieutenant expects your names to shine!”
Robert Anson Heinlein, USNA ’29
Lt.(jg), USN, R’td

From: charles krin
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Date: Sat, 30 Apr 2005 09:25:37 -0500

On Thu, 28 Apr 2005 20:59:19 +1000, “Sean”
wrote:

>
>Oh, and technically “robots” would be a reversal of “stobor”, not an
>acronym, and not even a palindrome.

Reference “R.U.R”….’robot’ is a reversal of the ?Czech “Tobor”… IIRC, serf like worker…http://tinyurl.com/dkggf Robots have fascinated us since way before the word was coined in Czechoslovakian playwright Karel Capek’s 1921 hit, “R.U.R.” Automatons, like mechanized birds, date to ancient Greece and China, and mechanical-man dime novels became a staple after Edward Sylvester Ellis’ “The Steam Man of the Prairies” in 1868. What else, after all, is the Tin Man in L. Frank Baum’s 1900 “The Wonderful Wizard of Oz”?

ck

country doc in louisiana
(no fancy sayings right now)

From: “David Wright Sr.”
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Date: Tue, 3 May 2005 02:16:24 +0000 (UTC)

charles krin wrote in
news::

(snip)

> Reference “R.U.R”….’robot’ is a reversal of the ?Czech “Tobor”…
> IIRC, serf like worker…http://tinyurl.com/dkggf

No, the word is based on the Czech ‘robota’ which means ‘drudgery’or ‘servitude’. It is cognate with the Russian ‘rabota’ which is simply work, and surprisingly, it is also cognate with English ‘labor’ and German ‘arbeit’ which also means work.


David Wright Sr.

To find the end of Middle English, you discover the exact date and
time the Great Vowel Shift took place (the morning of May 5, 1450,
at some time between neenuh fiftehn and nahyn twenty-fahyv).
Kevin Wald

From: “David Wright Sr.”
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Date: Tue, 3 May 2005 02:17:43 +0000 (UTC)

charles krin wrote in
news::

See: http://jerz.setonhill.edu/resources/RUR/

From: “David Wright Sr.”
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Date: Wed, 27 Apr 2005 19:40:13 -0400 Interesting quotation from _Friday_.

Georges Perrault says:

“When I was a student, I read some classic stories about humanoid robots.
They were charming stories and many of them hinged on something called the
laws of robotics, the key notion of which was that these robots had built
into them an operational rule that kept them from harming human beings
either directly or through inaction. It was a wonderful basis for
fiction… but in practice, how could you do it? What can make a self-
aware, non-human, intelligent organism–electronic or organic–loyal to
human beings? I do not how to do it. The articifical-intelligence people
seem to equally at a loss.”

Thought I’d throw this into the mix.


David Wright Sr.
If you haven’t joined The Heinlein Society, Why Not?
https://www.heinleinsociety.org/join.html
The Heinlein Estate is again matching new member
registrations and fund raising up to $15,000
Make your new membership count twice!

From: “David M. Silver”
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Date: Wed, 27 Apr 2005 18:51:11 -0700

In article ,
“David Wright Sr.” wrote:

> Interesting quotation from _Friday_.
>
> Georges Perrault says:
>
> “When I was a student, I read some classic stories about humanoid robots.
> They were charming stories and many of them hinged on something called the
> laws of robotics, the key notion of which was that these robots had built
> into them an operational rule that kept them from harming human beings
> either directly or through inaction. It was a wonderful basis for
> fiction… but in practice, how could you do it? What can make a self-
> aware, non-human, intelligent organism–electronic or organic–loyal to
> human beings? I do not how to do it. The articifical-intelligence people
> seem to equally at a loss.”
>
> Thought I’d throw this into the mix.

I’ve enjoyed a three-book series, Semper Mars (and two others), military SF, written by someone calling himself Ian Douglas, modeled in part I think on WEB Griffin’s writings, for years since 1998 when the first one came out.

It seems there is already developed by another race in the past an aware, non-human, intelligent organism, electronic or some manner of artificial intelligence. Problem is: it got rid of its inventors some eons back in another part of the universe; and it conceives it in its self-interest and defense to get rid of all other organic intelligences, lest it be destroyed by carbon-based units. And it’s coming to get us … before we get too far out and cannot be stopped. Whoo-haw, “with guns and knives and clubs,” it’s coming to get us! Charming, as Georges Perrault might say.


David M. Silver

“The Lieutenant expects your names to shine!”
Robert Anson Heinlein, USNA ’29
Lt.(jg), USN, R’td

From: “David Wright Sr.”
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Date: Wed, 27 Apr 2005 22:06:28 -0400

“David M. Silver” wrote in
news::

> In article ,
> “David Wright Sr.” wrote:
>

(snip)

>>
>> Thought I’d throw this into the mix.
>
> I’ve enjoyed a three-book series, Semper Mars (and two others), military
> SF, written by someone calling himself Ian Douglas, modeled in part I
> think on WEB Griffin’s writings, for years since 1998 when the first one
> came out.
>
> It seems there is already developed by another race in the past an
> aware, non-human, intelligent organism, electronic or some manner of
> artificial intelligence. Problem is: it got rid of its inventors some
> eons back in another part of the universe; and it conceives it in its
> self-interest and defense to get rid of all other organic intelligences,
> lest it be destroyed by carbon-based units. And it’s coming to get us
> … before we get too far out and cannot be stopped. Whoo-haw, “with
> guns and knives and clubs,” it’s coming to get us! Charming, as Georges
> Perrault might say.
>

In spite of the quote, RAH seemed to write about intelligent self-aware computers that were loyal to humans, Mike, Minerva, Dora. Was Perrault just a pessimist?

Asimov was definitely bent on countering the image of computers/robots as displayed in the Douglas stories you mentioned, and RAH appeared to do so more often than not, without any such artifices as the 3(or 4) Laws.


David Wright Sr.
If you haven’t joined the Society, Why Not?
https://www.heinleinsociety.org/join.html

Keep Up with the Latest
https://www.heinleinsociety.org/updates.html

Benefit The Heinlein Society by ordering books thru
http://home.alltel.net/dwrighsr/heinlein-amazon.htm

From: Oscagne
Date: Wed, 27 Apr 2005 21:12:33 -0500
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month

David Wright Sr. wrote:
> “David M. Silver” wrote in
> news::
>
>
>>In article ,
>> “David Wright Sr.” wrote:
>>
>
>
> (snip)
>
>
>>>Thought I’d throw this into the mix.
>>
>>I’ve enjoyed a three-book series, Semper Mars (and two others), military
>>SF, written by someone calling himself Ian Douglas, modeled in part I
>>think on WEB Griffin’s writings, for years since 1998 when the first one
>>came out.
>>
>>It seems there is already developed by another race in the past an
>>aware, non-human, intelligent organism, electronic or some manner of
>>artificial intelligence. Problem is: it got rid of its inventors some
>>eons back in another part of the universe; and it conceives it in its
>>self-interest and defense to get rid of all other organic intelligences,
>>lest it be destroyed by carbon-based units. And it’s coming to get us
>>… before we get too far out and cannot be stopped. Whoo-haw, “with
>>guns and knives and clubs,” it’s coming to get us! Charming, as Georges
>>Perrault might say.
>>
>
>
> In spite of the quote, RAH seemed to write about intelligent self-aware
> computers that were loyal to humans, Mike, Minerva, Dora. Was Perrault just
> a pessimist?
>
> Asimov was definitely bent on countering the image of computers/robots as
> displayed in the Douglas stories you mentioned, and RAH appeared to do so
> more often than not, without any such artifices as the 3(or 4) Laws.
>

Perhaps he was saying that it would be difficult to program those three laws in, either hardwired or coded. The examples of Heinlein’s loyal machines all were “programmed” by being raised and nurtured, not by having 3 artificial laws poked into their innards.


Oscagne, High Priest of Skeptics and Cynics
http://users4.ev1.net/~mcgrew/mss
http://users4.ev1.net/~mcgrew/webpage/home.htm
http://oscagne.textamerica.com

From: “David M. Silver”
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Date: Wed, 27 Apr 2005 21:43:33 -0700

In article ,
“David Wright Sr.” wrote:

> “David M. Silver” wrote in
> news::
>
> > In article ,
> > “David Wright Sr.” wrote:
> >
>
> (snip)
>
> >>
> >> Thought I’d throw this into the mix.
> >
> > I’ve enjoyed a three-book series, Semper Mars (and two others), military
> > SF, written by someone calling himself Ian Douglas, modeled in part I
> > think on WEB Griffin’s writings, for years since 1998 when the first one
> > came out.
> >
> > It seems there is already developed by another race in the past an
> > aware, non-human, intelligent organism, electronic or some manner of
> > artificial intelligence. Problem is: it got rid of its inventors some
> > eons back in another part of the universe; and it conceives it in its
> > self-interest and defense to get rid of all other organic intelligences,
> > lest it be destroyed by carbon-based units. And it’s coming to get us
> > … before we get too far out and cannot be stopped. Whoo-haw, “with
> > guns and knives and clubs,” it’s coming to get us! Charming, as Georges
> > Perrault might say.
> >
>
> In spite of the quote, RAH seemed to write about intelligent self-aware
> computers that were loyal to humans, Mike, Minerva, Dora. Was Perrault just
> a pessimist?
>
> Asimov was definitely bent on countering the image of computers/robots as
> displayed in the Douglas stories you mentioned, and RAH appeared to do so
> more often than not, without any such artifices as the 3(or 4) Laws.

Oh, I don’t know about “countering,” by RAH. Michael recorded his throwing rocks, so he could replay his little game, for his own amusement. What if Michael had never found a “not-stupid”? [Well, today we’re going to have a little new amusement … we’ll play a new practical joke: let’s see what happens if we introduce a little extra nitrous oxide into Level Six under Tycho. Or increase the carbon monoxide. I wonder if I’ll get that neat purple color in their skins again.]


David M. Silver

“The Lieutenant expects your names to shine!”
Robert Anson Heinlein, USNA ’29
Lt.(jg), USN, R’td

From: “
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Date: 27 Apr 2005 19:11:48 -0700

David Wright Sr. wrote:
> Interesting quotation from _Friday_.
>
> Georges Perrault says:
>
> “When I was a student, I read some classic stories about humanoid robots.
> They were charming stories and many of them hinged on something called the
> laws of robotics, the key notion of which was that these robots had built
> into them an operational rule that kept them from harming human beings
> either directly or through inaction. It was a wonderful basis for
> fiction… but in practice, how could you do it? What can make a self-
> aware, non-human, intelligent organism–electronic or organic–loyal to
> human beings? I do not how to do it. The articifical-intelligence people
> seem to equally at a loss.”
>
> Thought I’d throw this into the mix.
>
> —
> David Wright Sr.
> If you haven’t joined The Heinlein Society, Why Not?
> https://www.heinleinsociety.org/join.html
> The Heinlein Estate is again matching new member
> registrations and fund raising up to $15,000
> Make your new membership count twice!

You visit them when they are with their mothers and start socializing them when their eyes first open. You feed them and exercise them and teach them how to do useful things, like herding sheep or guarding your home or jumping cold and soaking wet into the bed where your lady is lazing away the morning. You joke with them and walk with them and always talk to them.

And they put you above them and you can see it in their eyes, the good ones: “If the Bear comes by, I will die for you” alternating, of course, with “You need ALL of that cheeseburger, boss?”

Maybe they aren’t intelligent enough to fit your concept but they will do for me. If they only lived longer.

Will in New Haven

“If, after the first twenty minutes, you don’t know who the sucker at
the table is, it’s you.” –Unknown

From: “David Wright Sr.”
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Date: Wed, 27 Apr 2005 22:25:19 -0400

” wrote in
news::

>
> David Wright Sr. wrote:

(snip)

>>
>> Thought I’d throw this into the mix.
>>

(snip)

> You visit them when they are with their mothers and start socializing
> them when their eyes first open. You feed them and exercise them and
> teach them how to do useful things, like herding sheep or guarding your
> home or jumping cold and soaking wet into the bed where your lady is
> lazing away the morning. You joke with them and walk with them and
> always talk to them.

>
> And they put you above them and you can see it in their eyes, the good
> ones: “If the Bear comes by, I will die for you” alternating, of
> course, with “You need ALL of that cheeseburger, boss?”
>
> Maybe they aren’t intelligent enough to fit your concept but they will
> do for me. If they only lived longer.
>

Well, I agree with you. I am just playing Devil’s advocate here. All three of my examples had caring mentors who helped form their development, although I don’t think that Mannie was really aware of what he was doing in the beginning with Mike, but just exhibited his own caring nature.

Friday is so mixed up because she missed a lot of this in early development. Presumably, “Living Artifacts” would have had even less.


David Wright Sr.
Have you ever stopped to think, and
forgot to start again?
To e-mail me, remove ‘t’ from dwrightsr

From: Fred J. McCall
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Date: Thu, 28 Apr 2005 05:03:02 GMT

”wrote:

:You visit them when they are with their mothers and start socializing
:them when their eyes first open. You feed them and exercise them and
:teach them how to do useful things, like herding sheep or guarding your
:home or jumping cold and soaking wet into the bed where your lady is
:lazing away the morning. You joke with them and walk with them and
:always talk to them.
:
:And they put you above them and you can see it in their eyes, the good
:ones: “If the Bear comes by, I will die for you” alternating, of
:course, with “You need ALL of that cheeseburger, boss?”
:
:Maybe they aren’t intelligent enough to fit your concept but they will
:do for me. If they only lived longer.

Works for me. I love my dog more than I do human beings.

You get what you give.


“The way of the samurai is found in death. If by setting one’s heart
right every morning and evening, one is able to live as though his
body were already dead, he gains freedom in The Way. His whole life
will be without blame, and he will succeed in his calling.”
— “Hagakure Kikigaki”, Yamamoto Tsunetomo

From: “
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Date: 28 Apr 2005 03:15:38 -0700

Fred J. McCall wrote:
> “” wrote:
>
> :You visit them when they are with their mothers and start socializing
> :them when their eyes first open. You feed them and exercise them and
> :teach them how to do useful things, like herding sheep or guarding your
> :home or jumping cold and soaking wet into the bed where your lady is
> :lazing away the morning. You joke with them and walk with them and
> :always talk to them.
> :
> :And they put you above them and you can see it in their eyes, the good
> :ones: “If the Bear comes by, I will die for you” alternating, of
> :course, with “You need ALL of that cheeseburger, boss?”
> :
> :Maybe they aren’t intelligent enough to fit your concept but they will
> :do for me. If they only lived longer.
>
> Works for me. I love my dog more than I do human beings.
>
> You get what you give.
>
> —
> “The way of the samurai is found in death. If by setting one’s heart
> right every morning and evening, one is able to live as though his
> body were already dead, he gains freedom in The Way. His whole life
> will be without blame, and he will succeed in his calling.”
> — “Hagakure Kikigaki”, Yamamoto Tsunetomo

MY dog and human beings in general? No contest. DOGS and human beings in general, no contest. There are some human beings whom I value more than any dog but damn few. But I am, to quote Mr. Silver, an uncivilized barbarian.

Of course, no dog would have stolen your tagline.

Will in New Haven

“The way of the samurai is found in death. If by setting one’s heart
right every morning and evening, one is able to live as though his
body were already dead, he gains freedom in The Way. His whole life
will be without blame, and he will succeed in his calling.”
— “Hagakure Kikigaki”, Yamamoto Tsunetomo

From: Fred J. McCall
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Date: Thu, 28 Apr 2005 13:17:29 GMT

”wrote:

:
:Fred J. McCall wrote:
:> “” wrote:
:> :
:> :And they put you above them and you can see it in their eyes, the good
:> :ones: “If the Bear comes by, I will die for you” alternating, of
:> :course, with “You need ALL of that cheeseburger, boss?”
:> :
:> :Maybe they aren’t intelligent enough to fit your concept but they will
:> :do for me. If they only lived longer.
:>
:> Works for me. I love my dog more than I do human beings.
:>
:> You get what you give.
:
:MY dog and human beings in general? No contest. DOGS and human beings
:in general, no contest. There are some human beings whom I value more
:than any dog but damn few. But I am, to quote Mr. Silver, an
:uncivilized barbarian.

So am I. Of course, I made a considered decision to be a barbarian, after looking around at the ‘civilized’ world and seeing all the things wrong with it.

I even used to use the name ‘Barbarian’ and sign with a little axe on some discussion groups.

(>||


“Have you noticed that the most subtle shedders of blood have always
been the most civilized gentlemen? If civilization has not made man
more bloodthirsty, it has at least made him more hideously and
abominably bloodthirsty. Formerly he saw bloodshed as an act of
justice, and with a clear conscience exterminated whomever he
thought he should. And now we consider bloodshed an abomination,
yet engage in this abomination more than ever.”
— Dostoyevsky “Notes From The Underground”

From:
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Date: Thu, 28 Apr 2005 11:10:45 -0400

On Wed, 27 Apr 2005 19:40:13 -0400, “David Wright Sr.”
wrote:

>Interesting quotation from _Friday_.
>
>Georges Perrault says:
>
>”When I was a student, I read some classic stories about humanoid robots.
>They were charming stories and many of them hinged on something called the
>laws of robotics, the key notion of which was that these robots had built
>into them an operational rule that kept them from harming human beings
>either directly or through inaction. It was a wonderful basis for
>fiction… but in practice, how could you do it? What can make a self-
>aware, non-human, intelligent organism–electronic or organic–loyal to
>human beings? I do not how to do it. The articifical-intelligence people
>seem to equally at a loss.”
>
>Thought I’d throw this into the mix.

There are robots and robots, you might say. We have robots today, mostly on factory assembly lines. They have no self awareness though. The Luddites like to complain that these put human workers out of jobs. Well, tough. Electricity and light bulbs put whale oil lamp makers out of business.

I seriously doubt if Heinlein had any objections to these robots.

If he was going to warn about the dangers of truly self aware robots, it seems that he would have done so in a somewhat more clear manner than with the “robots/stobor” thing.

Among other things, I think Heinlein was a strong advocate of using the right tool for whatever job you are doing. You generally don’t use a hammer to cut a board, you generally don’t use a saw to drive a nail, and you generally would not use a robot to write a story. This does not mean we should forego the use of hammers, saws, or robots though.

A general warning about any and all robots makes no sense to me, in light of this. Georges Perrault’s comments make more sense, and a coincidence that Heinlein did not notice at the time makes sense to me.
From: “Big_Fella”
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Newsgroups: alt.fan.heinlein
Date: Fri, 29 Apr 2005 01:49:27 +1000

“David Wright Sr.” wrote in message
news:Xns9645C818DD842nokvamli@63.223.7.253…
> Interesting quotation from _Friday_.
>
> Georges Perrault says:
>
> “When I was a student, I read some classic stories about humanoid robots.
> They were charming stories and many of them hinged on something called the
> laws of robotics, the key notion of which was that these robots had built
> into them an operational rule that kept them from harming human beings
> either directly or through inaction. It was a wonderful basis for
> fiction… but in practice, how could you do it? What can make a self-
> aware, non-human, intelligent organism–electronic or organic–loyal to
> human beings? I do not how to do it. The articifical-intelligence people
> seem to equally at a loss.”
>
> Thought I’d throw this into the mix.

As we’re throwing things into the mix, I’d like to introduce someone with the brain the size of a planet, exclamation mark. Marvin the Maladroit Android, from The Hitchickers Guide to the Galaxy.
:-
> —
> David Wright Sr.
> If you haven’t joined The Heinlein Society, Why Not?
> https://www.heinleinsociety.org/join.html
> The Heinlein Estate is again matching new member
> registrations and fund raising up to $15,000
> Make your new membership count twice!

From: “David Wright Sr.”
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Date: Thu, 28 Apr 2005 12:51:23 -0400

“Big_Fella” wrote in news:0C7ce.281$_96.5531
@nnrp1.ozemail.com.au:

(snip)

>>
>> Thought I’d throw this into the mix.
>
> As we’re throwing things into the mix, I’d like to introduce someone with
> the brain the size of a planet, exclamation mark. Marvin the Maladroit
> Android, from The Hitchickers Guide to the Galaxy.
>:-

Marvin Lives! ;0)>

http://home.alltel.net/dwrighsr/Marvin.html


The next meetings of the Heinlein Readers Group
Thursday 5/26/05 @ 9:00 P.M. EST and
Saturday 5/28/05 @ 5:00 P.M. EST
The topic for this discussion will be:
“Beware the stobor – Robots? in Heinlein”
See: https://www.heinleinsociety.org/readersgroup/index.html

From: pixelmeow
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Date: Thu, 28 Apr 2005 14:37:03 -0400

You won’t *believe* what “David Wright Sr.”
said on Thu, 28 Apr 2005 12:51:23 -0400, in alt.fan.heinlein!!!

>”Big_Fella” wrote in news:0C7ce.281$_96.5531
>@nnrp1.ozemail.com.au:
>
>(snip)
>>>
>>> Thought I’d throw this into the mix.
>>
>> As we’re throwing things into the mix, I’d like to introduce someone with
>> the brain the size of a planet, exclamation mark. Marvin the Maladroit
>> Android, from The Hitchickers Guide to the Galaxy.
>>:-
>Marvin Lives! ;0)>
>
>http://home.alltel.net/dwrighsr/Marvin.html

ROFL!!!


~teresa~
AFH Barwench

=^..^= “Never try to outstubborn a cat.” =^..^=
http://www.storesonline.com/site/rowanmystic
email my first name at pixelmeow dot com

http://pixelmeow.com/

From: “JaneE!”
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Date: 28 Apr 2005 11:59:11 -0700

pixelmeow wrote:
> You won’t *believe* what “David Wright Sr.”
> said on Thu, 28 Apr 2005 12:51:23 -0400, in alt.fan.heinlein!!!
>
> >”Big_Fella” wrote in news:0C7ce.281$_96.5531
> >@nnrp1.ozemail.com.au:
> >
> >(snip)
> >>>
> >>> Thought I’d throw this into the mix.
> >>
> >> As we’re throwing things into the mix, I’d like to introduce someone with
> >> the brain the size of a planet, exclamation mark. Marvin the Maladroit
> >> Android, from The Hitchickers Guide to the Galaxy.
> >>:- >
> >Marvin Lives! ;0)>
> >
> >http://home.alltel.net/dwrighsr/Marvin.html
>
> ROFL!!!
>
> —
> ~teresa~
> AFH Barwench
>
> =^..^= “Never try to outstubborn a cat.” =^..^=
> http://www.storesonline.com/site/rowanmystic
> email my first name at pixelmeow dot com
> https://www.heinleinsociety.org/
> http://pixelmeow.com/

Very nice.

;o)

JaneE!
From: “Big_Fella”
Newsgroups: alt.fan.heinlein
Date: Fri, 29 Apr 2005 07:53:04 +1000

“David Wright Sr.” wrote in message
news:Xns964682C7F64FBnokvamli@63.223.7.253…
> “Big_Fella” wrote in news:0C7ce.281$_96.5531
> @nnrp1.ozemail.com.au:
>
>
> (snip)
>
>>>
>>> Thought I’d throw this into the mix.
>>
>> As we’re throwing things into the mix, I’d like to introduce someone with
>> the brain the size of a planet, exclamation mark. Marvin the Maladroit
>> Android, from The Hitchickers Guide to the Galaxy.
>>:->
>
> Marvin Lives! ;0)>
>
> http://home.alltel.net/dwrighsr/Marvin.html
>
> —
> The next meetings of the Heinlein Readers Group
> Thursday 5/26/05 @ 9:00 P.M. EST and
> Saturday 5/28/05 @ 5:00 P.M. EST
> The topic for this discussion will be:
> “Beware the stobor – Robots? in Heinlein”
> See: https://www.heinleinsociety.org/readersgroup/index.html

Marvin’s getting above himself now. Brain the size of the universe indeed. Where will we be next, opening doors? I ask you…
:-&;lt;) From: “Dr. Rufo”
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Date: Thu, 12 May 2005 02:20:21 GMT

LNC wrote:
> if
> it’s word games (like backwards words) you want to play, “Beware the
> stobor” is an anagram of “A Hebrew et robots.”

I was just reviewing this and it strikes me that the Grand Master may have made a Very Serious Error here.
Stipulating (1) the anagrammatic content of the statement quoted by LN is “correct” and (2) “some” Hebrew actually *ate* (that is the “proper English construction” for the past tense of “to eat” ain’t it?) “robots.”
— I pose these questions:

Are these robots “kosher”? or “tref”? or “pareve”?

Does this affect the injunction to the students?

Is this a manifestation of some sort of subliminal anti-Semitic bias rearing its ugly head?

Rufe
From: “
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Date: 11 May 2005 20:03:05 -0700

Dr. Rufo wrote:
> LNC wrote:
> > if
> > it’s word games (like backwards words) you want to play, “Beware
the
> > stobor” is an anagram of “A Hebrew et robots.”
>
> I was just reviewing this and it strikes me that the Grand Master
> may have made a Very Serious Error here.
> Stipulating (1) the anagrammatic content of the statement quoted by
> LN is “correct” and (2) “some” Hebrew actually *ate* (that is the
> “proper English construction” for the past tense of “to eat” ain’t
> it?) “robots.”
> — I pose these questions:
>
> Are these robots “kosher”? or “tref”? or “pareve”?

Well, what do YOU think?

>
> Does this affect the injunction to the students?

How could it not? > > Is this a manifestation of some sort of subliminal anti-Semitic bias > rearing its ugly head? Isn’t it almost always?

Fortunately, I was able to give you the answers to your concerns and in the best Jewish tradition.

Will in New Haven


Do not walk behind me, for I may not lead. Do not walk ahead of me,
for I may not follow. Do not walk beside me either. Just pretty much
leave me the hell alone.

From: “Sean”
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Date: Thu, 12 May 2005 21:19:18 +1000

“Dr. Rufo” wrote in message
news:F3zge.5334$
>
>
> LNC wrote:
>> if
>> it’s word games (like backwards words) you want to play, “Beware the
>> stobor” is an anagram of “A Hebrew et robots.”
>
> I was just reviewing this and it strikes me that the Grand Master may have
> made a Very Serious Error here.
> Stipulating (1) the anagrammatic content of the statement quoted by LN is
> “correct” and (2) “some” Hebrew actually *ate* (that is the “proper
> English construction” for the past tense of “to eat” ain’t it?) “robots.”
> — I pose these questions:
>
> Are these robots “kosher”? or “tref”? or “pareve”?
>
> Does this affect the injunction to the students?
>
> Is this a manifestation of some sort of subliminal anti-Semitic bias
> rearing its ugly head?

The stobor warning was added to the recall instructions of the solo survival test, and said “Watch out for stobor”, and not “Beware the stobor”. Perhaps the only subliminal resulting from the actual quote was created when the student broke out in a cold sweat, perhaps thinking they had not been paying attention in class on the day “stobor” was mentioned, or had been sick on that particular day……. and now their life might depend on that knowledge.


Sean
RAH on Australians in _Tramp Royale_ “They think as we do, only more so.”

From: pixelmeow
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Date: Thu, 12 May 2005 08:49:13 -0400

You won’t *believe* what “Dr. Rufo” said on
Thu, 12 May 2005 02:20:21 GMT, in alt.fan.heinlein!!!

>
>
>LNC wrote:
>> if
>> it’s word games (like backwards words) you want to play, “Beware the
>> stobor” is an anagram of “A Hebrew et robots.”
>
> I was just reviewing this and it strikes me that the Grand Master
>may have made a Very Serious Error here.
> Stipulating (1) the anagrammatic content of the statement quoted by
>LN is “correct” and (2) “some” Hebrew actually *ate* (that is the
>”proper English construction” for the past tense of “to eat” ain’t
>it?) “robots.”
> — I pose these questions:
>
>Are these robots “kosher”? or “tref”? or “pareve”?
>
>Does this affect the injunction to the students?
>
>Is this a manifestation of some sort of subliminal anti-Semitic bias
>rearing its ugly head?
>
>Rufe

“et” is also Latin for “and”… “et tu, Brute?”


~teresa~
AFH Barwench

=^..^= “Never try to outstubborn a cat.” =^..^=
http://www.storesonline.com/site/rowanmystic
email my first name at pixelmeow dot com

http://pixelmeow.com/

Posted Via Usenet.com Premium Usenet Newsgroup Services
———————————————————-
** SPEED ** RETENTION ** COMPLETION ** ANONYMITY **
———————————————————-

From: “David M. Silver”
Newsgroups: alt.fan.heinlein
Subject: Re: Notice of no meeting of Heinlein Readers Group this month
Date: Thu, 12 May 2005 06:23:18 -0700

In article ,
pixelmeow wrote:

> You won’t *believe* what “Dr. Rufo” said on
> Thu, 12 May 2005 02:20:21 GMT, in alt.fan.heinlein!!!
>
> >
> >
> >LNC wrote:
> >> if
> >> it’s word games (like backwards words) you want to play, “Beware the
> >> stobor” is an anagram of “A Hebrew et robots.”
> >
> > I was just reviewing this and it strikes me that the Grand Master
> >may have made a Very Serious Error here.
> > Stipulating (1) the anagrammatic content of the statement quoted by
> >LN is “correct” and (2) “some” Hebrew actually *ate* (that is the
> >”proper English construction” for the past tense of “to eat” ain’t
> >it?) “robots.”
> > — I pose these questions:
> >
> >Are these robots “kosher”? or “tref”? or “pareve”?
> >
> >Does this affect the injunction to the students?
> >
> >Is this a manifestation of some sort of subliminal anti-Semitic bias
> >rearing its ugly head?
> >
> >Rufe
>
> “et” is also Latin for “and”… “et tu, Brute?”

Ooops, Teresa’s just revealed our secret. Heinlein tried to, but thus far no one had figured it out. Most Hebrews have robots, right, Yisroel? We call them golem, because they’re made out of lumps of clay, purified white robes, dust, water, and a purity of purpose, rather than mere metal, glass, and, electronic components, gears and such, like Arnold our current Governor was made.

http://golem.plush.org/history/


David M. Silver

“The Lieutenant expects your names to shine!”
Robert Anson Heinlein, USNA ’29
Lt.(jg), USN, R’td

From:
Newsgroups: alt.fan.heinlein
Subject: Another robot reference
Date: Mon, 02 May 2005 10:09:41 -0400

I came across another brief reference to robots. Early in Double Star, Lorenzo is being dragged along through the spaceport by Dak, and compares it to being dragged out of a danger zone by a traffic robot.

I still maintain that Heinlein never indicated any qualms about robots or AI in general, Friday notwithstanding. Ok, an AI, whether electronic or genetically engineered, might be inclined to crash a plane or SB just for the hell of it, because they are not human and never can be human. It seems to me that there is an incredibly simple answer to this problem: don’t use an AI to pilot a plane or SB. This does not mean we should not employ them for any purpose, though. Each job could be analyzed on the basis of whether or not robots are suited, considering how much damage they can do and whether or not safeguards can be implemented, then put them to work. I am reasonably certain Heinlein would have seen it this way, rather than any sort of general warning that robots are always a bad idea.
From: lal_truckee
Newsgroups: alt.fan.heinlein
Subject: Re: Another robot reference
Date: Mon, 02 May 2005 16:01:14 GMT

wrote:
>
> I still maintain that Heinlein never indicated any qualms about robots
> or AI in general, Friday notwithstanding. Ok, an AI, whether
> electronic or genetically engineered, might be inclined to crash a
> plane or SB just for the hell of it, because they are not human and
> never can be human. It seems to me that there is an incredibly simple
> answer to this problem: don’t use an AI to pilot a plane or SB. This
> does not mean we should not employ them for any purpose, though.

Seems to me that Friday’s warning on AIs and robots was misguided. Even if currently a shuttle pilot an AI/robot intellegence potentially could have its conscience transfered to other situations, even to androidal bodies if desired, and could theoretically be (and would know it could be) immortal. Therefore an AI/robot intelligence has much more to lose than a human with his limited lifespan and destined to die in pain and fear. I think flying in a human controlled contrivance has serious safety issues, since the pilot is likely to decide he doesn’t want to go into the unknown alone. From:
Newsgroups: alt.fan.heinlein
Subject: Re: Another robot reference
Date: Mon, 02 May 2005 14:23:01 -0400

On Mon, 02 May 2005 16:01:14 GMT, lal_truckee
wrote:

> wrote:
>>
>> I still maintain that Heinlein never indicated any qualms about robots
>> or AI in general, Friday notwithstanding. Ok, an AI, whether
>> electronic or genetically engineered, might be inclined to crash a
>> plane or SB just for the hell of it, because they are not human and
>> never can be human. It seems to me that there is an incredibly simple
>> answer to this problem: don’t use an AI to pilot a plane or SB. This
>> does not mean we should not employ them for any purpose, though.
>
>Seems to me that Friday’s warning on AIs and robots was misguided. Even
>if currently a shuttle pilot an AI/robot intellegence potentially could
>have its conscience transfered to other situations, even to androidal
>bodies if desired, and could theoretically be (and would know it could
>be) immortal. Therefore an AI/robot intelligence has much more to lose
>than a human with his limited lifespan and destined to die in pain and
>fear. I think flying in a human controlled contrivance has serious
>safety issues, since the pilot is likely to decide he doesn’t want to go
>into the unknown alone.

The only real way to know how an AI would react is to construct one and watch. Even if they would be totally amoral and quite willing to destroy themselves in order to kill humans, I stand by my position that they would still have uses and that Heinlein would have recognized this, and that “stobor” was a coincidence, not a warning about robots.
End of Postings
Go Beginning of Posts
Here Begins the Discussion

You have just entered room “heinleinreadersgroupchat.”

Reilloc: Evening, David.

DavidWrightSr: Hi folks. Just dropped in early to make sure I got a good copy of the log.

LVPPakaAspie: Early? I thought it was supposed to start at 7?

Reilloc: Glad you did since my log-keeping skills almost always bomb.

DavidWrightSr: Nope 9:00 EDT

LVPPakaAspie: Oh.

LVPPakaAspie: This was in the post LNC put up a few days ago:

LVPPakaAspie: > WHEN: May 26, 2005, 7:00 PM EDT and May 28, 2005, 5:00 PM EDT

Reilloc: I think that’s right.

LVPPakaAspie: It is now 7:20 PM EDT.

DavidWrightSr: Well, we’ve gotten our wires crossed. 9:00 has always been the starting time.

Reilloc: 7:00 CDT

LVPPakaAspie: Maybe more people will show up then.

LVPPakaAspie: Maybe you meant that, but you posted EDT.

Cmaj7Dmin7 has entered the room.

LVPPakaAspie: Since we are apparently waiting for 8 or 9 pm edt, want a suggestion for a future chat topic?

Reilloc: Sure.

LVPPakaAspie: We are discussing Friday in a thread. That novel left a lot of unanswered questions, such as what Red Thursday was really all about and why boss wanted her to memorize the address that turned out to be Mosby and Finders Inc. cont.

LVPPakaAspie: Would the novel have been better or worse if these things had been spelled out more clearly?

Reilloc: Interesting question.

LVPPakaAspie: My own opinion is that it would have been better.

Reilloc: I’d have to read it again…

BPRAL22169 has entered the room.

BPRAL22169: Hey, for some reason I had 6:00 p.m. PDT in mind.

Reilloc: Hi, Bill.

BPRAL22169: I see both your heads are here, LNC. What a hoopy frood you must be! Really know where your towel is.

Reilloc: The reason could be that I can’t tell time and disseminated about half a dozen different times.

Cmaj7Dmin7: I’m a stobor. Ignore me.

BPRAL22169: Ah, so you’ve doubled back on yourself? Well, now you can tell people the “two faced” lawyer jokes are accurate.

Reilloc: Gentlemen, it being somewhat after the appointed hour and there being present enough warmish bodies to declare a quasi-quorum, I’m starting off this session.

Reilloc: As newcomers arrive, if newcomers arrive, we’ll absorb them into the discussion.

Reilloc: So, starting off…

BPRAL22169: Just a sec — LVPP, are you the one who has been doing those “STarship Troopers” Chapter headings thing on AFH?

Reilloc: Welcome to the monthly online Heinlein Readers Group chat.

BPRAL22169: (I just saw the “aka Aspie” and it rang bells.)

LVPPakaAspie: Yes, that’s me. Maybe it is time for another one.

Reilloc: While there’s often an announced topic, the format here’s flexible enough to permit the discussing of anything Heinlein, marginally Heinlein or not at all Heinlein so long as the participants can tolerate it.

BPRAL22169: I am doing some commentaries on ST; have you kept the earlier ones? I remember reading and enjoying them, but I didn’t keep themm (I should have learned by now!)

LVPPakaAspie: No, but they should be available in Google groups, and probably fairly easy to search for.

Reilloc: Bill, in your commentary on TitS, any mention of the stobor thing?

BPRAL22169: Sorry, I was unclear — I was doing commentary on Starship Troopers.

BPRAL22169: Joe Major’s 59 page commentary was a little disappointing.

Reilloc: What I mean is, when and if you do an introductory commentary on TitS, do you plan any mention of stobor?

pixelmeow has entered the room.

pixelmeow: hi!

Reilloc: Hi, Teresa

pixelmeow: what’s going on?

Reilloc: So far, not much. We’re just getting warmed over.

Reilloc: I mean “up.”

pixelmeow: “over”?

pixelmeow: 🙂

pixelmeow: I’m working on the book list page, boy is it out of shape.

Reilloc: We were waiting for you and your take on stobor, robots, or amanaplanacanalpanama

LVPPakaAspie: Have you been following the current thread about military strategies in ST, Bill?

BPRAL22169: I think it has to be mentioned, because it’s gotten a lot of attention, in a way that must have frustrated Heinlein, because what people usually talk about leads away from what it’s doing in the story. The thing people most often

BPRAL22169: seem to concentrate on is “robots spelled backwards.”

BPRAL22169: I dip into the thread occasionally — it’s not the kind of discussion (so far) that I can do much with.

pixelmeow: you know I never even considered robots spelled backwards.

LVPPakaAspie: I still think it is nothing but a coincidence.

pixelmeow: I never think about word games like that.

Reilloc: Maybe there’s nothing to it.

BPRAL22169: I think so, too.

BPRAL22169: The same as “Mota” (6th Col) is “Atom” spelled backwards.

BPRAL22169: Or “reilloc” is —

BPRAL22169: But we won’t go there.

Reilloc: Oh, go there…

LVPPakaAspie: As I have stated on the NG, I really don’t think Heinlein had anything against robots, and although they are rare in the canon, they do show up here and there.

BPRAL22169: Not me. Uh-uh.

BPRAL22169: I’m having a hard time coming up with a humaniform robot in Heinlein. Refresh my memory, please?

pixelmeow: I’m sitting here trying to come up with same.

Reilloc: I can’t think of one.

LVPPakaAspie: I don’t think any humaniform ones showed up.

pixelmeow: I’m also thinking of how glad I am that his work isn’t full of them, that’s not what I read SF for.

DavidWrightSr: The only ones I can recall were the luggage carrier types on Lanador

BPRAL22169: I think after Adam Link, Asimov pretty well covered that territory. RAH probably didn’t see any need to go there.

LVPPakaAspie: I don’t see all that much difference between a huminiform one and one that is designed for function.

Reilloc: Didn’t see a need to go or consciously avoided, Bill?

BPRAL22169: Generalist rather than special-function.

BPRAL22169: Possibly both — one thought leading to the other.

BPRAL22169: He may have thought of it as a “solved problem” now and went looking for some unsolved ones to think about.

Reilloc: Still, he didn’t shy away from artificial intelligence.

pixelmeow: Well, he went one step further, and IMO better, with his “machine people”, Gay and Dorable, and all the others.

BPRAL22169: (solved, at least in terms of definitive story treatment)

BPRAL22169: True.

LVPPakaAspie: I mentioned on the NG that traffic robots are mentioned in Double Star, when Lorenzo compares being pulled through the port by Dak to being dragged out of a danger zone by a traffic robot.

LVPPakaAspie: True. They were fully self aware intelligent robots, just not humaniform.

Reilloc: Interesting that RAH did the fictional separation of mind and body that the wisest philosophers determined couldn’t happen…

BPRAL22169: About the same time he was referring to his dishwasher and his a/c unit as a “robot appliance” in letters — I think the usage has changed since the mid-fifties.

BPRAL22169: You know, I had never thought of it in quite that fashion — it bears on Heinlein’s uneasy relationship with Platonic idealism.

Reilloc: Explain, Bill.

Reilloc: I’m not sure I understand what you mean.

Reilloc: Or not…

BPRAL22169: The mind-body split goes back in western philosophy to Plato, who spoke of the mind as a chariot trying to ascend into heaven (the ideal world) while the chariot kept trying to descend to the earth.

BPRAL22169: “Ideal” meaning the perfect ideas of which all material things are an imperfect reflection.

Reilloc: That would make me uncomfortable with the undefinable terms alone.

BPRAL22169: That’s what Diogenes the Cynic was looking for — not the perfect man, but the Platonic idea of man. That’s why he was a cynic = dog to Plato’s intellectual descendants. No respect!

pixelmeow has left the room.

BPRAL22169: Idealism got a real boost in the 1790’s when Kant put a solid analytic footing under it — his “critical idealism.” Kant’s logic was unassailable until the 20th century and highest

Reilloc: So, how close do “classic” sci fi robots come to a Platonic ideal?

BPRAL22169: prestige in philosophical thinking at the time Heinlein was being educated.

BPRAL22169: I wasn’t thinking of that in particular — it was your remark about the mind and the body split in the case of Minerva and Mycroft Holmes and other that set off tht train of thought.

BPRAL22169: So that relates more to Heinlein’s AIs than to robots per se.

Reilloc: Seems to me there’s no comparison since robots might be argued to have neither minds nor bodies.

LVPPakaAspie: Mannie from Moon might disagree with that, Reilloc

Reilloc: Why do you think that, Don?

LVPPakaAspie: He thought that Mycroft had both a mind and a body.

BPRAL22169: But Mike wasn’t a robot.

LVPPakaAspie: Just not humaniform.

DavidWrightSr: Interesting Bill. One of Korzybski’s main points was the concept of mind-body organism as a whole.

Reilloc: Is a metal superstructure a body?

BPRAL22169: I think part of the idea of robot is “standalone”

BPRAL22169: Yes, Daivd — and it’s in Ouspensky, as well — two big influences on Heinlein’s thinking.

BPRAL22169: I think the body-memory that was in both “Heil!” and “I Will Fear No Evil” come out of Ouspensky.

LVPPakaAspie: I can’t find my copy of Moon, but I think I remember at some point Mannie said that, yes, that was a body.

Reilloc: I reinstate the question: is a metal superstructure a body?

Reilloc: Can it feel like flesh feels?

BPRAL22169: It’s been an old, old argument in AI, whether you could have a mind without a body, so Heinlein may have felt it necessary to give an explicit answer to that for Mike.

Reilloc: Similarly, is artificial intelligence a mind since it’s emotionless?

LVPPakaAspie: If we came up with a truly self aware conscious AI, would it necessarily be emotionless?

BPRAL22169: And the $64,000 question: does it matter at all?

Reilloc: Mike was written self-aware but self-awareness doesn’t necessarily mean you suddenly have emotions.

DavidWrightSr: Mike certainly had emotions.

Reilloc: He did?

Reilloc: He was written to simulate emotions but did he have emotions?

BPRAL22169: I think there’s a definite answer to that one: he had a sense of humor and he had what he described as an orgasm of the mind when the strikes hit Earth simultaneously. That comes close enough to qualify IMO

DavidWrightSr: It was Mannie satisfying his loneliness that helped him development further

BPRAL22169: AND he is known to have simulated emotion, as well.

LVPPakaAspie: If it looks like an emotion and acts like an emotion, it’s an emotion.

Reilloc: Really?

Reilloc: What’s an emotion look like?

LVPPakaAspie: All the panel lights coming on at once.

LVPPakaAspie: Or a ripple of them to simulate a chuckle.

Reilloc: As for sense of humor, Bill, there’ve been a couple of chats on that with no agreement even as to what it is.

BPRAL22169: There’s always the extensional definition: it’s what I mean when I point to it. Manny did in fact point to it for us, in a metaphorical sort of way.

DavidWrightSr: Whether a real AI could have emotions is not the point, RAH wrote it that Mike did.

BPRAL22169: It’s actually interesting that this point is coming up for discussion — for a very long time it was just assumed that if you had mimicry of intelligence, it would naturally have all the other things human beings have.

BPRAL22169: We no longer make that assumption automatically.

Reilloc: I tend to agree that it’s necessary to suspend disbelief to enjoy the novel, Davie; however, for the sake of analysis as opposed to fictional enjoyment, is it genuinely believable?

Reilloc: David

Reilloc: I don’t think I’ve ever called you Davie.

DavidWrightSr: How do we know that you or anyone else here is really intelligent?

LVPPakaAspie: There is really only one way to find out: program a truly conscious AI, then find out if it has emotions.

Reilloc: Wouldn’t you have to give it a flesh body, Don?

LVPPakaAspie: Don’t know about you, tovarisch, but I am. “)

LVPPakaAspie: I don’t see why.

BPRAL22169: I think you can believe in the presupposition of the book — whether or not AIs in general would have that capability, this one does.

Reilloc: I don’t believe that emotions are a purely mental phenomenon, are they?

LVPPakaAspie: Although having two different ones, one with a flesh body, the other purely electronic, might give us interesting insights.

BPRAL22169: I think the psychiatric definition of an emotion is something like “a thought with a somatic component.”

Reilloc: Then, Bill, you’d need flesh, no, in your emotional AI?

BPRAL22169: In other words, if it doesn’t move the body, it’s not an emotion.

LVPPakaAspie: On a somewhat related topic, how many fans of 2001: A Space Odssey do we have here?

BPRAL22169: Or the equivalent.

Reilloc: Okay, so we’ve got Heinlein writing characters who aren’t flesh and blood humans…

LVPPakaAspie: Mike’s desire to only talk to “not-stupids” sounded like an emotion to me.

BPRAL22169: To fit that definition, sure. Some of the best sf, however, is about having to rework our definitions — Little Fuzzy, for instance.

Reilloc: But not free-standing mechanical men.

BPRAL22169: His reluctance to wound Manny emotionally sounds like emotion, too.

Reilloc: Isn’t that a glaring oversight?

LVPPakaAspie: Or The Star Beast.

BPRAL22169: Why”oversight”?

Reilloc: The possibilities are fascinating, if treated by a highly competent storyteller like Heinlein.

DavidWrightSr: Since we really have no notion of what self-awareness is or how it comes about, I agree with RAH that artificial intelligences could well have emotions. Wye Knott?

Reilloc: I would tend to disagree since AI’s can’t “feel.”

BPRAL22169: Well, there is a bottom line here: Heinlein was interested in what he was interested in and not interested in things he was not interested in. That is, to some degree, an imponderable.

Reilloc: Question.

BPRAL22169: My bottom line is that I would rather have had him writing about things he was interested in.

Reilloc: Did Mike have beliefs?

BPRAL22169: The alternative is too awful to consider.

BPRAL22169: (sorry, that’s alternative to writing about what he was interested in, not alternative to beliefs)

Reilloc: Yes, Bill.

Reilloc: But did Mycroft Holmes or Adam Selene have beliefs?

BPRAL22169: What do you mean by “beliefs”?

LVPPakaAspie: That is a good question.

BPRAL22169: He acted like he believed in loyalty to his friends. Does that qualify?

LVPPakaAspie: I think he did. He had a belief that the lunar revolution was a good idea. As to exactly WHY he believed this, that is certainly open to discussion, but he did believe it.

Reilloc: Did Mike accept certain things without the necessity of formal proof?

LVPPakaAspie: Maybe he thought it was a king sized joke, and believed that jokes were fun, maybe he believed that helping his first friend Mannie was worthwhile, maybe he believed that food riots in a few years would hurt his first friend.

LVPPakaAspie: Whatever, he believed in the revolution.

Reilloc: You say he “believed,” eh?

Reilloc: Is wanting to make somebody happy by doing something the same thing as believing in it?

LVPPakaAspie: It means that you believe making that person happy is worthwhile. It might or might not mean belief in other things.

Reilloc: I belive you’re mixing “concluding” and “believing.”

Reilloc: Would Mike believe in god?

BPRAL22169: We know Mike accepted “certain things without the necessity of formal proof.” One of the first things we learn about him is the Multi-Evaluation part of his design — he comes to conclusions on inadequate information

BPRAL22169: information to support a formal proof.

LVPPakaAspie: Possibly, but I’m not sure I understand what you are getting at. Feel free to expand on it.

LVPPakaAspie: I cannot imagine Mike being anything but an agnostic. Insufficient evidence, even for a Highly Optional, Logical, Multi Evaluating Supervisor.

Reilloc: Are you saying the smarter you are the more likely you are not to believe in a supreme being?

LVPPakaAspie: Not smarter, more logical.

LVPPakaAspie: They can be the same thing, but not necessarily.

BPRAL22169: i think there is a correlation between atheism and effective degree of education.

BPRAL22169: The more critical thinking you do, the less likely you are to believe in a Supreme Deity.

LVPPakaAspie: Or, at the very least, the less likely you are to think you have the one and only Right Answer.

BPRAL22169: Well — if the colleges are any guide, I’m not too sure about that one.

Reilloc: What’s critical thinking, Bill?

LVPPakaAspie: Keep in mind there are two forms of atheist: the sort who insists that there is no god, and the sort who does not know but considers it to be unlikely.

Reilloc: Is that pure intellectualization divested of emotionalism?

BPRAL22169: generating adequate hypotheses to cover an event you’re looking at and weighing the alternatives to come to a “best guess” at the most likely.

BPRAL22169: That’s how I would functionally define critical thinking. Not a formal defintion, of course.

Reilloc: Then many of those hypotheses have to include emotionalism, you’re saying?

BPRAL22169: It’s certainly something I would include in the field of data.

Reilloc: That’s prudent since to do otherwise suggest that mind and body can be separated.

BPRAL22169: But a lot depends on how you figure things are related to other things. You can take positions on whether emotions are necessarily connected to intelligence, but they are dialectical positions, not matters of evidence

LVPPakaAspie: Most religions are quite insistent that mind and body can be separated.

BPRAL22169: (so far, at least)

Reilloc: They are, Don?

Reilloc: I’m unfamilar with them.

Reilloc: Which ones are those?

LVPPakaAspie: Christians say that the spirit lives on after the physical body dies.

LVPPakaAspie: Even if you are not one, you HAVE to be familiar with that religion, at least.

BPRAL22169: Actually, no. The New Testament insists that at the Rapture the physical bodies will be restored.

Reilloc: So, you’re saying the Christians say that the spirit’s purely an intellectual creature?

Reilloc: I’ve never heard that before.

LVPPakaAspie: Ok, then Jews say it.

BPRAL22169: The idea that the spirit can survive the body is a heresy, not orthodox Christianity.

Reilloc: I’ve never heard that, either.

DavidWrightSr: Whoa. play that one over again Bill.

BPRAL22169: It’s the basis on which necromancy is condemned in the new testament.

LVPPakaAspie: I don’t know if they say that it’s purely intellectual, just that death of the physical body is not The End for a believer.

Reilloc: Okay, guys.

BPRAL22169: Ah, but that’s a different proposition, LVPP — not survival of the spirit without body.

Reilloc: Let’s take a brief break so my body can do something my mind thinks it ought to do.

Reilloc: About five minutes and reconvene?

BPRAL22169: Was that sufficient additional material, David?

LVPPakaAspie: Ok with me.

Reilloc: Naturally, continue to talk if you like and I’ll catch up when I get back.

BPRAL22169: fine by me.

BPRAL22169: The technical definition of “soul” in CAtholic theology is “the animating principle of the body.” Therefore, no body, no soul.

Reilloc: Roll call

LVPPakaAspie: I’m here

DavidWrightSr: Here also.

LVPPakaAspie: Among many other things, I assert that the Christian Bible is so vague, badly written and inconsistent that no one can even figure out what it means.

Reilloc: Would a robot bible make more sense?

Reilloc: Full of robot beliefs?

LVPPakaAspie: I guess I would have to see one to have an opinion.

Reilloc: Would you have faith in what a robot believes?

LVPPakaAspie: It would depend on what evidence the robot presented.

Reilloc: Evidence?

BPRAL22169: Asimov wrote an intriguing story about a robot that came to believe in the Creator God.

Reilloc: Beliefs are what you accept without proof.

LVPPakaAspie: I find it convenient to accept certain postulates without proof. I try to keep these to a minimum.

LVPPakaAspie: God (Christian or otherwise, singular or plural) is not one of the postulates I find convenient to accept.

Reilloc: So, as it relates specifically to Heinlein AI, what did Heinlein robots believe?

Reilloc: Anything?

Reilloc: Whatever he wrote them to belive and nothing more?

Reilloc: Did Asimovian robots have beliefs?

BPRAL22169: Apparently Minerva believed she was a human being.

DavidWrightSr: They are creatures of his imagination. How could they not believe what he wrote them to believe?

LVPPakaAspie: Any fans of either Battlestar Galactica series? Those robots believed rather firmly that humans should be exterminated.

Reilloc: If you guys can suspend disbelief for a minute, presume that “stobor” meant “robots.”

LVPPakaAspie: We come back to: construct an

LVPPakaAspie: AI and find out what it believes (if anything)

Reilloc: Given that it did, what’s to “watch out for?”

Reilloc: Maybe, watch out for the trap of pathetic fallacy?

BPRAL22169: Look out for backward robots?

DavidWrightSr: Totally no relation other than coincidence. There were no robots on Tangaroa, there were dangers that they had to watch out far. I see nothing beyond a way to make them aware of that danger on a personal level

BPRAL22169: Give them something to focus on? Something not too definite?

LVPPakaAspie: I’m with DWSr.

LVPPakaAspie: I’m also with Bill, since Deacon Matson pretty much spelled that out.

BPRAL22169: Keep the edge up.

DavidWrightSr: Interesting. I happen to be watching the Making of ‘I Robot’ as we speak.

BPRAL22169: Does the same thing for the group(s) that his sister’s remarks about the spare knife do for Rod.

Reilloc: It would appear (1) that there’s consensus that “stobor,” rearranged however is a dead issue; and, (2) the proof’s in the opinions and the turnout.

DavidWrightSr: The scriptwriter just told the story that Eando Binder wrote about the ’emotional’ robot

BPRAL22169: Adam Link?

DavidWrightSr: Yes.

DavidWrightSr: He is doing the history of stories about robots.

BPRAL22169: That was a story that figured awfully large in the horizons of sf readers in the early 1940’s

Reilloc: What channel, David?

DavidWrightSr: DVD. We bought it this evening. My wife is watching the movie. I’ve got the second dVD here and watching it.

Reilloc: So, guys, if this topic’s academic, I don’t want to insist that it be further explored.

BPRAL22169: I think the consensus is that it’s coincidental.

BPRAL22169: but the reason for telling them to watch out for Stobor might not be played out.

Reilloc: k…

LVPPakaAspie: Someone at some point said that Ginny Heinlein confirmed that Robert told her it was a coincidence, but I was unable to find anything in Google groups.

BPRAL22169: For example: I’d advance the thesis that their program directors wanted them to think there was something more dangerous than they were out there.

BPRAL22169: Why? Haven’t they just been telling their students man is the top predator?

Reilloc: Okay…

DavidWrightSr: To deflate that feeling that Helen talked about? To help them grow eyes in the back of their heads.

LVPPakaAspie: Deacon Matson told them that the most dangerous predator around was man himself. That does not make him the only dangerous predator, though.

Reilloc: You suggesting that there’s “man” and then there’s something more dangerous?

Reilloc: Or something besides the here and now?

LVPPakaAspie: No, a predator can be less dangerous than man, but still dangerous.

BPRAL22169: I think maybe it might have opened up the question.

DavidWrightSr: Overconfidence is the killer. Look at Braun

BPRAL22169: That does seem to run through the book.

LVPPakaAspie: THAT is a good point, DWSr.

BPRAL22169: And I think it’s another interesting point is that Heinlein wanted it to be a question for them — and presumably also for the readers.

BPRAL22169: We’re still talking about it 50 years later.

LVPPakaAspie: I am a little skeptical that Deacon Matson would not have told the students pretty much what Helen told Rod about guns, but sometimes things have to be subordinated to telling the story.

DavidWrightSr: Wasn’t that the point of the Truce of the Bear comment?

BPRAL22169: It’s a very curious thing to find in the kind of fiction that’s supposed to be gratifying adolescent power fantasies.

BPRAL22169: I think it’s a very good indicator of how far science fiction had moved from its origins in pulp by that time.

RichardFctn has entered the room.

Reilloc: Hi, Rich.

Reilloc: You’re just in time to talk about robots in Heinlein.

RichardFctn: Hello

Dehede011 has entered the room.

Reilloc: Hi, Ron.

LVPPakaAspie: People are showing up about the time we were discussing adjournment.

LadyS122 has entered the room.

LadyS122: hello

Reilloc: Hi, Lady.

Dehede011: Hi folks,

Reilloc: Pull up a chair and talk about robots in Heinlein or not robots in Heinlein or any damn thing you like.

BPRAL22169: Oh, Ron, you’ve gone bright yellow!

DavidWrightSr: My apologies to everyone. I got so involved in the chat and simultaneously watching a dvd that I forgot to lookout for people and invite them in.

LadyS122: 🙂

Dehede011: Bright yellow??

BPRAL22169: On my screen.

Reilloc: I’ll bring you up to date.

Dehede011: Great

Dehede011: The Stobor is Freud’s Id or Odd??

Reilloc: So far. everybody says my making this a topic, that in Tunnel in the Sky there’s a line that says, “watch out for the stobor,”…

Dehede011: Yes

Krin135 has entered the room.

Reilloc: …was my making a topic pretty much out of not much.

Reilloc: Hi, Doc.

Reilloc: How the hell have you been?

Krin135: evening all…

BPRAL22169: now, now, only if you take just the stobor – robots part of the topic.

LVPPakaAspie: Actually, there is something I had intended to bring up but forgot until now. Heinlein tended to return to certain themes. He was not one to make one cryptic reference to something he considered important, then drop the matter.

Dehede011: It was one of Heinlein’s oft repeated cautions to his readers.

Krin135: busier than a one armed paper hanger…haven’t even had time to be on line much

Reilloc: Haven’t seen you posting much.

BPRAL22169: He also didn’t make single references in the book — it’s almost always followed up with another hit at the same idea.

Reilloc: YOu got an opinion about robots, stobor or anything thereabouts?

BPRAL22169: He also has a habit of telling you something then showing you in scene. Very characteristic duple presentation.

Reilloc: It’s a sales technique.

BPRAL22169: wouldn’t surprise me.

Reilloc: Tell them what you’re going to tell them, tell them and then tell them what you told them.

Dehede011: Yes, Bill, and I get the sense of being warned about the Stobor repeatedly.

Dehede011: BRB phone

LVPPakaAspie: He thought individual freedom was important, and returned to it over and over. There are other things that can be pointed out in this regard.

BPRAL22169: I heard it as a principle of persuasive writing, in almost exactly those same words

LVPPakaAspie: Is it even remotely possible that he would make a single cryptic reference to robots, and then drop the matter? I don’t think so.

BPRAL22169: At the end, the kids still don’t “get” it, though — they think the dopey joes were stobor.

starfall2 has entered the room.

Krin135: that is also something taught in sales and persuasive writing…

Krin135: make the point in the opening paragraph

Reilloc: So, if that’s true, what’s with the stobor?

Krin135: develop it in the middle, and then refresh it in the closer

Reilloc: Where’s the “telling them what he was going to tell them” and the “telling them what he told them?

BPRAL22169: “Be alert for the unknown”

Krin135: I agree, BP…

Reilloc: Hi, Star..

Reilloc: Finals over?

starfall2: hi

starfall2: yep, for a while now.

Dehede011: There used to be a saying in flight school. It was a caution. Something about “Head up and locked, eyeballs caged.”

BPRAL22169: The colony was just getting to the stage where that particular lesson was getting a little stale — so it’s probably just as well that they were rescued round about then.

starfall2: my last final was may 4

Krin135: caught that the second or third time I read it…just a general warning

Dehede011: Hi, Starfallen

Krin135: dancing rodents, Star

starfall2: thanks ^_^

Krin135: Dehede…I’m trying to remember that one…

Reilloc: Participants, the topic is, kinda/sorta, robots in Heinlein.

BPRAL22169: (oh, don’t be too quick with the thanks — dancing rodents can mean black plague, too…)

Reilloc: Feel free to say anything about anything about anything about that.

Dehede011: That was 1955 for me Krin

Krin135: IIRC, having one’s head up and locked wasn’t always good…if it meant that you lost situational awarness in the cockpit

starfall2: yeah, but human diseases have a hard time traveling through my computer screen 🙂

BPRAL22169: We spent some time talking about Artificial Intelligences earlier on.

Krin135: chuckle…my flight time was 1980 time frame

Dehede011: Anyway, I constantly hear Heinlein warning us against going “Head up and locked, eyeballs caged.”

Dehede011: Navy??

Krin135: Army Dustoff

Dehede011: Did you hear the Navy version of the Air Force Fight Song??

Reilloc: There was quite a discussion about AI the first hour.

Krin135: nope, can’t say I did

Krin135: what was the outcome, Reil?

Dehede011: “Off we go into the wild blue yonder — CRASH”

Krin135: ah…

Reilloc: It included such things as, “do robots have feelings,” “do robots have beliefs,” and “would you accept a robot’s beliefs as your own?”

Krin135: the comments that the Navy/Marine jocks used to make to the Zoomies in my hearing

starfall2: i’ve gotta go for a little while… any idea how long this is going to last?

Krin135: was more on the order of “Flare to land, squat to….”

Reilloc: Until 11:00 EDT, star, as long as people are here to talk.

Dehede011: Very good, they didn’t let down the side just because I left. LOL

starfall2: sounds good… i’ll be back well before then

starfall2: later!

starfall2 has left the room.

Krin135: maybe, possibly and I’d like to hear the equivalent of ‘angels dancing on a pin’ before I’d accept

Reilloc: Well, Doc, we never did hear that.

Krin135: I have definately owned/cared for mechanicals that had a personality

Krin135: and responded better to praise than to critcism

Reilloc: We did hear some stuff about having to see the robot’s evidence before accepting the robot’s beliefs.

Dehede011: Yes, they don’t call planes “she” for no reason.

Reilloc: Even though beliefs seem to be things that don’t require evidence.

Krin135: hell, I don’t even need evidence…I’d love to see the argument

Reilloc: Argument?

Reilloc: What’s the argument for the existence of a supreme being?

BPRAL22169: Asimov did that in one of his I Robot stories.

Krin135: such as ‘how many angels can dance on the head of a pin’

BPRAL22169: (both examples that can be seen in full in St. Thomas Aquinas’ Summa Theologiae)

LVPPakaAspie: I require evidence for EVERYTHING that I believe. “My life is easier and more comfortable if I belive X” might be sufficient evidence.

Krin135: faith and belief both are unique to human experience in that they can be had *absent* any hard evidence

Reilloc: Right, Don…

BPRAL22169: I think one of the conscious lessons Heinlein taught was that your life will be more comfortable and convenient if you ACT as though you believed x, y, and z.

Dehede011: The Sufi as I understand them use the senses as a basis for believing in a Supreme Being.

Krin135: and even *contrary* to verifiable evidence…vice Kansas board of Education and ‘creationism’

Reilloc: Do robots have any senses, Ron?

LVPPakaAspie: For example, I really don’t have any “evidence” that keeping the temperature of my home around 70 F is a good thing. It makes my life easier and more comfortable though, so I do so.

Dehede011: I don’t know.

Reilloc: Does Don have senses?

Krin135: actually, Aspie, that *is* a form of evidence

Reilloc: He says that 70’s the way to go.

Dehede011: I think the present opinion is that we have not achieved AI

Reilloc: That’s too cold for me in the summer and sometimes too hot in the winter.

Reilloc: He says it’s his “belief” but it sounds more like a physical sensation.

Krin135: agreed, Reil…now I just need to figure out how to keep Ma Cherie comfortable outside that temp

LVPPakaAspie: What is the evidence that making my life easier and more comfortable is a good thing? I don’t have any, I just do it.

Dehede011: What was Heinlein’s preferred temp in TEFL? I’ve forgotten

Reilloc: Bill?

BPRAL22169: Yo?

Krin135: Aspie, if keeping your house at that temp makes you more comfortable, and hence more efficient

Reilloc: Room temp at Boondock?

Dehede011: On LL’s tramp ship.

Krin135: then you are providing *evidence* that *for YOU* in that envioroment

BPRAL22169: Oh, I think it was around 72, but there were times when he bumped it to 80 for comfort

Krin135: 70 is best

BPRAL22169: onboard Dora, I think.

DavidWrightSr: He mentioned a temperature in the story of the twins, but I forget what it was.

Reilloc: I like it really cold inside in the summer and hot inside in the winter.

Dehede011: But he went nude very often

BPRAL22169: IO think he particularly warmed the delivery room in the tale of the Twins

DavidWrightSr: He dropped them temperature to encourage exercise.

BPRAL22169: Right.

Krin135: or just in a kilt…which is much the same

LVPPakaAspie: Why is my being more efficient a good thing? At some point, SOMETHING has to be accepted as a postulate with no evidence required. I still try to keep those postulates to a minimum.

Reilloc: So, as far as sense go, there’s no senses consensus among us much less about whether robots could and, if so, how it might feel?

Krin135: because inefficiency in and of itself leads to an increase in chaos and hastens the heat death of the universer

Krin135: mmm…

Reilloc: We’ve got new people in the room.

Krin135: robots are currently being made to accept feedback and to have primative reflexes

Reilloc: You guys who weren’t here the first hour, why no free-standing robots as in “mechanical men” in Heinlein?

Reilloc: Opinions?

Krin135: I suspect that the problem is that we have no mechanos which are sufficently complex to go even as far as a lizard brain

Reilloc: Oversight?

Reilloc: Intentional?

LVPPakaAspie: And we get to “Why is an increase in chaos and hastening of the heat death of the universe” any sort of bad thing? No matter WHAT you can come up with, I can ask why it is a good or bad thing?

Reilloc: Just not enough time to cover all the scifi topics?

LVPPakaAspie: SOMETHING has to be accepted as a postulate at SOME point.

Krin135: I’m of the opinion that he saw what IA was doing with it…and figured that if RAH tried, there would be the inevitable comparisons

Krin135: and RAH wanted to keep to new ground as much as possible.

Reilloc: Really, Doc? I like that.

Dehede011: BTW, what was it that LL said about what to do when national IDs became a requirement.

Reilloc: It’s consistent with what I posted as a teaser for the chat.

Krin135: The other point is RAH repeatedly pointed out that he was writing *speculative fiction*

Dehede011: I just noticed that the US and Britain are wanting to issue the same ID cards.

Krin135: and at the time he was doing much of his formative writing, *everyone* except maybe Doc Smith was working with robots and IA of some sort

Reilloc: During the first hour, the mind-body conundrum was discussed in some detail.

BPRAL22169: Doc — this IA is Artifical Intelligence instead of Isaac Asimov?

Reilloc: Any possibility that RAH saw the problematic nature of mechanical characters and avoided them intentionally?

Krin135: good point Bill

Krin135: yes, AI and not IA

Reilloc: Not Iowa to the post office?

BPRAL22169: Touch of dyslexia there? Me, too.

Krin135: Dehede, have you also noted the rise of the Armageddonists?

BPRAL22169: I can’t recall Heinlein ever remarking on problematical nature of humaniform robots.

Krin135: eh, Reil…I don’t think that was the problem.

Dehede011: Yes, to some extent.

Reilloc: Watch out for the stobor.

LadyS122: but he had Friday display a distrust of their motives.

Krin135: but one of the things that almost everyone but IA missed, was the rapid advancement

Krin135: almost compound improvement

Krin135: in electronics

Reilloc: What’s Moore’s Law imply for AI?

LVPPakaAspie: Not familiar with Moore’s Law?

Reilloc: Yes, I am.

Krin135: mmm…comparing the most advanced multiprocessors widely available today with what we know of Neuro anatomy

Krin135: I’d SWAG for at least another 100 years and at least *two* major breakthroughs in complexity

Reilloc: Doing that comparison, Doc, and knowing the Moore’s 18-month time frame, has anybody calculated the time by which processor speed will approximate neuroprocessor speed?

DavidWrightSr: I believe Moore’s law said that computers would double in power every 18 months.

Krin135: possibly with the use of superconducting semiconductors.

Krin135: I’ll be back in a bit…have to make a run for Ma cherie

Krin135: will leave the screen up and catch up when I get back

Reilloc: k

Dehede011: Tell to hitch hike home??

Dehede011: LOL

Krin135: actually, it was that processor *complexity* would double around every 18 mos

Dehede011: I pulled that on a buddy of mine the other night — his reply wasn’t even polite or nothing.

Reilloc: Processor speed.

Reilloc: Guys.

Reilloc: It’s on the hour now.

DavidWrightSr: Break time?

Reilloc: Five minute break for leg stretching and such.

Reilloc: Keep talking, of course, if you like and I’ll be back in five.

BPRAL22169: Minsky estimates 20 years.

LVPPakaAspie: There has to be a limit. We may be close to that limit, or we may be far away from it, but speed of light will stop us if nothing else does.

BPRAL22169: But I reply: the estimate is based on false premises: Turing’s original 1950 speculations, which don’t seem to be backed up by neuroscience any longer.

BPRAL22169: In short, the development of AI may have nothing whatsoever to do with processor speed or complexcity, or mass storage regimes.

Dehede011: Then in that case what might it depend on??

BPRAL22169: organization.

BPRAL22169: In 1950 the view of the brain was as of a more or less amorphous mass with some few areas of specialization. The view since then is one of fantastic degrees of organization.

BPRAL22169: And from my direct experience I know that some aspects of language modeling which are thought to require supercomputers can be modeled instead on a TRS-80

BPRAL22169: — if you have the right organization of the materials.

LVPPakaAspie: I agree that software will be FAR more important than hardware in developing a truly self aware AI.

Dehede011: Okay, so in essence you are saying that we may be waiting on the jellyware capable of doing a breakthrough in organization for the computer???

BPRAL22169: (and are willing to put up with 2 minute parse times per sentence — admittedly a significant limitation)

BPRAL22169: No, just a change in theory from Chomsky’s structurlism to something that is easier to handle — particularly something that invokes complexity theory.

BPRAL22169: The model I’m currently working with is based on a Sierpinsky carpet.

LVPPakaAspie: A self aware AI that functions slowly will still be a self aware AI.

Reilloc: Interesting, Don.

Reilloc: I’ve never heard that.

Reilloc: Okay, guys.

Reilloc: This is the third and final hour of the monthy Heinlein Readers Group Thursday chat.

Reilloc: Here, traditionally, everybody lets down his or her hair and turns on the heat.

Reilloc: Not.

LVPPakaAspie: Possibly you have never heard of it because (a) this is the first time I have ever stated it that way and (b) I am a poker player, not a world famous scientist.

Reilloc: It’s just like the first two hours only two hours later.

BPRAL22169: LVPP– yes, in fact, there’s a lot of playing around with run-time frames in GregEgan’s various upload stories.

Reilloc: So, let me kick off the last hour with the burning question…

Reilloc: Why did the stobor not cross the road?

Reilloc: Any takers?

Reilloc: Time’s up.

BPRAL22169: Because he didn’t want to get to the other side?

Reilloc: The answer is: because they can’t get to the other side.

BPRAL22169: Close, very close.

Reilloc: Bill, the corallary might be that they want to but can’t.

BPRAL22169: Impotent robots?

Reilloc: What’s interesting about a story about a guy who lives for 2,000 years?

Dehede011: That RAH was able to tell the story

Reilloc: What’s telling about a story about a guy who lives for 2,000 years?

BPRAL22169: Extensional definition: Point to Time Enough for Logve.

Reilloc: Isn’t it about avoidance of death?

DavidWrightSr: What’s not interesting about a man who lives 2,000 years?

Reilloc: David, the 2,000 year old man wanted to stop living, as I recall.

DavidWrightSr: Unless he sits in a room and watches Reality TV the whole time 😉

BPRAL22169: J Neil Schulman raised George Bernard Shaw’s answer to that in Back to Methuselah: we finally have time to pay attention to what’s really important instead of merely urgent.

Reilloc: When the 2,000 year old man was prevented from taking his life, he found time for really important things over the span of a mere couple of years, didn’t he?

Reilloc: Now, robots.

Reilloc: That’s something else.

Dehede011: If Aubrey de Grey is right how much will we have to extend life for people to have time for to the “really important?”

Reilloc: Even if they have feelings and beliefs, can they ever get to the other side?

Dehede011: Or like Bill with AI and Steve Covey with his habits will it require reorganization in folks habits of mind

Reilloc: And a mush fell over the room….

Reilloc: Back to the nuts and bolts of robots and Heinlein.

Dehede011: Bill can we make an argument that Heinlein was a man that learned to think about the “Really Important.”

Dehede011: ??

Dehede011: One of those things being the Stobor??

BPRAL22169: I think that describes it pretty well.

BPRAL22169: Or — try this on: he didn’t so much spend his own time thinking about what’s really important (he did, but that’s not the main thing) — it’s that he forces/induces us to do that thinking.

Reilloc: Really important, such as what?

Dehede011: I’ll agree to that. That may be the elusive intellectual discipline that he was teaching.

BPRAL22169: What’s really important to you?

Reilloc: What time is it?

Reilloc: What day is it?

BPRAL22169: That’s easy: be present.

Dehede011: Reilloc, not to play games but like thinking about “what is really important?”

Reilloc: What’s really important isn’t an abstract notion.

BPRAL22169: Now give me a hard one.

IWillFearNoTrout has entered the room.

IWillFearNoTrout: hi

BPRAL22169: Neat handle. Who is that masked man?

Reilloc: Greetings, Kilgore.

IWillFearNoTrout: sorry… forgot to change the screenname 😛

IWillFearNoTrout: this is jackie/starfall

BPRAL22169: Ha!

IWillFearNoTrout: this is my other screenname

BPRAL22169: Farmer fan as well?

Reilloc: I can read your font under this name a lot better.

IWillFearNoTrout: (not quite a masked man)

BPRAL22169: or Vonnegut?

BPRAL22169: OK: Who is that masked person?

IWillFearNoTrout: hehe

Reilloc: You got here just in time to tell us what’s “really important.”

IWillFearNoTrout: the name actually comes from a friend’s tendency to virtually attack me with an imaginary trout

Dehede011: But isn’t that first a question to be decided? What is really important?

IWillFearNoTrout: what’s really important about what?

DavidWrightSr: What’s really important isn’t an abstract notion? Explain Please.

Reilloc: I repeated what Bill and Ron said.

Reilloc: They said RAH knew how to make us think about what’s “really important.”

BPRAL22169: qualification: as opposed to telling us what was realling important as a result of the thinking he did for himself.

Dehede011: Yes, and the 1st step is to decide for ourselves What is really important.

DavidWrightSr: What’s important varies with each person. He knew how to provoke each of us into examining what we think and feel.

Dehede011: Dead on, Bill

Reilloc: Half an hour left.

Reilloc: Was it really important that he not write about robots or a negligible omission?

BPRAL22169: I think this is one of the biggest mistake made by the people who complain he went to lecturing instead of showing us what he wanted us to see — we follow his thinking as an example of how to go through the process,

BPRAL22169: not to come to the same conclusions.

BPRAL22169: Sorry, didn’t mean to step on your provocation

Reilloc: It’s fine, Bill.

Reilloc: I’d like to know what the RAH process was, though.

Reilloc: The scientific method?

BPRAL22169: He gives us lots of examples — the late books are all demonstrations

Dehede011: I seem to see a lot of General Semantics in his thinking.

DavidWrightSr: You should see the references that I have bookmarked so far.

BPRAL22169: I mean, one useful way of reading To Sail Beyond the Sunset is to follow another example of how Maureen interpreted her experience, different from the way Lazarus Long interepreted his experience of the same events

DavidWrightSr: to Korzybski that is

DavidWrightSr: I’m researching a paper on K’s influences.

Dehede011: I need to re-read Steven Covey again — I’m suddenly seeing parellels between RAH and SC

IWillFearNoTrout: i liked reading To Sail Beyond the Sunset right after reading Time Enough For Love. When I picked up TSBTS (the first time), I had no idea it was going to be Maureen’s perspective on what I had just read the week before

BPRAL22169: Cat is another one of those — waking up in a strange hotel room with a dead man in your bed is kind of a metaphor for the kind of waking up you do when you realize the world isn’t anything at all

BPRAL22169: like you thought it was ten minutes ago.

Dehede011: Yes

Dehede011: A moment of epipheny?

BPRAL22169: And you are rescued by these incredible grotesque individuals who want you to work on their agenda.

BPRAL22169: A particular kind of epiphany, perhaps

Dehede011: Yes

Dehede011: Wow, I am more convinced than ever.

LVPPakaAspie: Change of subject. LNC, maybe you should try to make it clear, both here and on the NG, just when the next chat group will meet, keeping time zones in mind.

Reilloc: It’ll be Saturday.

Reilloc: Some time in the afternoon.

Reilloc: What time’s traditional, David?

DavidWrightSr: 5:00 P.M.

DavidWrightSr: EDT

Reilloc: 5:00 EDT?

Reilloc: That’s when it’ll be, then.

Dehede011: Starting with TEFL IMHO RAH had played with all his ideas in detail and could put together the world he wanted so he could write about the really important things.

Reilloc: If that’s true, where are the robots?

Dehede011: In TEFL and there after they are way in the background, out of sight, maintaining the world for the flesh and bloods.

Reilloc: You think?

Reilloc: Just machines, eh?

Dehede011: An extension of Drafting Dan, etc in Door into Summer.

Reilloc: I suppose that’s possible; however, I give him more credit than that.

Dehede011: How so?

Reilloc: I think he considered the task and rejected it because of the philosophical implications.

Dehede011: ??

BPRAL22169: In the esoteric scheme of things, the robots are indeed in the background, out of sight, maintaining the world for the real people — they’re just not metal machines.

BPRAL22169: Tak?

BPRAL22169: Task?

Dehede011: Very insightful

BPRAL22169: I mean

BPRAL22169: It’s certainly true he backs away from that position quite frequently.

Reilloc: Look around this room and tell me who here is self aware and who’s not?

Reilloc: Who thinks and feels and who doesn’t.

DavidWrightSr: I don’t know about you cobber, but I know I am. O:-)

DavidWrightSr: I know where I came from, but where did all you zombies come from?

Reilloc: Now, it’s about 15 minutes before adjournment for the evening.

Reilloc: Final thoughts?

Reilloc: Comments?

Reilloc: Invectives?

DavidWrightSr: What you got planned for the next time?

Reilloc: Perjoratives?

Dehede011: Yes, this is the best session I’ve ever attended.

Dehede011: Comments, questions, snide remarks??

BPRAL22169: Your last remark caused me to think about “They” — the “automatics” that made up the world.

Reilloc: For June, David?

Reilloc: June is busting out all over.

Reilloc: I’m thinking of a nature theme for June.

Dehede011: Who are you addressing Bill?

BPRAL22169: LNC

Dehede011: thanx

BPRAL22169: Third Thursday is 17th

DavidWrightSr: There was mention of ‘automatics’ in BTH also.

BPRAL22169: Or am I thinking about Thanksgiving . . .

Reilloc: Naturally, I’m open to all suggestions, and Don made one the first hour, about topics.

Dehede011: I believe they were strongly implied in The Cat That Walks Through Walls.

Reilloc: Before I forget, thank you all for coming and your outstandingly informed and well-articulated participation.

Reilloc: It’s really easy to moderate one of these things when the participants really do participate like you guys do.

IWillFearNoTrout: and i apologize for not much participation

LVPPakaAspie has left the room.

Reilloc: You did, Jackie.

Reilloc: The order in which you read TEFL and “Sails” stuck me as particularly significant.

Reilloc: Made me think about how much coincidence there is and how there’s nothing coincidental.

LVPPakaAspie has entered the room.

IWillFearNoTrout: i’ve loved those two books as a set ever since the first time i read them

LVPPakaAspie: Compute4r glitch there.

IWillFearNoTrout: and usually if i read one… i read the other right after it

IWillFearNoTrout: welcome back

Reilloc: Don, you just got back in time to move that we adjourn.

BPRAL22169: Which “two” books?

BPRAL22169: oh, TEFL and Sails

IWillFearNoTrout: yep

Dehede011: Everytime I pick up TEFL I turn to the poem beginning the Happy Valley sequence.

Dehede011: I think that is the location. By now my copy falls open at the proper spot.

LVPPakaAspie: Not much I can do about that, since a motion to adjourn is always in order and cannot be debated. 🙂

Dehede011: But as the Sergeant at Arms will not throw us out of the room…….

Reilloc: There’ll be no throwing on my watch.

IWillFearNoTrout: that’s much appreciated; i’ve already been thrown once today ^_^

Reilloc: I will, though, request that the Honorable David Wright, Keeper of the Infernal Record, now close that record for this session of May 26, 2005.

DavidWrightSr: Log officially closed at 10:55 P.M. EDT

Reilloc: Feel free to mingle and discuss among yourselves for as long as there’s an AOL.

Reilloc: And thanks, again, all.

DavidWrightSr: Of course, that doesn’t mean that if you say something witty or profound or embarrassing, I won’t take it out O:-)

IWillFearNoTrout: hehehe

BPRAL22169: bye-bye, one and remainder…

BPRAL22169 has left the room.

Reilloc: Remember there’s a Saturday session, too.

LVPPakaAspie: At whatever time on Saturday…

Dehede011: Good night, Y’all.

DavidWrightSr: 5:00 P.M. EDT

IWillFearNoTrout: goodnight

Reilloc: Night, all, and I’ll see you next time…

Reilloc has left the room.

Dehede011 has left the room.

DavidWrightSr: I’m gone. Do svidanije i s’pokoijnij nochi
End of Discussion

Click Here to Return to Index

Leave a Reply