close this bookVolume 4: No. 32
View the documentFunding news
View the documentPolitics and policy
View the documentBusiness news
View the documentVirtual reality agents
View the documentJob opportunities
View the documentJournals and e-journals
View the documentSoftware development
View the documentOpinion -- AI and OOP
View the document>From John Reeves ([email protected])

Atlantic magazine (8/94) carries a very negative article about AI by one of the journalist-judges at last year's Loebner prize competition. (Of course, the competition really isn't about AI.) [Steven Salzberg ([email protected]), comp.ai, 8/10.] For a more positive article, see Michael Mauldin's "Chatterbots, TinyMuds and the Turing Test -- Entering the Loebner Prize Competition" in the AAAI-94 proceedings. [Tim Finin ([email protected]). David Joslin.]

Lifelike computer characters, conversational assistants, believable agents, etc., are the subject of LCC '94, Snowbird, UT, 10/4-7. Write to [email protected] for the announcement. [Gene Ball, 7/9/94. Tim Finin.]

A new WWW page for Interface Agent resources is http://www.cs.bham.ac.uk/~amw/agents/index.html. [Andy Wood ([email protected]), comp.ai, 6/29/94. David Joslin.]

"Facial information science" is the newly evolving study of facial animation and interpretation. Related professions include video-game programming, computer science, graphics and animation, makeup artistry, and psychology. NSF sponsored a workshop, and three conferences have been held in Japan. NTT is working on recognition of facial expressions, gestures, and lip speech. Keith Waters at Digital is attempting synthesis of accurate lip motion. Norman Badler of UPenn's Center for Human Modeling and Simulation wants to animate actors and historical figures. Dr. Akikazu Takeuchi of Sony has synthesized a face that models 16 muscles -- involved in 44 basic facial actions -- and can maintains eye contact with someone moving in the room. "What we want is not intelligence but humanity." [Andrew Pollack, NYT. SJM, 7/10/94, F1.]

Did you see the Virtual Jack humanoid CAD model in the 6/94 Discover, p. 66? It has mouse and spreadsheet control of 30 major body segments and 25 joints, plus indirectly controlled segments and joints. The software can report how much strength is available to each limb in any position. View cones show the mannequin's area of vision, and what it sees can be displayed in a separate window. Animations of 60 frames/sec are supported. Jack 5.8, developed at UPennsylvania, is now available in the UK. The educational price is 1200 pounds sterling, a 90%+ discount. Goodwin Marcus Systems Ltd. ([email protected]), 44 (0)606 836093, 44 (0)606 835643 Fax. [[email protected], sci.virtual worlds, 5/5/94. Anandeep Pannu.]

Rodney Brooks is guiding the COG program at MIT's Media Lab and AI Lab. COG is an upper-torso robot with vision, hearing, hand and finger control, and cognitive abilities. It's "production line" research, with a 3-year PERT chart covering 30 PhD or masters theses. Brooks has said that he might have only one 10-year project left in him, and he wanted to make a human-like AI rather than a cockroach. [Hugo de Garis ([email protected]), comp.ai.alife, 8/1/94.]

NASA Ames Research Center, Numerical Aerodynamic Simulation Systems Div., Applied Research Branch, offers over 100 technical reports on scientific visualization, virtual reality, and parallel computing. Also computational fluid dynamics data sets. http://www.nas.nasa.gov/RNR/rnr.html. [Scout Report, 7/2/94.]