close this bookVolume 5: No. 04
View the documentFunding news
View the documentEvent horizons
View the documentDiscussion groups
View the documentInvestment
View the documentElectronic commerce
View the documentJob opportunities
View the documentJob services
View the documentLinguistics and word processors
View the documentVisualization technology
View the documentSoftware development
View the documentComputists' news

Tecate is a virtual-reality WWW markup language for 3D viewing of scientific data, from the Project Sequoia 2000 group at the San Diego Supercomputer Center. 3D data objects (e.g., in GIS or informational "landscapes") act as "hot links" to additional information, which can be viewed with 3D visualization techniques. Ann Redelfs , SDSC External Relations, (619) 534-5032. [HPCwire, 1/3/95.]

The month-old "virtual museum" 3D modeling technology used in the "Star Trek: The Next Generation" CD ROM has found a surprising use. Apple's Quicktime VR will be used to help TV viewers retrace the killer's path for the O.J. Simpson trial. 400 photographs were used to build 3D models of 26 sites, permitting the same kind of roaming and zooming as if live cameras were present. [Rory J. O'Connor, SJM, 1/19/95, 1E.]

The archive of 3D objects, models, and related materials at anonymous FTP site avalon.chinalake.navy.mil is moving to avalon.vislab.navy.mil. [Francisco X. DeJesus , comp.graphics, 1/18/95. Chuck Morefield.]

A new stereo display system creates a 3-D effect from single- camera motion pictures. The image is offset right and left as it's presented alternately to your two eyes. The result appears flat, of course, until something moves. Then your brain picks up the segmentation cues and begins to perceive the scene in 3-D. [Suzanne Oliver, Forbes, 1/16/95, p. 94.] (It's not true depth, of course, but endoscopic surgeons find it very helpful.)

From the Computer Vision Home Page you can reach over 8,000 documents in just four link traversals -- all listed in a searchable index. Over 80 research groups are referenced directly. , or use if you have a slow link. [Mark Maimone , c.i.www.misc, 1/13/95. Chuck Morefield.]

For visual edge detection, 10,000 instructions per second (e.g., from a 100 MIPS workstation) produce the coarse functional equivalent of 100 neurons in the optic nerve. Such a workstation might fully simulate an ant if we understood all the biological hacks and open-loop shortcuts. But note that insects seldom behave wisely and most die before reproducing. Human-level performance with $10K workstations may take us until 2030. "We can hack our way incrementally to human-level intelligence, because stupid old Darwinian evolution works that way, and it got there. And from time to time we can cheat, and copy an answer. That's why our robots had cameras even before they knew what to do with them. Nature had shown us that eyes were a good idea." [Hans Moravec ([email protected]), comp.ai, 11/17/94. Chuck Morefield.]

The Fox 2000 "electronic nose" from AlphaMOS (France) uses gas sensors made of surface acoustic-wave devices, conducting polymers, and metal-oxide silicon to recognize odors in 10-30 seconds. Neural Computer Sciences (Southampton) will be studying neural networks for faster recognition. [Computer News, 12/7/94.]