20010530

Quantum Topology

Roman Zapatrin, Quantum topologist
Roman Zapatrin | Starlab 

Roman R. Zapatrin is working on the "Quantum Topology" project. He has recently developed mathematical methods which take away the last pieces of ground under our feet: Einstein took away the predefined metric from spacetime, and Roman Zapatrin – with his physicist colleagues – is taking away spacetime itself.

Another possible application of quantum topological jumps, for which he has provided the theory, is to store information for quantum computers. He graduated from St. Petersburg State University as a pure mathematician. He does not respect any kind of scientific supervision, nor any academic degree; indeed his university diploma was written by himself.

He is an accomplished composer and musician; he plays the balalaika, the domra, the mandola and the mandolon-cello. He enjoys unconventional and ‘uncivilized’ travelling - crossing snow passes in the Alps with a small folding bike, or skiing in the Russian backwoods. He claims just to be providing tools which to wrestle with Nature's challenges.

For several years he worked on quantum logic, and managed to build the theory of automata simulating quantum system; after that he began grappling with the quantization of spacetime. Still, for him, his main achievement is that he is happy with what he is doing. Roman Zapatrin believes that—theoretically at least—we shall be able to change spacetime, so that by a click we may change both the future and the past.

Quantum Topology 

Physical phenomena are supposed to require an arena in which they may occur. That stadium is spacetime. But in the quantum realm is there such an arena—that is to say, does the stadium exist before the game begins? Or does it emerge as we observe it? Can we change spacetime? May we alter the past without time travel?

It is by now generally accepted that, in the quantum realm entities—minuscule particles—somehow come into existence at the instant that they are observed. In the study of quantum topology there may be different scales at which explorations may be conducted, ranging from the very small to the entire universe.

According to the laws of quantum mechanics a basic assumption is made: an assumption of a pre-existing structure. At the very small scale all attempts to observe that assumed structure inevitably change the topology itself; the large amount of energy which has to be applied distorts the arena's very structure.

The topic of quantum topology spawned two projects at Starlab:

Project Aphrodite: Spacetime Foam

The beauteous Aphrodite, she of the wondrous form, took shape and emerged, fully made in her perfection, out of the foam. The notion of spacetime as foam dates from ideas put forward by John Wheeler of Princeton's Institute for Advanced Study during the 1960's. The Aphrodite project aims to dive deep into the broth of geometrical fluctuations and give perfect shape to that which was formless.

This project explores the structure of spacetime at the Planckian scale. The Planck length is the smallest naturally occurring measurement used by scientists: about a billionth of a billionth of a billionth of a millionth of a centimetre. It is at this scale those quantum phenomena and the arena—or the topology in which they occur—emerge as they are observed.

The task here is to provide a mathematical solution to this physical problem. There is no desire to give up Einsteinian relativity; it presents a very good working model, in its domain of application. But at the sub-Planckian scale, Einstein's theory cannot even be tested. Because it is not testable, the notion of pre-existing spacetime is swept away and may be replaced by an appropriate quantum observable–an entity whose values at the moment it is measured.

Then care is taken to make this work compatible with existing working theories such as relativity, so that the beautiful Aphrodite may be safe wherever she roams.

Project Undo: Topology Leaps

Undo follows from the claims of quantum topology. `Undo' involves the changing of spacetime.
Say for instance that an explosion has occurred; in principle it is possible that by observation itself the arena that is spacetime may be so altered that the explosion did not occur. In this sense it has been undone.

This is a quasi undoing or altering, which occurs as a result of appropriate measurement. This would not be possible without quantum effects, and the goal of this project is to find appropriate measurements of spacetime, which involve those effects. Quantum measurements are those which unavoidably effect that which is being measured. The point about this process is that it is the act of measurement itself, which creates the stadium, and further measurements may create altered or different stadia. This is not the same as travelling back in time; what takes place is an alteration so that a previous setting is undone, in the sense that it did not exist.

Einstein claimed that the past and the future are in a given, predefined or frozen spacetime. The Undo project melts it.

20010505

US National Labs Salishan Fellowship









Salishan | Algorithms, Architecture, Language

LANL | LLNL | LBNL | SNL
Los Alamos | Lawrence Livermore | Lawrence Berkeley | Sandia

Christopher Altman  US graduate student in applied physics, Kavli Institute of Nanoscience, Delft University of Technology. Research foci include high-performance computing (HTMT), solid-state superconducting nanoelectronics, macroscopic quantum coherence and computation, quantum infomation processing in Josephson junction nanocircuits.

I was honored to participate in the US National Labs Conference on High Speed Computing from April 23-27, 2001 at Westin Salishan in Gleneden Beach, Oregon. Salishan is a half-mile's walk from the beach, a picturesque, mist-covered mountain resort that has been the setting for the conference since its inception.

The conference, founded in 1980, was founded as a means of getting experts in computer architecture, languages, and algorithms together to improve communications, develop collaborations, solve problems of mutual interest, and provide effective leadership in the field of high speed computing. Attendance is by invitation only, and limited to about 170 of the best and brightest in the world.

The conference is sponsored by Lawrence Livermore, Los Alamos, and Sandia National Laboratories, as well as being co-sponsored by a number private companies—this year volunteering sponsors included Compaq, Cray Inc., Fujitsu, IBM, Intel, SAIC, SGI, StorageTek, and Sun Microsystems.

A highlight of the conference was the informal discussions held each evening in Salishan's Sunset Suite, a forum to exchange ideas, solve problems, and develop friendships. This year's talks profiled recent developments in nanotechnology, supercomputing, microelectromechanical systems, large-scale networks, memory architectures, data management, artificial intelligence, molecular electronics, and a number of other technologies that will significantly impact the future of information science and technology.

The meeting was a stimulating and challenging week of close interaction with many of the most creative minds in the field. I'd like to extend my gratitude to the many inspiring scientists with whom I had the opportunity to meet at Salishan—and to those who helped to make my attendance possible, including Doc Bedard, Horst Simon, David Kahaner, Brett Berlin, Will Stackhouse, Jim McGraw, Kathy Turnbeaugh, Dennis Bohnenkamp, and Lala Stone.

It was an honor to attend under support of a Salishan fellowship, to meet and discuss large-scale networks with H. Shrikumar and paintable computers with Bill Butera. I look forward to meeting both again on my next trip to the Media Lab.

Special thanks go to Horst Simon for our continuing discussions on high-performance computing and the HTMT architecture, and to Doc Bedard for his guidance, advice, and for answering my questions while exploring ideas on imaginative walks through the forested grounds of Salishan. My interest in Josephson Junction RSFQ superconducting nanoelectronics has been in no small part due to Bedard's Random Access talk, and the influence of our continued discussions throughout the week.

Notwithstanding revolutionary hardware breakthroughs, the next generation of high-performance computing systems will continue to be reliant upon low-temperature superconducting nanoelectronics. Moore's Law ensures their dimensions will shrink rapidly. As we enter the era of quantum information processing, this is certain to be a productive and exciting area of research.



Proceedings

Application Requirements and Current System Architectures

An Overview of Nuclear Stockpile Stewardship
James Mercer-Smith, Los Alamos National Laboratory

Requirements for Large-Scale Massively Parallel Computing
Robert Weaver, Los Alamos National Laboratory

Sandia C-Plant Clusters
Art Hale, Sandia National Laboratories

LLNL ASCI Platforms
Mark Seager, Lawrence Livermore National Laboratory

Future System Architectures

HEC Architectures in the 21st Century: Drivers and Imperatives
Thomas Sterling, Caltech

High Performance and High Density Archives
Jim Hughes, StorageTek

Future Communications and Networking
Marc Beackon, Lucent Technologies

From Problem Definition to Problem Setup

An Introduction to the Challenges of Problem Setup
Robert Leland, Sandia National Labs

Responses to Analysis / CAD Integration Perplexities
Ted Blacker, Fluent, Inc.

Computational Problem Setup: An Industrial Perspective
Todd Michal, Boeing

Unstructured Meshing
Glen Hanson, Los Alamos National Laboratory

A Heirarchical Data Management System for Parallel Partitioning of Adaptive Communication
Joe Flaherty, Renssalaer Polytechnic University

CAD to Results: The Snowball Effect
David White, Cargegie-Mellon University

From Problem Setup to Result Data

Performance Metrics: Out of the Dark Ages
David Bailey, Lawrence Berkeley National Laboratory

State of the Art in Programming Tools
John Levesque, Times N Systems

Addressing the Memory Bottleneck
Sally McKee, University of Utah

Random Access Talks

NASA's digital library initiative
Eugene Miya, NASA Ames Research Center

Evolutionary Hardware
de Garis, Starlab NV/SA

OSCAR and the Open Cluster Group

ASCI Setup
Sandia National Laboratory

Self-adapting software
Jim Hughes, INFOSEC

Josephson Junction RSFQ Superconducting nanoelectronics
Fernand Bedard, National Security Agency

ATIP Activities in East Asia
David Kahaner, Asian Technology Information Program

Alan Huang, Stanford University

Norm Whittaker

From Result Data to Insight

Is Visualization a Solved Problem?
Sam Uselton, Lawrence Livermore National Laboratory

Large Scale Scientific Data Management and Analysis
Alok Choudhary, Northwestern University

Can Data Mining Ever be a Gigabit Application? Lessons from DataSpace
Robert Grossmsan, University of Illinois, Chicago

Schooling in the Digital Age
Sara Armstrong, PhD, George Lucas Educational Foundation


MEMS: Micro-Electrical Mechanical Systems


A Smaller Hammer
William S. Trimmer, Standard MEMS Inc.

MEMS Modeling: Pushing the Limits of Miniaturization
Robert Rudd, Lawrence Livermore National Laboratory

Artificial Brains and Self-Configuring Electronics

Artificial Brains: Today and Tomorrow
de Garis, Starlab NV/SA

Gate Array, Configure Thyself
Nick Macias, Cell Matrix Corp

An Approach to Designing Extremely Large, Extremely Parallel Systems
Lisa Durbeck, Cell Matrix Corp

Molecular Computing and Myriad Nets

Defect Tolerant Molecular Electronics Algorithms, Architectures, and Atoms
Philip Kuekes, Hewlett-Packard Laboratories

Myriad Nets: De-Layering to Scale Networks up to the Billions
H. Shrikumar, MIT Media Laboratory

Future Directions

Programming a Paintable Computer
Bill Butera, MIT Media Laboratory

The Future of High Performance Computing: Dynamic Translation and High Density Computing
Dave Taylor, Transmeta Corporation