Its not exactly Rocket Science

As soon as aliens were discovered within the solar system, scientists started to wonder how they had managed to cross the vast distances between the stars.

Of course there were various ways of traversing these distances, but each of them took enormous amounts of time. Had the aliens used conventional propulsion systems or even systems that were conceivable and feasible by human, the fastest method would take several decades to come to the Sol system from even the closest other star Proxima Centauri.

If they had discovered a way to travel at close to the speed of light, it would still have been years to decades.

In the beginning, Scientists were sceptical to such a thing as faster than light travel i. While it was a good explanation about how the aliens traversed the huge distances of space, theories were sorely lacking. There were a few ideas about, some decent, some wildly ridiculous. Wormholes were considered, but those were extremely unstable and unlikely to permit a transfer through them, even if they could be formed outside of extremely heavy black holes. Outside of black holes they would need large amounts of exotic matter that was far away from human capabilities at the moment or even in the foreseeable future.

Any civilization that was capable of using the proposed ideas would have to be much more advanced than humanity was on the Kardashev scale, where humanity was not even close to being a Level 1 civilization. The Martian ruins showed nothing that proved the Ziggies to be that much more advanced. At most they were just past Level 1.

Eventually Burkhard Heim published his complete Unified Field Theory, or called Heim Theory by most scientists, in 1986, after publishing pieces of it in the late 1970s. During the process of peer review, the theory came to the attention of Richard Feynman. Until his death in November 1989, Feynman worked to remove several inaccuracies in the Heim Theory, publishing ‘An attempt to revise a Theory of Everything’ in August 1989.

Feynman’s paper in the Heim Theory was considered to be his last important work, where he expanded the six dimensional Heim Theory to eight dimensions. The Heim-Feynman Theory, was well received in the peer review process and confirmed by a number of important physicists, like Stephen Hawking.

The Feynman-Heim Theory eventually became the fundament for explaining how the discovered aliens had been able to cross the distances between the stars. During the early 1990s work on the theory resulted in theoretical models for a contra-gravity system and an FTL engine which of course was big news in 1994. The theory allowed for crossing over into another of the eight dimensions of Feynman-Heim space, where lightspeed could relatively easily be surpassed without breaking causality. However the FTL system was limited by gravitational fields of either a star or a sufficiently large planet. To enter Feynman-Heim space a spacecraft would have to go out into deep space past Jupiter or use vast amounts of energy that could destroy the FTL system.

The wreckage of the alien spacecrafts discovered in Mexico and in Germany provided an early insight in the alien technology. In Germany Walter Dröscher and Joachim Hauser, working for ESA, were able to identify a device that could possibly be a contra-gravity system, according to the existing theoretical models.

Theoretically it was possible to copy the system using existing technologies. From there on, it would be relatively simple to create a compatible FTL system.

In other areas the research into alien technologies proved to be much more fruitful.

Both the United States and the Soviet Union had returned artifacts from Mars, with the United States enjoying the advantage of having earlier access and a prolonged monopoly of access and the Soviets having sole access to the Veneran derelict. Later on the Europeans, Chinese, Japanese and Mexicans joined in with their access to the Quetzalcoatl wreckage discovered in their respective areas.

The first area that advanced significantly with the help of these artifacts was material science.

At first only the casings of various devices that proved to be valuable. Several highly advanced polymers were discovered, some of which showed properties of metals. Other materials were cermet materials that were much more resistant than those known at the time. Any of these materials were used in what was believed to be articles of everyday use, suggesting that the Ziggies had been able to produce these different materials with little to no effort.

The interesting part was that many of these materials showed a microscopic layering of several ten nanometers. Several engineers therefore suggested that the Ziggies had used a more developed version of the just emerging rapid prototyping technologies. That would imply there had to be technology and/or machines for this task. Proof of the existence of such machines was lacking at the time as no one on Mars had yet encountered any. Proof of the existence of such machines was lacking at the time as no one on Mars had yet encountered one.

For both the United States and the Soviet Union a Ziggy rapid prototyping machine rose high on the list of objects to be returned from Mars, even if no one actually had an idea what to look specifically. For the United States such machines meant an additional way of improving national self sufficiency, while the Soviet Union saw it as a way to increase industrial production and efficiency.

Superconducting materials were the next big area where alien technology advanced the field by several years, if not entire decades. Most of the samples from Mars and Venus contained superconducting cermets that kept their superconducting properties up to temperatures of 600 Kelvin. The superconductors obtained from the Quetzalcoatl wreckage were less effective compared to the ones from Mars, losing their superconducting properties at temperatures of about 400 Kelvin. This was still much more than humanity had achieved so far.

Chemical analysis of the materials were the least of the problems in trying to replicate these materials, as knowing the contents of these materials helped little in finding out how they were made. Another difficulty was the relative high amount of rare earth metals needed for the high end materials which would make industrial scale production of these cermets extremely expensive.

First attempts to copy the materials and subsequently reduce the rare earth metal content, yielded a superconductor that only lost its superconductive properties at 235 Kelvin, greatly reducing the cooling of the material and the bulk of this cooling.

In 1994 the first application of the new superconductor was an MRI scanner of General Electric. The scanner used a liquid ammonia cooling system and was considerably less bulky than earlier systems.

Another commercial use of superconductors was also aimed at hospitals, as General Electric began to offer a superconducting electricity storage system as emergency backup and system to bridge the time until the backup diesel generators of a hospital could activate.

Of course other companies quickl followed with several more commercial applications for superconductors.

Research in nuclear fusion profited from the Ziggy and Queztalcoatl superconductors, through the laboratories quickly moved to the room temperature variant once they became available, as they could be cooled with a simple and cheap water cooling cycle.

In general, fusion research concentrated on using more advanced magnetic confinement methods as the new superconductors made their use easier and increased the power of the magnetic fields involved. In 1997 the National Spherical Torus Experiment in Princeton achieved a magnetic field of 40 Tesla, just after the European JET produced a world record of 16 MW of fusion power. In 1999 the NSTX achieved fusion for the first time and later broke the record of JET with 25 MW. Yet, the breakeven point was still far away.

Other fusion experiments in Europe and China produced advanced Z-Pinch fusion systems with superconducting capacitors that showed promise. At the Imperial College in Great Britain a Z-Pinch system first produced fusion pulses every second in 1998, and kept it active for a week in 1999. China was not far behind with their own Z-Pinch system.

Another more practical field of research was opened by the Europeans with the Drachenfels wreckage. It was made from alloys that were a little more advanced than known alloys, but well within the ability of the industry to produce. The secrets of the heat shield remained a mystery as there was no visible part that could explain how the aerospace craft dealt with the heat produced by reentry.

The recovered space suit of ‘Fafnir’ was more willing to let go of its secrets and was identified as a four layered mechanical counter pressure suit. While two of the layers were using well known synthetic fibers, including kevlar and spandex, the remaining layers used previously unknown synthetic fibers that had shape-memory properties when set under a low electrical current.

As mechanical counter pressure suits were already getting more advanced since the first use in space by ESA, the use of the new shape-memory fibers was great news for the scientists working on developing simpler versions of the existing suits.

Carbon was another material that came to the forefront of research with the discovery of the Ziggy carbon-based semiconductors. Created from multiple monoatomic layers of carbon in a diamond carrier, these microelectronics appeared to be more powerful than conventional silicon semiconductors in theory. Practically they could not be tested due to the lack of any information on the nature of the integrated circuits.

The diamond based carriers for the semiconductors were relatively easy to reverse engineer and resulted in a research boost for the chemical vapor deposition method of producing synthetic diamonds.

The monoatomic carbon layers, named Graphene, were more of a problem in reverse engineering and production. Theoretically Graphene had been known since 1962, but only then there had been a need to create the material. Various methods were suggested, including the use of tape to produce the material.

Graphene didn’t only have uses as a semiconductor , but also proved to be useful for other applications, such as filtering water. Eventually a graphene based filtering system replaced the reverse osmosis filters of the NASA and Soviet Mars bases.

A more interesting discovery within several Ziggy and Veneran semiconductors were nanoscale electrodynamic ion traps. These traps were already used in research for quantum dynamics, making these integrated circuits small quantum computers. 143 was the highest number of ion traps counted within a single integrated circuit the size of a 80286 microprocessor.

Sadly, weapons were in fact discovered on Mars, within the Veneran derelict and the known Quetzalcoatl wreckages.

The first advances in weapons technology came from Mars and several rifle like objects recovered from Ziggy remains. These weapons turned out to be laser weapons, using highly advanced semiconductor laser systems to pulse a more conventional lasing cavity. The optics were all made from synthetic sapphires and were far better than anything produced by mankind to that date and using a few tricks that had been previously unknown.

The first use of these optical systems were the Soviet Polyus armed satellites, coupled with a conventional 1MW carbon dioxide laser. An American laser system using a semiconducting laser was in development at the time and not expected to be delivered until 2003.

The Soviets also had access to a more exotic weapon system extracted from the Veneran derelict, as they had been able to identify a number of large and apparently powerful particle accelerators, mixing a cyclotron with a linear accelerator. Weight constraints on the Soviet VEK spacecraft however were keeping them from simply returning an entire weapon to Earth. It did not keep them from taking one of them apart to transport single parts to Earth for research and to assemble a simplified version on Earth, testing the weapon in a controlled environment.

The Quetzalcoatl weapons recovered from the wreckage of the aerospacecraft on Earth finally gave pointers to different conventional weapons, or rather how to make conventional weapons more powerful. They were largely conventional projectile weapons, though several experts were surprised about the use of largely conventional propellant casings made of brass and the built in brass catcher of the weapons.

Each weapon was fitted with something that eventually got called a ‘booster stage’. Made up of several superconducting coils, fed by a superconducting battery and controlled by a simple computer system the system formed a coil gun that would give an additional kick to the projectile.

First reverse engineered prototypes were built by 1997 and tested with existing conventional weapons. Tests showed that even these prototypes were able to increase the energy of a projectile by fifteen percent. The more advanced Quetzalcoatl boosters were believed to be able to increase the projectile energy by fifty to sixty percent.

To counter better weapons, better defenses were needed. While it was simple to be taken care of on Earth, especially with the introduction of reactive armor, space held other problems. Armor could not be made heavier and more bulky. Instead new ways of increasing protection had to be found.

The first way of increasing protection had been to introduce Whipple Shielding to counter projectile fire and as a side effect increased the protection against micrometeorites and micro debris that were a problem. Multiple, relatively thin layers of metal armor were spaced by several centimeters and high speed objects allowed to hit the first few layers. The hypervelocity impacts vaporized the objects and allowed the next lower layer to absorb the energy over a larger area. Additional improvements were made by integrating kevlar and other textile materials.

Lasers on the other hand proved to be a problem to defend against. It were the Soviets, who eventually came up with a workable defense that could be integrated with their whipple shield armor, by adding a bubble wrap like layer to the kevlar layers with the ‘bubbles’ filled by a gel of a high temperature capacity. If a laser was able to drill through a whipple layer, it would hit the gel. The gel would at first take the heat, before vaporizing and scatter the laser beam.

More exotic methods of defense were thought about as well, as the new superconductors allowed for relatively light and powerful magnetic field coils that could protect a spacecraft from most ionizing radiation like the magnetic field of Earth.

The Soviet Union made use of their more advanced knowledge about plasma physics and managed to use a magnetic field to shape a cold plasma, resulting in the absorption of some microwaves and particle radiation.

Most these newly developed technologies however would need several years to reach a stage where they could be actually used outside a laboratory.

From the beginning of research into the alien artifacts, universities and laboratories had needed to remain in contact with each other, sometimes even permanent contact. One way to facilitate this contact was the use of computer networks between the single locations, such as the American ARPANET, largely using electronic mail for faster transmission of documents and conversations, or the IRC protocol for more real time conversations.

The early boom in the computer segment of consumer electronics, with systems like the Spectrum ZX81, the Commodore 64 or the Apple II, was considered to be the reason why computer networks and modems eventually made their way into everyday homes, as scientists and engineers working for the various research projects wanted to have the same amenities of instantaneous contact with other at home.

Home computers were widely spread and lead to offers of acoustic couplers, followed by dedicated modems, by the various telecommunication companies allowing to connect to ARPANET and eventually the civilian version Usenet.

Eventually the home computer systems gave way to the Personal Computer, which, developed and originally sold by IBM, could be extended with prefabricated modules where the home computer could only be enhanced by more intrusive means, like soldering in additional modules.

By 1991 the Euronet, the European answer to the American ARPANET, went online, after being developed at CERN using ARPANET protocols. Developed by a team around Tim Berners-Lee, Euronet included a number of servers that provided access to interlinked static content, using the Euronet Hypertext ENHT format to present the content.

The ability of ENHT to include external graphics lead to a broad adaption of the Euronet Information Transfer Protocol, EITP, by ARPANET providers and eventually lead to a simple interconnectivity of both networks.

The new iteration of network connectivity with graphical and eventually audio capacities made new user interfaces needed. Microsoft dominated the North American market with its MS-DOS and integrated these capacities into their Windows NT operating system.

The European market was a more diverse affair compared to North America, as it was dominated by various implementations of Unix systems, followed by RISC OS and XTS-400. Due to the fragmented nature of European operating systems they eventually developed to a common base, where programs and applications could be run on all operating systems with little modifications, such as applications allowing the use of the newly found capabilities of Euronet with existing systems.

Eventually the Soviets opened up their own Soobshcha Computer Network to the west, however the differing natures of the two networks required special bridges to convert the protocols of one network to the other. Conveniently the KGB was in control of these bridges through the Soobshcha Network Authority.

The Soviets, with less developed computer technology, were very interested in samples and entire construction plans for advanced hardware architectures as well as software.

The ARM, a quite advanced 32 bit RISC architecture was of special interest for the Soviet Union and lead to the acquirement of samples, design documents and other documents to produce their own versions and eventually continue development of them. Renamed as the PK processor, the ARMv2 clone was produced in 1993, eventually being developed into a largely independent infrastructure following RISC principles.

On the software area the open source nature of GNU and the Minix Kernel allowed to simply copy and adapt them under the label EVM, though the sources of the software were not made available for the general public and included subsystems designed to allow the KGB to monitor the computer systems during online activity if needed.

However there were enough programmers and intellectuals who were able to program software to get around these KGB programs and establish a form of Shadow network that existed within the Soobshcha Network.

By 1999 computers had penetrated many households around the world and 75 percent of households with computers had access to one of the national or the larger international computer networks.

Leave a Reply

Your email address will not be published. Required fields are marked *