Français Français Français        Smaller page          pastel colours page  

Lokouten        Chapter 2       

 

Chapter 2
The Dumrian computers and the online meta-game.

 

 

On Earth, the ancestors of the computers were mechanical machines. The mechanical calculators could have appeared in the time of Christ, if scientific progress in that time had continued: the Antikythera machine was able to calculate the age of the moon with a system of gears and dials. But this technological progress was lost, due to the sacking of Rhodes by the Romans. Then we had to wait for Pascal, in the 17th century, to invent a second mechanical calculator, based on the abacus, and using the mechanical skills of watchmaking. In the 18th century Vaucanson's looms were the first industrial automatons, developed on the principle of artistic automatons. However, the first mechanical machine that could actually claim to be an ancestor of the computers was in the 19th century, created by Babbage, which was provided with conditional jumps. But the mechanics of the time were unable to build it. This is how Lady Lovelace, the daughter of Lord Byron, was the first programmer, in 1850...

However the first information processing machines did not appear until 1890, with the U.S. Census, using the first tabulating machines and punched cards, invented by Herman Hollerith. These machines were programmed in an unique way, a bit like our current spreadsheets, to compute totals using relay counters. Hollerith founded in 1896 the Tabulating Machine Company, which later became IBM. A whole industry flourished until the 1960s, around mechanical calculators (cash registers) and tabulating machines. Some were sophisticated enough to make simple queries to databases, or they could receive several connection plates (programs). However we were still far from the flexibility of real computers.

But it was only in the 1940's that appeared, in Germany and England, the first really usable machines, using vacuum tubes (triodes) or electromechanical relays. While Turing laid the foundations of software science, and illustrated it with breaking the secret nazi «Enigma» code, Von Neuman described what would be the basic functioning of Earth's computers until the 2030's: An instruction counter, which value points at one instruction in a program memory. When the instruction is executed, the counter is incremented, in order to point at the next instruction. This allows these instructions to be executed in a series, as in the Vaucanson system. Each instruction manipulates the data that it reads or writes in a data memory. But some instructions, specific to computers, allow in more for conditional jumps in the program, by changing the instruction counter according to a result. These conditional jumps provide with all the flexibility and adaptability of modern software, compared to the rigid course of Vaucanson's automatons.

These computers would have remained huge and expensive machines, thus rare and confined to military or scientific areas, if two inventions had not opened the way for their miniaturization: The transistor, which replaced the bulky vacuum tube with a tiny crystal, and the integrated circuit, which allowed to manufacture and link tens of thousands of elementary circuits in a single operation, replacing tons of vacuum tubes with a tiny silicon chip easy to produce in series. The successive progress in the manufacture of integrated circuits led to a tremendous reduction in size of computers, but mostly of their price, which allowed these machines to penetrate all the society, up to the world of entertainment: It was hippies in San Francisco, in the 1970's, who first had the idea of purchasing a second hand «small» machine, for game purposes. Then the computers invaded businesses (communications, accounting, text processing, databases, computerized design) administrations (accounting, databases) factories (process controls, automation) private dwellings (games, videos, mail, Internet) and even vehicles (planes, subway, cars)

As the advances in electronic also profits to the development of telecommunications, the idea appeared naturally of connecting all these machines together into a giant network, since the beginning of the ARPA net in the 1960 years until the triumph of the Internet between 1995 and 2000. In the 2100 years, this Internet had still much progressed, and its user could find, gathered in one machine the size of a notebook, all the functions related to telecommunications and data storage (mail, files, books, Videos, Music, phone, radio, television, live concerts, conferences, sales, banks ...) and all the media which may be put under electronic form: books, images, videos, sounds, shapes (with kinds of printers able to sculpt 3D objects in colour) and even smells (through analyzers and synthesizers of smells, using an on demand synthesis of odorous chemicals). While in 2000 most web pages contained only images and texts, one hundred years later, many of them were videos, or virtual world.

For the second Internet revolution in the 2000 years was the emergence of virtual worlds useable by everybody. By 2020, they had become indispensable for organizing business meetings or scientific conferences, as the virtual characters could meet at thousands of kilometres, and discuss as if they were actually facing each other. But the enormous empathic and poetic impact of the total immersion in another world, could not just go unnoticed by the artists, and by all the designers of a better life, an heavenly life in the landscapes of our dreams. Thus the first beautiful virtual communities began to appear in 2007, and became numerous and varied as soon as 2015. But it must be said that many users preferred to invest the virtual worlds for much more prosaic purposes, such as satisfying their sexual fantasies, or playing online fighting games. Nevertheless, the visions and ideals born into virtual worlds had a growing impact on the so-called «real» life, starting from about 2020.

However the user of these virtual worlds was still sitting before a screen, pushing buttons to walk or move his character. The means to really move and act in a virtual world was exoskeletons (note 3), cheap enough to replace the game consoles in every household, allowing all the youngsters from 7 to 77 to meet and live their adventures in an incredible variety of extremely sophisticated virtual worlds, sometimes more complex and extended than the physical world. The first «dream machines» were just an arm, but they already allowed to handle a tool, a weapon, or to feel a breast, that the software corrected to the ideal size and shape. Later appeared total immersion machines. A good exos (the fashioned abbreviation) was looking like a cabin (foldable) that took little more space than a desk and his seat (that they even often replaced) and one could slip into as easily as in clothes, with an electronic assistance. A curtain and headphones for sound allowed the user to be practically isolated from his room, and dive into incredibly realistic adventures. The curtain was not indispensable, but it guaranteed the privacy of the user, so that he was not seen gesticulating in a weird way. Some of these machines, questioned by many, but with still a lot of adepts, were able to reproduce the sensations of tiredness or pain, in order to capture the stake of the situations. Some amateurs were pushing them at maximum strength, to the point of receiving bruises in fights. There were of course some machines provided with sexual functions, which allowed to meet safely, via the network, virtual lovers, who were all physically perfect and free of any disease. Some said that the development of the teledildonics had led the advances in electronics and telecommunications in the 2020s. As a matter of facts, sex or fighting with exoskeletons required warranted and ultrafast Internet communications, that the satellite links and the http protocol could not provide.

Such a mingling of the virtual and the physical was not without sometimes serious problems, and some users were showing the «autism of the player», dropping interest in their physical relationship, or losing bearings in the physical world. Others had more and more difficulties to accept the limits and forbidding of the physical world, compared to some virtual worlds with no morals. Electronic game and virtual reality had become, for some, a plain drug, a new artificial paradise, to the point that laws were needed, prohibiting or regulating some practices, such as games with a real stake (For instance the loser owing money or service to the winner).

xxxxxxx However a real and astounding computer delirium was the belief as what our «avatar» (our appearance in the virtual world) could survive us after our death. Some were even paying a high cost for being «immortalized» on the network: unscrupulous web hosts were proposing artificial intelligences, or «bots», able of behaving like the deceased, manifest his tastes, his opinions, his appearance, and everything that the materialists see as being «their personality». As everything was distributed on the computer «cloud», once started it was very difficult to stop these «electronic ghosts» (boghosts or zombies). Thus, many continued to haunt the virtual places where they lived, in such an extend that they had to be ejected. Of course these virtual characters had no consciousness, and it was easy to check this, by putting them in unexpected situation, that their model could not imagine while living. But a crazy belief was spreading, about the survival into the network, fuelled by enthusiastic books, written by «witnesses» or by barmy «scientists». Some believers even organized into a pseudo-religion advocating «the continuity of life in the virtual worlds after death.»

The electronic ghosts, on their side, were still imperturbably exhibiting the beatific smile and selfish ways of their models, without any change other than the artificial intelligence designers were putting into them. There were cases of widows harassed by the electronic ghost of their deceased spouse, and in any case a good fight, based on whether an «avatar» was a real human person, or an artificial intelligence. The latter, more and more sophisticated, were more and more easily able to fool people into impersonating humans, and there were cases of people defrauded by ghosts, or who, through their sexual exoskeleton, happened to have sex with virtual characters deceased for years. These virtual sex relations brought some divorces for adultery, while «philosophers» claimed on the contrary that these sex relations «in the virtual» were not compromising the marital contract.

All this craziness was terribly worrying many people, and made many others laugh, who felt that virtual world were games, or at best communication means. The only real problem with thinking people using the Internet, or virtual worlds, was to get rid of all these «bots», ghosts or advertising creatures, which, without proper care, would invade all the virtual landscapes with their slogans and their sex fantasies. Some were even multiplying like viruses, and it was extremely difficult to disinfect the computer «cloud». Among the solutions, one of the first was to place «sphinx» at the entrance of virtual worlds: software randomly asking series of questions such as «which of these images is beautiful» (among a series of ten showing nine Picasso and one Botticelli), questions which answers were obvious to any normal human being, but extremely difficult to find for a software. This process, although tedious, was sufficiently effective, except for one detail: it happened more and more often that some humans were missing the anti-bot tests! It became a joke and even an insult. Other more firm solutions were anti-bots software, comparable to anti-virus, still needed, and responsible for disinfecting the machines themselves of any unwanted presence.

 

Of course puritan morons were all together angry after the cyberworlds, and especially after cybersex, which they said was a perversion of human love, reduced to simple bodily functions without any emotion. Banned in some countries, criticized everywhere, their users considered as retards, the exoskeletton however enjoyed an untouchable statute, because they happened to be the only real solution to curb the overpopulation catastrophe of our modern world, and especially to stop the epidemics of sexually transmitted diseases. In any case, they were perfectly enough for those who think that love can be summarized to only experiencing physical sensations, without giving a thought to the partner. These ones even discarded virtual partners, always unpredictable. Instead they quickly developed artificial intelligence simulating the human behaviour, but in a fully programmable and predictable way.

Facing the «cybersex», new naturist movement advocated the «return to the real life», contact with the sun, nature, and especially with others. But there also was, more and more, people who, in complete discretion, were using the most advanced technologies for social, spiritual or poetic purposes. Some even saw in the sexual exoskeletons a way for enjoying a pure love and romance, rid of all these disgusting bodily details which kill eroticism. Distance was also freeing them of all the banalities and small daily conflicts which, over years, erode and wear out the most intense love. Vast online communities devoted themselves to create cyber paradises, where, after a day in a dull «reality», people gathered in the sun on the beach, into flower gardens or forests, to exchange pleasure and other joys of live, or to explore lush lands populated only with friends and nice people. Soon the sick, disabled or with a shunned race, came into virtual communities, where they were often among the most active and constructive. Here at least they were free to walk, to dance, to have a normal life, without having to face this terrible exclusion look which nails disabled people on the ground. There even were Hindu, Chinese or Tibetan gurus, saying that such a practice was interesting, and it might even have a tantric meaning, if it was done with a sincere motivation of pursuit of happiness and wisdom. Oh, the nice virtual prana we could breathe in some virtual worlds...

As soon as 2010, appeared the idea that the virtual appearance of a person was much more accurately figuring what he really was, deep in his heart. Much better in any case, that this assembly of flesh which forms our physical bodies, that we did not choose, that we cannot modify, that we must endlessly maintain, clean, cure, fill with expensive materials and empty of disgusting substances, without being able at least to prevent aging and decay. So, more and more people became beautiful, young and friendly, and enjoyed the primal happiness of appearing as such to the eyes of others. Some choose fantastic appearances, but quickly the virtual worlds knew a craze for beautiful creatures of human shapes or similar, such as Elves or Fairies.

So people were building virtual houses, which had no toilets, but which were always full of light, beauty and flowers, in idyllic landscapes of a stunning creativity.

Even the death of a virtual friend strangely lost any sinister character, even if it clearly remained a painful loss, as the author of these lines has personally experienced in 2008. I keep a moving memory of a beautiful young Elve dressed all in pink, merrily dancing like a pixie, and exchanging jokes, two hours before she died... Died in the physical world, let us say it clearly, it was not a game, she knew and everyone knew that she was in the terminal stage of brain cancer. She was twenty...

 

 

 

As if the incredible miniaturization of integrated circuits during the 2000 years was not enough, nanotechnologies allowed, starting in the 2020 years, an even more incredible leap in miniaturization, with storing information on the scale of atoms. The principle, simple and known since the 1990s as the tunnel effect microscope, was to deposit, move or remove atoms in small groups, see individually, on a surface, using a very fine point, actuated with an extreme precision by piezoelectric crystals. It took many studies and many years to obtain circuits which could be used in practice, but governments and companies had such a passion for this utopia that they succeeded at last.

However the mass printing techniques used for integrated circuits were not easily transferable to nanotechnology circuits. The very first nanotechnology circuit was the programmable millipede, invented by IBM in 2003. It combined a number of tunnel effect points, moving simultaneously on the same surface, each being busy with a small portion of this surface. This system was initially developed to replace the computer memories, but it had no immediate success, because of its complexity, and other competing systems. However the progress in these devices gradually brought a new concept in computer science: The writing system could write data, but also circuits, as it was already done since 1990 with the ASIC. So the first computers to use these technologies could load programs, but also logic circuits for the acceleration of certain repetitive functions. Graphics accelerators were already doing this, but as separate integrated circuits. Now everything was assembled into a single package, and it was possible to load data, programs and circuits in a single operation. More, all this was done with the same writing tools, left permanently in place.

This system soon came to challenge the original Von Neumann architecture, as the two previously separate manufacturing steps (building the circuit and installing the software) could now be completed at the same time and with the same device. Indeed, rather than fetching an instruction in the memory, it was often more convenient to write a circuit which executed it directly. This questioning of the Von Neuman architecture was already seen, especially with the neural circuits of the 1990 years, but these circuits remained confined to specific applications. Soon the traditional central processing unit of the computers, even in personal computers, was confined to only a coordination role between many software-circuits, each dedicated to a peculiar function: application, input-output, etc. One could buy already engraved circuits, with the programs and data for their function, or engrave them directly with downloadable files describing both the software and the circuitry. So, the previously fully separate concepts of circuit, computer language and software completely flew apart, and it appeared languages describing simultaneously the circuits (as in ASICs) and the software that would run into (as in computers in the 1990 years). These languages were describing functions, of which the user no longer had to worry about whether they were engraved or programmed. We even got hardware circuits which could be erased and re-engraved, for bug fixes or changes, just as with software on a hard drive.

The hard drives of the 1990 years began to decline with the flash memories, and many other integrated technologies that emerged since. However, all encountered an impassable limit: when circuits or memory cells are only a few atoms wide, their operation becomes erratic. The manipulation of atoms one by one somewhat repelled this limit, but without suppressing it. The atom size was the ultimate limit of all miniaturization.

But the number of wires linking the integrated circuits was steadily increasing, to the point of becoming the main bottleneck, especially when the formats of the links had to be changed. An optical solution appeared, with a simple plastic light guide connecting the various circuits. Tiny laser diodes were emitting light pulses, each on a different frequency. Thus each circuit could receive the relevant information, simply by adjusting its reception frequency, without worrying about the intermediate journey. Anyway chips were for long so fast, that they were called relativistic (Einsteinian Relativity): because of the limited propagation speed of the signals, any notion of simultaneity had disappeared. So each item had to manage a flow of information as and when it arrived.

Large Desktop computers in the 1990 or 2000 style would look quite ridiculous in the early 22nd century. The largest of all personal computers were laptops, the size, at most, of a PC keyboard in the year 2000. Through the system of light guides, everybody could build his computer with adding small modules, the size of a coin, along the guide, and one could even make the computers communicate, by simply placing them side by side, their guides joining. So the computer was no longer a monolithic machine, but a variable set of functions one could add or borrow temporarily, on the flight, even for some minutes.

The screens also made considerable progress. Instead of emitting light, on the model of the cathode rays tube, monitors now had the appearance of white sheets of paper, thin, rigid or flexible, where images formed from tiny droplets of dye, moving under the action of electric fields. The images could even remain visible once the screen was disconnected. These screens were also touch sensitive, with a simple process of scanning with an electrical potential at high frequency, that the finger was short-circuiting: no need for a mouse, one could just click with the finger, or manage virtual tools.

Such a system allowed for keyboards with a variable or dynamic display: different sets of letters, menus, tool bars, etc. Thus the screen and the keyboard could be combined into a single entity, and very varied sizes were available. Some were very small, like cell phone or computer-cards (the size of a credit card, but used for various other functions, such as keys or passports). Others were like the laptop computers of the 2000 years, but thinner. As the later, they were pliable, one side for the screen and the other side for the keyboard. But the keyboard part was able to adapt itself in real time to the work in progress, sometimes assuming the appearance of an alphabetic keyboard, sometimes of an icon bar, leaving the screen itself entirely available for viewing. In the extreme of miniaturization, screens took the shape of glasses, where the image was projected. But there also was the opposite trend: maxi-screens occupying the entire physical desktop: the software desktop of the 1990's PC became a physical desktop, where one could push documents far away, take notes in a corner, click and draw with one's finger, and even bring mess like on a real desktop.

Another quieter revolution in the software industry was the emergence of free software, initially considered as a curiosity for computer enthusiasts. But when the main functions of a computer could be provided by free software, it became more and more important. Of course, dispersed fans could not compete in speed with teams of professional developers, and toll software were for long a step ahead of the free software, particularly in new areas, or in games. This quality discrepancy took a long time to be filled, as the amateurs had to learn, in the years 2010 to 2030, to organize themselves, to work in a hierarchized teams, to accept methods, norms and standards, and mostly to heed the needs, and provide with simple products, understandable by all the users, instead of developing each in their own way or imposing their conception of the ideal user. Despite these obstacles, the gap between free and toll software was reduced gradually, so that in the late 21st century it was difficult to remain a software company, as it was impossible to make one's work profitable over enough time. Making toll software was only possible for new needs, or for areas involving a high intellectual or artistic content.

An advantage for amateur and free software was that it was easier and easier to develop complex applications. In the 1980 years, one had to write programs line per line, and manipulate data byte per byte, with languages like the Basic, where major projects were quickly becoming unmanageable. In the 1990 years, a structured language like the C avoided some mistakes, but others remained difficult to detect. In the 2000s appeared object programming, for example with the Java language, where one could write entire functions in a few words, leaving the compiler create all the complex manipulations required by these functions. Later were introduced development assistants, which managed the variables and functions as a database, allowing to easily create complex functions interacting with each other. These assistants also managed the development stages of a project, achieving automatically all the tiresome routines of coordination and verification. At the time of the story, nobody ever wrote a line of code for long, people were manipulating flowcharts on the screen, even not those detailed flowcharts which were drawn in the time of the Fortran, but synthetic charts representing the various functions of the application, and their interactions. To add a single function, it was enough to add some arrows and all the other functions were automatically updated, rewritten, coordinated and compiled, even in such a complex case as, for example, to take a program designed for a single user, and make it multi-user, a thing which in 2000 would require to rewrite all the program line per line. These programming support environments were expensive, but they allowed a single person to develop applications in a few days, when in 2000 it would require teams of engineers. These systems certainly allowed professionals to develop their products much faster, and above all to improve and maintain them instantly, even while entrusting the work to someone who did not knew the product. But in a free environment where any previous creation could be taken again and improved without regard for copyright, these benefits were much higher, and they allowed some simple amateur clubs to create complex applications in a few days, just as professionals did, and with the same quality. This ensured the ultimate victory of the free software, as it was no longer necessary for unqualified amateurs or egotic geeks to learn and accept all the discipline and methods required for a correct quality.

But there is no need for having many versions of the same software. In the late 21st century, basic versions existed long ago, for all the possible functions of a computer, and improvements, free or not, were just about details, or entirely new functions. So the most basic software, word processing, spreadsheet, editors, music or video, were all free for a long time.

 

At the time of the discovery of Dumria, many people were still thinking that this evolution of electronics on Earth had followed a logical path, the only possible, for instance when switching from the triode to the transistor. What a mistake. Just as Dumrian life had developed hominisation on a different basis (Reptiles instead of mammal), Dumrian electronics and computers had evolved in very different ways. Not only the techniques themselves, but also the use and purpose of the machines.

Logically, the Dumrians, fond of astronomy, developed as soon as antiquity the first mechanical calculators for this science. Even before the Iron Age, calculation mills, of polished mahogany and bronze, powered by paddle wheels in mountain valleys, were, more than six thousand years ago, the first Dumrian machines for exploring the Universe. They strongly evoke Vaucanson's automata or Babbage's project. Some of these machines have been carefully preserved as it, despite their size. But the discovery of electricity was, as on Earth, the true birth of computer science, thanks to the electro-mechanical relays and vacuum tubes, two very different technologies which emerged and evolved in parallel. Soon confronted with the need to miniaturize their machines, Dumrian engineers began to punch cut the electrodes of their triodes; then they grouped the sets of parts in a single plate, a few centimetres square, with dozens of cold cathode tubes, enclosed into a single extra flat box. At last they developed engraving techniques, quite similar to those used on Earth for integrated circuits, but they used them for the miniaturization of the triodes and relays! Never a single silicon transistor has been manufactured on Dumria, even when the quantum telescopes allowed to find the plans on other planets! Too complicated, the Dumrian engineers thought, facing all the necessary steps of purification and crystal growth, diffusion of impurities, etc. Instead, the basic build of a cold cathode triode involved only two quite common materials, a metal and an insulator, and they could be miniaturized and etched in series, just as transistors, and with the same advantage. So why to develop a silicon-based electronics? Only plastic-based transistors were used for computer screens.

The Dumrian technicians used to deposit first an insulating salt on a special metal substrate, and then a thin layer of metal. The connections and the electrodes were first etched in the thin metal layer. The cold cathode tips were fashioned from an acute angle, where a small excess of metal dissolution was forming a very sharp tip, in the direction of the crystal grain. Then a chemical attack dissolved large holes in the salt, leaving the insulated electrodes in a vacuum, maintained only by pillars of the remaining salt. A fine tip electrode was a cold cathode, two lateral blocks gave a gate, while a square facing the tip was used as the anode. Their favourite component was a double triode, with one cathode, two deflection blocks as a grid, and two anodes side by side, to make differential amplifiers or logic circuits with two inverted outputs. Some more complicated circuits even contained full electrons channels, to manipulate and steer a complex information flow, or to provide sophisticated functions with very few components. Curiously, on Earth, in the 1960 years, the Tektronix Company had developed simple logic circuits, using gas tubes and cold cathodes, used to make line counters on oscilloscopes for professionals of television. Other attempts were made to gather and miniaturize several cold cathode triodes into one enclosure. If the transistors had arrived later, these devices could have evolved into computing, in the same way as on Dumria.

Other Dumrian engineers had developed, still with integrated circuits engraving techniques, some extremely miniaturized electrostatic relays. In these devices, a mobile electrode is attracted or repelled by the electric field of a control electrode, so that it can touch or leave an exit contact. Early models, a few centimetres high, were directly derived from the electroscopes, which on Earth were used to study static electricity, before the discovery of electricity itself. The idea of using such devices for calculation purposes, came early into the minds of the Dumrian engineers who were already building mechanical computers for several millennia. Some models were built, powered by high electrostatic voltage. Although they were cumbersome and highly sensitive to weather, they kept some followers, who tried to miniaturize them more and more, in order to obtain reasonable size and supply voltage. They soon packed them under vacuum, to get free of the weather effects. In fact, it is them who invented the methods for engraving and cutting circuits, which were used later for triodes! And, just as with the integrated triodes, the smaller it is, the better it works, the faster it goes, and the less it consumes. And as the machines required for the two techniques were the same, triodes and relays, so they engraved either field emission triodes or electrostatic relay on the same circuit.

So, at the time of the story, the Dumrian electronic circuits were matching the performances of the best transistorized integrated circuits, as much in speed as in power consumption, and even integration. Even field emission triodes, miniaturized well below the micron scale, do not consume more power than field effect transistors. And in order to get a strong signal, it was enough to get a very large number in parallel, what is called a Spindt array, just as in a power MOSFET. As for the relay, they were so fast that they could be used with sound and even with video.

However, the Dumrians, happy with their techniques, and quite fans of nice big machinery, only slowly developed nanotechnologies and quantum electronics, and this was one of the few areas where they were clearly late compared to Earth. They developed few laptop computers or mobile phones, because, they said, when we travel we travel, and we do not play two games simultaneously.

The Dumrian computers also remained confined to the model of Von Neuman, because software and circuits remained fundamentally two different things, like on Earth in the era of the transistors. They did not develop either hard drives or floppy disks, only electrostatic storage that gave them full satisfaction, again despite a large volume (at least initially, at in the time of contact with Earth, they were fairly miniaturized, and they would advantageously replace the bulky hard drives of the 2000 years). These memories were based on the property of certain insulating materials to retain electrical charges. The first electrostatic memories were tested with computers made of electrostatic relays: a relay was open or closed depending on the charge stored on its control electrode. But this technique was quickly transferred to field emission triodes, as a negative charge stored on the grid could easily block them permanently.

The software industry also evolved in a very different way from Earth, given the total absence of commercial interests to push developers in an unique direction. No great systems which «impose» their use like religions, despite their disadvantages; no version changes every six months, no bugs that the publisher never corrects despite repeated complaints from the users. Dumrian software science first evolved as a flowering of different systems, where the notions of operating system, language and applications were happily mixed all together. But the need to communicate and share software quickly created a strong trend toward method and standardization. For the Dumrians, standardization just was a new game, and programmers gathered in a standardization committee. Without commercial powers to impose their systems, without intellectual property to prevent the spread of innovations, without competition to push into wrong directions, it was a combination of the best systems which finally emerged. Everyone was free to reuse a program which worked well, to improve it, or to include it into a larger whole, rather than having each developer to recreate from scratch a new application, more or less imitating the one of a competitor. From this simple base, computer enthusiasts developed all the variety and subtlety of useful software that you can imagine.

If the Dumrians were seeing their computers as useful machines or communication means, with a sense of play as theirs, they quickly started very sophisticated computer games, online games, and of course virtual worlds. But as soon as the earlier computer networks, an entirely original game system was developed: a common language, allowing anybody to create his universe, his game, his characters and his rules. There was a huge contrast with Earth's games of the 2000 years, fully closed and monolithic pieces of software, where the player was allowed with no creativity. A module of this language was for modelling the field map, in plan or isometric view, another translated this map in a 3D landscape, using standard or special objects, another allowed for action, another for generating situations... These various languages, and their corresponding editors, were first separate creations, but they were harmonized and combined into one unique system, very complex, but very flexible and easy to use. Thus, unlike Earth games, with their closed design to protect the commercial interests of their authors, Dumrian games were always open, allowing everyone to modify landscapes and scenarios to his liking, to create new worlds as desired, and even join completely different worlds together. The common language, carefully updated with technological progress, allowed for an incredible blossoming of many different games, but all able to communicate the one with the other, into only one giant meta-game or metaverse, whatever they were exploration, social interaction or physics simulation.

There even was no clear boundary between dream and reality, many using the meta-game system to manage their business as if it was a game, or on the contrary, behaving concretely as if the game was the reality, dressing or building houses as in their game...

On the other hand, nowhere on Dumria would we find any of these ugly «punch-up games», or those dreadful war simulacrums where we have to kill tens of opponents in a series (and never be killed ourselves!) As seen above, the matter was only about cooperative games, where the fun is to recreate the life and evolution of an imaginary society... Many Dumrians were investing most of themselves into these dream worlds, Mysteries, Utopias... The most subtle aspects of the Dumrian psyche could express itself here, without any limit. The first Earthlings to be introduced into the meta-game were extremely surprised, discovering a world of dreams and light, which curiously echoed their own dreams, far away of the dark and cynical ideologies fostered by Earth's games in the 1990 and 2000 years...

Another very Dumrian difference was the total absence of boundaries between «play» and «work», and we could find children games among the simulations, together with high science works, without any hierarchy between the two.

Of course, all the Dumrians were not spending the same time into the meta-game. Some did only if needed, others were spending most of their time in world. But we could not find these autistic geeks losing contact with the physical world, and not either of these guys with stuck brains snubbing the virtual life of others.

Lokouten        Chapter 2       

 

 

 

 

 

 

Scenario, graphics, sounds, colours, realization: Richard Trigaux.

 

 

 

As every independant author I need your support to be able to continue to work on this site and allow for a freedom of expression to exist on the net:

 

 

 

Legal notice and copyright Unless otherwise noted (© sign in the navigation bar) or legal exception (pastiches, examples, quotes...), all the texts, graphics, characters, names, animations, sounds, melodies, programming, cursors, symbols of this site are copyright of their author and right owner, Richard Trigaux. Thanks not to mirror this site, unless it disappears. Thanks not to copy the content of this site beyond private use, quotes, samples, building a link. Benevolent links welcome. No commercial use. If you desire to make a serious commercial use, please contact me. Any use, modification, overtaking of elements of this site or the presented worlds in a way deprecating my work, my philosophy or generaly recognized moral rules, may result into law suit.