Computing

Successors of the Transistor
Current Silicon based transistors will be replaced with a better technology. Moore's Law will continue.

Possible candidates are based on

improved manufactoring
 * Multiple cores per chip.
 * combination of silicon and germanium (III/IV) semiconductors
 * Trigate transistors
 * 3D computer chips.
 * On-chip memory.

New technologies
 * Silicon Photonics Research
 * molecular electronics or nanoelectronics (advanced chemistry/biochemistry)
 * plasmonics (wave effects of electrons are used)
 * spintronics (including phase-change and magneto-stricitve RAM)
 * nanotubes (graphene electronics)
 * Nanotubes replace silicon. PRAM, MRAM.
 * Holographic storage.
 * Quantum computing
 * DNA computing

MEMS
Nano-mechanical devices are based on elektromagnets and mechanics realised on a very small scale. Millipede for example stores data by chipping recesses in film of plastics. Also MEMS are used for moving mirrors as needed for optical switches, beamers, and camera stabilizers. Also mechanic computers are possible, but probably won't ever be able to outperform electron/plasmon/photon and spin based ones.

Photonics and plasmonics
One path is a transition from electronics to photonics. Using light instead of electricity for signalling has numerous advantages/mdash;for example in photonic clocking or for Fast Fourier Transformation. Metamaterials can provide lenses that allows working in sub-wavelengh dimensions. Instead of light also plasmonics might be a possible candidate to do so, but is in early development compared to photonics:
 * An article with an introduction about plasmonics.
 * The Silicon Solution - lasers on boards using current fabrication technology

Reconfigurable Chips
Reconfigurable computing will become the norm. General purpose circuits which can be configured for digital signal processing, data encryption decryption or other hardware roles will be reconfigured on the fly, to ensure that circuitry is not lying idle.

A step into this direction will be the FPNI&mdash;a technolgy developed by HP&mdash;which combines CMOS Gates with a programmable crossbar on the top. The switches of the crossbar are based on new molecular electronics. The crossbar provides a programmable interconnections of this gates like in an FPGA. This technique might get introduced between 2010 and 2019 and then become the regular.

Processors will also evolve for asynchronous clockless parallel processing. This advance in hardware architecture will lead to greater effective use of parallelism, but needs very complex software-to-hardware compilers which will get developed too.

Stacking
Computing systems will gradually move to arbitrary ultra-dense 3D. This technique will get realized by through-chip-interconnects. With that technique, chips will get stacked one-on-another and connected in the vertical connection. This will allow a more efficient routing by VSLI-Design Algorithms during the chip design, so the wiring is reduced and more transistors can get placed. Also, stacking memory and CPU will reduce the length of the wires and therefore reduce the latency and increase the data-throughput at the same time.

SOC Design
Another step in that direction is the system-on-package (SOP) approach, where chips, wires, antennas, switches, sensors, MEMS devices, etc. are embedded in thin layers of a single board. SOP boards easily combine digital, analog and optical elements&mdash;built with a variety of technologies&mdash;while each element is specialized to a specific function. This will allow ultra compact mobile devices. System-on-a-chip

Spin-based Transistors
Spin-based transistors will replace charge-based transistors. This will be possible by the development of better shielding, error-correction, and advances in the use of nanotube-based transistors. Spin-based transistors will provide higher clock rates because the electrons don't need to get pumped in and out of the barrier layer of the transistor.

RAM
The limited abilities of computer memories will be looked back on fondly, from a perspective where all fast access memory is expected to have significant parallel searching and matching power, by combining RAM with a SIMD engine. This means that RAM can do simple calculation operations itself with a lot of Memory without the need of loading the data into the CPU, which makes it more effective.

Memory becomes also either very fast by basing them on spin-based transistors rather than on electric charge in Cache memory, or non-volatible like in the magneto-strictive RAM (MRAM) and ferroelectric RAM (FeRAM) designs, or even both.

Such transistors are using magnetic effects in materials on a nanoscale (~100 nm-wide). In the future they could be scaled down to individual electrons.

For lower-speed applications like in Flash Memory, Phase-Change RAM could get used instead of MRAM. 

Bioengineering
Layers of bacteria and/or viruses incorporating genetically engineered plasmids will be grown onto substrates, with apoptosis being used to create the desired pattern and spacing. As a result layers will be 'self aligning'.

Initially this will be used for regular repeating patterns such as used in batteries, but more complex designs will become possible over time, so that even transistors and electronic devices will become possible.

Lasers will also be used to fix computers.

Quants and DNA
Computing technologies, which might be not well suited for general purpose use, are quantum computing and DNA computing.

Quants
Quantum computing will be useful for very special purposes like cryptography and search. A possible solution to realize such a thing might be single registers based on an ion trap or quantumdots, or a complete quantum cellular automata.

Quantum cellular automata are doing logic operations starting with a pulsed field as input, which alters the orientation of adjacent fields to create a cascade effect across the array. . The drawback of this construction is a comparatively low speed, big dimensions, and high requirements on the manufacturing quality. Therefore other designs might get used.

A 16 Q-Bit computer has been announced, and is claimed to give quadratic speed up to NP complete problems (rather than exponential). A 1000 Q-Bit computer is planned for 2008 by the same company.

DNA
DNA computing will be very useful in Bioengineering and for medical Applications, but not in consumer devices. DNA computing will be helpful in the diagnoses of illnesses and in the manipulation of lifeforms like viruses or bacteria. This will allow it to provide better medicaments and improves genetic manipulation processes (which can be very good or bad, dependent on the use).

Software
Secure computation is calculating the result without revealing the input data, because all intermediate steps are (quantum-)cryptographically protected.

Advances in software development will make current day high level programming becomes more-and-more productive. Software libraries will evolve in much the way that electronics has: integrating large numbers of parts into pre-built assemblies. Just as FPGAs are pre-built but flexible electronic parts sold with tools to customise them. So software libraries will come with extensive tools to 'wire them up' in the desired configuration using mainly graphical programming techniques. Libraries become standardized for use in own software projects. Also compilers based on generic algorithms will be able to optimize the programs (if the libraries are open-source), so they become smaller and faster.

Dynamic recompiling of software based on a runtime will change datatypes if needed: ie. when a 8-bit integer overflows&mdash;and the software didn't expect that&mdash;the software rewrites itself for the use of a 16-bit integer, etc.. This allows to use fewer registers in soft-wired processors by default, so other programs can use this registers. Programming languages will introduce the possibility to define pre- and postconditions the compiler and/or runtime has to check. Programming languages will integrate unit testing and will use (mathematical) theorems for automatic error checking.

Computing in science
According to the "Towards 2020 Science" report, "computing no longer merely helps scientists with their work. Instead, its concepts, tools and theorems have become integrated into the fabric of science itself". This marks the big trend towards the merging of the real and digital worlds. See also Simulation and Virtual reality.

Timeline

 * 2015 - nanoelectronics, according to International Technology Roadmap for Semiconductors (2005).

Links

 * Mid term (2020) future of computing, special issue by Nature (2020)