| Main Template: Main Page | Basics | Advanced Concepts | Standard Timeline | Community Portal | Recent Changes | Long Pages |
The first step to establishing a forecast is to look at the near future, ie. the coming five years. Called "Target 2015", this section is an attempt to do just this: expound on the most basic components of the future so that we may be able to delve deeper into the far future as we gain experience and wisdom.
Explore the Future--All pages longer than 3000 bytes--pages longer than 10,000 bytes in bold--First see Standard Timeline
The Standard Timeline
"A scientific revolution is just beginning. It has the potential to create an era of science-based innovation that could completely eclipse the last half century of technology-based innovation; and with it, a new wave of global social, technological and economic growth."
Scientific revolutions are rare, but history shows they occur when either a fundamentally important new ‘conceptual’ tool (e.g. calculus) or ‘technological’ tool (e.g. the telescope) is invented that leads to the creation of ‘new kinds’ of science. The coming big new tool is computation.
The dream of "invisibility cloaks" is at last becoming possible, and although the process will likely take another decade or two, steps have already been made in that direction. A group of graduate students at Duke University are already working on developing metamaterials-a special kind of material that causes light to bend in unusual ways, following the contours of the material's structure, and come back out the same way it went in. Provided that this technology is feasible, preliminary trials will be performed by the end of the decade, and the first economical cloak to be put into use (for hiding obstrusive structures from the riverfront areas) will be constructed before 2020.
- Experience will show that the technique works best with flat surfaces. Curved surfaces lead to differences in the length of the light path, which are visible as artefacts. Accordingly the material will be first adopted in buildings as a replacement for windows, allowing people deep in a large building the illusion of having an office with a prime location in the building and with excellent views.
Programmable matter is any bulk substance whose physical properties can be adjusted in real time through the application of light, voltage, electric or magnetic fields, etc.
Primitive forms may allow only limited adjustment of one or two traits (e.g., the "photodarkening" or "photochromic" materials found in light-sensitive sunglasses), but there are theoretical forms which, using known principles of electronics, should be capable of emulating a broad range of naturally occurring materials, or of exhibiting unnatural properties which cannot be produced by other means.
Silicon transistors will eventually be replaced with a better technology.
Possible candidates are
- molecular electronics or nanoelectronics (advanced chemistry)
- graphene electronics (wave effects of electrons are used) 
- nano-mechanical computers (rods and gears)
- photonics (leveraging optics technologies used in communications already)
- magnetic chips 
Nano-mechanical computers are in a sense return to the early history of computers (Babbage, etc.), but realised on a much smaller scale. Millipede, for example, stores data by chipping holes in film via atomic force probe, and refilling them.
One path is a transition from electronics to photonics. Using light instead of electricity for signalling has numerous advantages, for example, in photonic clocking .
An article with an introduction about plasmonics. The Silicon Solution - lasers on boards using current fabrication technology Magnetic chips may soon use nanomagnets (~100 nm-wide) that are assembled into arrays that can perform logic or store information. In the future they can be scaled down to individual electrons (operating as magnetic quantum cellular automata). Logic operations start with a pulsed magnetic field on the input magnet, which alters the orientation of its magnetic field and creates a cascade effect across the array. 
Other computing technologies, which might be not well suited for general purpose use, are quantum computing and DNA computing.
Communications together with computing was part of the technological revolution of the late 20th century.
The number of mobile phones in the world is more than 1.5 billion (2005) and growing. It is becoming possible to use them on planes, on trains, in subways. Many cities (such as Philadelphia and San Francisco) are going forward with plans for seamless wireless coverage (and WiMax is making it even simpler/cheaper). Telecoms are doing trial runs of fiber to the home, promising gigabit broadband speeds.
The takeup of IPv6 will make it possible to greatly increase the number of addressable objects (not just PCs, but ever smaller objects, all wirelessly connected).
It all adds up to a future flooded with great, big wired and wireless pipelines of data coming at you from all directions in all locations at all times, a wonderland of entertainment and communications possibilities.
Universal digital ubiquity All communications (and that means all) are gradually migrating to digital, usually IP-based, protocols. For example, it is already possible to distribute feature films in high-definition digital video to theatres using Internet (including wireless connections).  Digital transmission protocols and digital formats mean that, in principle, all content can be accessed via any digital device (barring any DRM complications). Eventually, in a not so distant future (around 2015-2020), it will be possible for anyone to instantly access any piece of information from anywhere. The "rights" issues will be resolved one way (micropayments) or another (freedom of information).
The information and other content will become much more interactive (currently the field is called Web 2.0, etc.), with the user gaining the capabilities for:
- easy categorization and filtering (today: RSS, tags, Semantic web)
- editing and collaborative editing (today: wiki)
- automated processing and contextual actions (XML formats, Liquid Information project)
- Global broadband fibre based network
- Audio transmission at 2.4kb/s, quality = analogue telephony
- Multiple channels of >100Gbit/s/ on single fibre
- Tb/s on optical fibres over long distance
- Light detection sensitivity exceeding shot noise limit
- Network will still be a significant bottleneck for some services
- New mechanisms for communication discovered
- Laser interferometer detection of gravitational waves
All predictions assume sufficient research funding.
2011 (In 2 years):
- Develop multiple applications for skin, cartilage, bone, blood vessel, and some urological products(33)
- Develop insurance reimbursable regenerative therapies
- Establish standards for FDA regenerative medicine therapy product approvals
- Solve cell sourcing issues, giving researchers access to the materials they need to design new therapies
- Establish cost-effective means of production, paving the way for future products
- Establish specialized cell banks for tissue storage, allowing storage of viable “off the shelf” products
2016 (in 7 years) - effective regenerative medicine therapies will be available for patient care and industrial research and development purposes:
- Further understand stem cell and progenitor cell biology
- Engineer smart degradable biocompatible scaffolding
- Develop microfabrication and nanofabrication technologies to produce tissues with their own complete vascular circulation(34)
- Develop complex organ patches, that could repair damaged pieces of the heart or other organs(35)
Energy is vital to power the technological civilization. Furthermore, while in the past humans needed energy "sporadically", to power various machines, to process a product, today we need an uninterupted supply of electricity just so that our society can function.
In most developed countries blackouts basically stop life even today. Elevators don't work, cities sink into darkness, shops close because they can't operate the checkouts and process credit card payments. Only the automobile transport works and those few islands of self-sufficiency that have their own generators, such as hospitals and some factories.
But even these need fuel constantly replenished and an interuption of supply would stop them after few day.
In the future we can expect that some system will be even more sensitive to power interruptions. Maglev trains, superstructures such as giant Skyscrapers, floating and flying cities, and, of course, the global computing and communications infrastructure may need energy supply that simply doesn't fail.
The energy systems of tomorrow will need to be more reliable and redundant. One approach to achieving this would be to continue to gradually improve all aspects of power generation and power transmission. Another approach is local generation. As technologies improve, local generation of electrical power becomes more economical. Small nuclear reactors could ensure decades of uninterrupted energy supply. Radioisotope thermal generators could provide power to small digital devices and vehicles, just like they provide power to spacecrafts today.
Some of the important future technologies for energy transmission are Hydrogen engines, Fuel cells and Superconductive power lines.
Another problem is the depletion of energy resources. We may have already passed Hubbert peak and it's now obvious that certain types of fuel will soon end or will stop being economical.
Renewable energy sources will increasingly be used, including solar, wind, wave, tide and geothermal power, as well as possibly biodiesel and others. However, all these sources combined (except solar) do not provide nearly sufficient capacity to fullfill all needs of our technological civilization. According to Ray Kurzweil's book The Singularity is Near, only 0.3% of the nearly infinite energy of the Sun is needed to power today's buildings. They will, of course, be successfully used, especially in more favourable areas, just like hydroelectric power is used today.
More promising is Nuclear power, because it is safe, relatively cheap and the uranium reserves are very large. Another source is coal, which can be burned relatively cleanly with modern technologies.
Eventually thermonuclear fusion will become viable. The construction of Iter will soon commence. Fusion, in its various forms has vast potential which is much bigger than other currently available sources of energy.
Some schemes call for use of Helium-3 as fuel for fusion, which is extremely rare on Earth, but available on the Moon. It may become economically feasible to mine it there and use for energy generation (or transport to Earth).
One interesting approach to harvesting solar power is to construct power plants in space or on the Moon and beam the energy to Earth using microwaves. This certainly will be feasible, but it is unlikely to supplant fusion.
In more remote future human civilization will find ways to directly tap into the energy of the Sun and other stars, use black holes for power generation, etc.
- Cost of energy will be higher
- Cities and accommodation designed to be more efficient
- Solar cells with efficiency > 30%
- Multi layer solar cells with efficiency > 50%
- Solar becomes competitive with grid electricity 
- Large area amorphous solar cells with efficiency > 20%
- Conversion/storage of solar energy as biochemical energy
- Common use of solar cells for residential power supply
- Space solar power stations
- Water decomposition by sunlight
In public mind nuclear power went from this supposed clean and perfect energy source to becoming the demon of nuclear war, chernobyl and three-eyed fish. Despite the rational fear of nuclear power, and opposition of "environmental" groups, nuclear power is widely used and in many countries is the largest source of energy. Some countries, such as the USA, have stopped the construction of new reactors, while others have not.
But there are little rational reasons today to using nuclear power. Safe nuclear power is not feasible right now. For example, pebble bed nuclear reactors can melt down, produce waste that cannot be safely managed, and still emit CO2 over the complete lifecycle.
In 2000s small, low-maintenance nuclear reactors have became possible. For example, Toshiba has developed 4S reactors ("super-safe, small and simple") with the capacity of 10-megawatt that can be installed underground and provide energy uninterrupted for 30 years before refueling. It is expected that first such reactors will be put into operation before 2010. 
Solar power describes a number of methods of harnessing energy from the light of the sun. It has been present in many traditional building methods for centuries, but has become of increasing interest in developed countries as the environmental costs and limited supply of other power sources such as fossil fuels are realized. It is already in widespread use where other supplies of power are absent such as in remote locations and in space.
Two main approaches for using solar power are direct (solar or photovoltaic cells) and indirect (letting sun heat air or heat transfer fluid and using turbines for generation).
The energy generated by solar power can be stored in batteries, or in some other form such as hydrogen, flywheels in vacuum, or superconductors. Another way is to export excess power to the power grid.
Photovoltaic cells are devices or banks of devices that use the photoelectric effect of semiconductors to generate electricity directly from the sunlight. As their manufacturing costs have remained high during the twentieth century, their use has been limited. The most important use to date has been to power orbiting satellites and other spacecraft. As manufacturing costs decreased in the last decade of the twentieth century, solar power has become cost-effective for many remote low power applications such as roadside emergency telephones, remote sensing, and limited "off grid" home power applications. Their cost has been declining steadily and is now within a factor of 2 from conventional power in many areas.
An important advantage of solar cells is their scalability. The capital costs are directly proportional to required capacity. Two problems with photovoltaic technology are that manufacturing is relatively dirty and that the cells have limited lifetime.
Solar power plants generally use reflectors to concentrate sunlight into a heat absorber. Heat transfer fluid runs through it and is used to power a steam turbine. A potentially promising approach to solar power are solar chimneys or solar towers. In these power plants air passing under a large (2-30 km) glass house is heated by the sun and channeled upwards towards a convection tower (up to 1 km high) where it is used to drive turbines, which generate electricity.
There have been some experiments to harness energy by absorbing sunlight in a chemical reaction in a way similar to photosynthesis without using living organisms, but no practical solar chemical process has yet emerged.
Deployment of solar power Deployment of solar power depends largely upon local conditions and requirements. But as all industrialised nations share a need for electricity, it is clear that solar power will increasingly be used to supply a cheap, reliable electricity supply.
In some areas of the U.S., for example, in Southern California, which has about 260 sunny days a year, solar electric systems are already competitive with utility systems. When systems are installed at $9/watt, and utility prices are at $0.095/kWh (the current base rate), solar energy brings 9% return on investment.
Several experimental photovoltaic (PV) power plants of 300-500 kW capacity are connected to electricity grids in Europe and the USA. Japan has 150 MWe installed. A large solar PV plant is planned for the island of Crete.
The Australia's Solar Tower project is underway to build a 1km solar chimney. It's capacity is estimated at 200MW. The construction is expected to start in 2006 and will cost $500-750 mln. Several similar plants are planned in China. 
Research continues into ways to make the actual solar collecting cells less expensive and more efficient. Other major research is investigating economic ways to store the energy which is collected from the sun's rays during the day.
First pioneered by a company in the middle of the Pacific Ocean, this strategy may become more and more effective in eliminating the carbon dioxide excess in the atmosphere. The basic tenets of this approach are as follows.
- Plants can act as a carbon dioxide sink, also known as simply a carbon sink, meaning that they can trap carbon atoms inside themselves. As a result of this activity, the carbon dioxide content in the atmosphere will decrease.
- The plants on land, such as grasses, ferns, and trees, have already covered as much of the land as humanity will allow. Therefore, even the growing of new forests will not help much to alleviate this problem, especially in the long term.
- The plants of the sea, such as plankton and algae, still have significant room for blossoming. These can also act as a carbon sink, but because they are in the ocean they will not significantly affect humanity and this can be conducted effectively.
- The plankton have enough of every resource to blossom, but cannot because the sea is direly missing a critical chemical-Iron. This can be replendished and placed at excess through dumping iron fillings, known as seeds, into the ocean.
- When the iron enters the ocean, a blossoming of plankton will occur, leading to the consumption and capture of the carbon molecules.
This strategy is currently under investigation; however, there is no indication that there will not be side effects.
See Main Article: Scenario: Nomenclature.
The International Union for Pure and Applied Chemistry (IUPAC) will someday adopt a new system of chemical nomenclature that expands upon the one scientists currently use for naming hydrocarbons. This new system will emphasize reducing chemical compounds into only a few basic literal components, thereby removing the need to introduce new terms for the more complicated chemicals, while simultaneously making the drawing out of structural formulas of said compounds significantly easier and understandable for both novices and experts. This new system can even be fed into computers to have them replicate the said chemicals, and introduces a streamlined methodology that is far more intuitive and flexible than the current version.
See also SMILES - Simplified Molecular Input Line Specification. This is a system that can acommodate arbitrary compounds, including steroids and other complex ring-structured compounds.
Knowledge management is an approach to solve a host of annoying problems caused by complexity of the world. Doug Engelbart defined it as "broad process of locating, organizing, transferring, and using the information and expertise within an organization". More simply, it's a method for gathering information and making it available to others.
Advanced high-performance and accessible tools became available with the advance of computers and communications.
We are going to see huge progress in this field, with significant developments happening constantly. There are no fundamental limits to how easy and efficient knoweldge management can be, so the improvements will continue at an amazing pace, until we achieve near perfection. It's quite easy to describe this final state — the most relevant, needed and useful information and knowledge at your
Classic science fiction had difficulties adapting to the knowledge about the future that was suddenly available thanks to transhumanism in the late 20th century. PopSci asked Is Science Fiction About to Go Blind? Transhumanist authors responded to the challenge by adopting the knew outlook on the future and embracing supertechnologies.
- Регистр н/ф идей — a professionally compiled (by inventor of TRIZ) database of science-fiction ideas (in Russian)
- Inventions and Ideas from Science Fiction in the news — yesterday sci-fi ideas being implemented today
- Ultimate Science Fiction Web Guide - a large database of sci-fi ideas and topics
- Science fiction themes at Wikipedia - just a list.
- The Sub-Genres of Science Fiction - A Map of Ideas - main topics in SF
- Applied Science Fiction - a list at FutureTAG wiki
- Transhumanism and Science Fiction - WTA leaflet
- Library at www.transhumanism-russia.ru (in Russian)
- A History of the History of the Future - Kuro5hin article
The defining characteristics of the Internet (as it developed between 1995 and 2005) are:
- online communications are part of the daily life of Internet users (including instant messaging)
- remote work is possible
- free access to information
- e-commerce (online purchases)
- Internet is fast and accessible (allowing multimedia content)
- mp3 paved the road to all-digital all-online content
- Open Source software plays an important role
- a large variety of free online services
- development of search engines
- cybercrimes (malware) are widespread
The web started out as mostly static documents (Web 1.0). In the 1990s it became more dynamic, making possible the portals, e-commerce and first online services (Web 1.5).
In early 2000s, after a decade-long history, a transition started from the "old Web" to wikipedia:Web 2.0 (see also ). This means more active and interactive web-applications, more advanced social networking tools and increasing connections between different kinds/sources of data.
The future ideal goal is the transparent integration of desktop and the Internet and transparent device-independent ubiquitous computing.
wikipedia:Semantic Web is a more advanced direction related to the infrastructure of the Net, with a larger emphasis on machine understanding of pieces of data.
- The present future (kottke.org) - a discussion of the future Web (S/N ratio is rather low, though)
- Online access gradually becomes as vital as oxygen to modern people.
Open source is a model of development where plans are available to other people.
First created for software (where it became successful with Linux) it is gradually expanding. To text (e.g. Wikipedia), but also some hardware and even soft drinks.
It is being used to some extent among custom manufacturing enthusiasts. It will be used in the future in biotech and nanotech development, where exact descriptions of products/technologies are available by default, because everything is done in simulations and then automatically synthesized.
We don't have to remember what we can look up on the Internet. We can take thousands of photos easily, so a lot of our memories are "backed up" by pictures. We no longer just remember the trip to some place (as our parents did), we can have hundreds of images, documenting every significant aspect of it.
Projects are underway to make it possible to record audio, images and video and organize it all for easy retrieval.
Current hardware makes it possible to record and store a constant videofeed of a human life. Steve Mann does it already.
The information can then be organized on a timeline, geographically (using GPS data), using contexts (automatically determined using time and location) such as "in a restaurant" or "at work", additional information such as keywords, folders and comments.
Some information can be extracted from the images (using image processing) and audio (using speech recognition). Some rudimentary algorithms already exist for searching through audiovisual data.
A number of companies are developing projects related to such digital memories.
Safe & Cool project uses fabrics with active cooling and internal structure. This signifies a shift from simple materials (just natural or synthetic threads woven together) to complex systems with many layers, active elements, sensors, etc. This is the early beginning of smart matter.
- The Age of Spiritual Machines, Ray Kurzweil