Wonders of miniature technologies

Dr Mohammed Muniruzzaman

THE PACE WITH WHICH new technologies are being introduced is often breath taking. What was wishful thinking, or even downright absurd, a few years back is now a reality. The technological march ahead often resembles a science fiction film. It took for instance, about half a century after the first radio station was established to reach an audience of fifty million. It took even less than fifteen years for television to reach the same number of viewers after the first programs were commercialised. The first telegraph transmitted messages through copper wires at less than half a bit per second, today's fibre optic cables can do the same job at 10 billion bits per second. Personal computers (PCs) reached the fifty million-user landmark in less that twenty years since its first marketing. Even more astounding is the fact that in less than five years the worldwide web (WWW) had fifty million users. Internet users now double almost every six months. Equally remarkable are achievements in biological sciences. It took for instance, hundreds of scientists tens of years to decode the first genome. Now scientists can sequence millions of nucleotides in a matter of hours.

Technologies such as nanotechnology, biotechnology, computers and neural network, sensors and robotics are expected to dominate research in the first quarter of the current century. In the race to control and eventually distribute the technologies, the United States, Japan and the European countries are again in the forefront.


In 1959, Professor Richard Feynmann delivered one of his most famous lectures, entitled There's Plenty of Room at the Bottom, to a packed audience at Caltech University. The brilliant Nobel Prize winning physicist is more famous for his creation of quantum electrodynamics than the field of new physics at the atomic or nanometre (one-billionth of a metre) scale that he was talking. The lecture drew many sceptics then. However, Feynmann believed strongly in the possibility of miniaturisation because biology is full of such examples. "The fact that enormous amounts of information can be carried in an exceedingly small space is well known to the biologists...and resolves the mystery how, in the tiniest cell, all of the information for the organisation of a complex creature such as ourselves can be stored." Furthermore, human cells are active, manufacturers of various substances, of storing, trading information in a small scale. Forty years and hundreds of research papers later what do we see today? Higher densities of information on computers, micro-electromechanical (MEMs) devices, creation of designer materials, quantum computers, molecular self-assembly, biological techniques in controlling and manipulating matter at the atomic scale that Feynmann envisaged are on the verge of fruition.

So what is nanotechnology? Definitions vary, but most agree it is "the construction and utilisation of functional structures and materials with at least one characteristic dimension at the nanometre scale'. Simply put, it is the technique of nanoscale fabrication and the application of nanotechnology to actual devices; the integration of those nanostructures into complex systems particularly through the use of molecular self-assembly. Until recently, adherents of nanotechnology could only boast their achievements in stacking the letter 'IBM' atom by atom. However, that is changing. Nanotech-aided inkjet printers, hand held sensors that analyse blood samples instantaneously, micropumps that deliver therapeutic drugs to specific sites and organs have already been developed. More spectaculars are the use of nanotech to create new pathways in the human nervous system to replace damaged nerves.

A team of experts at the Caltech is leading the research to induce suspended nanoscale columns of silicon into vibrations at gigahertz frequencies, making them into possible radio transmitter. Such devices are also expected to find applications in modulating or filtering signals. The speed and stability of these structures might even usher in a new kind of computer where the mechanical levers will serve as processing or memory element. Nanoscale silicon cantilevers are being used as probes in magnetic resonance microscopy with the aim to fabricate atomic-resolution NMR imaging. Other possible devices include high-sensitivity photo-detector and charge-detector. MEMs are looked upon as the essential technology to explore solar system with ultra-miniaturised football-sized spacecraft that will integrate micro-gyroscopes, micro-seismometers, micro-spectrometers and micro-propulsion engines. Other novel uses of the technology include acoustic emission sensing, biochemical sensing using molecular recognition micro-cantilevers, surgical and scientific micro-instruments. MEMs research and the spin-off devices are expected to lay the foundations of nanotechnology and the atomic scale assembly. A carbon nanotube has been built by researchers at Delft University of Technology in the Netherlands, providing a demonstration of room temperature carbon-based electronics at the single-molecule scale. Researchers predict that many copies of their nanotube transistors may be integrated into a circuit using molecular self-assembly techniques. Nanostructured organic opto-electronic materials and devices have the potential to generate a revolution in telecommunication, information processing display and transportation report researchers from the University of Washington.

The proponents of nanotech believe that the 'new technologies will change the world more than any other technological advance including biotechnology'. According to a UNESCO-sponsored study in 1996, 'Nanotechnology will provide the foundation of all technologies in the new century'. Adding that its impact could exceed the Industrial Revolution 'by 2010 or 2020'. It further goes on to say, 'Nanotechnology is the logical consequence and ultimate destination of our quest for control and manipulation of matter.'

Critics of nanotechnology, however, point out that the vision, as painted by its adherents, sounds like the promises of early nuclear energy. When its advocates predicted unlimited sources of clean energy for all our needs. They would like to point the negative aspects of nanotechnology as "the core capability, self replication, requires unmatched diligence to avoid hazards equal to or exceeding those associated with atomic energy. As uplifting as nanotechnology might be for humankind, if not controlled, it could be more devastating than a hundred Hiroshima bombs or a thousand Chernobyl meltdowns".


The cloning of Dolly the Sheep in February 1997 and the presentation of the map of Human Genome in June 2000 possibly mark biotechnology's two most important achievements to-date. About the same time, another news item caught everybody's attention. That any living cell could be re-programmed to perform any function in the organ. As an indicator of the re-programme effects many of us can recall a recent BBC's news feature of a grotesque mouse with a life size human ear grown on its back. The discovery of reverse DNA sequence, "not only made the cloning of sheep, cow and monkeys doable (and the cloning of a monkey made it scientifically hard to pretend that human beings could not be cloned), it means we can replicate tissues and organs from our own bodies for organ or bone morrow transplant". No sooner had one group of scientists declared transplanting the human chromosome into a rodent (1998-99) concomitantly Nature reported the isolation of 'memory-genes' by another group and their transplantation into the DNA of rats, thereby enhancing their ability to 'remember'. These events also ushered in the possibilities of temperature tolerant, disease resistant plant species. The isolation of 'selective traits' and moving it into separate species is a reality now. The 'switching on' and 'off' of various genes also brings out the possibility of 'bringing back to life' lost species from within the species itself.

The media hype surrounding A- and H- bombs and anything nuclear have almost relegated the issue of biological warfare to the background. Robert Taylor writes in the New Scientist on 'Bio-terrorism' cautioned "that the weaponisation of bacteria and viruses was not only likely but almost inevitable". On the downside of the Human Genome Project are the efforts by medical researchers to develop ethnically targeted viruses. The World Medical Association cautioned against these 'ethno-bombs' and warned of the impending threat to human well being. Of immediate concern, however, is seed-targeted 'suicide-sequence' in the crop. The idea is to switch the 'suicide-sequence' in the seed on or off either directly with chemicals or atmospheric conditions or other remote chemical induced devices - sometimes if necessary after several generations of planting. The consequences of such devices are overwhelming. The production and non-production of crops were now in the hands of its creators. These genetically modified seeds could also be made to 'like or dislike' certain chemicals produced by multinational companies. The US efforts to wipe out narcotic crops in Colombia by aerial spraying genetically modified fungi is one such attempt at controlling its production. The campaign was unsuccessful, as the Colombian government did not agree to such experimentation. However, how long dollar-starved nations can resist million-dollar offers of such experimentation is another matter. The weaponisation of viruses is not only confined to rich nations. It is 'the poor man's nuke'. The advantages of such weapons are obvious, it is easy to manufacture, store and use with the perpetrator difficult to trace. Bio-weapons need not be used on humans it can equally be effective on livestock and grains.

Computers neural network

Intel guru Gordon Moore first observed that the number of transistors that can be accommodated on a chip increased exponentially with time. That exponential growth rate corresponds to a four-fold increase in the number of bits that can be stored on a memory chip, usually in every three to four years. The empirical observation that 'the number of transistor on a single integrated circuit chip increases by a factor of four every three years' is generally known as Moore's First Law. Moore's Law has been effective during the last thirty or so years, since its inception. However, the scaling law is running into difficulties as the cost of manufacturing chips is increasing faster than the market is expanding. In addition, the chip manufacturers are unwilling to invest money in a technology they believe will bring marginal benefits to the industry or to themselves. This has been summed up by Moore himself in his Second Law, "The cost of fabrication facilities (fabs) for manufacturing integrated circuits has been increasing by a factor of two every three years." The other factor threatening Moore's law is that an important device the metal oxide semiconductor field effect transistor (CMOS) which has propelled the industry so far has almost reached the limit of its performance. Even with improved design of the CMOS, by 2010 individual transistors in the circuit will be turned on or off by about eight electrons as compared with about a thousand electrons now. Any further improvement will mean one electron will be available to switch the transistors. Long before such a stage is reached, scientists will have to contend with quantum mechanical effects. Getting past it is not an engineering problem but a physical limitation.

What is the maximum number of computation that can be performed by today's non-reversible silicon integrated circuit technology? This is an important question for computer technologists who are constantly striving to improve its performance. Richard Feynmann again came to the rescue. Using thermodynamic consideration, he worked out the minimum amount of energy required to transport a bit irreversibly from device to device in a computational system. He showed that it is a function of the Boltzmann constant, the operating temperature of the system, the transmission distance, the operating frequency and the velocity of light. The Boltzmann constant and the velocity of light are of course constants, the temperature and frequency of the operating system have marginal influence, the transmission distance, however, plays an important role in all computations. For a maximum information transport distance of 50nm (nanometre) Feynmann's analyses show that 1018 bit transfer per second will require one watt of power. This is about a two-fold increase in computational power than what is available now. To reach such computational advances will require a whole new technological paradigm. Physicists, mathematicians, chemists and computer scientists will have to work together in such areas as nanothnology, nanofabrication, self-assembly and molecular electronics if they are to develop a new archetype computer.

I have mentioned molecular electronics - an interdisciplinary research that unites computer science and molecular biology. Within the area of organic molecular electronics, DNA computing is an emerging discipline. Scientists create fragments of DNA, whose letters represent computer data and instructions. These are then used to solve various problems. It is also possible to use DNA to construct massive neural network - computers modelled after the human nervous system - with a connectivity of one trillion synapses. This is roughly one-hundredth the capacity of a human brain. One idea put forward to achieve molecular computing is to use pairs of complementary DNA strands for parallel selective operations.

Sensors and robotics

Combinations of nanotechnology and biotechnology have led to the development of a complex and powerful technology under the general category of sensors. These are devices capable of detecting and transmitting signal, sound, smell, chemical composition and pressure changes. Already sensors have found application in the field of agriculture. Biosensors capable of detecting and transmitting information on the chemical and physical condition of the soil are to be sprayed with such devices. Information picked up from these biosensors by low flying objects or satellites could be analysed for any future land use. Biosensors have also found applications in the defence establishments. Researchers have, for instance, developed genetically altered bacteria that glow when they feed on certain chemicals (trinitrotoluene, TNT) that ooze out of many land mines.

Scientists are also experimenting with the idea of inserting microprocessors and micro-cameras into live cockroaches with the objective of seeking out earthquake victims and checking out nuclear power plants. Industry has been predicting that robots will take over most manufacturing tasks from manual labour. That has not quite happened yet, although many repetitive chores, particularly in industry, are now performed by robots. Linked with neural network and biosensors, robots could be made to perform with a certain level of intelligence. Micro-robots that can slip behind enemy lines can send back information on munitions, troops and vehicle movement. 'Army ants' - a large number of identical intelligent robots capable of acting together or independently, to take a wide range of military chores - have also been developed. Supplying power to the sensors and robots once it is exhausted and their regular maintenance is still a major problem.

We have now truly entered the 'Time of Small Things'. During the second half of the last century our effort at understanding small things were primarily aimed at the atom and atomic energy, biotechnology and genetic engineering. The challenges of nanotechnology, biotechnology, quantum computers and neural network will keep the scientists occupied for years to come.

Dr Mohammed Muniruzzaman is Professor, Department of Physics, Jahangirnagar University.

Source: The Daily Star, Dhaka, July 20, 2001

Home Page

Previous Page