nanocomputers and swarm intelligence || conclusion

6
Conclusion Due to the on-going progress that has been made in the last 50 years, the industry which started off producing electronic components has created information systems and society has benefited from them. The transformation of applications has taken place step by step, but was marked by the technology of computers whose development was relatively stable and at the same time subject to Moore’s Law. With miniaturization, we are currently reaching the boundaries of these technologies. In the next 10 to 15 years, the era of molecular components will begin. This type of device will profoundly change the nature of our computers. This trend will be pushed even further by developments in the field of silicon. Until now, progress was mainly made on centralized units of computers which remained relatively homogenous (e.g. mainframe, PC, server). The analysis of the devices that will be created in the upcoming decade has shown that progress will go hand in hand with a large diversity of capillary systems that co-operate with one another in the form of a swarm of intelligent objects. Even supercomputers, which are the last bastions of centralized IT based around a monolithic intelligence, are moving towards a system sub-divided into clusters. The computer is transforming into a system. These systems can no longer be reduced to one specialized machine but are spread out in our environment and offer a collective and a co-operative mode. Radical changes in IT will lead to a transformation process in the architecture of information systems which will then mark the boundaries of the future’s information technology. Information systems will no longer be centralized in the form of a PC and its user, but will take the shape of a swarm of small communicating devices out of which each single one fulfils its assigned task. This is therefore a complete change. The change will not only affect the way such a system looks, but also the way in which the user perceives it. Nanocomputers and Swarm Intelligence Jean-B aptiste Waldner Copyright 0 2008, ISTE Ltd.

Upload: jean-baptiste

Post on 06-Jun-2016

214 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Nanocomputers and Swarm Intelligence || Conclusion

Conclusion

Due to the on-going progress that has been made in the last 50 years, the industry which started off producing electronic components has created information systems and society has benefited from them. The transformation of applications has taken place step by step, but was marked by the technology of computers whose development was relatively stable and at the same time subject to Moore’s Law.

With miniaturization, we are currently reaching the boundaries of these technologies. In the next 10 to 15 years, the era of molecular components will begin. This type of device will profoundly change the nature of our computers. This trend will be pushed even further by developments in the field of silicon. Until now, progress was mainly made on centralized units of computers which remained relatively homogenous (e.g. mainframe, PC, server). The analysis of the devices that will be created in the upcoming decade has shown that progress will go hand in hand with a large diversity of capillary systems that co-operate with one another in the form of a swarm of intelligent objects.

Even supercomputers, which are the last bastions of centralized IT based around a monolithic intelligence, are moving towards a system sub-divided into clusters. The computer is transforming into a system. These systems can no longer be reduced to one specialized machine but are spread out in our environment and offer a collective and a co-operative mode. Radical changes in IT will lead to a transformation process in the architecture of information systems which will then mark the boundaries of the future’s information technology. Information systems will no longer be centralized in the form of a PC and its user, but will take the shape of a swarm of small communicating devices out of which each single one fulfils its assigned task. This is therefore a complete change. The change will not only affect the way such a system looks, but also the way in which the user perceives it.

Nanocomputers and Swarm Intelligence Jean-B aptiste Waldner

Copyright 0 2008, ISTE Ltd.

Page 2: Nanocomputers and Swarm Intelligence || Conclusion

254 Nanocomputers and Swarm Intelligence

Supercomputers have increased their performance and combined the progress made in the

field of processors (silicon technology) which allowed for an increase in the number of processors that co-operate in one computer (architecture systems). The first supercomputers

were computers equipped with monoprocessors and were 10 times faster than other computers used at the time. In the 1970s, the majority of supercomputers had adopted a

vector processor that carried the decoding of an instruction at once by applying a series of operands. At the end of the 1980s, parallel systems were introduced with the usage of

thousands of processors in the same computer. Even today, some parallel supercomputers use RISC microprocessors that are designed for serial computers. Recently, the notion of the

supercomputer has changed to the concept of the supersystem which consists of small distributed computers. Riken’s MDGrape-3 is the first distributed super system that consists

of a cluster of 201 units that can be sub-divided into 40,330 processors. It is the first system to breach the limit of PetaFLOPS (1015 FLOPS)

Over the course of one entire century the notion of an automat, a calculator and later a computer, has transformed itself into the universal model of the PC. At the beginning of the 21st century a reverse trend set in. Computers have once again become specialized to carry out their tasks more efficiently. In order to do so their performance is limited to this task only and they can therefore be integrated into the real world. Nanotechnologies allow for further miniaturization and therefore contribute to a successful outcome.

This book would be rather superficial if it only mentioned the development of information systems in the next decade. Why is it important to present so many devices which are currently still emerging or are still being experimented on? We are not summarizing all possible scenarios of the future generation of IT for the sake of it. It is a way of becoming aware of the impact that changes in the field of computers will have on humans who install and manage them.

Page 3: Nanocomputers and Swarm Intelligence || Conclusion

Conclusion 255

In order to ensure for a better integration of computers, large appliances will be replaced by specialized digital devices. This specialization and integration in the device’s environment requires engineers with new skills and a new breed of computer scientists who are less focused on the creative development of programming code since it will have become universal and written in exchangeable components, but that will focus on the integration of the device into its environment. The trend from specialized to universal is now starting to reverse. At the beginning, the aim was to integrate devices and applications into their environment. With the increasing usage of computers, the IT professions have become more and more abstract and focused on the processing of essentially logical applications. This is the paradigm of plug and play. When connecting a system they configure themselves automatically and are immediately available for tasks that are considered of high importance. Integrating devices directly was no longer considered as a strategic move for companies. When necessary, the concept of sub-contracting workers or companies was applied. IT jobs disappeared just as was the case for internal air-conditioning maintenance teams or car fleet maintenance departments.

RFID and other mobile networks will bring about similar changes. The competencies of higher integration1 are often delayed or even ignored by the managing directors, which leads to loss in the potential advantage a company would have over its competitors.

This is why companies need to be aware of this tendency and address it when training and hiring people. Students and future computer scientists who are entering the job market also pay a lot of attention to new requirements. During the next decade the professions that were entirely forgotten about in the developments of the last 20 years will re-emerge.

However, the most difficult task will have to be carried out by managing directors, directors of the IT department, CIOs (Chief Information Officers) and CTOs (Chief Technical Officers). The company’s managing directors need to anticipate future developments and plan technological infrastructure in order to have a competitive advantage. Managing IT systems and ensuring that they are working correctly has for a long time been the priority for directors of the IT department. This task has, however, lost its importance. Using systems and offering support to their users are no longer strategic elements of a company. These services are progressively out-sourced to shared service centers, i.e. specialized companies that provide customer services. The maintenance of existing applications is developing

1 Competences of integration do not substitute integration services that ensure that projects are implemented. However, they are indispensable when it comes to an internal definition of needs and pilot projects. While the production process needs to be delegated, this should never be done in the phase of development.

Page 4: Nanocomputers and Swarm Intelligence || Conclusion

256 Nanocomputers and Swarm Intelligence

in the same direction. Applicative and corrective maintenance are out-sourced as they no longer represent a strategic advantage.

No matter how we look at it, until now directors of IT departments had relatively little influence on major strategic decisions that were made within a company. These decisions (e.g. ERP, CRM, e-Business) were mainly taken by the board of directors and carried out by the directors of a specific department that would then co-operate with the IT department.

With the emergence of diffuse IT, the role of technological managers has become far more strategic2. Due to the high number and the diversity of omnipresent systems, ambient intelligence will produce a quantity of data that has no equivalent throughout history. All directing boards agree on the fact that the secret of success lies within these enormous amounts of data.

The pharmaceutical industry, for example, produces gigantic amounts of data3 when a new medication is undergoing preclinical, and later, clinical tests. Research and development of a molecule was until now based on heuristic principles that led to the commercialization of only 1 out of 5,000 tested molecules. It took 12 years for a molecule to be tested in the laboratory and then to finally be launched on the market. 20% of those companies’ turnover is invested in R&D facilities.

A new, and by far more rational, approach in R&D is based on being aware of specific chemical reactions of the target organism which will allow the direct creation of the desired molecule. This new technique will have a strong impact on the technology of the next decade. The approach is based on a tool that extracts the required information from a database produced by systems that allow for a very high speed, such as genomic applications4.

2 This cannot be compared to the web economy which was founded at the end of the 1990s as at that point the bases of the profession were not yet fully developed. 3 New forms of embedded systems (e.g. lab-on-a-chip, microsensors that can be used to monitor from a distance) working inside the body, or online (medical imaging) generate the highest amounts of data in the industry. 4 Content-based research is used in very large and unstructured databases. These databases are currently growing so quickly that an increase in the performance of traditional IT still does not ensure that the time needed to obtain results will remain reasonable. In reconfigurable parallel architecture these databases are stored on a high number of hard disks that allow for parallel access. The data is filtered at the exit of the hard disk in order to use only a small amount of data in the calculating process which reduces the time needed to obtain a result. Content based research can start with a simple filter before carrying out a complex and time consuming algorithm. These filters are implemented in the material under the form of reconfigurable components which allow them to be adapted to the format of the data and work in parallel in order speed up the algorithms.

Page 5: Nanocomputers and Swarm Intelligence || Conclusion

Conclusion 257

Microsystems which are based on nanodevices that obtain data in exactly the same place where the medication should act, i.e. inside the organism, will lead to an exponential increase in the quantity of data.

All data is considered as useful and worthy of being exploited systematically. Never before have IT managers had to process similar amounts of information. A piece of information can be used to create a specific structure and lead to the development of a large variety of devices.

Furthermore, computer scientists are working with programs of a limited lifespan (i.e. 12 years). The emergence of nanocomputing and ambient intelligence are predicted for the near future. Strategic decisions therefore need to be based on upcoming technologies.

8 to 12 years are required to develop medication and finally sell it on the market. Nearly an entire decade is needed to conclude all phases of research and development. R&D programs cost several hundred million euros. The process is mainly empirical and based on trial and error. The first step is called physiopathology and defines the target which needs to be addressed in order to cure a disease. Once the target is identified, scientists move on to the conception, the synthesis and finally the test phase of the molecule that will interact with the target. Metaphorically speaking, this can be compared to the key/lock principle. The lock is the target. The medication therefore is the key. The process is subject to the following criteria: the target needs to be addressed effectively without creating any negative side effects.

Choosing and validating the molecular target, for diagnosis and therapeutic reasons, could become much easier when using cheminformatics5 which allow for a virtual and rapid selection of structures, predictive molecular toxicology, etc. Furthermore, the integration of bioinformatics and cheminformatics in the process of animal testing would be beneficial.

The deciphering of the human genome and those of other organisms provides scientists and medical researchers with a considerable quantity of data. High speed IT systems would be required to exploit this data and use it for a more rational approach towards medical innovations. Information systems that dispose of enormous databases and analyzing tools would allow the most promising molecules to be synthesized and tested in a realistic set-up.

Nearly all other sectors are experiencing the same revolution in the field of IT. Banks and retail, for example, use all different kinds of new systems (e.g. chipcards, radiotags, online payment, mobile payment via cell phone, eye tracking analysis,

5 Also known as chemoinformatics or chemical informatics.

Page 6: Nanocomputers and Swarm Intelligence || Conclusion

258 Nanocomputers and Swarm Intelligence

etc.) that create huge amounts of data about their clients which might give the company a competitive advantage. Filtering and processing this information is, therefore, of great importance. This is a new vision of IT, as now also data, which is not useful, is produced.

Directors of IT departments therefore need to be able predict future developments. They need to leave behind well known territories which represent monolithic IT and start exploring the areas of capillary technologies. They need to invent solutions for all different sorts of communicating devices. Information will then be sent directly form the place where it is produced (radio tags when moving through doors, digital cardiac pacemakers, geo-localized cell phones, RFID chips, etc.) to its final destination (screen embedded in a vehicle, intelligent watch, etc.). These applications will be increasingly distributed and introduce more and more hybrid functions.

New IT managing directors need to anticipate these developments and use them in their strategies. In other words, they need to create a roadmap that organizes the lifespan of applications in order to suit technological developments as well as the company’s human and financial resources. Projects need to be created with the aim of adding value to the company.

Managing directors dealing with the third generation of information systems do not only have to imagine efficient solutions for an era of IT that is currently changing, but also share their enthusiasm with users, heads of department, clients, suppliers and stakeholders6.

Until now, directors of IT departments were believed to have reached the summit when it comes to power and influence within a company. They are using techniques that have been proven successful in other sectors in the market. Their skills lay in the fields of stabilizing the company, decreasing risks and adapting systems to the required needs by using elementary components in programming code. This was the era of directors who were interested in system operations and tactics. However, we are currently entering a new era. This era will be that of directors that deal with information in a very strategic way. Digital information will be created by all different kinds of devices. The amount of information that will be generated is currently unimaginable. The new breed of IT directors will be characterized by their ability to think abstractly and anticipate and take on calculated risks.

6 There are still too many companies in which the board of directors does not take into account the challenges of technological projects. They also have to enjoy technology and predict future developments in the field in order to be able to admit that the future lies within them. Resources need to become part of their strategies. Directors of IT departments have to help the managing board enjoy technology and predict future developments.