white paper: safeguarding tomorrow’s trustworthy ict environment – the importance of research in...
DESCRIPTION
We are entering a Networked Society that generates enormous opportunities in business, communication, social interaction, personal entertainment, workforce mobility, sustainability, and general ease-of-life. While security and privacy on today’s internet still provides significant challenges, our current internet usage would not have been possible without great achievements in ICT security research that provided us with encrypted and authenticated connections, file system access control, firewalls, PKI, anti-virus programs and tamper resistant hardware. Most of these advances have occurred only in the last decades, even in spite of occasional attempts to prevent public research in some areas of security.TRANSCRIPT
THE IMPORTANCE OF RESEARCH IN ICT SECURITY
Every day we benefit from the fruits of groundbreaking research in ICT security.
All mobile phone calls are encrypted, and the authenticity of internet services is
verifiable, to give just two examples. This research has 2,000-year-old roots, but
most public research is only 30 to 40 years old. Public research has not always
been taken for granted. What is the status of ICT security research, and how do
we safeguard tomorrow’s trustworthy ICT environment?
ericsson White paperUen 284 23-3243 | September 2014
safeguarding tomorrow’s trustworthy ict environment
SAFEGUARDING TOMORROW’S TRUSTWORTHY ICT ENVIRONMENT • INTRODUCTION 2
IntroductionPrimitive forms of secret writing can be traced some 3,500 years back in time. And by 2,000
years ago, people had already taken the first steps toward communication security, with the
development of various codes and ciphers to protect information carried by messengers against
enemy capture. The invention of the telegraph and, subsequently, the radio facilitated
communication but also dramatically increased the risk of eavesdropping. David Kahn, when
discussing how cryptography started to mature into a science during the World Wars, writes
that: “The direct cause of this development was the enormous increase in radio communication”
[1]. In general, most shifts in technology and services have caused new security concerns, for
example, how personal computers have become affordable, interconnected and used for
e-commerce. Wireless and computing technologies and internet services are now ubiquitous,
making security everybody’s concern. And the evolution is not going to stop here.
Our society is rapidly evolving into a Networked Society, in which our daily lives involve and
depend on swift information sharing and processing. The pace of change is extraordinary. It took
100 years to connect 1 billion places but only 25 years to connect 5 billion people. By 2020, we
expect there to be more than 50 billion connected devices, including home automation, industrial
control systems, electricity grids and health applications, among many others.
Security research is now – and has been for at least two decades – a key integrated part in
the making of the technologies that are enabling the Networked Society. Today, it might be taken
for granted how security research enables this transition. However, this is a recent development,
as there were attempts at national security confinement of security research until just 30 years
ago. As recent as 25 years ago, when the world’s most widespread security solution was
conceived for the GSM system, utilizing cryptography was still under strict export restrictions.
Also, it will soon become apparent that with everything becoming interconnected in a system-
of-systems, it is no longer viable to address security vertically (for example, per application, per
business), nor as “islands” (for example, per country).
This paper discusses the need to maintain and stimulate research in ICT security, following a
path of transparency and standardization, guided by multi-stakeholder and interdisciplinary
interests.
SAFEGUARDING TOMORROW’S TRUSTWORTHY ICT ENVIRONMENT • HISTORY AND TRENDS 3
History and trends To describe the broad topic of ICT security, we consider three subareas: security for networks,
platforms and services. These areas are partially interdependent and are all necessary to build
a trustworthy system.
NETWORK SECURITY
The origins of network security lie in information
security needs within military applications,
which, with a few exceptions, remained a
closed military research discipline well into the
20th century. The main take-off of public
industry and academic research occurred in
the mid-1970s with the Data Encryption
Standard [2], which became the first mainstream
encryption standard, and with the invention of
public key cryptography [3] [4]. This growing
amount of research was attributed to new
technology – computers offering programmable,
automated security processing. More recently,
researchers have also explored using quantum
physics as a basis for information security.
Due to this military background, open
research on network security has at times been
endangered, but is today crucial. Since the late
1980s, public insight into security research – as
well as extensive analysis by academic and
industry experts – has been key in the adoption
of the results of new research as public
standards. Indeed, all development of mobile networks from 3G and beyond has followed this
public-driven principle. As an important part of the standardization process, several independent
academic teams performed security analysis of the 3G and 4G security algorithms.
But not only information is threatened; networks also need protection. Security problems first
occurred as “phreaking” – various attempts to make free phone calls – in the late 1960s. As
networking evolved to interconnect more than telephones, with IP technology evolution leading
to the global internet we have today, academic researchers’ initial assumptions of “friendliness”
eroded and the risk of attacks increased. One particular wake-up call was the Morris worm in
1988. Disabling a considerable part of the internet, this incident stimulated research and also
led to the formation of the first Computer Emergency Response Team.
With the enormous success of mobile telephony, mobility started to be a consideration also
for the internet, again changing the threat model. More recently, mobility in the form of bring your
own device has put focus on new security challenges. As private devices are brought into an
employer’s network domain and connect to it from within, approaches of placing network defense
at the perimeter are no longer efficient.
Cloud computing closes the circle by accentuating many of the most classic research topics
in information, computer and network security: information security is necessary to protect data
during transfer and while stored in the cloud platform; security is needed for processing; and
access to cloud is intrinsically done over a network (often mobile), encompassing more or less
all network security aspects.
Figure 1: Three areas of ICT security: network security (green), platform security (red) and service security (blue)
SAFEGUARDING TOMORROW’S TRUSTWORTHY ICT ENVIRONMENT • HISTORY AND TRENDS 4
PLATFORM SECURITY
Research into platform security was also primarily spawned from a military information security
angle: early computer hardware was expensive, and resources had to be utilized in an effective
manner. Information of multiple sensitivity levels (unrestricted, restricted, secret) needed to be
stored and processed on the same hardware, without air gap separation and by multiple users.
This multilevel security (access control) problem of operating systems from the early 1970s was
therefore one of the first platform security research topics, and the work has inspired today’s de
facto standards for secure operating systems, such as SE Linux and L4.
Early research studied tamperproof security kernels monitoring the system at runtime. This
work has carried over into current topics, which include hypervisors that provide separation
between different guest systems running on the same hardware, as well as trusted computing
with special hardware modules that maintain a checksum of a system state. The explosion of
malware in the 1990s further accentuated the need for various forms of monitoring.
The invention of smart card-based secure processing platforms found use in GSM SIM cards,
as well as in banking and payment solutions in Europe. Since then, the technology has become
widely accepted for realizing highly secure cryptographic processing and key management.
An important research area is security assurance: how to establish confidence that a system
can withstand relevant threats. This area comprises how to do threat analysis, to formalize
requirements and testing to ensure that requirements are sufficient and correctly implemented.
A number of national initiatives (for example, the Trusted Computer System Evaluation Criteria
in the US, commonly known as the “Orange Book”) eventually led to the international assurance
standard for Common Criteria. Methodologies for software security and formal verification have
also been extensively studied.
SERVICE SECURITY
In the past, services were bundled vertically with networks (for example, telephony), as was
security. With the decoupling of services and networks, however, services can no longer assume
specific vertical security attributes.
The first widespread services after telephony were other forms of communication services,
such as e-mail. The need for secure (encrypted and authenticated) electronic mail is put forward
as an important use case in the early research papers on public key cryptography in the late
1970s.
What really stimulated networked services was the creation of the World Wide Web in the late
1980s. This has enabled businesses to use the internet as a basis for e-commerce and other
sensitive applications, and thus demanded ways to add security. This led to the development of
protocols, such as the Transport Layer Security protocol in the mid-1990s as a means to secure
web access. The web protocols are now seen also as a platform for person-to-person
communication services with an expectation of user privacy similar to that of old-style telephony.
A fundamental need is to allow users and services to authenticate each other. Studies into public
key infrastructures and biometrics, among other topics, have advanced technical possibilities,
but practical solutions still often depend on password mechanisms with limited security and
usability.
Regarding content services over networks, unauthorized access to content soon became an
issue in cable and pay-TV, and simple analog scrambling techniques were introduced. This
developed into full-fledged, standardized digital rights management (DRM) solutions for copyright
enforcement.
Today, users of social networks are starting to feel similar copyright issues as their data (photos,
contacts) are stolen and misused and “personal DRM” is becoming an interesting topic. Indeed,
social networks and other new forms of networked services could be a use case for many research
results from the past that so far have not come into widespread use, such as secure multiparty
computation and electronic voting.
SAFEGUARDING TOMORROW’S TRUSTWORTHY ICT ENVIRONMENT • WHERE ARE WE NOW? 5
Where are we now?Security can be applied proactively – either to deter or protect – or reactively – to defend, respond,
or recover – and this is true also for research in security. It is sometimes possible to retrofit
security, but critical security functions cannot be bolted on. Historically, a bias toward reactive
research can be seen, mainly due to a lack of – or gradual invalidation of – initial trust models.
Such trust model invalidation, in turn, is typically caused by one or more of the following factors:
> business and ecosystem change; for example, a “walled garden” business being opened up
to external parties
> technology proliferation; for example, a software defined radio becoming accessible to hackers
and providing them tools to attack radio access networks
> advances in security research itself; for example, an attack is found on a previously presumed
secure encryption algorithm.
Both in research and standardization, practices stimulating awareness and proactiveness are
today more common. Current research on new cryptographic algorithms continuously seeks
new, improved algorithms before existing ones have been weakened through attacks. For example,
within the Internet Engineering Task Force (IETF), all new standard contributions must now declare
security considerations and define a strong security solution. “Clean slate” approaches, aimed
at redefining an internet with built-in security, are interesting but problematic to adopt in practice
due to lack of backward compatibility. Even if we were allowed freedom of complete redesign,
it seems unrealistic to expect that we would ever be able to predict all new threats or technology
shifts before they emerge, and reactive research approaches will continue to be common. Still,
it is possible to identify some research areas likely to be (or become) of specific relevance to a
Networked Society.
SAFEGUARDING TOMORROW’S TRUSTWORTHY ICT ENVIRONMENT • THE RESEARCH APPROACH 6
the Research approachAGENDA
We are currently in the middle of a dramatic revolution in technology and society that is bringing
us toward a Networked Society.
The Networked Society can act as a vehicle by which a person can stay in control of most
aspects of their daily life, and by which an enterprise can conduct most of its business. But this
potential can only be reached if security and privacy remain ensured. In fact, stakeholders should
not be content with just keeping the status quo. Investing proactive effort into some specific
research areas will lead to stronger
assurance and minimize risk of future
negative revolutions in security. Progress in
these areas is also very realistic if the right
approach is taken.
Multi-stakeholder research
Research cooperation between industry
and academia is already well established.
Approximately 240 security research
projects with a total budget of EUR 1.3
billion were launched within the Seventh
Framework Programme of the European
Union (EU FP7). In most of these, industrial
and academic interest and participation is
intertwined. The network assets forming the
spine of the Networked Society will be
critical for individuals, enterprises,
operators, governments and public safety
personnel alike. These stakeholders will
have many common security requirements
but there will also be significant differences,
both quantitatively and qualitatively [5].
Security cannot be put into a standalone “box.” Different stakeholders should join research forces,
try to understand each other’s requirements, and develop a security fabric for the Networked
Society, placing security functions where they are best suited, and meeting the various
organizational security policies, while keeping the common network manageable and useful.
Multidisciplinary approach
In the presence of strong technical security solutions, non-technical issues such as human
behavior and lack of training remain vulnerabilities, as shown by phishing and social engineering.
Security research has benefited by not only borrowing from technical sciences such as
mathematics, but also from behavioral sciences like psychology and economics. There is much
more to be done in terms of cooperation, and the results of this new research can be expected
to produce user awareness and deeper understanding of security problems, as well as the
development of new technical tools.
Assurance
Having a thought-through security blueprint is the first step – it must also be securely implemented
and managed in order to provide assurance. For design assurance, current standards are often
perceived as cumbersome. For operational assurance, approaches are based on manual audits,
which are neither real time nor continuous. Yet on-demand cloud services need on-demand
DriversDrivers
AttractscyberthreatsAttractscyberthreats
Intelligent transport, industryand society, smart utilities
Connected consumerselectronics
CloudAppsM2M
MobileBroadband
Big data
Entertainment, social, security, healthProductivity, new revenuesSustainability, regulation
EnablersEnablers Broadband ubiquityDeclining cost of connectivityOpenness and simplicity
Figure 2: Keeping the Networked Society ecosystem secure from cyberthreats
SE
CU
RIT
YS
EC
UR
ITY
SAFEGUARDING TOMORROW’S TRUSTWORTHY ICT ENVIRONMENT • THE RESEARCH APPROACH 7
assurance. National regulations are being put in place [6] in the EU, focusing on energy and
transport, and countries such as India are also placing requirements on telecom operators. Since
the Networked Society is global in its nature, we should strive for a worldwide standard baseline,
while allowing countries and businesses to add their own requirements for national or mission
critical infrastructure aspects. Important steps have been and are being taken [7] [8], but we are
still quite far from this vision.
Software security
Our society’s dependence on software is increasing, and reports of serious software security
flaws [9] make headline news. All major ICT vendors are already taking this seriously. The issues
could be even better addressed through more research into how to better integrate advanced
software assurance methodology into the development life cycle processes of the ICT industry.
Energy-efficient security
A common misconception is that adding security is like throwing sand in the gears. The fact is
that encrypting a message drains the battery only by a tiny fraction compared with the radio
transmission. However, computation and connectivity move into smaller devices, where each
microjoule saved counts. Also, for sustainable growth, we need to study to what extent software
can replace hardware, while keeping an acceptable security and performance level.
Formal methods
For high assurance levels, use of formal methods, including mathematical proofs and protocol
verification, will be important tools. There is a rich theory in this area, but methods are still
cumbersome to apply. Industrial and academic research cooperation is underway, but mainly
for safety aspects such as industrial control and public transport. The time has come to extend
this to other sectors.
Privacy
This boils down to individuals being in charge of how they choose to interact with other entities
in the Networked Society, and it must be treated as a top concern. The research in this area is
already very vivid, though it must be noted that some work has limited practical applicability
since regulations make it hard to deploy related solutions, and some existing work has so far
been lacking practical use cases, but this is likely to change. For instance, secure multiparty
computation, which can be seen as providing “ways to securely interact with strangers,” will be
common in a Networked Society. Existing work should be revisited and extended.
The agenda must be seen against the background of experiences from the past; security research
needs processes that allow us to deal swiftly with new threats that are unimaginable today.
SAFEGUARDING TOMORROW’S TRUSTWORTHY ICT ENVIRONMENT • THE RESEARCH PROCESS 8
the Research processWe have several times emphasized the importance of transparency. In a commercial ICT product
and service organization, there will inevitably exist information that for business reasons needs
to be kept confidential. However, to the extent possible, we believe transparency into the security
research process should be provided. We believe a model similar to that proposed in the 2010
book Strategic Learning by W. Pietersen [10] works well for security research.
An industrial research process is fed by
an inflow of information from an
organization’s business area (for example,
its own business unit or a customer) and
the relevant technology area (for example,
its own product development unit or an
external academic research institute).
These inputs are both in the form of
experience from current real-world use,
which can include identified gaps in
existing products or standards, as well as
early identification of new emerging threats.
In the first case, input can be concrete in
spelled-out “trouble reports.” Here, the
daily responsibility lies on a product
development organization, and research
organization resources are brought in only
to tackle more advanced problems. In the
latter case, structured analysis is often
required to be able to distill and predict potential threats based on a new business case or
emerging technology. Thus, what is needed is a mix between basic and applied research, with
the applied part sometimes bordering product development.
The outflow from the process consists of contributions in knowledge both to its own organization
and the security technology area at large, in the form of new concepts, demos and prototypes,
and new or improved standards. The process between inflow and outflow is continuous and
consists of the following three phases:
UNDERSTAND AND INTERACT
What is specific for security research in general, and related to the Networked Society in particular,
is the need for diverse and broad insight into a number of different stakeholder concerns and
technology areas. New threats can appear overnight, so keeping abreast with developments in
external research and the ever-changing threat landscape is also a challenge. For senior specialists
and experts in leading research organizations, journals, workshops and conferences are primary
channels for input.
RESEARCH AND INNOVATE
As mentioned, this will be a mix of basic and applied research. There will also be a mix of projects
planned and executed internally within the security research group, and projects formally run by
other parts of the organization to which security researchers contribute. Just as “security
considerations” is a mandatory part of all internet standards, all research projects executed in
the organization should consider whether security competence needs to be added to the project.
Parts of the research activities will be performed with external partners and competitors in
academia and industry.
Understand andinteract
BusinessKnowledge
Concepts
Prototypes
Standards
Technology
Research andinnovate
Realize andcommunicate
Stra
tegi
c te
chno
logi
esan
d se
rvic
es
Opp
ortu
nitie
san
d th
reat
s
Figure 3: A process for industrial ICT security research
SAFEGUARDING TOMORROW’S TRUSTWORTHY ICT ENVIRONMENT • THE RESEARCH PROCESS 9
REALIZE AND COMMUNICATE
As in any other field of industrial research, besides internal dissemination of research results to
product and business units, there will also be external dissemination in workshops and
conferences. Standardization is a well-established communication channel, and open source is
increasing in importance. Industrial organizations are also starting to lead methodological security
analysis work within open source communities [11]. An extremely worthwhile form of dissemination
is as demonstrations or prototypes. Here, a challenge lies in finding good ways to visualize
security features, for example, how to show in a meaningful way that data is encrypted or
authenticated.
SAFEGUARDING TOMORROW’S TRUSTWORTHY ICT ENVIRONMENT • CASE STUDIES 10
Case studiesIn this section we discuss the research process within selected concrete security research efforts
in the ICT industry. We have chosen examples that illustrate how shifts in technology, business
and threat models call for new research, and how business and technology considerations enter
into the above research process, leading to explicit design choices. The examples also illustrate
the need for researchers to possess broad technical knowledge also outside the security area,
in order to understand the specifics of the service to be secured. These examples are currently
at different levels of maturation.
SECURITY FOR CONVERSATIONAL IP MULTIMEDIA
With the 1999 release of the 3G UMTS mobile standard came an increased desire for IP-based
services, and an initiative to standardize IMS started in 3GPP. IMS was soon expanded to allow
for other, non-cellular access technologies, such as variants of Digital Subscriber Line and
wireless local area network (WLAN). Previous research and development had provided 3G with
a strong access security solution, including 128-bit encryption. But when allowing a mix of
heterogeneous accesses, it became difficult to know whether the security provided by the access
was sufficient. An IMS connection could be set up between a 3G subscriber (enjoying strong
protection) and a user at a public WLAN hotspot (with little or no security). The only viable option
was to place security at the IP layer or above.
The standard protocol stack for conversational multimedia at the time consisted of the Real-
time Transport Protocol (RTP), carried over User Datagram Protocol (UDP) and IP. There was no
complete security solution for RTP or UDP, so the only possible off-the-shelf solution would have
been IP security (IPsec). However, applying IPsec encryption would make it impossible to apply
header compression to RTP and UDP headers (encrypted data cannot be compressed), which
was highly desired for bandwidth efficiency. Thus, a new protocol was needed, applying security
at RTP level, leaving RTP and lower headers accessible for compression.
Due to the foreseen business case of providing IMS over basically any access, the IETF seemed
the most suitable standardization body. The effort to develop the Secure Real-time Transport
Protocol TP protocol (SRTP) was initiated in late 2000. As it turned out, several actors in IETF
had similar ideas, and the work became a joint effort.
It was rather obvious that state-of-the-art cryptography should be used, and the then newly
developed Advanced Encryption Standard (AES) became a natural core component. To safeguard
against possible advances in cryptanalysis of the AES, however, the algorithms in SRTP were
made “pluggable.” Special consideration also had to be made in how to operate AES. In a wireless
setting, there will be transmission bit errors, which, when combined with an ill-chosen way to
use AES, could create an avalanche effect in bit errors. This could be disastrous, since while
speech encoders are designed to cope with a small number of bit errors, their performance
deteriorates in the presence of multiple errors.
The final consideration regarding security had to do with data authentication. While being a
highly desirable security service, data authentication has the drawback of adding overhead to
messages. From a general wireless standpoint, this would probably have been acceptable.
However, certain wireless radio technologies had optimized their radio bearers, which caused
problems. For example, in CDMA2000, the Enhanced Variable Rate Coder/Decoder/Selectable
Mode Vocoder codec data and Interim Standard 95 physical frames matched exactly in size.
Adding authentication data would thus cause “overflow” and speech frames would be dropped.
Thus, authentication was added, but made optional to use.
Any encryption protocol requires key management but considering the diverse business cases
(for example, that network operators might like to use SIM cards for key management, whereas
enterprises might have other preferences), key management was left outside the scope, though
it was later specified in another IETF standard.
The specification of SRTP was finalized in 2004 [12]. Since then, SRTP has been adopted as
part of several other standards and can by now be considered a mature, commercially successful
research result.
SAFEGUARDING TOMORROW’S TRUSTWORTHY ICT ENVIRONMENT • CASE STUDIES 11
GENERIC BOOTSTRAPPING ARCHITECTURE
3GPP mobile networks make use of a subscriber authentication mechanism known as
Authentication and Key Agreement (AKA). It is based on keys stored in the SIM card and in the
Home Subscriber Server (HSS) in the operator’s network. AKA is tightly integrated with the
cellular access technology, offering efficiency and security. As mobile internet started to take
off, the need to authenticate mobile subscribers toward internet services arose. It was considered
beneficial if an operator’s installed base of SIM cards could somehow be reused. In order to
allow internet services to be agnostic to specifics of the AKA protocol, a front-end to the HSS
was defined, called the Bootstrapping Server Function (BSF). When an internet service wishes
to authenticate the mobile subscriber, it contacts the BSF. The BSF takes care of the AKA-specific
exchanges with the phone/SIM and produces a key, which is transferred to the service. The
service can then use this key to authenticate the subscriber by using basically any authentication
protocol that uses shared keys, such as standard HTTP authentication. The solution became the
Generic Bootstrapping Architecture (GBA) [13].
Even though GBA hides AKA-specific details to the service, the service still has to “understand”
some details of the GBA to communicate with the BSF. For this reason, further research looked
at how to integrate GBA with the OpenID Internet standard, enabling services to be fully agnostic
about GBA.
The 3GPP standard for GBA has been ready for some years and is used as a building block
of some other standards, including VoLTE, Open IPTV and a few others. Although smartphones
are used extensively to access internet services, GBA has not yet been widely adopted on the
internet at large.
RUNTIME MONITORING OF DEVICES
In the Networked Society, there will be billions of small, connected devices. Many of them will
provide security and safety-critical functionality for our personal life, such as personal health
monitoring and home and car security. Other devices will be deployed in critical infrastructure:
power grids, drinking water control, and traffic monitoring, to give a few examples. Exploited
security bugs in old operating systems present a tangible threat to both privacy and security: a
hacked health monitoring system might fail to raise an alarm for arrhythmia, and hacking of a
drinking water control system could send water unfit for consumption to an entire city. For cost
and sustainability reasons, many of the devices will have a long lifetime, so a major challenge
lies in maintenance and attack mitigations.
A possible solution is over-the-air upgrades, automatically distributed and installed. This is
technically feasible, but for power-constrained devices the amount of data to receive could drain
batteries fast. The alternative of relying on users for maintenance is also problematic. Device
owners may neither understand the importance of upgrading their devices nor have the technical
skills to do so.
One solution – applicable to more statically configured systems – would be to try to eliminate
all unauthorized changes in the system memory during runtime. If a runtime monitor, separated
from the main operating system, is provided with detailed knowledge of the memory mapping
and memory content of the main operating system, it can possibly prevent types of attacks that
were not known at the time of the device deployment. To achieve this, a hypervisor would be
used; this is software that operates between the hardware and the main operating system. The
hypervisor works as a lower-layer operating system, scheduling the hardware resources (including
the processor) to the operating system and the monitor, providing controlled separation between
them. The main operating system should not be able to examine or modify the memory of the
monitor, but the monitor needs to be able to examine the main operating system memory.
To trust the hypervisor to provide separation, some form of verification of its functionality is
needed. Many of the existing hypervisors are very rich in functionality and are therefore hard to
completely verify. A small hypervisor that can be formally verified is needed. Previously verified
hypervisors include the seL4 hypervisor from Open Kernel Labs [14], but to provide proper
monitoring, a more fine-grained control over the hypervisor is needed. Recent academic research
[15] has provided the necessary foundation for a formally verified hypervisor with the desired
properties.
The monitor itself must rely on the hypervisor to provide “signals” (traps) when the main
operating system memory changes, for example, when system memory access rights change
from writable to executable. In such an event, the monitor needs to verify that the code is
SAFEGUARDING TOMORROW’S TRUSTWORTHY ICT ENVIRONMENT • CASE STUDIES 12
executable. Several techniques can be employed for this, including binary code analysis. For
certain parts of the memory, the monitor could prevent any change, and signal the hypervisor
to not allow write access for the main operating system.
Many embedded devices will contain very few applications and a minimal operating system.
The applications will be statically installed before deployment and never change. In such a system,
the monitor could store signatures of parts of the memory to provide a more complete system
protection. A downside is that the system becomes more inflexible with respect to application
upgrades, since new signatures must be securely distributed to the monitor when upgrading.
The problem of attack prevention for embedded connected devices can thus be divided into
at least two parts – one to address the management (upgrading and patching the system), and
one in which the system is locked down and monitored by a hypervisor-separated runtime monitor.
These ideas are ongoing research topics, and the best way forward is still to be discovered.
SAFEGUARDING TOMORROW’S TRUSTWORTHY ICT ENVIRONMENT • SUMMARY AND CONCLUSIONS 13
Summary and ConclusionsToday’s internet ecosystems would not have been possible without great achievements in ICT security research and
insights into communication security that have ancient roots. Many of the most relevant advances in these areas have
occurred relatively recently, and the current openness into research that is so fundamental for trust and assurance needs
to be maintained.
As we enter the Networked Society, we should further improve multi-stakeholder approaches. Security research works
best when it is proactive, but the research process, in particular in a commercial ICT vendor, by necessity also needs to
be agile and able to adapt to sometimes unpredictable real-world experiences. The Networked Society is also likely to
become the first real practical implementation of certain research areas, such as formal methods and multiparty
computation that have so far been seen as largely theoretical.
SAFEGUARDING TOMORROW’S TRUSTWORTHY ICT ENVIRONMENT • REFERENCES 14
References1. David Kahn, 1996, The Codebreakers, Scribner
2. NIST, 1977, FIPS 46
3. W. Diffie and M. Hellman, 1976, New Directions in Cryptography, IEEE Transactions on Information Theory 22 (6):
644-654
4. R. Rivest, A. Shamir, and L. Adleman, 1978, A Method for Obtaining Digital Signatures and Public-Key Cryptosystems,
Communications of the ACM 21 (2): 120–126
5. FIRE EU FP7 project, 2013, Industry Sector Research Needs: Trustworthy ICT Research in Europe for ICT Security
Industry, ICT Security Users and Researchers,
http://www.trustworthyictonfire.com/images/documents/deliverables/Industry_Sector_Research_Needs_Report.pdf
6. COUNCIL DIRECTIVE 2008/114/EC of 8 December 2008 on the identification and designation of European critical
infrastructures and the assessment of the need to improve their protection,
https://www.siseministeerium.ee/public/Direktiiv_Euroopa_esmat_htsate_infrastruktuuride_m_ramise_kohta.pdf
7. Common Criteria Recognition Agreement, https://www.commoncriteriaportal.org/ccra/ accessed September 2014
8. Security Assurance Methodology (SECAM) for 3GPP Nodes, January 2014,
http://www.3gpp.org/news-events/3gpp-news/1569-secam_for_3gpp_nodes
9. OpenSSL ‘Heartbleed’ vulnerability (CVE-2014-0160), April 2014, https://www.us-cert.gov/ncas/alerts/TA14-098A
10. W. Pietersen, 2010, Strategic Learning, Wiley
11. Openstack Security/Threat Analysis, https://wiki.openstack.org/wiki/Security/Threat_Analysis
12. M. Baugher et al, 2004, The Secure Real-time Transport Protocol (SRTP), RFC 3711, IETF,
http://www.ietf.org/rfc/rfc3711.txt
13. 3GPP TS 33.220, Generic Authentication Architecture (GAA); Generic Bootstrapping Architecture (GBA),
http://www.3gpp.org/DynaReport/33220.htm, accessed September 2014
14. Open Kernel Labs and NICTA to Deliver Verified Microkernel/Hypervisor Technology to Mobi, August 2009,
http://www.ok-labs.com/releases/release/open-kernel-labs-and-nicta-to-deliver-verified-microkernel-hypervisor-techn
15. Provably Secure Execution Platforms for Embedded Systems (PROSPER), http://prosper.sics.se/ accessed
September 2014
SAFEGUARDING TOMORROW’S TRUSTWORTHY ICT ENVIRONMENT • GLOSSARY 15
GLOSSARYAES Advanced Encryption Standard
AKA Authentication and Key Agreement
BSF Bootstrapping Server Function
CDMA2000 Code Division Multiple Access family of 3G standards
DRM digital rights management
EU FP7 Seventh Framework Programme of the European Union
GBA Generic Bootstrapping Architecture
HSS Home Subscriber Server
IETF Internet Engineering Task Force
IPsec Internet Protocol security
RTP Real-time Transport Protocol
SRTP Secure Real-time Transport Protocol
UDP User Datagram Protocol
WLAN wireless local area network
© 2014 Ericsson AB – All rights reserved