Document Actions

Special Issue on Contracts on Digital Goods and Services

Interoperability in the Digital Economy

  1. Prof. Dr. Wolfgang Kerber
  2. Prof. Dr. Heike Schweitzer

Abstract

Interoperability has become a buzzword in European policy debates on the future of the digital economy. In its Digital Agenda, the EU Commission has identified a lack of interoperability as one of the significant obstacles to a thriving digital economy. The EU Commission and a number of other actors have advocated far-reaching policies for ensuring the interoperability of digital goods, services, platforms and communication networks. In this paper, we present a systematic framework for discussing interoperability problems from an economic and legal perspective and apply it to several interoperability issues such as, e.g., standardization, interoperability regulation in the field of electronic communication, duties of dominant firms (including platforms) to ensure horizontal and vertical interoperability and IP law exceptions in favor of interoperability. The complex trade-offs between benefits and costs of a higher degree of interoperability suggest the need for a careful and separate analysis of each specific interoperability issue, caution regarding a (top down) imposition of mandatory standards and interoperability obligations, and a greater focus on unilateral solutions of interoperability problems, such as adapters or converters. Within the framework of Art. 102 TFEU, EU competition law may be better advised to develop a workable test to address hurdles for unilateral interoperability solutions created by dominant firms, than to continue focusing on the essential facilities doctrine to mandate interoperability.

Keywords

1. Introduction*

Interoperability has become a buzzword in European policy debates on the future of the digital economy. In its Digital Agenda, the EU Commission has identified a lack of interoperability as one out of seven [1] “most significant obstacles” to the “virtuous cycle” of digitalization. [2] Effective interoperability between networks, devices, applications, data repositories and services has thus become a major goal of the European Digital Agenda, which aims to stimulate the emergence of “a truly digital society” and to boost innovation and European competitiveness. [3] Significant market players shall be led to pursue interoperability-friendly business policies. [4]

Indeed, in an interconnected economy, interoperability of a broad variety of networks, devices and services will be key. [5] The expected benefits of the Internet of Things and Industry 4.0 hinge on the interoperability between networks, software and data. Yet, interoperability is a complex concept. Any interoperability policy which strives to intervene into the market-driven determination of the degree of interoperability will come at a cost. Such trade-offs must be taken into account.

In our paper, we shall offer a systematic framework for discussing interoperability and the EU’s interoperability policy, and we will analyze the existing legal framework on this basis. In chapter 2., we introduce the concept of interoperability, provide an overview of its benefits and costs and the ensuing tradeoffs, and show that the market determination of interoperability can be subject to serious market failures where the degree of interoperability is determined unilaterally by a dominant firm, or where the market gravitates towards a uniform technical standard with natural monopoly characteristics. In the following chapters (3.-6.), we shall inquire how these insights translate into law and public policy. Both law and public policy have to consider that the need for interoperability may differ depending on the market setting, and that different paths towards interoperability exist, all of which have both advantages and costs. In certain settings public intervention may be justified; however, there should be a clear and strong reason for mandating and/or regulating interoperability.

Firstly, we shall look at standard-setting in this light, analyzing the different variants of standard setting, with a focus on the EU Commission’s pro-collective standard-setting policy (3.). Electronic communications networks provide an example where mandated interoperability may be justified – based in particular on a public service rationale. This rationale cannot easily be extended to digital platforms, however (4.). Competition law should be cautious in imposing interoperability remedies, in particular when they are based on a vague and potentially over-broad “essential facilities”-doctrine (5.). Instead, law and policy should focus more on protecting market solutions to non-interoperability. On the side of IP law, both the Software Directive [6] and the Trade Secret Directive [7] provide for decompilation exceptions to promote unilateral efforts to ensure interoperability. Competition law may apply where dominant firms try to hamper competitors in their efforts to invent around interoperability impediments. Taken together, these two instruments may be a promising and innovation-friendly alternative to broad public interoperability mandates (6). Chapter 7. will conclude.

2. Interoperability: Benefits, costs, trade-offs, and market failure

2.1. What is interoperability?

One of the difficulties of the interoperability discussion is the absence of a clear definition of interoperability. Broadly speaking, interoperability denotes the ability of a system, product or service to communicate and function with other (technically different) systems, products or services. Interoperability issues in the digital economy will typically relate to information exchange and data. In this context, Palfrey and Gasser, two leading figures of the interoperability debate, define interoperability as the "ability to transfer and render useful data and other information across systems, applications, or components".  [8] The EU Software Copyright Directive  [9] and the EU Draft Directive on Digital Goods and Services  [10] entail similar, but more context-specific definitions. Interoperability is thereby a sub-category of the broader, but also vaguer concept of compatibility; namely the “ability of two or more systems or components to perform their required functions while sharing the same hardware or software environment”.  [11] Since it is the communication and exchange between systems, products and services that is key in the digital economy, we shall focus on the concept of "interoperability".  [12] The boundaries that systems share and allow them to connect and exchange information are called "interfaces".  [13] Often interoperability will be based on the access to a (technical) standard.  [14]

Interoperability can be relevant on different layers; for example, syntactic/technical interoperability refers to the possibility that systems can physically connect to each other and can exchange data, whereas semantic interoperability refers to the ability of systems to understand the meaning of the information exchanged.  [15] Particularly important is the distinction between horizontal and vertical interoperability. Horizontal interoperability denotes the interoperability of competing products, services or platforms. One example is the interconnection between communication networks.  [16] Vertical interoperability refers to the interoperability of a product, service or platform with complementary products and services. The degree to which complementary products (e.g., digital goods as music files or e-books) can be shared across different platforms, and complementary products of one platform can be accessed from rival platforms is said to characterize the horizontal openness of a platform. The ability of independent firms to offer complementary products on a platform stands for its vertical openness.  [17] Both horizontal and vertical interoperability can be a matter of degree. First, technically there can be a continuum between full and no interoperability (with different degrees of partial interoperability, as, e.g., in regard to the number of functionalities). For example, interoperability issues may arise between different versions of software (upward and/or downward compatibility). Secondly, achieving interoperability may come at a cost, e.g. the monetary cost of developing adapters and converters, and the inconvenience of applying them. Thirdly, interoperability and openness can be symmetric or asymmetric, e.g., the products of platform A can be used on platform B, but not vice versa. There is, in other words, a wide continuum between no and full interoperability, with many different intermediate designs of partial interoperability between both extremes.

The extent and specific design of the interoperability of products, services, and platforms of a firm is influenced by both technological decisions and legal constructs.  [18] Namely, it depends not only on (1) technological decisions of the firm but also on (2) its decisions (a) to allow interoperability through contractual arrangements with customers and suppliers, (b) its willingness to disclose the necessary interface information and (c) its toleration of the unilateral development of adapters and converters by other firms. The different forms and degrees of interoperability indicate the complexity of the interoperability issue.

2.2. Benefits and Costs of Interoperability: An Overview

Even among the proponents of greater interoperability, there is a broad consensus that (1) interoperability is not an aim in itself, (2) there are both benefits and costs of interoperability, and (3) due to the ensuing trade-offs, the optimal degree and design of interoperability will be context-specific and will depend on the specific economic and technological conditions in a market.  [19] The following overview shall explain the potential benefits and costs of interoperability in a general way before we come to assess the Commission’s interoperability policy within the context of the existing legal framework.

Uniform standards allow for more mass production and a lower number of product variants. The resulting economies of scale and scope as well as network externalities can bring large cost advantages. Interoperability may also allow for a modularization of components of products, which can be used for different (often customized) products. It can reduce the costs for consumers (and increase their benefits), if they can more easily combine products from different firms and share them with other consumers on different devices or platforms. Moreover, it can reduce transaction costs through lower information costs about interoperability problems. Interoperability, especially through open standards and open platforms, can boost innovation with regard to complementary products and services – an effect that may be particularly important in the digital economy. Simultaneously, interoperability increases competition with regard to these complementary products and services, which may benefit consumers through lower prices. In addition, interoperability is a precondition for the interconnectedness and free flow of data that is crucial for a data-based economy, and therefore for data-driven innovation. Further advantages of more interoperability include greater choice for consumers, easier access to products and services, and more flexibility both for firms and consumers, due to a lower degree of lock-in (both for consumers and firms).  [20]

However, more interoperability and the use of uniform standards may also increase costs and risks both for firms and consumers. Most importantly, it can lead to a greater degree of homogeneity. To the extent that uniform standards and interfaces are used, the possibilities of firms to develop their own specific products and services are limited, because they have to comply with these standards and interoperability requirements. This will limit the scope for innovation and therefore the extent to which specific consumer preferences can be fulfilled by way of product differentiation.  [21] Although greater interoperability may lead to more innovation and competition with regard to complementary products, it also can lead to less innovation and competition with regard to the standards and interfaces themselves, which may have the characteristics of natural monopolies (with all their negative consequences). Furthermore, the openness of products and platforms for complementary products can lead to higher risks for consumers, if the complementary products offered by other firms are not monitored closely with a view to their interoperability, quality, and safety. Through a generally higher level of interconnectedness in a digital economy, more interoperability may lead to higher risks regarding reliability, security, and privacy.  [22] Considering these (potentially large) costs of interoperability, the policy objective should not be full or maximum interoperability, but rather an optimal degree of interoperability that balances benefits and costs.

2.3. Interoperability and competition: When should we expect market failure?

First and foremost, it is part of the entrepreneurial freedom of firms to decide themselves on the extent of the interoperability of their products and services. Selling products that are interoperable with other products, or offering an open platform that allows for sharing products and services with other platforms, can increase the value for customers and therefore increase profits. In the same way, the use of standardized components in a production value chain can reduce production costs and therefore allow for lower prices. However, firms may want to develop more innovative products and services that require more specific components and services, and/or think that the specific quality and features of their service can only be assured if they are capable of controlling the entire value network (including complementary products and services) according to their own specific requirements. A large degree of interoperability and openness to complementary products the quality and safety of which they cannot control may then endanger their business model. As a consequence, they may opt for a closed instead of an open system. A good example for such a business model is Apple: with the iOS operating system and the Apple App Store, it established a closed system, which allows for far-reaching control of all apps that run on the iOS operating system.

For a better understanding, it is useful to introduce the concept of modularity with interfaces and combine it with the distinction of competition between systems and competition within systems. In the (old) example of the automobile industry, it is the car manufacturer who decides on the entire product that consists of thousands of specific components in the value chain, which have to fit and interoperate but are produced by many independent suppliers. In a modularized system, the car manufacturer (or system leader) decides on the interfaces that the component suppliers have to use in order to ensure the smooth interoperability of all car components. Within such a modular system, suppliers can compete and innovate with regard to these modularized components (competition within system). However, only through competition among car manufacturers is the modular system with its specific interfaces itself subject to competition (competition between systems). Therefore, there are two levels of innovation: innovation within a system at the level of the components (but limited by the requirements of the interfaces); and innovation of the systems themselves (including the interfaces of such a modular system).  [23]

On the market, firms compete with different business models and different degrees of interoperability. A number of customers may prefer products and platforms that offer a more closed system of complementary products and services (and which are therefore less interoperable with other systems), even if this may lead to the customers being locked-in to some extent. Other customers will value the flexibility and larger choice of more open systems, even if this is accompanied by higher risks in terms of reliability or safety, and perhaps less convenience. In the same way, the producers of components or complementary products (as apps) can decide whether they want to develop and produce their products according to general standards or want to be part of a closed system with all its specific rules. Each will have specific advantages and costs. Competition economists would claim that in markets with effective competition, the firms have incentives to decide on the extent and design of interoperability that corresponds to the preferences of their consumers (and their supplier and app developers). Therefore, as long as there is effective competition, serious market failures with regard to the extent of interoperability cannot be expected.  [24]

The situation is very different if competition does not work well or is even impossible, e.g., due to natural monopoly problems. Two different groups of cases can be distinguished:

Dominant firms: This refers to situations in which a dominant firm already exists that can unilaterally decide on the interoperability of its products. The famous Microsoft case decided by the CFI in 2007  [25] is an apt example. Due to its dominant position on the market for PC operating systems, Microsoft’s decisions regarding the interoperability between its PC operating system and work group server operating systems were not effectively controlled by competition. Similar settings may gain importance in the digital economy because of the strong role of platform markets (search engine market, social media market etc.) with their strong positive network effects and tipping tendencies to quasi-monopolies.  [26]

Standards as natural monopolies: In this second group of cases, there is no dominant firm at the beginning, but the economic advantages of a (technical) standard and therefore of interoperability are so large that competition between standards is not sustainable. Ultimately, only one single uniform (technical) standard should exist (natural monopoly). Due to the economic advantages of (monopolistic) technical standards, their collective establishment within the framework of standard-setting organizations (SSOs) has been promoted by public policy for a long time. Important examples in the digital economy are telecommunication standards or the DVD-standard, and we have seen the claims that a data-based economy (such as the Internet of Things) needs new technical standards for ensuring data communication in highly interconnected systems.  [27]

While the two settings are different in many respects, law and policy have to address the following two problems in both scenarios:

(1) The situation of market dominance either at the beginning or at the end raises a danger of monopoly pricing and potential foreclosure and/or leverage options with regard to upstream / downstream markets and complementary products.

(2) There are serious concerns that the market may not be capable of identifying and implementing efficient technical standards in a competitive process. Fragmentation of standards, standard wars, and lock-in into inefficient or outdated standards may result.

In a comprehensive survey article about possible solutions to these interoperability problems through standard setting, Farrell/Simcoe have distinguished four different "paths to compatibility".  [28] (1) Firms compete in the market for setting their own standard as the single uniform standard, which can lead to "standard wars". (2) A dominant firm may have the power to impose a standard on the market.  [29] (3) Firms may agree on a new single standard through negotiation leading to the well-known solution of collective standard setting (with standard-setting organizations). In the following chapter 3 we will discuss in more detail the problems of standard-setting and the advantages and problems of these three solutions. (4) A very interesting fourth solution to the problem of setting a uniform single standard (with natural monopoly problems), is the market search for adapters and converters capable of either converting a format into another or at least ensuring that a product can be used on another platform (in a similar way as electricity adapters and converters). Where this solution works, it may render the establishment of a single standard unnecessary, because they reduce network externalities and "lock-in" problems (through reducing switching costs and allowing more flexibility). Consequently, adapters may enable a sustainable coexistence of different standards, and even beneficial innovation competition between them. Chapter 6 will discuss this alternative path towards interoperability with its problems and policy implications.

3. Interoperability through standardization

Economically, non-interoperability does not necessarily constitute, or result in, a market failure; and interoperability can be achieved in different ways. The EU Commission, however, consistently highlights the importance of interoperability as a core element of its Digital Single Market Strategy. Among the different strategies to achieve interoperability in the ICT sector, collective standard-setting enjoys the Commission’s particular support:

“Standardisation has an essential role to play in increasing interoperability of new technologies within the Digital Single Market. It can help steer the development of new technologies such as 5G wireless communications, digitisation of manufacturing (Industry 4.0) and construction processes, data driven services, cloud services, cybersecurity, e-health, e-transport and mobile payments.”  [30]

Standardisation has accompanied and shaped the evolution of the ICT industry for some time.  [31] Apart from influential industry consortia,  [32] standard-setting organizations (SSOs) have been crucial in developing open standards. The mandated development of the GSM standard by ETSI and its subsequent market roll-out is frequently considered a particular success of European standardization policy.  [33]

The EU Commission's pro-collective standard-setting strategy can be a suitable solution for solving standardization problems. However, both theoretical analysis and empirical studies indicate that, when comparing the different modes of standard-setting (competition for standards, decisions by dominant firms, collective standard-setting) and routes for solving interoperability problems (including the development of adapters), collective standard-setting will not always be optimal.

A comparative analysis of the benefits and costs of different modes of standard-setting has to start with the following effects and problems of standard-setting, which affect the different forms of standard-setting in different ways:

Dynamic / path dependency effects: Interoperability standards are characterized by positive (direct and indirect) network effects: for each firm, the attraction of a given standard grows with the number of other firms and products using it. A “critical mass” of adoptors is needed for the standard to survive in the marketplace. Frequently, first-mover advantages will exist, i.e. long-term competitive advantages of the standard of an early firm in comparison to later entrants. Where standard-setting has not (yet) become a collective endeavor, firms will therefore strive to attract as many adoptors as possible as fast as they can – individually or within the framework of joint ventures and strategic alliances. Where one firm (or group of firms) succeeds, the said network effects, combined with first-mover-advantages, may induce a “lock-in” of the market into the first successful standard. More efficient standards that are introduced later on may fail.  [34] In economics, this phenomenon is well-known as the problem of an inefficient market selection of technologies through dynamic effects (or path dependency effects).  [35] However, even where the successful standard was optimal at the time of its introduction, it may become inefficient over time. The lock-in effects can be an important barrier for the replacement of the old standard with newer ones.  [36]

However, if none of the firms is capable of securing a large advantage early on, so-called "standard wars" may emerge, in which the competing firms use bundling strategies, low pricing or preemptive strategies to fight competing standards and to achieve a "tipping" of the market in favor of their own standard. On the one hand, such competition may be advantageous because the market will not be locked into one standard early on. The extended period of competition between different standards can lead to the development of better standards. On the other hand, both the parallel experimentation with different standards and the uncertainty about the future standard can lead to wasteful investments and slow down the innovation on the market for complementary products and services.  [37]

Incentive problems of individual firms: Since the firms that try to introduce a standard (or participate in a process of collective standard-setting) have different strengths and weaknesses with regard to their technological capabilities, their patent portfolios and/or their market positions, their incentives and strategies for choosing and introducing a particular standard can differ significantly. The private incentives for choosing a certain standard may not align with the social benefits. This is all the more true because a firm will usually not be able to internalize all the positive effects that a standard may have for other firms and consumers. The benefits of open interfaces, for example, will accrue to the many other firms that are thus enabled to develop complementary products or services, and, as a consequence, to consumers.  [38] Due to this incentive problem, the market may end up in an equilibrium with too many different standards and isolated proprietary solutions. Such an excessive fragmentation is an important concern in the ongoing debates about standards in the digital economy.  [39] Compared to such fragmentation, even the unilateral setting of a standard by a dominant firm may be preferable, as the dominant firm may be better able to internalize the benefits of such a standard and may therefore have greater incentives for choosing socially efficient standards. At the same time, the dominant firm may have socially inefficient incentives to stifle competition and innovation in markets for complementary products and services (ex-post competition) and to block innovation that may endanger its (long-term) market position.  [40]

Knowledge problems: The development of new technical standards is in itself an innovation process that often takes place in the context of a rapid Schumpeterian technological evolution with disruptive innovations and a high degree of uncertainty (as in the current digital revolution).  [41] Therefore, it is hard or even impossible to reliably predict what the optimal technical standards for the next five or ten years may be, inter alia with a view to facilitating follow-on innovation (of complementary products and services). Both the firms and the state (or regulators) face this knowledge problem. This is why a decentralized bottom-up process that also encompasses a process of parallel experimentation with different new standards may be advantageous for finding better standards, even if, due to a longer period of competition between standards, some of the static advantages of a single standard may be lost. Hence, there may be a Schumpeterian trade-off between the static benefits of a single standard and the dynamic benefits of experimenting with different standards for finding better solutions (competition as a discovery process).  [42] Another implication of the knowledge problem is that it is often not clear whether a single monopolistic standard is the most efficient solution with a view to a specific interoperability problem, or whether two or more different standards may coexist and compete with each other in the market. These knowledge problems have to be taken into account when assessing potential market failures and defining desirable policy solutions.

What conclusions can be drawn from a comparison between the three main ways of standard-setting, with a view to these problems and effects? The said problems – dynamic effects, the critical mass problem and the danger of lock-in into an inefficient standard – may argue against decentralized standard-setting: competition for the standard may turn out to be a lengthy and wasteful process, and result in an inefficient standard in the end. Where a dominant firm imposes a standard, this will come at the risk of distorted incentives for choosing standards that stifle ex-post competition and innovation.  [43] Moreover, the absence of experimentation with different standards may lead to a premature lock-in into an inefficient standard.

Against this backdrop, collective standard-setting in standard-setting organizations (SSOs) may seem to be the preferable solution. Participation in standard-setting organizations is usually voluntary. Apart from that, SSOs can be organized (and therefore also work) very differently. Regardless of the precise procedure, the agreed upon standards will be the result of a negotiation process in which technical experts will typically play a crucial role. This increases the chances of identifying a high-quality standard.  [44]

Yet, SSOs are affected by a number of problems themselves. Due to their specific patent portfolios or market positions, the participating firms will usually have different interests. The need for a consensus solution is no guarantee for finding the best standard. Negotiations can fail or suffer from lengthy delays. During the process, firms are free to exit, possibly trying to impose their own standard unilaterally in the market.  [45] Where the search for a collective standard is successful and the standard is adopted by the market, monopoly problems may arise. In an effort to appropriate a significant part of the value of the standard, holders of standard-essential patents (SEP) may engage – and have, in the past, engaged – in hold-up strategies.

In spite of these well-known problems, and the cooperative nature of collective standard-setting notwithstanding, EU competition law has adopted a rather beneficial stance towards collective standard-setting. According to the Commission’s Guidelines on the applicability of Art. 101 TFEU to horizontal co-operation agreements,  [46] standardization agreements are usually considered to be pro-competitive, as they tend to promote the internal market, encourage the development of new and improved products or markets and ensure interoperability and compatibility to the benefit of consumers (para. 263). Therefore, where:

“participation in standard-setting is unrestricted and the procedure for adopting the standard in question is transparent, standardisation agreements which contain no obligation to comply with the standard and provide access to the standard on fair, reasonable and non-discriminatory terms will normally not restrict competition within the meaning of Article 101(1) [TFEU]”.

It may be considered a complement to this pro-collective standard-setting strategy that the Commission has recently stressed its determination to address the monopoly problem potentially associated with standard-essential patents. A review of FRAND  [47] licensing policies for SEPs shall ensure fair and easy access to the standard  [48] and contribute to lower royalty demands.  [49] In a legal and economic environment where collective standard-setting is considered key,  [50] the Commission wants to reduce the uncertainty that currently exists with regard to who the relevant community of SEP holders is and with regard to the cost of access to the cumulated intellectual property rights (IPRs) needed to implement the standard, and it strives to clarify the methodology applied to calculate the value of the licensing terms and the regime regarding the settlement of disputes. According to the Commission, a “fast, predictable, efficient and globally acceptable licensing approach, which ensures a fair return on investment for SEP holders and fair access to SEPs for all players is needed” (ICT Standardisation Priorities, p. 13). As of now, it is still unclear however, which direction the Commission’s efforts will take.  [51] In the past, the Commission has been willing to use competition law (namely Art. 102 TFEU) to go against exploitative licensing fees for SEPs following a patent ambush.  [52] Both the Samsung and the Motorola case have defined the preconditions under which a request of an SEP holder for an injunction may constitute an abuse of dominance.  [53] Apart from these special settings, the framework within which SEP holders commit to license on FRAND terms has been defined (albeit not enforced) by the relevant SSOs. In the future, the EU Commission may consider linking the legal privilege for collective standard-setting in SSOs to the existence and active enforcement of a qualified FRAND policy.

But the EU’s policy with regard to collective standard-setting is not limited to privileging and supporting market-driven cooperative standard-setting endeavors as a “bottom-up” approach. Being concerned that, at least in the ICT sector, standardization is increasingly taking place outside of Europe, potentially undermining European competitiveness,  [54] the Commission finds that it cannot be left to industry stakeholders to decide in which areas to develop standards, and at what speed. Rather, the Commission is determined to “define missing technological standards that are essential for supporting the digitisation of our industrial and services sectors” and to actively mandate European standardization bodies for a speedy delivery of standards  [55] in order to “ensure that ICT-related standards are set in a way that is more responsive to policy needs” and sufficiently fast.  [56] According to the recently published ICT Standardisation Priorities,  [57] open European standards for 5G communications,  [58] for the IoT, for cybersecurity, big data and cloud computing will be core. In various areas, the new digital economy requires an “open platform approach that supports multiple application domains and cuts across silos”. Open standards shall support the entire value chain and integrate multiple technologies (p. 7). In particular, the Commission is interested in such open platforms and standards in the area of eHealth, transport systems, including automated vehicles, smart energy and advanced manufacturing (p. 10 et. seq.). At the same time, the new standardization processes shall take into account the blurring of the boundaries between traditional sectors and industries, products and services. They shall consider safety needs, data exchange, and privacy concerns simultaneously (p.3) – aspects that, today, are typically dealt with separately. From this perspective, the Commission’s pro-collective standard-setting approach is not limited to addressing market failures. Rather, what resonates in these communications and statements is that European standard-setting is a pro-active trade and industrial policy.

While collective standard-setting certainly is an important route towards interoperability, the mixed experiences do not allow for the conclusion that it is the optimal solution from an economic perspective. Both economic theory and empirical studies suggest that all paths towards interoperability have advantages as well as disadvantages. All strategies can work well under certain circumstances and suffer from serious problems under others. According to Farrell/Simcoe, it may be advisable to allow for the parallel pursuit of, and experimentation with, all four interoperability strategies, instead of heavily relying on just one of them. Even hybrid solutions may evolve in the market place over time.  [59] A cautious, market-friendly approach is all the more expedient in light of the technological revolution that we currently witness in the digital economy. The greater the knowledge problems, the more suitable a more decentralized “bottom-up” search for standards and other interoperability solutions may be. Adapters and converters may play an important role in such a discovery process (see chapter 6). The indubitable merits of a pro-active policy stance towards standardization in the digital economy notwithstanding, there is a risk that in a highly innovative and dynamic digital environment, such a push for speedy, top-down standardization may lock the European industry into premature standards.

4. Interoperability regulation in the field of electronic communications

In some areas, the EU has gone far beyond a voluntary pro-collective-standard-setting approach and has created a legal basis for mandating interoperability within the framework of a regulatory regime. The legal empowerment of national regulatory authorities (NRAs) to mandate access  [60] to or interconnection  [61] between physical electronic communication infrastructures  [62] – and in the future possibly to mandate interoperability even between number-independent interpersonal communications services  [63] – is arguably the best example.

From an economic perspective, such access/interconnection/interoperability requirements may have three different rationales. (1) Communication network operators may be dominant in a relevant market for access of downstream competitors to the network (or to elements of that network) and may have incentives to act anti-competitively in this market, e.g., through not granting access to (unbundled), non-duplicable elements of their networks, which are essential for competitors to offer telecommunication services themselves. Therefore, there may be inefficiently low vertical interoperability (see also section 5). (2) Horizontal interconnection obligations between communication network operators that ensure end-to-end connectivity across networks eliminate the danger that the market may "tip" towards the largest communication network due to network effects. Horizontal interconnection regulation will shift the network effects from the individual network to the level of all interconnected networks and can thereby prevent the emergence of dominant communication networks (with all their potentially problematic effects). (3) In contrast to the first two rationales, which relate to market failure problems with regard to competition, the goal to ensure end-to-end interconnectivity in electronic communications may also be grounded in a public universal service policy. Such a policy is not based on a pure economic efficiency rationale, but relies heavily on a political decision in favor of society-wide end-to-end connectivity. Beyond distributional reasons, universal service in electronic communications provides for a communication infrastructure that is considered essential for the functioning of the economy, democracy, and the entire society.

The network access and interconnection regime for the electronic communication sector is currently  [64] set out in the Access Directive 2002/19/EC. According to this directive, NRAs may impose access obligations upon network operators based on different legal norms. Art. 5 of the Access Directive allows for the imposition of (vertical or horizontal) access, interconnection and interoperability requirements on electronic communication network operators irrespective of their market power, if necessary to ensure end-to-end connectivity. As the irrelevance of dominance shows, the goal of this norm is not to fight abuses of market power. Rather, it shall promote “efficiency, sustainable competition, and [...] the maximum benefit to end-users” – a justification which points both to the elimination of network effects as a factor of competition between electronic communication networks and to a universal service rationale. Yet, in practice, the German national equivalent to Art. 5 of the Access Directive – § 18(1) TKG – has been of limited relevance so far.  [65]

Art. 8(2) with Art. 12(1) of the Access Directive 2002/19/EC have been significantly more relevant. Based on these provisions, NRAs may impose a range of access obligations upon network operators found to possess “significant market power”  [66] in a market that the Commission has found to potentially be in need of regulation, and “where the regulatory authority considers that denial of access or unreasonable terms and conditions having a similar effect would hinder the emergence of a sustainable competitive market at the retail level, or would not be in the end-user’s interest”. The duties that may be imposed range from a duty to negotiate in good faith with undertakings requesting access (Art. 12(1) lit. b) to a duty “not to withdraw access to facilities already granted” (Art. 12(1) lit. c), an obligation “to give third parties access to specified network elements and/or facilities, including unbundled access to the local loop” (Art. 12(1) lit. a), to “grant open access to technical interfaces, protocols or other key technologies that are indispensable for the interoperability of services or virtual network elements” (Art. 12(1) lit. e), “to provide specified services needed to ensure interoperability of end-to-end services to users, including facilities for intelligent network services or roaming on mobile networks” (Art. 12(1) lit. g). Again, the regulatory authority may impose access or interconnection duties to ensure either horizontal or vertical interoperability.

The linkage of these authorizations for intervention to a position of “significant market power”, which is generally understood to be equivalent to market dominance within the meaning of Art. 102 TFEU, suggests a competition law rationale. Where a dominant network operator would refuse to grant access to (unbundled) elements of its network which are essential for competitors to offer telecommunication services themselves and which cannot be duplicated, the “essential facilities”-doctrine would suggest an abuse of dominance. It is much less obvious whether an obligation to ensure horizontal interoperability – i.e. interconnection between two in and by themselves complete networks – could be imposed under competition law. So far, the pure reliance on network effects to work to a dominant firm’s benefit has not been considered an abuse.  [67] Like Art. 5, Art. 8 with Art. 12 of the Access Directive may therefore be informed by the goal to prevent market tipping (see above) – a pro-competitive rationale, but with no firm basis in competition law. Furthermore, the ex-ante-regulatory remedy under Art. 8 with Art. 12 of the Access Directive is limited to electronic communications markets where dominance is particularly entrenched.  [68]

Recent debates have evolved around a possible extension of the existing interoperability requirements for electronic communications network operators towards (dominant or even non-dominant) number-independent interpersonal communications services providers (e.g. WhatsApp  [69] ) and upon social media platforms (e.g. Facebook  [70] ). As is the case with electronic communications networks, interoperability between interpersonal communications services providers or social media providers would ensure end-to-end connectivity. In addition, an interoperability requirement would exclude the possibility for a dominant platform in a market characterized by tipping tendencies to function as “closed communities”. Network effects so far working in favor of the dominant platform would benefit all like platforms as well.  [71]

It is arguably along this logic that Art. 59(2) lit. c of the Draft European Electronic Communications Code  [72] now proposes to introduce a new legal basis for NRA’s intervention. According to this draft provision, NRAs shall be able to impose:

“in justified cases, obligations on providers of number-independent interpersonal communications services to make their services interoperable, namely where access to emergency services or end-to-end connectivity between end-users is endangered due to a lack of interoperability between interpersonal communications services”.  [73]

The extension of horizontal interoperability regulation from physical infrastructures to interpersonal communications services and digital platforms is, however, not at all obvious. The balance of interests differs significantly. Neither the goal to prevent market tipping nor the universal service rationale are relevant across the board when it comes to digital platforms. Universal services policies strive to ensure a basic service – but not end-to-end connectivity in any possible respect. Interventions into the digital platform operators’ freedom to choose between closed and open systems lacks justification where end users typically engage in multi-homing and thereby ensure de facto end-to-end connectivity themselves. Similarly, where multi-homing is common, tipping may not be an issue. Even where tipping may be a concern, the imposition of interoperability duties upon digital platforms may imply a significantly more interventionist regime than the interconnection requirement between physical networks. It is therefore important to clearly distinguish between network interconnection and platform interoperability.

Network interconnection is essentially limited to enabling an unhindered transmission of signals across well-defined technical interfaces. There is no need to regulate the resulting forms of communication or services. Physical network operators will normally not be responsible for regulating the content exchanged. Mandating horizontal interoperability between number-independent interpersonal communications services is an entirely different matter. The difficulty starts with determining what exactly interoperability shall mean. Interpersonal communications services operators may allow for the exchange of very different forms of data and content. In such a case, open interfaces may not be enough for ensuring end-to-end connectivity. Along which parameters and according to what rules shall users of different services be able to communicate? Which functionalities must be available? Which formats and user interfaces shall be used? Which legal authority will a service provider have over “external” users’ speech? Likely, full horizontal interoperability can only be realized based on a high degree of standardization and/or horizontal cooperation between competitors. The degree of services differentiation will then suffer – a high price to be paid in an innovative, dynamic market setting with frequently changing business models and market boundaries.  [74] A harmonization of contractual rules may even be required to make the regime manageable.  [75] Such an interoperability regulation is likely to affect investment choices by the dominant network operator and its competitors in potentially complex ways.

Given these concerns, a strong justification for mandating horizontal interoperability will be needed. The universal service logic that applies to the interconnection of physical electronic communications networks should not be easily extended to all types of communication services. A severe form of market failure and/or policy need should be clearly identified. Measures less intrusive than the imposition of interoperability must be unavailable. Frequently, a widespread practice of multi-homing or the availability of “adapters”, i.e. of instruments that allow users to overcome interoperability hurdles unilaterally, will provide for an acceptable level of connectivity.

These restrictions to any interoperability requirement are set out only incompletely in the new Art. 59(2) lit. c of the Draft European Electronic Communications Code. The breadth of regulatory necessities implicated by an extended interoperability policy for number-independent interpersonal communications services should caution against the introduction of such a provision, or at least against its future application by NRAs.

5. Horizontal and vertical interoperability in the case of dominant firms

In section 3, we showed that we cannot expect market forces to bring about efficient interoperability solutions in the presence of a dominant market player: the market outcome may not properly match the trade-offs between the advantages of more interoperability and the advantages of more differentiation that less interoperable and more “closed” systems may allow for. Some extent of market failure with regard to optimal degrees of horizontal and vertical interoperability may emerge. We shall now inquire how competition law can address the resulting market failures. Is competition law – and in particular Art. 102 TFEU – available where competition fails to control a dominant digital platform’s unilateral “closed” business strategy – both with regard to horizontal and vertical interoperability?  [76]

5.1. Horizontal interoperability

Horizontal interconnection / interoperability denotes the ability of horizontally competing networks, services or platforms to interact with one another (see above). As the example of electronic communications networks has shown, interconnection / interoperability requirements can prevent market tipping, since network effects will no longer work in favor of the strongest player alone, but will be market-wide. While this, together with a universal service rationale, has been a justification for the imposition of regulatory duties, the question is whether a refusal to interconnect with a horizontal competitor could qualify as an abuse under Art. 102 TFEU – and consequently justify the imposition of interoperability duties as a competition law remedy. There is, however, only one single precedent – a precedent from US antitrust law – for the imposition of such a duty to cooperate horizontally, namely the Aspen Skiing case.  [77] In US law, the Aspen Skiing case is highly controversial  [78] and known to lie at the “outer boundary” of antitrust liability.  [79] Under EU competition law, the refusal to interconnect horizontally has not yet been found to constitute an abuse. The fact that a dominant firm benefits from network effects does not qualify as an abuse, nor does the risk of market tipping change this legal appraisal. From an economic perspective, a duty to interoperate at the horizontal level would risk to replace competition for innovation and differentiation by mere price competition between homogeneous products and services. It is not for competition law to impose such choices.

Instead of mandating horizontal cooperation, EU law has, in various contexts, opted for an alternative and significantly less intrusive instrument to increase competition: Both Art. 20 of the General Data Protection Regulation 2016/679,  [80] and Art. 16(4) lit. b of the Draft Directive on Digital Content  [81] set out a duty to ensure data portability.  [82] Data portability requires some degree of interoperability between different data formats, but does not presuppose full interoperability. While data portability – contrary to interoperability – does not overcome network effects that may work in favor of one particularly prominent platform, it may ease any data-induced lock-in effect. By increasing user mobility, a coordinated move to superior alternatives is facilitated. Market barriers to entry are not eliminated, but reduced.

Similarly, the ability of dominant firms to enter into exclusivity agreements with customers will be subject to significant constraints under Art. 102 TFEU, as such agreements will impose additional switching costs upon customers and thereby reduce competition.

5.2. Vertical interoperability

A dominant firm (or platform) may also have incentives to foreclose competition on adjacent markets by hampering interoperability with third party complementary products and services.  [83] As users will frequently place a premium on interoperability, such conduct may have the potential to leverage market power from the platform market to neighboring markets. In order to protect competition and follow-on-innovation on such adjacent markets, mandating vertical interoperability may be economically justified.  [84] At the same time – as already discussed in the context of network interconnection regulation – mandating access to interfaces or platforms may negatively affect the innovation and investment incentives of the dominant firm at the platform/systems level. Also, there may be valid efficiency justifications for a closely controlled interface (or platform), such as, inter alia, quality, safety, and security concerns.  [85] The economically optimal degree of vertical interoperability will depend on the specific circumstances of a case.

This is the complex economic dilemma underlying the so-called “essential facilities” doctrine. On the basis of this doctrine – well established in EU competition law, but treated with much more skepticism in US antitrust law – a refusal to grant access to interface information has been qualified as an abuse of dominance in the Microsoft case.  [86] In 2004, the EU Commission ordered Microsoft to make available to its competitors on the work group server market interoperability information regarding the interface with Microsoft’s client PC operating system. Microsoft had freely provided this information to third parties for some time. After entering the work group server market itself, and having gained some experience with this product, it had ceased to do so in 1998 however.  [87] Microsoft’s competitors on the work group server market tried to maintain some level of compatibility between their software and Microsoft’s client PC operating system based on re-engineering techniques. Yet, the degree of compatibility – and hence the quality and utility of their workgroup server software for users – was significantly reduced. In order to compete effectively in the work group server market, competitors needed full access to Microsoft’s interface specifications. According to the Commission, under these circumstances Microsoft’s refusal to disclose the relevant interface information to competitors constituted an abuse of Microsoft’s dominant position on the market for client PC operating systems. The protection of the relevant interface information by alleged IPRs did not justify Microsoft’s refusal to disclose, as this refusal significantly hampered follow-on innovation and competition on quality in the market for work group servers. Microsoft had limited the technical development to the prejudice of consumers. In 2007, the GC upheld the Commission’s decision.  [88]

Much of the controversy that has followed this judgment has concerned its precedential value.  [89] The ECJ’s broad interpretation of the criteria for finding an abuse of dominance under the so-called “essential facilities” doctrine has the potential to significantly overstretch the doctrine’s reach in future cases. GA Jacobs, by contrast, has famously called for a narrow construction of the “essential facilities” doctrine in an earlier case:  [90]

“In the long term it is generally pro-competitive and in the interest of consumers to allow a company to retain for its own use facilities which it has developed for the purpose of its business. For example, if access to a production, purchasing or distribution facility were allowed too easily there would be no incentive for a competitor to develop competing facilities. Thus while competition was increased in the short term it would be reduced in the long term. Moreover, the incentive for a dominant undertaking to invest in efficient facilities would be reduced if its competitors were, upon request, able to share the benefits. Thus the mere fact that by retaining a facility for its own use a dominant undertaking retains an advantage over a competitor cannot justify requiring access to it.” (para. 57)

In fact, the Commission, in its Microsoft decision, had tried to consider this concern. Addressing Microsoft’s argument that an obligation to disclose its allegedly IP-protected interface information would reduce its future incentives to innovate, the Commission had proposed an “incentives balance test”: a refusal to license should be justified if the resulting innovation incentives for the dominant firm would outweigh the loss of innovation by rival firms on the adjacent market. In the Microsoft case, the Commission had found the overall innovation activities in the industry to be larger with than without mandatory disclosure of interface innovation, however.  [91]

Economically, this balance test restates the difficult trade-off between the different innovation incentive effects of open versus closed interfaces.  [92] Nonetheless, the GC did not endorse this balance test. It is the task of the law to translate economic insights of the relevant trade-offs into legally manageable criteria that allow for a certain degree of predictability and legal certainty. In the absence of economic methods that allow for a reliable quantification of these innovation incentive effects, an incentives balance test cannot be expected to render objective, predictable results.

Unfortunately, the Microsoft judgment does not offer an alternative test that would sensibly limit the application of the doctrine to other interoperability cases either. The lack of conceptual clarity regarding the “essential facilities” doctrine as it now stands complicates its transposition to the relatively new and not yet fully understood phenomenon of digital platforms. Strong concentration tendencies in platform markets might seem to justify a pro-active imposition of interoperability obligations at first sight – a measure of comparatively low intrusiveness, but suitable to effectively prevent a long-standing monopoly. Interestingly, the Commission’s 2005 Discussion Paper on Exclusionary Abuses suggested such a line of reasoning.  [93] While the Commission highlighted that there “is no general obligation even for dominant companies to ensure interoperability”, it proposed to assume an abuse wherever a dominant company withheld the relevant interface information to leverage market power from one market to another. Even if the relevant information were protected by a trade secret, it might “not be appropriate to apply to such refusals to supply information the same high standards for intervention” as they have been established for refusals to provide access more generally.

This passage has not made its way into the Commission’s final Guidance Paper on exclusionary abuses, which was published in 2009.  [94] In substance, it has downplayed the context-sensitivity of interoperability. Current discussions on the application of the “essential facilities” doctrine to digital platforms rather question its suitability in a context which significantly differs from the traditional setting of physical infrastructures.  [95]

An example for such caution in imposing interoperability remedies is the French Conseil de la Concurrence’s refusal to order Apple to license its Digital Rights Management (DRM) technology FairPlay to VirginMega, one of its competitors in the market for music download services.  [96] Apple tried to tie iPod users to its own music download service iTunes by using its proprietary Fairplay technology, refusing to support rival standards on its iPod, and refusing to license FairPlay to competitors in the music download services market. VirginMega’s request for a Fairplay license to expand its user base was denied. Yet, the Conseil de la Concurrence found that, irrespective of a possible position of dominance of Apple on the markets for portable music players and downloaded music, access to Apple’s DRM technology was not indispensable for operating a music download service. Among the core arguments was the possibility for users to create compatibility themselves, namely by converting the format of VirginMega’s downloaded music into Apple’s (“ripping”). The cost of doing so was negligible, and it was a commonly used method. Also, several alternative portable music players were available on the market, all of which were compatible with VirginMega’s DRM technology. Finally, the Conseil de la Concurrence was convinced by Apple’s argument that licensing FairPlay to VirginMega would have weakened its security system, contrary to its contractual commitments to the recording industry.

This case once again illustrates the potential complexities of imposing interoperability duties on digital platforms, not only in horizontal, but also in vertical settings. Interoperability that extends beyond a purely technical level may raise issues of contractual and non-contractual liability and of security, and may consequently go along with heightened monitoring requirements. There may be valid business reasons not to allow for interoperability with competing platforms, but to operate a closed community. Here, like in the case of horizontal interoperability, data portability may be a preferable instrument for promoting competition (see above).

6. Hurdles for unilateral interoperability solutions

6.1. Adapters and converters as unilateral interoperability solutions

A very important (and in the discussion so far underestimated) group of solutions to interoperability problems are unilateral solutions. Firms that want to link up to a “closed” system, or that want to enable their users to link up, can create and offer adapters or converters that achieve (full or limited) interoperability with the “closed” platform or a system without that platform’s active cooperation. In effect – depending on the degree of their perfection – adapters or converters may be able to eliminate the "natural monopoly" situation of a single uniform standard (see above, 3.) and allow for the coexistence and competition between different standards, thus reviving the market mechanism for finding optimal or replacing outdated standards. Irrespective of standardization, adapters and converters can solve many of the interoperability problems associated with the horizontal and vertical openness of platforms and other closed systems.  [97] Adapters and converters may also facilitate portability, thereby reducing switching costs and lock in-problems of consumers and firms, or help to solve aftermarket problems. The decentralized and bottom up invention of adapters and converters can promote innovative solutions for a wide array of interoperability problems.

As adapters and converters may seriously challenge a firm’s business choice in favor of a “closed” system, such firms may have strong incentives to obstruct the well-functioning of such interoperability solutions, however, thereby re-establishing the users’ lock-in.  [98] Possible instruments of obstruction range from the technical design of interfaces such as to hamper the unilateral interoperability solutions to a frequent change of interfaces or pro-active blockages.  [99] Facebook for example, has been said to actively block Google Chrome’s extension for exporting friends, thereby reinforcing the lock-in of Facebook users. In order to ward off decompilation efforts by competitors on the core market or neighboring markets, dominant companies may integrate so-called “obfuscators” into their software to complicate the attempt to access interface information.  [100] In a more recent proceeding against Google, the Commission is concerned that Google has contractually restricted software developers in the offering of tools that allow for a seamless transfer of search advertising campaigns across different search engines.  [101]

Competition law may have to take a stance on such actions when it is a dominant firm that engages in such behavior. The said strategies can reduce competition and innovation of complementary products and services, and thereby reduce social welfare and harm consumers. Any competition law analysis will have to consider the potential costs of adapters and converters however. While the protection of differentiation may not be a central concern in a dominance setting, even dominant firms may legitimately strive for a higher degree of quality and/or security by opting for a “closed” system. The invention of adapters and converters can also weaken the closed system operator’s innovation incentives and create free-rider problems. Competition law – as well as IP and trade secret law – may want to take account of the finding that in certain settings, certain strategies for defending the "closedness" of a platform or a system may be economically justified

6.2. “Interoperability obstruction” as an abuse of dominance?

From a competition law perspective, adapters and converters, wherever they emerge in the presence of a dominant platform or system, seem to hold the promise to significantly revive competition, and may be relevant in two important respects. Firstly, a non-interoperability policy of a dominant platform or system should not be considered abusive if sufficiently effective means are available to competitors or users to achieve interoperability themselves. In the Apple FairPlay case, the Conseil de la Concurrence considered the possibility to convert the music files into a different format in this spirit. Any means that enables competitors and/or users to solve the interoperability problem themselves should be considered before imposing access remedies.

Secondly, an obstruction of such market-driven interoperability solutions by the dominant player may constitute an abuse under Art. 102 TFEU. In the US, the 9th Circuit Court refused to qualify MySpace’s decision to redesign its social media platform such that individual users were no longer able to link to content of competing social media platforms as an unlawful monopolization.  [102] While MySpace was the dominant social media platform in the US at the time, the 9th Circuit Court did not find either exclusionary conduct or causal antitrust injury. A refusal to deal claim failed because, according to the Court’s reasoning, there was no prior course of dealing between MySpace and its competitors; if at all, there had been a prior course of dealing between MySpace and its users. Moreover, the plaintiff had not shown that any prior course of dealing had been profitable to MySpace, such that its termination was contrary to MySpace’s interest. The fact that MySpace’s conduct prevented consumers from accessing competitors’ websites through MySpace did not suffice for finding an antitrust injury. It is unclear whether this case would have been decided similarly in the EU.

At the same time, it is notoriously difficult to deal with practices by which dominant firms frequently change the configuration of relevant interfaces and thereby frustrate attempts by competitors to access interface information by way of reverse-engineering. While such changes may boil down to raising rivals’ costs strategies, they may also qualify as legitimate product innovation or security measures.  [103] In ambivalent cases, the outcome will frequently depend on the structure of the legal rule that is applied to such conduct: namely (1) on the division of the burden of proof; and (2) on whether the relevant conduct of a dominant firm should be subject to a proportionality principle. In the US, courts have proposed different tests under Sec. 2 Sherman Act. According to one line of cases, the implementation of a product change by a dominant firm will not be considered anti-competitive whenever the dominant firm can show some degree of innovation or product improvement.  [104] No balancing of the benefits of product improvement versus anti-competitive effects shall apply, as courts would be unable to administer such a balancing exercise.  [105] In United States v. Microsoft Corp., by contrast, the Court of Appeals – District of Columbia Circuit has proposed a somewhat different test: where likely anti-competitive effects are established, the analysis will not end with the proposition of an “innovation” or “product improvement defense”. Rather, in reaction to an alleged pro-competitive justification, any plausible pro- and anti-competitive effects need to be analyzed within the framework of a “balancing enquiry”.  [106] US academics are divided along similar lines: some have argued for a strong presumption in favor of the legality of any type of product innovation,  [107] while others have supported the Microsoft balancing test.  [108]

Within the EU, no clear test for “interoperability obstruction” has evolved as of now.  [109] In the European Microsoft case – which could have been considered a case of interoperability obstruction – the GC applied the ill-suited “essential facilities” doctrine instead (see above). A broader view of the European case law would suggest that the proportionality principle will play a significantly larger role in the EU as opposed to the US. The challenge how to structure the balancing of anti- vs. pro-competitive effects on competition such that the consequence is a manageable and predictable test has not been met so far. Shifting focus from access to interoperability information to addressing potentially anti-competitive strategies of interoperability obstruction appears to be the next and much-needed step in developing a sound pro-interoperability strategy for the digital age.

6.3. Protection of decompilation in IP and trade secret law

Where the lack or inadequacy of (horizontal or vertical) interoperability between products and/or services is due to the non-availability of software interface information, the evolution of market-driven remedies may be promoted by efforts of market actors to decompilate the relevant software. Decompilation denotes the process by which a machine executable program is analyzed and translated back into the original source code.  [110] Where the software proprietor refuses to grant access to relevant interface information, third-party decompilation may allow for its extraction, and may thereby allow producers of complementary software or products to ensure or improve interoperability.  [111]

Decompilation involves the copying of the relevant software. Therefore, where the relevant software is protected by a copyright, the prior approval of the right holder may be needed. While ideas and principles are not protected by copyright law, the process used to identify these ideas and principles may infringe copyrights. Art. 6 of the Software Copyright Directive  [112] provides for an exception however, where decompilation is used with the aim to achieve interoperability. The precise structure of this exception is the result of a hard-fought battle between lobbying groups,  [113] which was ultimately won by the advocates of a rather restrictive exception. In order to rely on the exception, the person undertaking the decompilation must have a license or a right to use the program; decompilation must be “indispensable to obtain the information necessary to achieve the interoperability of an independently created computer program with other programs”; this indispensable information must not have been previously available to the person performing the decompilation; and decompilation must be confined to the parts of the original program which are necessary to achieve interoperability. Where these conditions are met, the Software Copyright Directive does not distinguish between horizontal and vertical interoperability: decompilation is then permissible in both cases. However, the ECJ, in its SAS judgment, has found that the information obtained by way of decompilation must not be used “for the development, production or marketing of a computer program substantially similar in its expression, or for any other act which infringes copyright.”  [114]

Where some of the preconditions for the decompilation exception in Art. 6 of the Software Copyright Directive, like the indispensability criterion and the proportionality criterion, seem to be informed by the “essential facilities” doctrine; both the preconditions for the permission and the content of the permission differ substantially: Art. 6 does not presuppose market dominance. At the same time, Art. 6 does not burden the right holder with a duty to actively provide access or public information, but is limited to a duty to tolerate.

The exception provided for in Art. 6 of the Software Copyright Directive has been extended to the unified patent. According to Art. 27(k) of the Agreement on a Unified Patent Court, the rights conferred by European patents with unitary effects will not extend to the use of information obtained through the acts allowed under Article 5 and 6 of the Software Copyright Directive, in particular by its provisions on decompilation and compatibility. Likewise, the acquisition of a trade secret is considered lawful when the trade secret is obtained by “observation, study, disassembly or testing of a product or object ... that is lawfully in the possession of the acquirer of the information”.  [115] However, the use of a trade secret shall be unlawful, where it is carried out in breach of a confidentiality agreement or a contractual duty to limit the use of the trade secret (Art. 4(2) lit. b and c of the Trade Secret Directive). The trade secret exception can therefore easily be overridden by the right holders' licensing terms.  [116]

While these IP law exceptions seem to open a different, market-driven path towards interoperability, it is not completely clear how useful these exceptions are in practice in helping to overcome the hurdles erected by non-interoperability business strategies. Obviously, the exceptions will not help where the relevant software is not available to other market actors, but runs only on servers of the software proprietor (so-called Application Service Providing – ASP). Secondly, software proprietors frequently engage in code obfuscation in order to hinder decompilation. Code obfuscation implies a deliberate modification of the relevant code meant to hamper its understanding.  [117] In principle, it will not completely preclude decompilation, but it can significantly complicate and make it economically unattractive for all practical purposes.  [118] Apart from obstructing competitors in their decompilation efforts, such a practice may also function as a prima facie legitimate security measure.  [119] It cannot be easily prohibited therefore.  [120] Finally, the IP exceptions have been criticized for being too narrowly construed.  [121] Along this line, the Commission Staff Working Paper on interoperability has discussed whether interoperability information should be protected by copyright at all.  [122] A number of scholars have argued in favor of a general permission of reverse engineering.  [123] The rights holder’s legitimate interest in retaining a competitive lead will be protected nonetheless by the fact that re-engineering of complex interface information is time-consuming and costly for competitors.  [124] The “interoperability exception” may continue to be limited to those cases where the competing software does not contain identical or very similar expression. Furthermore, more discussion will be needed on the limits of the interoperability permission in cases where the interoperability information reveals to a large extent the technology and functionality implemented by a device or a system beyond its interfaces, or allows access to such functionality.

Overall, a search for market-driven solutions to interoperability hurdles should certainly include a renewed discussion on the optimal construction of an IP exception for interoperability information. However, while a significantly broadened exception could be an element of a pro-interoperability policy, it will not provide a general solution. It would need to be flanked by a competition policy that would actively address the anti-competitive obstruction of market efforts to overcome interoperability hurdles.

7. Conclusions: Towards a prudent pro-interoperability policy in the digital economy

Interoperability features prominently in the rhetoric of the Commission’s Digital Agenda. [125] Yet, the Commission was right to drop the idea of imposing a general duty to license interoperability information in the ICT sector within the framework of an Interoperability Directive [126] as temporarily envisaged in 2013. [127]

Firstly, interoperability is not – or should not be – an end in itself; it is a means to a broader set of goals: to address market fragmentation; to avoid market tipping towards monopoly; to open downstream markets for competition where the upstream market is monopolized; to increase follow-on innovation irrespective of market power; or to address a perceived societal need for general interconnectedness and communication across competing networks. In each case, before taking action a clear and strong market failure or public service rationale should be identified.

Secondly, even if some sort of market failure has been identified, there is no general single best way towards achieving interoperability. The importance of interoperability, its optimal degree, and the optimal path will differ depending on the technological context and the market environment. Due to the complex trade-offs, interoperability issues and potential policy solutions must be analyzed with a view to the relevant sector and technology.

Both when applying competition law rules and when considering further-reaching public policy interventions, the existence of different paths to interoperability and the trade-offs inherent in each one of them should be kept in mind. In certain settings, mandated interoperability may still be justified. However the “essential facilities” doctrine in its current form lacks clear boundaries when applied to interoperability problems. Before considering mandated interoperability, it must be established, with some certainty that market solutions ranging from competition for a standard to unilateral or collective standard-setting to adapter or converter solutions will fail. The positive imposition of interoperability requirements must remain a measure of last resort. Although we cannot be sure that the market is always capable of finding the best or even satisfactory solutions for interoperability problems, competition in the market provides the innovating firms with incentives for developing products and services with a degree of interoperability that matches the preferences of consumers. Business strategies that restrict interoperability may be justified by legitimate business concerns. In the midst of a disruptive technological and economic revolution like the digitization of the economy, uncertainty about the appropriate standards and other interoperability solutions calls for caution in imposing top-down public policy solutions. There is a real danger of regulatory failure, and the implementation of wrong solutions may distort and impede technological and economic progress.

There is, therefore, a good cause for looking carefully for prudent pro-interoperability policies. In view of the potential cost of mandated interoperability with regard to the path of innovation, a strict proportionality principle should apply. Before mandating access, policy makers, regulatory and competition authorities should strive to support decentralized bottom-up interoperability solutions wherever possible. The EU Commission has started to look for such strategies: user rights to portability of content and/or data may significantly reduce switching cost in a non-interoperable environment. Also, more attention should be given to defining the preconditions under which the pro-active unilateral obstruction of a decentralized search for adapters or converters by a dominant firm may constitute an abuse.

* Wolfgang Kerber is a professor of Economics at the Marburg Centre for Institutional Economics, School of Business & Economics, Philipps-University Marburg.

Heike Schweitzer is a professor of Law and Managing Director of the Institute for German and European Economic Law, Competition Law and Regulatory Law, Freie Universität Berlin.



[1] The other obstacles are: fragmented digital markets; rising cybercrime and risk of low trust in networks; lack of investment in networks; insufficient research and innovation efforts; lack of digital literacy and skills; and missed opportunities in addressing societal challenges – see EU Commission, A Digital Agenda for Europe, Brussels, 19.5.2010, COM(2015)245 fin., p. 5-6.

[2] EU Commission, A Digital Agenda for Europe, Brussels, 19.5.2010, COM(2010)245 fin., p. 3.

[3] See, for example, EU Commission, A Digital Agenda for Europe, Brussels, 19.5.2010, COM(2010)245 fin., p. 14-15; EU Commission, A Digital Single Market Strategy for Europe, Brussels, 6.5.2015, COM(2015)192 fin.

[4] See EU Commission, A Digital Agenda for Europe, Brussels, 19.5.2010, COM(2010)245 fin., p. 15: The Commission will examine the feasibility of measures that could lead significant market players to license interoperability information while at the same time promoting innovation and competition”.

[5] For a broad account of the role of interoperability in the digital environment see Palfrey/Gasser, Interop, 2012.

[6] Directive 2009/24/EC on the legal protection of computer programs.

[7] Trade Secret Directive 2016/943 of 8 June 2016, OJ 2016 L 157/1.

[8] Palfrey/Gasser, Interop, 2012, p.5, and the „Standard Glossary of Software Engineering Terminology” (IEEE 610) of the Institute of Electrical and Electronics Engineers: Interoperability is „[t]he ability of two or more systems or components to exchange information and to use the information that has been exchanged ...”.

[9] Directive 2009/24/EC on the legal protection of computer programs. See recital 10: "The function of a computer program is to communicate and work together with other components of a computer system and with users and, for this purpose, a logical and, where appropriate, physical interconnection and interaction is required to permit all elements of software and hardware to work with other software and hardware and with users in all the ways in which they are intended to function."

[10] According to Art. 2 No. 9 "interoperability means the ability of digital content to perform all its functionalities in interaction with a concrete digital environment".

[11] See the „Standard Glossary of Software Engineering Terminology” (IEEE 610) des Institute of Electrical and Electronics Engineers.

[12] In this context the relation between the concepts of compatibility and interoperability are often not clear, which also explains the inconsistent use in the literature.

[13] See the “Standard Glossary of Software Engineering Terminology” (IEEE 610) des Institute of Electrical and Electronics Engineers, and Directive 2009/24/EC, recital 10: “The parts of the program which provide for such [see Footnote 2] interconnection and interaction between elements of software and hardware are generally known as ‘interfaces’.”

[14] A (technical) standard is a technical norm that is (or shall be) broadly used in the marketplace in order to ensure compatibility or interoperability – see OECD, Data-Driven Innovation. Big Data for Growth and Well-Being, 2015, pp. 110.

[15] With their suggestion of four different layers of interoperability (technological, data, human, institutional) Palfrey/Gasser, Interop, 2012, p. 6, 39-53) emphasize that interoperability should not only be seen as a primarily technical problem but should also encompass the level of humans and institutions.

[16] “Interconnection” means the physical and logical linking of public communication networks used by the same or a different undertaking in order to allow the users of one undertaking to communicate with users of the same or another undertaking, or to access services provided by another undertaking – see Art. 2 lit. (b) of the Directive 2002/19/EC of the European Parliament and of the Council of 7 March 2002 on access to, and interconnection of, electronic communication networks and associated facilities, OJ 2002 No. L 108/7 (“Access Directive”).

[17] Farrell/Simcoe, Four Paths to Compatibility, in: Peitz/Waldfogel, The Oxford Handbook of the Digital Economy, 2012, 34, 36-37. Since in the digital economy complex interconnected value networks have emerged, the distinction between horizontal and vertical interoperability might not always be so clear anymore.

[18] From a business strategy perspective, see also Shapiro/Varian, Information rules, 1999, pp.193.

[19] See for overviews on benefits and costs of interoperability Choi/Whinston, Benefits and requirements for interoperability in the electronic marketplace, Technology in Society 22, p. 33; Gasser, Interoperability in the Digital Ecosystem, 2015, pp. 9-17; available at SSRN: http://ssrn.com/abstract=2639210 ; Farrell/Simcoe, Four Paths to Compatibility, in: Peitz/Waldfogel, The Oxford Handbook of the Digital Economy, 2012, 34, 36-38; more specifically with regard to standards LaRouche/Overwalle, Interoperability standards, patents and competition policy, TILEC Discussion paper, 2014, pp.15-18.

[20] See Gasser, Interoperability in the Digital Ecosystem, 2015, pp. 11-12 (available at SSRN: http://ssrn.com/abstract=2639210 ).

[21] See also Palfrey/Gasser, Interop, 2012, pp.106.

[22] See Gasser, Interoperability in the Digital Ecosystem, 2015, pp. 13-15 (available at SSRN: http://ssrn.com/abstract=2639210 ).

[23] For the advantages of modularized systems for innovation, see Baldwin/Clark, Design Rules. Vol. 1: The Power of Modularity, 2000, and MacKie-Mason/Netz, Manipulating interface standards as anticompetitive strategy, in: Greenstein/Stango, Standards and Public Policy, 2007, 281, who distinguish between systems and component competition.

[24] However, effective competition cannot guarantee that the market always finds the optimal interoperability solutions. Especially in oligopolistic settings there might be problems due to collusive behavior.

[25] CFI, Judgment of 17.9.2001, Case T-201/04 – Microsoft Corp.

[26] Haucap/Heimeshoff, Google, Facebook, Amazon, eBay: Is the Internet driving competition or market monopolization?, Int Econ Econ Policy 2014, 49, 50 et seqq. Evans, suggests that tipping towards monopolies is usually prevented by the complexity of multi-platform markets: The Antitrust Economics of Multi-Sided Platform Markets, Yale Journal on Regulation, Vol. 20 (2003), 325, 350.

[27] For an overview about the economics of standards, see Tassey, Standardization in Technology-Based Markets, Research Policy 29, 2000, 587, and Blind, The economics of standards, 2004.

[28] Farrell/Simcoe, Four Paths to Compatibility, in: Peitz/Waldfogel, The Oxford Handbook of the Digital Economy, 2012, 34, 38-47.

[29] Farrell/Simcoe use a broader notion of a "dominant player" who can impose standards. Besides a dominant firm it can also be a large customer or even the government. See Farrell/Simcoe, Four Paths to Compatibility, in: Peitz/Waldfogel, The Oxford Handbook of the Digital Economy, 2012, 34, 40-42.

[30] EU Commission, A Digital Single Market Strategy for Europe, Brussels, 6.5.2015, COM(2015)192 fin., p. 15.

[31] Biddle et al., The Expanding Role and Importance of Standards in the Information and Communications Technology Industry, 52 Jurimetrics 177 et seq. (2012).

[32] See Baron/Pohlmann, 9(4) Journal of Competition L&E 2013, 905 et seq.; Liu, International Standards in Flux: A Balkanized ICT Standard-setting Paradigm and its Implications for the WTO, Journal of International Economic Law 2014, 561, 568 et. seqq. For the relevance of standard setting by industry consortia see Van Eecke et al., EU Study on the Specific Policy Needs for ICT Standardization, July 2007, at 7, available at http://ec.europa.eu/enterprise/sectors/ict/files/full_report_en.pdf .

[33] For a closer analysis: Audrey Selian, 3G Mobile Licensing Policy: From GSM to IMT-2000 – A Comparative Analysis, available at https://www.itu.int/osg/spu/ni/3G/casestudies/GSM-FINAL.pdf .

[34] In the interoperability discussion three different kinds of lock-in problems have to be distinguished. (1) Consumers can get "locked-in", because they buy a product or use a platform which require them to buy complementary products and services (as in aftermarkets) or because the products they buy on platforms cannot be transferred to other platforms (e.g., music files or e-books). (2) However, firms can also get "locked-in" into a standard or a system, if they have to make a standard- or system-specific investment for using the standard/system for their products and services. The patent hold up-problem in regard to standard-essential patents (Rambus case) as well as transaction-specific investments of app developers for Apple or Android (or component suppliers for car manufacturers) are well-known examples. (3) However, here we mean that also an entire market might be locked-in into a standard or technology due to the dynamic effects and path dependencies, which make it hard to replace the standards through a newer, more efficient one. For a sophisticated analysis of lock-in situations and strategies, see from a business perspective, Shapiro/Varian, Information Rules, 1999, 103-171.

[35] See for these dynamic effects through network effects, first-mover advantages, path dependencies, and lock-in effects Katz/Shapiro, Network Externalitites, Competition and Compatibility, American Economic Review 75, 1985, 424; David, Clio and the Economics of QWERTY, American Economic Review 78, 1988, 332; Arthur, Competing Technologies, Increasing Returns, and Lock-In by Historically Small Events, in: Arthur, Increasing Returns and Path Dependence in the Economy, 1994, 13; Shapiro/Varian, Information Rules, 1999, 173-225.

[36] In such a case of market failure different policy solutions can be considered for overcoming these lock-in effects, as, e.g., subsidies, public procurement or regulation.

[37] For the analysis of standard wars, see Besen/Farrell, Choosing How to Compete - Strategies and Tactics in Standardization, Journal of Economic Perspectives 8, 1994, 117; Shapiro/Varian, Information Rules, 1999, 261-296. Stango, The Economics of Standard Wars, Review of Network Economics 3(1), 2004, 1-19.

[38] For the problem of internalizing complementary externalities and thereby aligning private and social benefits of a standard, see Farrell/Weiser, Modularity, Vertical Integration, and Open Access Policies: Towards a Convergence of Antitrust and Regulation in the Internet Age, Harvard Journal of Law and Technology 17, 2003, 85.

[39] OECD, Data-Driven Innovation. Big Data for Growth and Well-Being, 2015, 192-194.

[40] See Farrell/Simcoe, Four Paths to Compatibility, in: Peitz/Waldfogel, The Oxford Handbook of the Digital Economy, 2012, 34, 44-45.

[41] See for the interrelationship between standardisation and innovation also LaRouche/Overwalle, Interoperability standards, patents and competition policy, TILEC Discussion paper, 2014, pp.17.

[42] See Hayek, Competition as a discovery procedure, in: Hayek, New Studies in Philosophy, Politics, Economics, and the History of Ideas, 1978, 179; for the advantages of parallel experimentation and diversity see Kerber, Competition, innovation, and maintaining diversity through competition law, in: Drexl/Kerber/Podszun, Competition Policy and the Economic Approach, 2011, 179.

[43] Farrell/Simcoe, Four Paths to Compatibility, in: Peitz/Waldfogel, The Oxford Handbook of the Digital Economy, 2012, 34, 35.

[44] Farrell/Simcoe, Four Paths to Compatibility, in: Peitz/Waldfogel, The Oxford Handbook of the Digital Economy, 2012, 34, 41.

[45] For the problems of collective standard-setting see Farrell/Simcoe, Four Paths to Compatibility, in: Peitz/Waldfogel, The Oxford Handbook of the Digital Economy, 2012, 34, 40-44. For empirical studies on SSOs see Chiao/Lerner/Tirole, The Rules of Standard Setting Organizations: An Empirical Analysis, The RAND Journal of Economics, 38(4), 2005, 905, and Rysman/Simcoe, Patents and the Performance of Voluntary Standard Setting Organizations, Management Science 54(11), 2009, 1920.

[46] OJ 2011 No. C 11/1.

[47] FRAND = fair, reasonable and non-discriminatory.

[48] EU Commission, A Digital Agenda for Europe, Brussels, 19.5.2010, COM(2015)245 fin., p. 15, announcing a follow-up to the White Paper “Modernising ICT Standard Setting in the EU”, COM(2009) 324.

[49] EU Commission, A Digital Agenda for Europe, Brussels, 19.5.2010, COM(2015)245 fin., p. 15.

[50] EU Commission, A Digital Single Market Strategy for Europe, Brussels, 6.5.2015, COM(2015)192 fin., p. 15.

[51] The Commission Staff Working Document “Analysis of measures that could lead significant market players in the ICT sector to license interoperability information”, Brussels, 6.6.2013, SWD(2013)209 fin., p. 15-16 has proposed a number of non-legislative measures, inter alia model licenses for interoperability information and guidelines for determining the value of interoperability information. The idea is to enhance transparency in the licensing market and minimize practical hurdles to licensing, in particular for SMEs.

[52] EU Commission, Decision of 9.12.2009, Case COMP/38.636 – Rambus (decision based on Art. 9 Reg. 1/03).

[53] ECJ, Judgment of 16.7.2015, Case C-170/13 – Huawei Technologies; EU Commission, Decision of 29.4.2014, Case AT.39939 – Samsung; Decision of 29.4.2014, Case AT.39985 – Motorola.

[54] EU Commission, A Digital Single Market Strategy for Europe, Brussels, 6.5.2015, COM(2015) 192 fin., p. 15.

[55] EU Commission, A Digital Single Market Strategy for Europe, Brussels, 6.5.2015, COM(2015) 192 fin., p. 15.

[56] EU Commission, ICT Standardisation Priorities for the Digital Single Market, Brussels, 19.4.2016, COM(2016) 176 fin., p. 2-3.

[57] EU Commission, ICT Standardisation Priorities for the Digital Single Market, Brussels, 19.4.2016, COM(2016) 176 fin.

[58] EU Commission, Communication “5G for Europe: An Action Plan”, Brussels, 14.9.2016, COM(2016) 588 fin.: A lack of coordination between national approaches would “create a significant risk of fragmentation and implementation of standards and would delay the creation of a critical mass for 5G-based innovation in the Digital Single Market” (p. 3). The EU Commission finds that “standards are of paramount importance to ensure the competitiveness and interoperability of global communication networks” (p. 7) and plans to “foster the emergence of global industry standards under EU leadership for key 5G technologies (radio access network, core network) and network architectures” (p. 7).

[59] Farrell/Simcoe, Four Paths to Compatibility, in: Peitz/Waldfogel, The Oxford Handbook of the Digital Economy, 2012, 34, 48-50.

[60] "Access" means the making available of facilities and/or services, to another undertaking, under defined conditions, on either an exclusive or non-exclusive basis, for the purpose of providing electronic communications services, cf. Art. 2 lit. (a) of the Access Directive 2002/19/EC.

[61] "Interconnection" means the physical and logical linking of public communications networks used by the same or a different undertaking in order to allow the users of one undertaking to communicate with users of the same or another undertaking, or to access services provided by another undertaking, cf. Art. 2 lit. (b) of the Access Directive 2002/19/EC.

[62] "Electronic communications network" means transmission systems and, where applicable, switching or routing equipment and other resources which permit the conveyance of signals by wire, by radio, by optical or by other electromagnetic means, cf. Art. 2 lit. (a) of Directive 2002/21/EC.

[63] Proposal for a Directive establishing the European Electronic Communications Code, Brussels, 12.10.2016, COM(2016) 590 fin.

[64] In this context, see the proposed Directive establishing the European Electronic Communications Code, Brussels, 12.10.2016, COM(2016) 590 fin. which shall replace the existing legal framework for electronic communications.

[65] Neitzel/Hofmann, in: Spindler/Schuster, Recht der elektronischen Medien, § 18 TKG, para 1; Scherer, in: Arndt/Fetzer/Scherer/Graulich, TKG-Kommentar, § 18, para 2.

[66] Equivalent to the competition law concept of market dominance – see Commission guidelines on market analysis and the assessment of significant market power under the Community regulatory framework for electronic communications networks and services, OJ 2002 No. C 165/03, p. 14 et seq., Ricke, in: Spindler/Schuster, Recht der elektronischen Medien, § 3 TKG, para 6; Kohrenke/Ufer, in: Geppert/Schütz, Beck’scher TKG-Kommentar, § 3, para 10.

[67] For a case at the limit of antitrust law which may be considered to be a “horizontal interoperability” case see US Supreme Court, Aspen Skiing Co. v. Aspen Highlands Skiing Corp., 472 U.S. 585 (1985).

[68] The Commission applies a 3-criteria-test to identify potentially relevant markets. Ex ante-regulation shall be considered only for markets with: (1) high and non-transitory barriers to entry; (2) a market structure that does not tend towards effective competition within the relevant time horizon; (3) a market failure that cannot be adequately addressed by competition law alone. See Commission Recommendation 2014/710/EU of 9 October 2014 on relevant product and service markets within the electronic communications sector susceptible to ex ante regulation in accordance with 2002/21/EC, OJ 2014 No. L 295/79. The Commission Recommendation currently indicates four potentially relevant markets: (1) wholesale call termination on individual public telephone networks provided at a fixed location; (2) wholesale voice call termination on individual mobile networks; (3a) wholesale local access provided at a fixed location; (3b) wholesale central access provided at a fixed location for mass-market products; (4) wholesale high-quality access provided at a fixed location.

[69] Discussing this question: Inge Graef, Mandating portability and interoperability in online social networks, Telecommunications Policy 2015 (39/6), 502 et seq.

[70] See, for example, Ian Brown/Christopher Marsden, Regulating Code: Good Governance and Better Regulation in the Information Age, 2013, pp. 190-191 who have argued in favour of imposing interconnection requirements on social network providers.

[71] In favour of an interoperability requirement for these reasons: Graef/Valcke, Exploring new ways to ensure interoperability under the Digital Agendy, Info – the journal of policy, regulation and strategy for telecommunications, information and media 2014 (16/1), p. 7: “In early phases of market development, a duty to disclose interoperability information should only be mandated in very limited circumstances, since in this period competition between systems could be particularly beneficial for innovation. In later stages of market development, the need for mandated interoperability increases as the prevailing system continues to dominate the market.”

[72] Proposal for a Directive establishing the European Electronic Communications Code, Brussels, 12.10.2016, COM(2016)590 fin.

[73] Art. 59(2) lit. c of the Draft European Electronic Communications Code is further qualified in Art. 59(3). According to this provision, obligations under Art. 59(2) lit. c may only be imposed “(i) to the extent necessary to ensure interoperability of interpersonal communications services and may include obligations to the use and implementation of standards or specifications ...; (ii) where the Commission on the basis of a report that it had requested from BEREC, has found an appreciable threat to effective access to emergency services or to end-to-end connectivity between end-users within one or serveral Member States or throughout the European Union and has adopted implementing measures specifying the nature and scope of any obligations that may be imposed ...”.

[74] Arguably for this reason, a Commission Staff Working Paper that discussed the expedience of an “Interoperability Directive”, namely the imposition of an interoperability requirement not only upon electronic communications networks, but also on digital platforms and services considered exceptions to compulsory licensing that should apply “where the interoperability information (like the description of a hardware interface) reveal to a large extent the technology and functionality implemented by a device or a system beyond its interfaces” – see Commission Staff Working Document, Analysis of measures that could lead significant market players in the ICT sector to license interoperability information, SWD(2013) 209 final, p. 12.

[75] For a discussion also see Inge Graef, Mandating portability and interoperability in online social networks, Telecommunications Policy 2015 (39/6), 502, 510 et seqq.

[76] In this article we will not discuss the difficulties in determining whether a firm and especially a (multi-sided) platform is dominant (including the difficulties of defining markets). These difficulties may further limit the capability of competition law to properly solve interoperability problems.

[77] US Supreme Court, Aspen Skiing Co. v. Aspen Highlands Skiing Corp., 472 U.S. 585 (1985).

[78] Critical with regard to Aspen Skiing: John E. Lopatka/William H. Page, Bargaining and Monopolization: In Search of the ‘Boundary of Section 2 Liability’ between Aspen and Trinko, Antitrust Law Journal 82 (2005), pp. 115 et seq.; Alan J. Meese, Property, Aspen, and Refusals to Deal, Antitrust Law Journal 73 (2005), pp. 81 et seq. In favour of a broader reading of Aspen Skiing: Marina Lao, Aspen Skiing and Trinko: Antitrust Intent and ‘Sacrifice’, Antitrust Law Journal 73 (2005), pp. 171 et seq.

[79] Verizon Communications v. Law Offices of Curtis V. Trinko , LLP, 540 U.S. 398 (2004).

[80] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016, OJ 2016 L 119/1. See also: Article 29 Data Protection Working Party, Guidelines on the right to data portability, adopted on 13 December 2016, 16/EN WP 242.

[81] Draft Directive on certain aspects concerning contracts for the supply of digital content, 9 December 2015. COM(2015)634 fin.

[82] See Ruth Janal, Data Portability, p. 59 in this issue.

[83] John M. Newman: Anticompetitive Product Design in the New Economy, Florida State University Law Rev. 39 (2012), pp. 1, at 14.

[84] For a discussion of arguments in favour of mandating interoperability see Fleischer, Behinderungsmißbrauch durch Produktinnovation, 1997, § 3.

[85] See Fleischer, Behinderungsmißbrauch durch Produktinnovation, 1997, § 4.

[86] EU Commission, Decision of 21 April 2004, COMP/C-3/37.792 – Microsoft; CFI, Judgment of 17.9.2001, Case T-201/04 – Microsoft Corp.

[87] For a critical economic assessment of this conduct see Leveque, Innovation, Leveraging and Essential Facilities: Interoperability Licensing in the EU Microsoft Case, World Competition 28(1), 2005, 71, 82-85; see also Kühn / Van Reenen (2009), Interoperability and Market Foreclosure in the European Microsoft Case, in: Lyons, Cases in European Competition Policy: The Economic Analysis, 50.

[88] CFI, Judgment of 17.9.2001, Case T-201/04 – Microsoft Corp.

[89] See, for example, Daniel F. Spulber, Competition Policy and the Incentive to Innovate: The Dynamic Effects of Microsoft v. Commission, Yale Journal of Regulation 25 (2008), pp. 247, 272 et seq.

[90] Opinion of Advocate General Jacobs, Bronner, Case C-7/97, [1998] E.C.R. I-7794, at paras. 56-58.

[91] See EU Commission, Decision of 21 April 2004, COMP/C-3/37.792, at para. 783 – Microsoft.

[92] See Leveque, Innovation, Leveraging and Essential Facilities: Interoperability Licensing in the EU Microsoft Case, World Competition 28(1), 2005, 71, 75-78, who offers convincing arguments why from an economic perspective the incentives balance test is conceptually clearer than the new product test, and therefore might be preferable. For an analysis of the incentive balance test particularly from an innovation economics perspective see Vezzoso, The Incentives Balance Test in the EU Microsoft Case: A Pro-Innovation “Economics-Based” Approach? In: European Competition Law Review 27, 2006, 382, and Schmidt/Kerber, Microsoft, Refusal to License Intellectual Property Rights, and the Incentive Balance Test of the EU Commission, 2008 (available at SSRN: http://ssrn.com/abstract=1297939 ).

[93] DG Competition, Discussion Paper on the application of Article 82 of the Treaty to exclusionary abuses, Brussels, December 2005, paras. 241, 242.

[94] EU Commission, Guidance on the Commission’s enforcement priorities in applying Article 82 of the EC Treaty to abusive exclusionary conduct by dominant undertakings, OJ 2009 No. C 45/7.

[95] See, for example, Bundeskartellamt, Digitale Ökonomie – Internetplattformen zwischen Wettbewerbsrecht, Privatsphäre und Verbraucherschutz, 1. October 2015, p. 29.

[96] Conseil de la Concurrence, Décision No. 04-D-54 du 9 Novembre 2004 relative à des pratiques mises on oeuvre par la société Apple Computer, Inc. dans les secteur du Téléchargement de musique sur Internet et des baladeurs numériques. See also: Graef/Valcke, p. 6.

[97] For the economic analysis of the role of adapters and converters in standardization contexts, see David/Bunn, The economics of gateway technologies and network evolution: Lesson from electricity supply history, Information economics and policy 3, 1988, 165, Farrell/Saloner, Converters, compatibility, and the control of interfaces, Journal of industrial economics 40, 1992, 9, Farrell/Simcoe, Four Paths to Compatibility, in: Peitz/Waldfogel, The Oxford Handbook of the Digital Economy, 2012, 34, 46-47, and Kölln, Strategien der Diffusion von Netzwerkgütern, 2013 (Dissertation), 123 et seq.

[98] Farrell/Simcoe, Four Paths to Compatibility, in: Peitz/Waldfogel, The Oxford Handbook of the Digital Economy, 2012, 34, 47; Shapiro/Varian, Information Rules, 1999, 281 et seq.

[99] See, for example, John M. Newman: Anticompetitive Product Design in the New Economy, Florida State University Law Rev. 39 (2012), pp. 1, 3, 15.

[100] Commission Staff Working Document: “Analysis of measures that could lead significant market players in the ICT sector to license interoperability information”, Brussels, 6.6.2013, SWD(2013)209 fin., p. 13-14, 19, pointing to the example of Microsoft’s Windows Server Protocols (WSPP): almost a decade of reverse engineering (through protocol analysis) and development by the free/open source Samba project did not yield a fully compatible implementation of the protocols. The licensing of the WSPP by Microsoft was ultimately necessary for achieving full interoperability.

[101] EU Commission, Press Release of 14 July 2016, IP/16/2532.

[102] LiveUniverse, Inc. v. MySpace, Inc.,  No. 07-56604, 2008 WL 5341843 (9th Cir. Dec. 22, 2008).

[103] See Holger Fleischer, Behinderungsmißbrauch durch Produktinnovation, 1997, § 4; John M. Newman: Anticompetitive Product Design in the New Economy, Florida State University Law Rev. 39 (2012), pp. 1, 2 et seq.

[104] See Berkey Photo, Inc. v. Eastman Kodak Co., 603 F. 2d (1979 U.S. App.), 286, 287; Allied Orthopedic Appliances Inc. v. Tyco Health Care Group L.P., 592 F.3d 991; 2010 U.S. App., 1000.

[105] Allied Orthopedic Appliances Inc. v. Tyco Health Care Group L.P., 592 F.3d 991; 2010 U.S. App., 1000: “There is no room in this analysis for balancing the benefits or worth of a product improvement against its anticompetitive effects. If a monopolist’s design change is an improvement, it is necessarily tolerated by antitrust laws. To weigh the benefits of an improved product design against the resulting injuries to competitors is not just unwise, it is unadministrable. There are no criteria that courts can use to calculate the right amount of innovation which would maximize social gains and minimize competitive injury.”

[106] Microsoft III, 253 D. 3d 34, 47 ff. (D.C. Cir. 2001).

[107] George Sidak, Debunking Predatory Innovation, Columbia Law Review, Vol. 83 (1983), p. 1148. In the context of product switching in the pharma sector: Douglas H. Ginsburg/Koren W. Wong-Ervin/ Joshua D. Wright, Product Hopping and the Limits of Antitrust, CPI Antitrust Chronicle, Dec. 2015(1).

[108] John M. Newman: Anticompetitive Product Design in the New Economy, Florida State University Law Rev. 39 (2012), pp. 1 et seq. See also: William H. Page/ Seldon J. Childers, Antitrust, Innovation and Product Design in Platform Markets: Microsoft and Intel, in Antitrust Law Journal 78 (2012), pp. 363 et seq.

[109] For an analysis of the relevant case law see Holger Fleischer, Behinderungsmißbrauch durch Produktinnovation, 1997, § 6 II.

[110] Commission Staff Working Document: “Analysis of measures that could lead significant market players in the ICT sector to license interoperability information”, Brussels, 6.6.2013, SWD(2013)209 fin., p. 13.

[111] Wiebe, Interoperabilität von Software, Jipitec 2011, p. 89, 90.

[112] Council Directive 91/250/EEC of 14 May 1991 on the legal protection of computer programs. For the German implementation of Art. 6 see § 69e UrhG.

[113] Wiebe, Interoperabilität von Software, Jipitec 2011, p. 89.

[114] ECJ, 2 May 2012, C-406/10, at para. 60 – SAS.

[115] Art. 3(1) of the Trade Secret Directive 2016/943 of 8 June 2016, OJ 2016 L 157/1.

[116] Before the entry into force of the Trade Secret Directive, it was believed that trade secret protection cannot be invoked against the use of interoperability information obtained through lawful reverse engineering and decompilation – see Commission Staff Working Document: “Analysis of measures that could lead significant market players in the ICT sector to license interoperability information”, Brussels, 6.6.2013, SWD(2013)209 fin., p. 12.

[117] See Schweyer, Die rechtliche Bewertung des Reverse Engineering in Deutschland und den USA, 2012, S. 172 ff.

[118] Behera/Bhaskari, Procedia Computer Science 2015, 757, 758; Schweyer, Die rechtliche Bewertung des Reverse Engineering in Deutschland und den USA, (Diss. 2012), S. 177, 239.

[119] See Behera/Bhaskari, Procedia Computer Science 2015, 757, 758.

[120] See, however, Schweyer, Die rechtliche Bewertung des Reverse Engineering in Deutschland und den USA, (Diss. 2012), S. 239-240, arguing in favour of a prohibition of the circumvention of Art. 6 of the Software Directive.

[121] Wiebe, Interoperabilität von Software, Jipitec 2011, p. 89, 92. See also: Commission Staff Working Document: “Analysis of measures that could lead significant market players in the ICT sector to license interoperability information”, Brussels, 6.6.2013, SWD(2013)209 fin., p. 19. For a comparison with the US approach see John Abbot, Reverse Engineering Software: Copyright and Interoperability, Journal of Law and Information Science 14 (2003), pp. 7 et seq.

[122] Commission Staff Working Document: “Analysis of measures that could lead significant market players in the ICT sector to license interoperability information”, Brussels, 6.6.2013, SWD(2013)209 fin., p. 8-9, 11.

[123] Wiebe, Interoperabilität von Software, Jipitec 2011, p. 89, 95.

[124] Wiebe, Interoperabilität von Software, Jipitec 2011, p. 89, 92.

[125] Commission, Communication – A Digital Agenda for Europe, COM(2010)245 fin., p. 15: “Since not all pervasive technologies are based on standards, the benefits of interoperability risk being lost in such areas”.

[126] For this idea see Commission Staff Working Document: “Analysis of measures that could lead significant market players in the ICT sector to license interoperability information”, Brussels, 6.6.2013, SWD(2013)209 fin. See also Kroes, How to get more interoperability in Europe, 10 June 2010: “complex antitrust investigations followed by court proceedings are perhaps not the only way to increase interoperability”.

[127] The idea was dropped for various reasons. The Commission was in doubt whether Art. 114 would provide a sound legal basis. It was also unconvinced that such a regime would be in conformity with the principle of proportionality (Art. 5(4) TFEU). Finally, it was concerned with the need to establish new regulatory institutions in the Member States that would need “to carry out an ex ante analysis of the market for identifying players with significant market power. Moreover, there would be serious technical difficulties to define market power. The analogy with the Access Directive breaks down due to the lack of identifiable market bottleneck assets in software that are equivalent to telecommunications networks”. Commission Staff Working Document: “Analysis of measures that could lead significant market players in the ICT sector to license interoperability information”, Brussels, 6.6.2013, SWD(2013)209 fin., p. 14-15.

Fulltext

License

Any party may pass on this Work by electronic means and make it available for download under the terms and conditions of the Digital Peer Publishing License. The text of the license may be accessed and retrieved at http://www.dipp.nrw.de/lizenzen/dppl/dppl/DPPL_v2_en_06-2004.html.

JIPITEC – Journal of Intellectual Property, Information Technology and E-Commerce Law
Article search
Extended article search
Newsletter
Subscribe to our newsletter
Follow Us
twitter
 
Navigation