Interoperability And Open Standards:
The Key To True Openness And Innovation

Simone Aliprandi,a

(a) Lead of the Copyleft-Italia.it Project,
member of Array (Arraylaw.eu) and
Ph.D. Candidate at Bicocca University of Milan

DOI: 10.5033/ifosslr.v3i1.53

 

Abstract
Most people agree that providing a shared set of standards produces a broad advantage for all actors involved in the ICT market. First of all, it’s an advantage for active operators in that market (companies, developers, designers), but also for users of computer technologies, simple observers and scholars as well.

However, if on one hand the same concept of standard appears to be quite intuitive and broadly known, on the other hand not so many people are aware of the complex dynamics behind the standard definition process, particularly in relation to today’s globalized and technology-savvy world. Even fewer people seem aware that, when a standard definition process is not being carried with true transparency and care, this procedure could even become counterproductive for the innovation itself. Therefore, in recent years, a new approach for the standard definition process has been emerging, with the aim of producing standards based on the broadest level of openness and interoperability: the so-called open standards.

This essay will start by addressing the broad concept of standards, with specific reference to the world of technology; later, it will focus on the drafting process of standards, highlighting major problems regarding its legal, economic and technology aspects. The final section will concentrate on the very concept of an open standard.

Keywords

Free Libre and Open Source Software; Innovation; Open standard; Open format; Standardization; Standard-setting process; Standard-setting organizations; Interoperability; Copyright; Royalty-free; Patent.

1. The crucial role of interoperability

Many sources consider interoperability one of the key features pertaining to freedom of information in a broader sense. Indeed, the lack of this “interoperable by default” feature threatens to crumble the whole FLOSS (Free Libre and Open Source Software) system.

According to its general definition, interoperability is the intentional design of a technology product or system, which allows it to cooperate with other products or systems without restriction or difficulty, thus producing a reliable outcome and resource optimization. The main goal of an interoperable system is to facilitate interaction between different software applications and to enable sharing and re-use of information among non-homogenous systems.

Based on this definition and given the current evolution and state of the mass computer market, it is clear that interoperability plays a pivotal role in ensuring competition between all of the actors involved. Major computer companies with large market shares can easily control and limit the competitive power of their rivals with intentional product design, a conduct defined by the competition law as “abuse of dominant position”.

Let’s consider a typical scenario of a global corporation that produces the most common operating system, while, taking advantage of the basic tools provided by the industrial trade law (industrial secret, copyright, patent), effectively prevents other companies from accessing the data needed to develop applications fully compatible with their operating system. In this fashion, the corporation would also effectively appropriate the application market, given the competitive edge provided by the internal availability of their data. Similar practices should be (and, fortunately, actually are) monitored and properly sanctioned by the Antitrust Authorities .

The complexity and importance of those aspects within the current economy put the spotlight on the central role of interoperability. Indeed, during the last few years this issue has gained particular relevance in public opinion and in International policy bodies as well. As a result, today, we have a more articulate and adequate definition of interoperability, promoted by a research study launched and concluded in 2004 by the IDABC (Interoperable Delivery of European eGovernment Services to public Administrations, Businesses and Citizens) on behalf of the European Commission. The study focused on the implication of e-government and the relationship between citizens and public Administrations. In the final report, this research includes a detailed definition of the interoperability concept and a list of major objectives pertaining to the EU States. The document title is “European Interoperability Framework for pan-European eGovernment Services”, also known by its acronym EIF. Paragraph 1.1.2. provides the following introduction to the actual definition of interoperability:

“Interoperability means the ability of information and communication technology (ICT) systems and of the business processes they support to exchange data and to enable the sharing of information and knowledge.”

Later on, the EIF document delves into the practical issues of this concept by detailing its three different levels, that is: organizational, semantic and technical interoperability:

  • Organizational interoperability. This aspect of interoperability is concerned with defining business goals, modelling business processes and bringing about the collaboration of administrations that wish to exchange information and may have different internal structures and processes. Moreover, organisational interoperability aims at addressing the requirements of the user community by making services available, easily identifiable, accessible and user-oriented.

  • Semantic interoperability. This aspect of interoperability is concerned with ensuring that the precise meaning of exchanged information is understandable by any other application that was not initially developed for this purpose. Semantic interoperability enables systems to combine received information with other information resources and to process it in a meaningful manner. Semantic interoperability is therefore a prerequisite for the front-end multilingual delivery of services to the user.

  • Technical  interoperability. This aspect of interoperability covers the technical issues of linking computer systems and services. It includes key aspects, such as open interfaces, interconnection services, data integration and middleware, data presentation and exchange, and accessibility and security services.1

Here, it is useful to quote a tip by computer adviser Bob Sutor, who recommends avoiding any confusion with the “intraoperability” concept — some kind of fake interoperability where a single product or standard or platform remains still predominant in regards to other comparable items.

“I think the word 'interoperability' is being similarly abused. When a single vendor or software provider makes it easier to connect primarily to his or her software, this is more properly called intraoperability. In the intraoperability situation, one product is somehow central and dominant, either by marketshare, attitude, or acquiescence. The connectivity is supported by protocols and data formats that favor the central software, and those are often prescribed by the provider. […] Compare this with real interoperability. In this situation, we use truly open standards that do not favor any one software provider. They work to allow two pieces of software to work together as they do any two others. Certainly one of the providers might have a superior market position, but it is not given or maintained by the asymmetrical intraoperable situation.”2

2. The “standard” concept

Within the context of interoperability, the concept of “standard” emerges as a common but often overlooked feature. Any generic dictionary could provide the following definitions for the term standard:

Something, such as a practice or a product, that is widely recognized or employed, especially because of its excellence.3

A pattern or model that is generally accepted.4

Both definitions make clear that this concept does not refer exclusively to the technology field, but more in general to the manufacturing and industrial markets.

However, by limiting our analysis to the technology field, it becomes easy to understand how the standard concept pairs perfectly with the interoperability concept. Indeed, a broadly recognized standard, whose features are publicly available, fosters the development of adequate technology solutions along two directions: by accessing such information, designers and developers can avoid wasting resources and have more opportunities to see their product succeed in the market; since the products designed are based on shared standards, the final users will have assurance that such products will actually perform seamlessly together.

This approach is also confirmed by an interesting definition to be found in the online encyclopedia Webopedia:

“[A standard is] a definition or format that has been approved by a recognized standards organization or is accepted as a de facto standard by the industry. Standards exist for programming languages, operating systems, data formats, communications protocols, and electrical interface. From a user's standpoint, standards are extremely important in the computer industry because they allow the combination of products from different manufacturers to create a customized system. Without standards, only hardware and software from the same company could be used together. In addition, standard user interfaces can make it much easier to learn how to use new applications.”5

3. Differences between de jure and de facto standards

A traditional distinction pertaining to the standard field lists two major categories — here below sketched in a simple way just to provide a general introduction, while a deeper analysis will follow in further sections of this essay. These two categories are: de jure standard and de facto standard.
De jure refers to a standard ensuing after a fair technical analysis and definition process, carried out by the appropriate organizations, and based on a formal definition and description drafted in a specific document. As a result, the organizations in charge of these tasks are known as “standard-setting organizations” (or more generically, “standardization bodies”).

These norms are being drafted through a complex mechanism that includes consultation and analysis stages carried by the regulatory body, along with experts in the specific industrial sector and the so-called stakeholders, that is, all actors potentially interested in the emerging standard.
Obviously, a specific norm becomes more authoritative when relying, in particular, on the quantity of stakeholders involved in the definition process and on the transparency and precision of the final standard description. The following sections will further investigate the dynamics of the regulatory process.

It should be noted here, however, that not always does any given model raise to the de jure standard status. In fact, while some reference models are commonly considered ‘standard’ simply because of their widespread dissemination, they have never actually been recognized as such by the appropriate organizations through a regular standardization process. In these instances , we have a de facto standard. 6

Therefore, we should focus our attention on the generic definition just mentioned above, where a major role is played by an unspoken “agreement”. Indeed, any definition implies a unifying element upon which a technical model could easily be considered a standard by virtue of a general agreement, that is, based on a more or less explicit acceptance.

In this context, it is worth taking into account another definition for the term ‘standard’, included in the “Frequently asked questions” section of the International Organization for Standardization (ISO) website:

“[A standard is] a documented agreement containing technical specifications or other precise criteria to be used consistently as rules, guidelines, or definitions of characteristics to ensure that materials, products, processes and services are fit for their purpose.”7

For comparison purposes, here is the definition included in the document drafted by ISO/IEC and called 'Rules for the structure and drafting of International Standards':

“[A standard is] a document, established by consensus and approved by a recognized body, that provides, for common and repeated use, rules, guidelines or characteristics for activities or their results, aimed at the achievement of the optimum degree of order in a given context (note: Standards should be based on the consolidated results of science, technology and experience, and aimed at the promotion of optimum community benefits).”8

4. The standardization process

As mentioned earlier, the path leading to an actual standard definition is called a standardization (or regulatory) process: it involves several stages, relies on the conventional features defining the standard itself and is carried out by specialized bodies whose authority and credibility are widely recognized.

Defining a standard differs from the creation of a typical legal norm. The standard refers essentially to the idea of a “norm” intended as a “kind” or “model” to which operators of a specified market should adhere to in order to be part of the “game”, to avoid risking the exclusion from the game itself (or at least a tougher participation). In other words, in the common meaning (legal norm) the founding idea is based on a social group whose individuals are all bound to abide by certain rules and where any violation leads to a judicial sanction. Instead, the other instance (standard definition) implies a reference model defined by conventional dynamics , and any interested party (market operators) can choose whether to accept them – keeping in mind, though, that the choice of non-acceptance will lead to serious difficulties in its market operations.
The regulatory process, further detailed below, is one of the key points of today’s innovation in a
world more and more permeated by technology. Accordingly, this is a very sensitive and complex step, involving several issues of legal, economic, political and ethical nature – well beyond technology itself and requires a multi-faceted approach.

4.1. Major principles of the standard-setting activity

The standard-setting activity is based on a few general principles whose compliance is essential to provide reliability and authority to the final standard. Those principles are:

  • A consensual agreement , that is, reaching the maximum consensus possible between all parties involved in the regulatory process. This is a pillar feature for the credibility of the entire process and the overall standard stability;

  • A democratic procedure , given that a process based on democratic mechanisms ensures that “all parties are being represented in each stage […] and that they can concur in a leveled way to a project consensual approval”;

  • Transparency , given the importance of ensuring that along the standard-setting process all involved parties have “the right, and the duty as well, to understand the ‘rules of the game’, that is, the regulations governing the activities of the various committees and working groups and their areas of expertise, along with full access to any documentation detailing the very regulatory process in progress” . 9

Of course, these are mostly ideal principles that “should” give shape to the standard-setting process. In fact, we shall see that not all standard setting organizations follow these principles in a consistent and regular way.

4.2. The stages of a standard-setting process

Each standardization body issues its own norms, adopts its own procedures and follows its own practices for the standardization process. However, in almost any standardization process there is bound to be a shared paradigm upon which our analysis can rely. According to the model proposed by  ISO10, each process develops along three general stages:

  • The need for a standard is usually expressed by an industry sector, which communicates this need to a national member body. The latter proposes the new work item to ISO as a whole. Once the need for an International Standard has been recognized and formally agreed, the first phase involves definition of the technical scope of the future standard. This phase is usually carried out in working groups, which comprise technical experts from countries interested in the subject matter.

  • Once agreement has been reached on which technical aspects are to be covered in the standard, a second phase is entered, during which countries negotiate the detailed specifications within the standard. This is the consensus-building phase.

  • The final phase comprises the formal approval of the resulting draft International Standard (the acceptance criteria stipulate approval by two-thirds of the ISO members that have participated actively in the standards development process, and approval by 75% of all members that vote), following which the agreed text is published as an ISO International Standard.

    According to another source,11 the definition of an International Standard could also pursue the following steps:

  • Proposal and necessity evaluation about the standard itself;

  • Putting together a first draft;

  • Consensus seeking based on that draft;

  • A broader inquiry stage, with the project being disseminated outside the circle of interested parties to gather comments, suggestions, criticism or support;

  • Approval of the final draft by the standard-setting organization;

  • Publication of the official standard;

  • A possible revision stage, in case of specific requests or needs emerging after its publication.

It is clear that this second scheme is just a more detailed version of the previous procedure.
In most cases, the drafting of the standard technical language is being managed by internal committees and working groups, including experts representing all interested economic and social stakeholders (producers, suppliers, customers, final users, distributors, research centers, consumers, public administration officials, etc.). Therefore, the standardization body exerts mostly a working coordination role and makes its own organizational structure available.

Finally, it should be noted that there has been an increasing trend where international standardization bodies decide to adopt a standard already formalized by other regulatory bodies: in these instances, we have a so-called “second degree standardization”.

This trend is particularly evident in complex application areas (such as, indeed, the ICT sector), where the standardization process requires long and articulated technical evaluations and could result in a better outcome if managed by a specialized body. This way, such body could address the standard at an advanced stage for a final revision and ratification.

4.3. Standard publication and usage

The standardization process produces a final text or hypertext document , including all necessary information to follow and reproduce the model described there – the so-called standard specifications. Therefore , companies interested in developing a product according to that standard must have full access to those specifications.

With few exceptions, as explained below, at this stage the major standardization bodies see the documentation produced as content covered by industrial trade law (secret, copyright). As a consequence, usually those standardization bodies do not distribute their documentation free of charge (except for a few particular instances) and, in order to access it, the interested operators must pay a royalty and acquire the necessary permission.

Relying on such rights, the standardization body could even decide to regulate access and use (and, indirectly, the implementation) of that standard by the customer. It is important to point out, however, that these considerations essentially pertain to the standard documentation access, rather than to the subsequent stage of its implementation. In fact, in addition to legal protection to access the standard documentation, there could also be some industrial property rights (that is, patents) on the technical solutions included and described in the standard itself. Therefore whoever acquires such documentation could still be prevented from adopting and implementing the standard, unless by paying another royalty to the possible patent holders.12

This is a crucial distinction to fully understand the legal intricacies related to regulatory and development issues within the technology sector. On the other hand, as we will soon illustrate, intellectual property management i.e. , the licensing of patents is indeed one of the most sensitive matters in the field of standardization.

Finally, we should keep in mind that most of the revenue for the standard-setting organizations – besides membership fees from associated or affiliated parties – comes from distributing the documentation related to those standards and from licensing the standard itself to be implemented by entities (companies and other professional operators) not actively involved in the standardization process itself.

5. The ICT sector: between de facto standards and network externalities

As mentioned earlier, the issues around interoperability and shared standard have particular relevance in the ICT sector. In this context, Massimiliano Granieri effectively points out that “the proliferation of rights and stakeholders involved in the standard definition of a certain product grew very large in the information and communication industry, featuring complex assets and system assets where interoperability becomes the basic condition for existence of the market”.13
Therefore, the direct link between this picture and the strong presence of network externalities makes the ICT sector more inclined than other fields to the affirmation of de facto standards and not-so-virtuous market dynamics, where the winner is not the best but rather the strongest and most determined actor.14

As illustrated in the previous chapter, in the past there have been some instances of successful de facto standards, that is , reference models able to force themselves and settle due to smart market strategies rather than on actual test of their features. It is safe to say that empirically the winner was not always the most effective and innovative standard.

Indeed, the most emblematic and often mentioned case of such a situation points directly to the technology world (specifically, about the home video-recording formats): the VHS format, proposed in 1976 by JVC, defeated its direct competitor, the Betamax format, developed in 1975 by Sony. A short summary of this story helps us to better understand the market dynamics behind similar practices:

“When home VCRs started to become popular in the UK, the main issue was one of availability and price. VHS machines were available through the high street rental chains such as Radio Rentals and DER (most of whom were owned by Ferguson Electronics, who were part-owned by JVC, the inventors of VHS), while Beta was seen as the more upmarket choice for people who wanted quality and were prepared to pay for it. By 1980, out of an estimated 100,000 homes with VCRs, 70% were rented, and the presence of three (the third being Video 2000) competing formats meant that renting was an even more attractive choice, since a small fortune (about £2000 or $3900 in today's prices) could be spent on a system which may become obsolete. By the time Betamax machines became easier to rent, VHS had already claimed 70% of the market.”15

These strategic mechanisms supporting a de facto standard affirmation in the market are being studied by economics theorists, particularly within the context of the so-called network economies addressed earlier in this article.

6. Major issues facing the standardization process

The following paragraphs will try to focus our reader’s attention on the major issues highlighted by the scientific literature (particularly in the legal and economic fields) regarding the standardization process. More than a complete discussion, we will provide a general framework and ‘food for thought’, referring to other and more specialized sources for a deeper assessment.

6.1. Standard and technology innovation

The framework outlined so far seems to suggest a common virtuosity and desirability of the standardization itself. As a consequence, we could easily say that the existence of pre-defined reference standards is always beneficial to technology development. However, most careful observers underline that a much more complicated issue is at stake here.

When establishing a specific standard, even if under the most transparent and shared procedures, we try to crystallize a specific reference model dictating the future development of a certain technology. However, at the same, time we are fully aware that any development of technology relies on fast and steady evolution – knowing all too well that eventually any effort to crystallize will be overwhelmed by the current of this flooding river. In other words, a specific standard would only illustrate the current state of the art and the techniques pertinent to the moment of the standard setting, or just a bit beyond that.

Therefore, the standardization process should take into account these dynamics and maintain a fluid perspective in order to encourage rather than stifle innovation. The stakeholders involved in the standardization process should consider a long-medium range viewpoint, so that the standard would become a pillar and a foundation providing actual support to future technology solutions. This is the reason why in most cases a technology model, when it is relatively solid and well-known, gets recognized as a standard.

Mario Calderini points out a crucial issue when explaining that the standardization process comprises an implicit co-existence between two opposing forces that should be kept under a tightly controlled balance if our aim is to truly gain more neutrality and technological innovation:

On one hand, we have a typical problem related to the standardization process: ensuring that the convergence procedures come to fruition with efficient outcomes (selecting the best technology available) as soon as possible. On the other hand, we have to ensure a virtuous coexistence between platform openness and interoperability and the need to define a competitive context able to foster innovation.16

As a consequence of this situation, another risk emerges: a badly structured standardization system could lead to a deadlock and hardening of the market, where the replacement of an obsolete standard with a more modern one could be stifled for purely strategic reasons. On the one hand, when a standard has taken roots -- that is, it is being widely adopted by companies and broadly used by consumers – it encourages a natural inertia, which makes it particularly difficult to replace with a new, innovative and technologically superior standard.

Addressing a key point within this framework, Andrea Giannaccari effectively underlines that “positive network features could become high entry barriers – wisely modeled by lock-in strategies – with the not-so-remote risk that such practice could even lead to an oligopolistic tunnel, thus putting offplay or delaying the entry of superior technologies”.17

6.2. Regulatory activities and intellectual property management

The growing need for a standardization approach in today’s ICT field, aimed more and more at technological convergence and integration, seriously calls into question some of the basic paradigms of intellectual property. This is because the very standard definition development feeds on an apparent contradiction: when a company gets involved in a standardization process, it is required to play in the open, it must share its own know-how with the other stakeholders about the very technology that is being considered in view of standardization. Obviously , in a broader sense , this know-how includes not only the business secrets typical of any technology design and development activity, but also (and foremost) industrial property rights such as patents and copyright.

The technical definition for this showdown is “IPR (Intellectual Property Rights) disclosure”, and it represents one of the key points on the roadmap to standard definition. Indeed, the holders of industrial property rights should truly embrace a collaborative and transparent approach, by openly declaring their property rights on the technical solutions being considered for the standardization process and by agreeing not to make strategic use of these tools of legal protection. In fact, we could picture an instance (actually, not so rare), where one of the companies involved in the standardization process hides from the other stakeholders its own patent on part of the very technology that is becoming a standard. That company could even decide to disclose its exclusive rights only after the standard has been formalized and published, thus requiring a royalty fee or even threatening legal action against the other parties. This would be an unfair behavior from the ethical and competitive standpoint in the first place but also quite dangerous for the entire standardization system. This system could easily be stifled and could miss its essential goal of establishing a virtuous platform aimed at innovation and interoperability. This is the main reason behind transparent and consistent policies adopted by major standard-setting organization on intellectual property) . 18

In addition, we must consider that, as mentioned earlier, often after a standard has been formalized, it still might contain technical solutions covered by some sort of property rights. Therefore, it is crucial to avoid risking that the subsequent adoption of standards by operators foreign to the standardization process itself could become a “trap” 19 with heavy consequences from a legal point of view. Indeed, the discovery of a patent later on in the development stage of a product or application will put the same developer in an extremely weak contractual position.
According to some authors, the instrumental use of intellectual property rights gains primary importance for the functionality of standards and, if not properly monitored, it could even transform into some sort of “pathology” capable of degrading the entire standardization system . 20

6.3. Standardization and competition issues

It is common knowledge that anti-trust bodies ensure fair market competition by closely monitoring the organizations where companies draft agreements on market development, exchanging information and establishing timetables, procedures and prices for their products and services. According to some sources, the standard-setting organizations could seize up on these competition dynamics due to their specific nature.

Within the European context, the most apt regulation is Article 101 of the Treaty on the Functioning of the European Union21, specifically addressing similar agreement practices among tech companies (also called “cartels”). The first two paragraphs appear to be quite strict in prohibiting similar practices and many provisions seem to directly address the standard-setting organizations.

1. The following shall be prohibited as incompatible with the internal market: all agreements between undertakings, decisions by associations of undertakings and concerted practices which may affect trade between Member States and which have as their object or effect the prevention, restriction or distortion of competition within the internal market, and in particular those which:

(a) directly or indirectly fix purchase or selling prices or any other trading conditions;

(b) limit or control production, markets, technical development, or investment;

(c) share markets or sources of supply;

(d) apply dissimilar conditions to equivalent transactions with other trading parties, thereby placing them at a competitive disadvantage;

(e) make the conclusion of contracts subject to acceptance by the other parties of supplementary obligations which, by their nature or according to commercial usage, have no connection with the subject of such contracts.

2. Any agreements or decisions prohibited pursuant to this Article shall be automatically void.

However, this prohibition gets less stringent in the third paragraph, whose provisions seem to protect exactly the existence of shared agreements among companies aimed at providing standardization definitions.

3. The provisions of paragraph 1 may, however, be declared inapplicable in the case of:

- any agreement or category of agreements between undertakings,

- any decision or category of decisions by associations of undertakings,

- any concerted practice or category of concerted practices,

which contributes to improving the production or distribution of goods or to promoting technical or economic progress, while allowing consumers a fair share of the resulting benefit [omissis].22

In other words, the EU could consider legit imate , on a case by case basis, the kind of business agreements that do not pose any danger to the balance of competition and , therefore , exempt them from the prohibition clauses listed in the first paragraph of Article 101 (ex 81).
In fact, the European Commission takes care of providing regular instructions about the application of Article 101 provisions in order to help companies choose and abide by the agreements compatible with the competition rules . 23

Beyond this general framework about the application range of Article 101 (ex 81), we should also take into account more specific and complex problems, particularly those issues that entail the anti-trust regulation principles and the strategic use of industrial property rights within the standardization process itself (including, for example, the issue technically defined as “patent pooling”24) mentioned above.

According to some important authors, there are meaningful points of contact and contrast among the standardization process, industrial property rights and fair competition regulations. Indeed, any technology right does not refer just to the possibility of creating and marketing a certain technology, but also provides control over market competition options and related products that are based on or will use such technology.

In addition, we should keep in mind the differences existing between the US and the European models. As a consequence, the legal and economic approach differs in regards to the attrition between standardization and competition rights.

7. Open standards

Based on the general picture illustrated above, in recent years the technology world at large (manufacturers, user communities, scholars and observers) has been witnessing a lively debate on the need to come up with standards capable of ensuring the most transparency possible along their adoption process and to allow free access to the related documentation, in order to maximize the scope and range of interoperability. To better illustrate this state of affairs, here below we will highlight some definitions drafted by authoritative sources.

7.1. The Open Standard definition by Bruce Perens

As a renowned representative of the FLOSS community and author of several popular essays on the issue, Bruce Perens has been quick to provide a clear and exhaustive definition of open standards. In his personal website25, Perens lists six essential requirements for the establishment of an open standard:

  • Availability: Open Standards are available for all to read and implement;

  • Maximize end-user choice: Open Standards create a fair, competitive market for implementations of the standard. They do not lock the customer in to a particular vendor or group;

  • No royalty: Open Standards are free for all to implement, with no royalty or fee. Certification of compliance by the standards organization may involve a fee;

  • No discrimination: Open Standards and the organizations that administer them do not favor one implementor over another for any reason other than the technical standards compliance of a vendor's implementation. Certification organizations must provide a path for low and zero-cost implementations to be validated, but may also provide enhanced certification services;

  • Extension or subset: implementations of Open Standards may be extended, or offered in subset form. However, certification organizations may decline to certify subset implementations, and may place requirements upon extensions;

  • Predatory practices: Open Standards may employ license terms that protect against subversion of the standard by embrace-and-extend tactics. The licenses attached to the standard may require the publication of reference information for extensions, and a license for all others to create, distribute, and sell software that is compatible with the extensions. An Open Standard may not otherwise prohibit extensions.

This definition has been applied in several frameworks, including a research study carried out in 2007 by UNDP (United Nations Development Programme) covering interoperability in e-government, under the title of “New Guidelines on e-Government Interoperability Developed by Governments for Governments”. However, the document part of this research states the lack of general consensus on the requirements described by Perens, considered too restrictive by some sources. The controversial part is essentially the “no royalty” requirement: it seems excessive to impose a royalty-free model, given the possibility that a fee payment, even if based on reasonable and non-discriminatory conditions, could actually represent a major incentive to the standard development and management.

7.2. The Open Standard definition by the ITU-T

This latter position has been embraced, among others, by the Telecommunication Standardization Sector (ITU-T), the organization coordinating standards for telecommunications on behalf of the International Telecommunication Union (ITU) and based in Geneva, Switzerland. The ITU - T has taken a different stance on the Open Standard concept, providing first a broad encyclopedia-style definition and then listing a series of requirements. The following is the basic definition available on the ITU website:

“Open Standards are standards made available to the general public and are developed (or approved) and maintained via a collaborative and consensus driven process. Open Standards facilitate interoperability and data exchange among different products or services and are intended for widespread adoption.”26

The same page includes a list of the requirements proposed by ITU, with the notice that those are not obligations but rather some general and illustrative conditions:

  • Collaborative process: voluntary and market-driven development (or approval) following a transparent consensus-driven process that is reasonably open to all interested parties.

  • Reasonably balanced: ensures that the process is not dominated by any one interest group.

  • Due process: includes consideration of and response to comments by interested parties.

  • Intellectual property rights (IPRs): IPRs essential to implement the standard to be licensed to all applicants on a worldwide, non-discriminatory basis, either (1) for free and under other reasonable terms and conditions or (2) on reasonable terms and conditions (which may include monetary compensation). Negotiations are left to the parties concerned and are performed outside the SDO.27

  • Quality and level of detail: sufficient to permit the development of a variety of competing implementations of interoperable products or services. Standardized interfaces are not hidden or controlled other than by the SDO promulgating the standard.

  • Publicly available: easily available for implementation and use, at a reasonable price. Publication of the text of a standard by others is permitted only with the prior approval of the SDO.

  • On-going support: maintained and supported over a long period of time.

7.3. The Open Standard definition by the IDABC

Finally, here is a less articulated, but concise and clear, definition which today main institutional bodies consider the most reliable one. This definition is included in the above-mentioned European Interoperability Framework (EIF) and has been adopted by several standardization organization and public institutions, particularly in their regulations and recommendations related to e-government.
According to the definition drafted by the IDABC
28 , a standard is considered “open” when:

  • The standard is adopted and will be maintained by a not-for-profit organisation, and its ongoing development occurs on the basis of an open decision-making procedure available to all interested parties (consensus or majority decision etc.).

  • The standard has been published and the standard specification document is available either freely or at a nominal charge. It must be permissible to all to copy, distribute and use it for no fee or at a nominal fee.

  • The intellectual property - i.e. patents possibly present - of (parts of) the standard is made irrevocably available on a royalty-free basis.

  • There are no constraints on the re-use of the standard.

8. Classification criteria of Open Standards

The affirmation of the new entry of the ‘open standard’ opens up the framework of the standard category regarding the two macro-categories mentioned earlier. In order to better illustrate this expansion of the standard concept and related classification, we will quote directly the opinion of Alfonso Fuggetta, Italian professor and scholar on issues concerning technology and innovation.29

According to Fuggetta, a new classification based on the standard openness degree includes five levels: 30

  • level 0: undisclosed/proprietary;31

  • level 1: disclosed. The standard is owned by a company and is made “available” in some form to other companies and users. The owner controls the evolution of the standard;

  • level 2: concerted. There is a consultation, but the admission to the consultation process and the management of the process itself is controlled by the company or by the association of companies that emits the standard;

  • level 3: open concerted. There is an open participation process through which the standard is defined and managed;

  • level 4: open de jure. Standards are owned and managed by official international and national standard-setting organizations.

In turn, this classification leads to the following four kinds of standards :

  • proprietary standards, further differentiated in non-disclosed and disclosed proprietary standards;

  • concerted standard;

  • concerted open standard;

  • de jure open standard.

Prof. Fuggetta is careful in pointing out that only the last two categories can be rightfully considered “open standard”, and that despite the lack of universal consensus on the interpretation of this term, a true open standard should also be royalty-free. 32

9. The web as an interoperable technology and the role of the W3C

Let’s imagine for a moment to strip today’s World Wide Web of its interoperability features. Probably the Web as we know it would cease to exist, or at least it will not be able to match the current evolution level. It is true that a general agreement on the de jure standard on the Web has been reached only in a relatively recent time particularly with the broad diffusion of the standard developed by the World Wide Web Consortium (W3C). Nevertheless, since its inception, the whole Internet has taken off and grew exponentially thanks to widely shared protocols and standards. This trend enabled the penetration and worldwide success of the Internet in an unsurpassed way for any other technology model yet.

The Internet, and more specifically the Web, as a successful instance of an interoperable technology has received attention in many research and analysis. As an example of this literature, here would suffice to briefly quote from the document “The Internet Standards Process”, drafted by Scott O. Bradner at Harvard University:

“The Internet, a loosely-organized international collaboration of autonomous, interconnected networks, supports host-to-host communication through voluntary adherence to open protocols and procedures defined by Internet Standards. There are also many isolated interconnected networks, which are not connected to the global Internet but use the Internet Standards.”33

The W3C is an international community where member organizations, a full-time staff and the public work together to develop Web standards. W3C's mission is  “to lead the World Wide Web to its full potential by developing protocols and guidelines that ensure the long-term growth of the Web”. 34 Founded in 1994 and still led by Sir Tim Berners-Lee, the WC3 comprises more than 350 members, including tech and telecom companies, non-profit organizations and research institutions, both private and public. The W3C official website lists in seven items 35 its main objectives and strategic principles :

  • Universal access to web resources;36

  • Research and development to build the so-called Semantic Web;37

  • Promotion of a Web of Trust, an environment based on reciprocal collaboration, trust, privacy and responsibility ; 38

  • Promotion of interoperability and open standards;39

  • Fostering the Consortium evolution in accordance with the continuum of technology development;40

  • Decentralizing the architecture and organization of the Web itself developing cool multimedia, that is, a Web more and more close to the user needs and able to provide a richer interaction also for entertainment purposes.41

10. The OASIS approach to the standardization activity

Launched in 1993, the non-profit consortium Organization for the Advancement of Structured Information Standards (better known by its acronym OASIS) is mostly focused on promoting research and formalization of open standards in the world of ICT. According to the organization website, its mission is to lead “development, convergence and adoption of open standards for the global Information Society”.42

Founded under the name of “SGML Open”, the consortium was intended as a community of vendors and users devoted to developing guidelines for interoperability among products that support the Standard Generalized Markup Language (SGML). However, in 1998, the organization changed its name to OASIS Open” to reflect an expanded scope of technical work and the increasing attention reserved by the ICT sector to technologies based on Extensible Markup Language (XML) and open standards in a broader sense. Today, OASIS has more than 5,000 participants representing over 600 organizations and individual members in 100 countries, with headquarters in the USA and major operative offices in Europe and Asia.
As highlighted in its website, the consortium structure features several interesting aspects that outline the basic philosophy behind its internal power balance and its actual decision-making procedures. The OASIS approach is particularly oriented toward transparency, democracy and openness. Quoting
directly from its “About” web page:

“OASIS is distinguished by its transparent governance and operating procedures. Members themselves set the OASIS technical agenda, using a lightweight process expressly designed to promote industry consensus and unite disparate efforts. Completed work is ratified by open ballot. Governance is accountable and unrestricted. Officers of both the OASIS Board of Directors and Technical Advisory Board are chosen by democratic election to serve two-year terms. Consortium leadership is based on individual merit and is not tied to financial contribution, corporate standing, or special appointment.”

Another meaningful aspect of the consortium modus operandi concerns the management of industrial property rights, an issue dealt with great attention and innovative strategy. In this regard, here is an answer to a specific question included in the FAQ section of its website:

“Most OASIS specifications are provided to the public on a royalty-free basis. The OASIS IPR Policy states that contributors of externally developed technical work must identify all IP claims (patents, trademarks, etc.) associated with that work, and must agree to grant use of this technology under reasonable and non-discriminatory (RAND) or royalty-free (RF) terms for purposes of implementing the OASIS specification.”43

Finally, OASIS embraces a pioneer approach to the whole standardization issue, encouraging a propagation toward positive mechanisms and trends. Indeed, it is not by chance that one of the most renowned standards – the Open Document Format , effectively recognized as an open standard – has undergone a standardization process carried out by the OASIS consortium.

Conclusion

This general overview on the issue of interoperability (and its relationship with standardization) highlights the importance of taking into consideration its many aspects in order to develop truly open and innovative technologies.

In fact, any FLOSS advancement within an eco-system polluted by market strategies still rigidly imposed by dominant players or de facto monopolies threatens to undermine much of the effort currently underway and to leave everything to a very abstract level.

On the other hand, the application of open standards in a broader context also means to build a transparent and fair foundation toward an effective and distributed technological innovation.

 

About the author

Licence and Attribution

This paper was published in the International Free and Open Source Software Law Review, Volume 3, Issue 1 (September 2011). It originally appeared online at http://www.ifosslr.org.

This article should be cited as follows:

Simone Aliprandi (2011) ' Interoperability and open standards: the key to true openness and innovation ', International Free and Open Source Software Law Review, 3(1), pp 5 – 24
DOI:
10.5033/ifosslr.v3i1.53

Copyright © 2011 Simone Aliprandi

This article is licensed under a Creative Commons
Attribution - NoDerivs 2.0 England and Wales licence, available at
http://http://creativecommons.org/licenses/by-nd/2.0/uk/

As a special exception, the author expressly permits faithful translations of the entire document into any language, provided that the resulting translation (which may include an attribution to the translator) is shared alike. This paragraph is part of the paper, and must be included when copying or translating the paper.

graphics2

 

 

Simone Aliprandi is an Italian attorney-at-law and researcher who is constantly engaged in writing, teaching and consulting in the field of copyright and ICT law. He has an additional degree in Public Administration Science and he is finishing a Ph.D. program in Information Society at the Bicocca University of Milan. He founded and still coordinates the Copyleft-Italia.it project and has published several books devoted to openculture and copyleft. He also collaborates as a legal consultant with Array (http://www.arraylaw.eu). More information about his activities: http://www.aliprandi.org.

1 http://ec.europa.eu/idabc/en/document/3473.html

2 Sutor, Bob (2006) 'Interoperability vs. intraoperability: your open choice', http://www.sutor.com/newsite/blog-open/?p=1260

3 http://www .thefreedictionary.com/standard

4 http://dictionary.cambridge.org/dictionary/british/standard_2

5 http://www .webopedia.com/TERM/S/standard.html

6 Here is the “ de facto standard ” entry on the Webopedia Computer Dictionary: A format, language, or protocol that has become a standard not because it has been approved by a standards organization but because it is widely used and recognized by the industry as being standard (available at: http://www .webopedia.com/TERM/D/de_facto_standard.html )

7 http://www .iso.org/iso/support/faqs/faqs_standards.htm

8 http://www .iec.ch/tiss/iec/Directives-Part2-Ed5.pdf

9 These three principles are listed in UNI (eds.) (2006) 'Le regole del gioco', UNI (p. 22), available online at: http://www.uni.com/uni/controller/it/chi_siamo/regole_gioco.htm

10 http://www .iso.org/iso/standards_development/processes_and_procedures/how_are_standards_developed.htm

11 UNI (eds.) (2006) 'Le regole del gioco', UNI (p.108), available online at: http://www.uni.com/uni/controller/it/chi_siamo/regole_gioco.htm

12 According to some sources, this behaviour is a threat to the entire standardization process. For example, see this detailed report about the Rambus case provided by Carlo Piana: http://www.piana.eu/rambus_ce.

13 Calderini, M.; Giannaccari, A.; Granieri, M. (2005), 'Standard, proprietà intellettuale e logica antitrust nell'industria dell'informazione', Il Mulino (p. 34)

14 The de facto standard solution relies on market dynamics, self-regulation effects and operator support. The actual selection of a de facto standard itself, which is not necessarily the best available, is exclusively or mostly due the power exerted by the specific actors (as in the case of the Microsoft operating system); (ibidem, p. 45-46)

15 http://en.wikipedia.org/wiki/Videotape_format_war

16 Calderini, M.; Giannaccari, A.; Granieri, M. (2005), 'Standard, proprietà intellettuale e logica antitrust nell'industria dell'informazione', Il Mulino (p. 17).

17 ibidem (p. 91)

18 While encouraging a simpler innovation process, the strategy of including technologies covered by intellectual property rights in a certain standard could also promote a specific agenda by the same property rights holders.  […] The behaviour of standard-setting organizations becomes crucial in dealing with the various issues related to intellectual property rights. (ibidem, p. 100)

19 Some sources define these instances as “patent ambushes”. For more details on this issue, see: Hueschelrath (2008) ' Patent Ambushes in Standars Setting Organizations. Implications for Antitrust Policy and the Design of IP Rules', AEA, available at http://www.aea-eu.net/2008Tokyo/DOCUMENTS/Publication/Abstract/HUSCHELRATH.pdf . See also: Farrell, Hayes, Shapiro, Sullivan (2007) 'Standard Setting, Patents, and Hold-Up', 74 Antitrust Law Journal No. 3; or just the Wikipedia entry: http://en.wikipedia.org/wiki/Patent_ambush

20 See this excerpt from the Rambus case mentioned earlier: Ghosts haunt the standardization process. They go by several names and come in different forms: “standards abuse”, “standards hijacking”, “patent ambush”, “royalty ambush”, “patent trolling”. The standardization world has never been so much under fire. Some companies try to bend the standardization process to fit their own selfish interest, without any regard for the common weal. Some others just sit and wait until some of their patent claims are “necessarily infringed” by a standard, the industry is locked in, and then pass the hat to collect the high toll that standard-abiding companies are forced to pay, in spite of the licensing rules of the standard setting bodies (SSB) that would require Reasonable And Non Discriminatory conditions (RAND) as a prerequisite for inclusion of any patented contribution into the standard. Others do the same, but in addition they actively seek to seed the standards with their own patented technology. Piana, Carlo (2009), 'Rambus and patents in standards', available at http://www.piana.eu/rambus_ce

21 The full text of the treaty is available at:  http://eur-lex.europa.eu/en/treaties/index.htm

22 For more details, see the EC  recommendation N. 330/2010 of  20th April 2010,  in regards to Article 101, paragraph 3, of the Treaty of EU guidelines about vertical cooperation agreements and shared practices.

23 See for example the 'Guidelines on horizontal cooperation agreements' available at http://europa.eu/legislation_summaries/competition/firms/l26062_en.htm

24 In patent law, a patent pool is a consortium of at least two companies agreeing to cross-license patents relating to a particular technology (http://en.wikipedia.org/wiki/Patent_pool)

25 http://perens.com/OpenStandards/Definition.html

26 http://www.itu.int/ITU-T/othergroups/ipr-adhoc/openstandards.html

27 Standard Developing Organization

28 This EIF definition is based on the Italian version available here: http://www.uni.com/uni/controller/it/comunicare/articoli/2007_1/odf_26300.htm

29 See the research paper 'Open standard, Open Formats, and Open Source', co-auhored by Davide Cerri and available at http://www.davidecerri.org/sites/default/files/art-openness-jss07.pdf. The same ideas are also proposed in a post on Fuggetta's personal blog: http://www.alfonsofuggetta.org/?p=539.

30 Actually Fuggetta does not include the level 0, added here for clarity and completeness purposes.

31 This refers to a case when the standard specifications are not publicly available and the standard itself is being owned by an organization imposing its property rights.

32 Another interesting classification is presented by Dolmans, Marurits (2010) 'A Tale of Two Tragedies – A plea for open standards, and some comments on the RAND report', IFOSS L. Rev., 2(2), pp 115 – 138 , DOI: 10.5033/ifosslr.v2i2.46

33 Bradner, S.O., 'The Internet Standards Process' (par. 1.1), available at: http://www.ietf.org/rfc/rfc2026.txt

34 http://www.w3.org/Consortium/mission

35 http://www.w3.org/Consortium/Points/

36 The W3C defines the Web as the universe of network-accessible information (available through your computer, phone, television, or networked refrigerator...). Today this universe benefits society by enabling new forms of human communication and opportunities to share knowledge. One of W3C's primary goals is to make these benefits available to all people, whatever their hardware, software, network infrastructure, native language, culture, geographical location, or physical or mental ability. W3C's Internationalization Activity, Device Independence Activity, Voice Browser Activity, and Web Accessibility Initiative all illustrate our commitment to universal access; (ibidem)

37 People currently share their knowledge on the Web in language intended for other people. On the Semantic Web ("semantic" means "having to do with meaning"), we will be able to express ourselves in terms that our computers can interpret and exchange. By doing so, we will enable them to solve problems that we find tedious, to help us find quickly what we're looking for: medical information, a movie review, a book purchase order, etc. The W3C languages RDF, XML, XML Schema, and XML signatures are the building blocks of the Semantic Web; (ibidem)

38 The Web is a collaborative medium, not read-only like a magazine. In fact, the first Web browser was also an editor, though most people today think of browsing as primarily viewing, not interacting. To promote a more collaborative environment, we must build a "Web of Trust" that offers confidentiality, instills confidence, and makes it possible for people to take responsibility for (or be accountable for) what they publish on the Web. These goals drive much of W3C's work around XML signatures, annotation mechanisms, group authoring, versioning, etc.; (ibidem)

39 Twenty years ago, people bought software that only worked with other software from the same vendor. Today, people have more freedom to choose, and they rightly expect software components to be interchangeable. They also expect to be able to view Web content with their preferred software (graphical desktop browser, speech synthesizer, braille display, car phone...). W3C, a vendor-neutral organization, promotes interoperability by designing and promoting open (non-proprietary) computer languages and protocols that avoid the market fragmentation of the past. This is achieved through industry consensus and encouraging an open forum for discussion; (ibidem)

40 W3C aims for technical excellence but is well aware that what we know and need today may be insufficient to solve tomorrow's problems. We therefore strive to build a Web that can easily evolve into an even better Web, without disrupting what already works. The principles of simplicity, modularity, compatibility, and extensibility guide all of our designs; (ibidem)

41 Who wouldn't like more interactivity and richer media on the Web, including resizable images, quality sound, video, 3D effects, and animation? W3C's consensus process does not limit content provider creativity or mean boring browsing. Through its membership, W3C listens to end-users and works toward providing a solid framework for the development of the Cooler Web through languages such as the Scalable Vector Graphics (SVG) language and the Synchronized Multimedia Integration Language (SMIL); (ibidem)

42 According to the presentation page at http://www.oasis-open.org/who : OASIS is a not-for-profit consortium that drives the development, convergence and adoption of open standards for the global information society. The consortium produces more Web services standards than any other organization along with standards for security, e-business, and standardization efforts in the public sector and for application-specific markets

43 From the OASIS 'Frequently Asked Questions' page at http://www.oasis-open.org/who/faqs.php.
For further details about the IPR policies promoted by OASIS, see this specific section of their website:
http://www.oasis-open.org/who/intellectualproperty.shtml