Linux Foundation is Eating the World

C. Bradford Biddle a

(a) Faculty Fellow, Center for Law, Science and Innovation, Arizona State University, and Principal, Biddle Law PC, Portland, OR USA

DOI: 105033/jolts.v11i1.137

Abstract

Marc Andreesen’s 2011 article Software is Eating the World suggested that developments in the information and communications technology (ICT) industry are now transforming industries far beyond ICT. By facilitating interoperability, private sector-led technology consortia contributed to these developments, and they will continue to play a critical role as interoperability requirements grow in complexity. Consortia themselves have evolved over time, and continue to change. While historically many consortia focused on hardware interoperability, open source software is increasingly part of how interoperability occurs, and today’s consortia reflect this. The extraordinary growth and rapidly expanding roles of the software-centric Linux Foundation is striking evidence of this new reality. This story holds important lessons for European stakeholders. Within this changed technology standardization landscape there are opportunities for European leadership.

Keywords

Law; information technology; Free and Open Source Software; standards; consortia; interoperability

 

I. Introduction

An explanation of the title of this article can serve to explain its goals. Linux Foundation is a non-profit organization based in the USA that hosts development of the open source operating system software called Linux. Less obviously, Linux Foundation also hosts at least 155 other collaborative software and specification development projects, and is growing at an extraordinary pace. The title also references a well-known 2011 article by venture capitalist Marc Andreessen called Why Software is Eating the World. In this article Andreessen argued that “six decades into the computer revolution, four decades since the invention of the microprocessor, and two decades into the rise of the modern Internet, all of the technology required to transform industries through software finally works and can be widely delivered at global scale.” (Andreessen, 2011). He describes a software revolution transforming a broad range of industries: entertainment, retailing, manufacturing, health care, education – even industries like oil and gas and national defense.

This paper argues that that private sector-led collaborative technology development organizations called “consortia” have been a fundamental part of advancing this revolution. It further argues that they are now also being transformed by it, and the remarkable growth of Linux Foundation is evidence of this transformation.  Ultimately this is not a paper about the Linux Foundation, however. Rather, the paper explores more generally how the technology standardization process is changing, and surveys what these changes may mean in particular for various European stakeholders.

Consortia such as the USB Implementers Forum, the Wi-Fi Alliance and the Bluetooth SIG, and countless other similar organizations have long played a critical role in developing interoperability standards in the information and communications technology (ICT) industry – arguably a more important role than formal international standards development organizations like the International Telecommunications Union  (ITU) or the International Organization for Standardization (known as ISO). Over the past several decades the ICT industry developed and honed a model for the formation and operation of collaborative groups that enabled ICT product interoperability in a diverse array of technology areas. Typically the core deliverables of these organizations have been technical specifications – textual documents that describe how to build interoperable systems – which in many cases serve as standards in particular industry segments.

This specification-focused model is increasingly challenged by open source software projects, such as those hosted by Linux Foundation, that facilitate interoperability through shared software code rather than shared technical specifications. This challenge has forced traditional consortia to adapt, with software now playing a more important role in many organizations. At the same time, both traditional consortia and open source software projects seem to be recognizing some limitations of software as a path to sustained industry interoperability (in part due to the risk of software projects “forking” into incompatible paths), and appear to be seeking some synthesis of the specification-oriented and the software-oriented models.

We are in the midst of an era of significant change around how technology consortia organize themselves and the types of deliverables they produce. And, because consortia play a critical role in the global ICT standardization process, and because the ICT standardization process has implications far beyond just the ICT industry, this change is consequential.

Historically technology consortia have been largely a United States-based phenomenon. Of the many hundreds – perhaps thousands – of private sector-led technology consortia that have been formed over the past decades, only a small percentage have been based in Europe. Europe has certainly played a leading role in global standardization – in addition to being the home of formal standards organizations like ITU and ISO, European-based organizations like 3GPP, ETSI, ECMA and others are major forces in ICT standardization – but European consortia in the style of USB, Bluetooth, Wi-Fi and the like are the exception, not the rule.

The European Commission recently issued a call for a study “to reinforce the EU’s competitiveness in digital technologies,” based on an “observation that European industry needs to come to agreements on functions and interfaces for those platforms, reference architectures and interaction protocols that have the potential to create markets and market opportunities leading to ecosystems and standards” (European Commission, 2018). Fundamentally, this is what specification-oriented technology consortia have done for decades, and what collaborative open source software projects like those hosted by Linux Foundation are increasingly doing. The EC study proposal suggests that European policymakers are rethinking the role that European industry and European institutions play in this process. Given the transformational effect technology is having across many different kinds of industries, and given the changing environment of private sector-driven collaborative interoperability efforts, this reevaluation is both important and timely.

The paper proceeds in three main sections. The first part explains the critical role played by technology consortia. The second part identifies how consortia are changing. The final section offers some observations for European stakeholders to consider in light of this changing environment.

II. The Importance of Consortia

The technology consortia that are the focus of this paper occupy a space between, on the one side, unilateral actions of a single company, and, on the other side, formal standardization efforts. This paper adopts the taxonomy of “single company” efforts, “consortia,” and “formal standards development organizations” set forth in Biddle (2017); the emphasis here is on consortia, under that framework. Accordingly, this paper focuses on structured, private sector-led collaborative efforts that produce technical specifications, software code, reference designs, or other similar deliverables that enable interoperability between third party products or services, or that otherwise support such efforts by providing compliance testing services, marketing support, or similar support services.  Technology consortia are the organizations that are formed – with varying structures and degrees of formality – to advance these efforts.  Formal standards development organizations – i.e., those organizations vested with some direct or indirect governmental authority to create “standards,” including all ANSI-accredited standards developers in the U.S. – are definitionally not consortia. In practice, the boundaries between these various kinds of activities can be blurry, however: some large consortia look much like formal SDOs, and some smaller or more specialized SDOs look much like private sector-driven consortia.

Examples of well-known technology consortia include:

Beyond these large, relatively well-known organizations, countless other industry-led collaborative efforts engage in similar activities, with both large and small impacts. The PCI-SIG’s PCIe specification is used in virtually every high end computing device. The HDMI Forum defines the ubiquitous HDMI video connector. MIPI Alliance produces interface specifications for mobile device hardware that are in embodied in literally every modern smartphone on the planet. The OPC Foundation’s OPC UA specification and associated software code is ubiquitous in the industrial automation industry and is shaping the factories of the future in profound ways. Khronos Group provides the foundational specifications and code that enable cross-platform virtual and augmented reality. OpenStack Foundation’s open source code has transformed the data center. Groups fill narrower niches too: the QSFP-DD MSA Group, for example, is a contractual arrangement between various stakeholder companies that was formed to create a specification for a new optical hardware connector used in telecom routing devices. While no one knows the exact number of these kinds of groups, it is fair to state that they number in many hundreds, and perhaps thousands.1

Consortia have played a critical role for the information and communications technology industry. Perhaps more than any other industry, ICT faces deep and complex needs for interoperability between third party products and services. ICT products are built from tens of thousands of components sourced from an equally large number of vendors. These products leverage complex firmware, operating system and user-level software stacks, and they communicate over a variety of local and wide-area wired and wireless networks that ultimately span the globe. They are employed in a near-infinite variety of use cases. The associated requirements of coordination between a vast array of different actors are staggering. Consortia are a principal tool used by the ICT industry to create and manage interoperability in this extraordinarily complex context.

In addition to facilitating sophisticated supply chains and product use cases, interoperability in the ICT industry also results in positive network externalities that are likely impossible to measure in the aggregate but that are undoubtedly immense in effect.  One recent study (Huawei 2018) suggested that “intelligent connectivity,” which it described as “more integrated connections between all things, machines, and people in industrial settings,” would produce US$23 trillion in new economic potential by 2025. Consortia are a key forum in which these kinds of connections are defined and developed.

Consortia are arguably more important for ICT interoperability than formal standards organizations. Formal standardization is a long, slow process that is most effective when marketplace consensus has already been established. Consortia are sometimes the battlefield where this kind of consensus is fought for and won. Groups rise and fall depending on the level of marketplace support they garner. The BluRay Disc Association competed with the DVD Forum’s HD-DVD specification, until the market tipped to BluRay. The Wireless Power Consortium competes with AirFuel Alliance with different visions for wireless charging, while Airfuel swallowed the Power Matters Alliance. Leading industrial automation groups recently negotiated a compromise that blends their various visions. These examples demonstrate that consortia allow a type of market dynamism that is muted at the formal standards level. Frequently the technologies that are brought to the formal standardization process are those that have been developed and achieved marketplace acceptance via consortia.  

Consortia also can complement formal standardization processes in other ways. Formal standards development organizations typically produce one deliverable: technical specifications intended as standards, embodied in descriptive textual documents. For example, the IEEE produced the 802.11 wireless communications standards. The Wi-Fi Alliance gave this technology a consumer-friendly name – Wi-Fi – and marketed it so that consumers understood the value of buying a product that had Wi-Fi functionality. Wi-Fi Alliance also did the critical work of developing a testing process that ensured that W-Fi products actually interoperated with each other in real-world implementations, and developed a sophisticated logo licensing model that enabled consumers and supply chain participants to accurately communicate that their products worked as advertised. The importance of the marketing and compliance testing activities, beyond the bare publication of the standard, cannot be overstated. Other consortia play this sort of complementary role to particular formal standards; for example HomeGrid supported the ITU’s G.Hn standard in this manner.

Quantitative assessment by researchers evaluating the role of consortia in technology standardization has been relatively rare. Biddle et. al. (2010) identified 251 standards in a then-current laptop computer and found that a 44% of these were developed by consortia (along with 36% developed by formal standards setting organizations and 20% promulgated for industry adoption by single companies). Armstrong et. al. (2014), in a paper focused on estimating patent royalties, identified key standardized technologies in a smartphone. Drawing from the Armstrong et. al. (2014) work, and categorizing the standards developer as either a consortium or a formal standards development organization, the breakdown would be:

Consortia (11 organizations)

Formal SDO (5 organizations)

SD Card Association (memory)

WiFi Alliance (802.11 compliance)

Bluetooth SIG (Bluetooth)

NFC Forum (near-field wireless)

MIPI Alliance (camera, display, more)

Open Mobile Alliance (MMS)

Open Handset Alliance (Android OS)

Internet Engineering Task Force (various internet protocol interfaces)

World Wide Web Consortium (various web interfaces)

UPNP Forum (local networking)

Digital Living Network Alliance (content)

3GPP (Wireless WAN, wireless protocols)

IEEE (802.11)

ITU-T (Wireless WAN, imaging)

ISO/IEC JTC1 (audio, imaging, more)

JEDEC (memory)

 

Similarly, using the same criteria, one can categorize the thirty six organizations identified by Baron & Spulber (2018) in their sample of leading ICT standards setting organizations as follows:

 

Consortia (22 organizations)

Formal SDO (14 organizations)

Accelera, CEA, DMTF, DVB, HomePlug, HomePNA, IETF, IMTC, IrDA, MEF, OGC, OMA, Open Group, OSGi, PCCA (WTA), PCI-SIG, PICMG, SDR Forum, VESA, VITA, WIFI

3GPP, ANSI, ASTM, ATIS, BioAPI (ISO/IEC JTC1), CEN, ECMA, ETSI, IEEE, ISO, ITU, JEDEC, OASIS, TIA

This data provides some empirical support for the argument that consortia play a fundamental role in technology standardization, exceeding even the role played by the better-known formal standards organizations. Formal SDOs created only about a third of the standards in a 2010-era laptop, while consortia created nearly half. In connection with identified smartphone standards, consortia were named as standards developers twice as often as formal SDOs. About 60% of the standards setting organizations selected by scholars studying a collection of important ICT standards are consortia, and only about 40% are formal SDOs.

Qualitatively, the critical role played by technology consortia in facilitating interoperability for the ICT industry seems indisputable. Examples like USB and Bluetooth – along with hundreds of lesser-known organizations – show how consortia create important standards-based market ecosystems. As suggested above, consortia also play a unique pre-standardization role, enabling market participants to group themselves around potentially competing technologies, collaboratively developing technologies that may eventually become formal standards. And, as discussed using the example of the Wi-Fi Alliance, consortia sometimes complement formal standards by providing necessary compliance testing and marketing functions that formal standards bodies do not provide. Consortia are a fundamental part of the story of why, in Marc Andreessen’s words, information technology “works and can be delivered at a global scale.”

III. How Consortia are Changing

Consortia are changing in two important ways. First, new structural models, embodying new ways of forming and governing consortia, have emerged and have rapidly altered the consortia landscape. Second, the nature of what consortia produce is changing. Software increasingly plays a primary role. Linux Foundation is a leading example that illustrates both of these major changes.

A. Structural and Organizational Changes: Continuous Evolution

This section of the paper describes how the structural model for consortia has evolved over time to address a complex set of legal and practical concerns, such as managing liability risks for participants, mitigating the risks of antitrust or competition law claims, negotiating difficult intellectual property and governance/decision-making questions, reducing taxes, managing finances, and addressing a broad range of operational matters. This evolution is discussed in some depth, accompanied by a detailed analysis of several examples, in part to frame later discussion about how European stakeholders might address these same issues. A reader who is less interested in the minutia of how U.S.-based consortia have organized themselves could reasonably skip this section, taking away the general point that structural considerations are generally driven by these kinds of important underlying concerns.

ICT consortia date back to at least the 1980s. Cargill (1989) discussed the emergence of the Corporation for Open Systems (COS), the Manufacturing Automation Protocol (MAP), and other consortia (using the term “consortia” in essentially the same way as the term is being used in this paper). A 1989 magazine article described a “proliferation of computer industry standards groups, which makes rabbits look positively abstemious by comparison.” (CBR Staff Writer, 1989).  Mähönen (1999) described an environment similar to that described in this paper:

Especially in the field of telecommunications, standardization used to be the province of international organizations such as ITU (International Telecommunications Union), ISO and IEC (International Electrotechnical Committee). Now, the activities in telecommunications, information technology and multimedia are also addressed by a multitude of other players in the field. The standardization organizations can now be categorized into two main groups: formal (de jure) and informal consortia (de facto, grey or ad hoc groups). The formal standardization processes arc handled by traditional standards development organizations (ISO, ITU etc.), scientific or professional societies, trade associations or industrial standard organizations that can have a liaison with formal official bodies. Informal standards, in contrast, are produced by market forces (de facto) or by specific groups or consortia working independently.

By 2000, a dominant structural model for consortia had solidly emerged. The USB promoters group, which had been formed and distributed its initial 1.0 specification in 1996 under a purely contractual arrangement between stakeholder companies, coordinated the formation of the USB Implementers Forum Inc. as a mutual benefit (i.e. trade association-style, as opposed to a public charity) non-profit corporation under applicable U.S. state law (Oregon). Bluetooth SIG and PCI SIG, both of which had begun initially under similar contractual-based models, followed suit and developed formal incorporated structures in that same year. USB-IF and PCI-SIG successfully established themselves as tax exempt non-profit corporations recognized by the U.S. federal tax authority, the Internal Revenue Service (IRS). (Interestingly, Bluetooth SIG fought and lost a battle with the IRS over Bluetooth’s tax exempt status.2)

Many other organizations formed following this same basic template: incorporation as a mutual benefit non-profit corporation under applicable U.S. state law (with some slight variations of corporate form based on particular state law requirements), and then operation as a tax exempt entity under a provision targeted at “business leagues” and other trade association-style enterprises. This provision, Section 501(c)(6) of Title 26 of the U.S. Code, generally enabled the organizations to avoid paying federal income tax, and often to avoid most state and local taxes as well. Selecting from hundreds of examples, some organizations that follow this model include Avnu Alliance, the Broadband Forum, CCIX, the DASH Industry Forum, the Ethernet Alliance, FIDO Alliance, GENIVI Alliance, HDBaseT Alliance, ID Federation, JEDEC, Kantara Initiative, LoRa Alliance, MEMs Industry Group, NVM Express, Open19 Foundation, PICMG, Risc-V Foundation, SATA-IO, Thread Group, Universal Stylus Initiative, VESA, WiFi Alliance, XBRL and Zigbee (alas, we found no “Y” example of a 501(c)(6) org, defeating our attempted A-to-Z list – but we’ll mention the Yocto Project as an example of a different model below).

These organizations are typically funded by membership dues, sometimes supplemented by revenue from other sources, such as a compliance testing program that requires the payment of fees. Often there are tiers of membership, and members at the higher tiers frequently have strong influence over organization governance. For example, in the LoRa Alliance only “sponsor members,” who pay a US$50,000 membership fee, are eligible nominate and to vote for the organization’s corporate directors.  

Prior to the emergence of this incorporated model for consortia, many groups had formed under multi-party contractual arrangements. One model, popularized by USB prior to formation of the USB Implementers Forum, and followed by many others, was to identify “promoter” companies that led an organization, “contributor” companies that provided substantive inputs but lacked final decision-making power, and “adopter” companies that implemented the group’s deliverables, with the companies executing either a Promoter Agreement, Contributor Agreement or Adopter Agreement as applicable. While still used occasionally, this model became disfavored for three primary reasons. First, under applicable U.S. law the participants in these groups faced the risk of “joint and several liability” for the actions of other participants – i.e., one deep-pocketed participant could be held liable for the actions of a third party group member. Second, the lack of a distinct legal entity created various practical problems: the groups could not open a bank account, or enter into contracts, or apply for and own trademarks or other intellectual property – all of which proved to be important activities for many organizations. Third, these direct agreements to cooperate, made between parties that often were otherwise competitors, raised questions about potential antitrust (competition law) liability.

The incorporated organization model addresses all of these problems. Under applicable law, absent extraordinary circumstances, members are insulated from liability for the activities of the corporate entity. Further, as an independent legal entity the incorporated body can manage funds and hold intellectual property, and enter into contracts. As an independent non-profit entity the organization also offers its members a stronger narrative around competition law questions, as well as the possibility of taking advantage of certain liability safe harbors that applicable U.S. law offered to more traditional standards organizations.3

The incorporated model has some drawbacks as well. Forming new organizations can be contentious and time-consuming, as each new organization requires a new negotiation of governance details and intellectual property licensing arrangements. In the context of this type of organization, these negotiations can be a proxy for broader competitive dynamics in an industry segment. An influential party may wish to exclude a competitor from a leadership position, for example, or may want to ensure that the head of a technical committee is an ally. A decision between a royalty free (RF) or a fair, reasonable and non-discriminatory but potentially royalty-bearing (FRAND) intellectual property rule can fundamentally alter business models in a particular industry segment.  Given these high stakes, among parties with potentially very different business interests, it is unsurprising that formation negotiations sometimes prove difficult; but still, the frequent contentiousness and delays frustrate the affected parties. Further, once formed, groups require operational support and compliance with various tax and legal requirements that can prove burdensome. These difficulties have left industry participants hungering for an easier-to-implement model.  

The IEEE Industry Standards and Technology Organization (ISTO) was an early attempt at a more efficient model. Founded in 1999 as a 501(c)(6) non-profit corporation fully independent from the IEEE, ISTO focused primarily on solving the operational support and compliance issues faced by independent ICT organizations. ISTO offered groups a corporate umbrella under which they could organize largely autonomous projects, with operational support from ISTO staff. When forming, however, each project still faced difficult negotiations over project governance and intellectual property, as each project developed its own unique charter documents. The ISTO model also raised new questions around liability and legal autonomy. For example, a large ISTO project with significant financial resources might worry that a different ISTO project could create a legal liability for ISTO that could drain the first project’s resources. A few ISTO projects ultimately formed separate corporate legal entities to address this risk – a step which undermined some of the ostensible simplification benefits offered by ISTO. Ultimately, however, ISTO appears to have established an important niche in the ICT industry: ISTO reports that it has supported over 50 groups in its 20 year history, and it currently lists 17 active groups on its website.  Most of these appear to be simply projects of ISTO, rather than separately incorporated entities, but apart from this structural difference these projects look and act like other independent technology consortia.

The Joint Development Foundation (JDF) is a more recent example of an attempt to improve the process of organizing consortia. Founded in 2015, JDF is also a non-profit corporation with tax exempt status under Section 501(c)(6). JDF’s goal was to provide groups what it called a “consortium in a box.” Like ISTO, JDF provided sub-contracted operational support to groups (as and if desired by the groups), but JDF’s focus was on simplifying the legal details associated with group formation, largely by providing a set of menu options of well-defined legal terms. JDF described its value as follows:

By using established Joint Development Foundation legal agreements, groups can establish projects quickly and with minimal legal expense.  By operating under the Joint Development Foundation’s legal umbrella, Projects can enjoy of the benefits of the Joint Development Foundation’s existing legal agreements, choice of intellectual property policies, non-profit status, and corporate structure.  This enables Projects to more easily establish themselves, collect funds, issue press releases in the Project’s name, develop liaison relationships, and hold copyrights, all without negotiating custom agreements and new corporate organizations.

JDF also developed an innovative legal structure for its groups, conceived by JDF founder David Rudin. It formed a subsidiary legal entity, called Joint Development Foundation Projects LLC, as a single member limited liability company (LLC) under the state law of the U.S. state of Delaware. In turn, JDFP LLC was empowered, under an applicable provision of Delaware law, to create “series LLCs” – essentially simple-to-form subsidiary entities of JDFP LLC.  Each JDF project was assigned its own series LLC. For example, the large JDF project known publicly as Alliance for Online Media is technically the “Joint Development Foundation Projects LLC Alliance for Open Media Series.” The entity then filed to do business under the trade name “Alliance for Open Media.” This model enabled each project to have its own distinct legal entity for purposes of contracting and as a wall insulating against liability: theoretically, liability created by one series entity cannot affect another series LLC, the parent LLC, or the ultimate parent corporation. For tax purposes, however, the LLCs are considered “disregarded entities” by the U.S. federal tax authority, because they each have a single legal member, the parent JDF corporation. Thus for tax purposes they all are treated as JDF – that is, as part of a tax exempt 501(c)(6) organization – and they generally are not obligated to pay federal taxes.

Between 2015 and 2018 JDF launched four public projects. In 2018 JDF and Linux Foundation announced a plan to “bring the Joint Development Foundation into the Linux Foundation family.” Currently JDF and the JDF projects are identified as Linux Foundation projects on the LF website.

Linux Foundation itself was founded in 2000, with an initial focus specifically on the Linux operating system. By the early 2010’s it had established a program it called “Linux Foundation Collaborative Projects,” under which it hosted other projects. In August of 2013 it hosted nine such projects, including Tizen, the Xen Project, the Yocto Project, and others. Today Linux Foundation hosts at least 156 projects. Most Linux Foundation projects are different in two important ways from the vast majority of the consortia discussed in this paper thus far: (a) the LF projects primarily produce open source software code rather than technical specifications, and (b) the governance and intellectual property models follow open source community norms, which differ from the norms of traditional specification development groups. These issues will be discussed further below. For now, focusing on organizational structure, the key point is that for much of the history of Linux Foundation the Foundation followed the ISTO model with its projects: that is, most were simply projects of LF, without any formal separate legal identity – although a few were separately incorporated as 501(c)(6) non-profit corporations. However, beginning in 2017, many LF projects now reference the series LLC structure; for example, the FD.IO Project is “FD.IO Project a Series of LF Projects, LLC.” Accordingly, it appears that, in addition to now directly incorporating JDF and its projects as of 2018, LF has also emulated JDF’s structural model, presumably to achieve similar goals of liability insulation, ability of groups to independently hold funds and intellectual property, to contract directly with third parties, etc. Plus, like ISTO, LF offers its projects a sophisticated set of services, ranging from website design and hosting to event planning to finance, operations and human resources support, to compliance program develop and implementation, and more.

All of this discussion is intended not as a comprehensive explanation of how consortia are structured – in fact, exploration of some important and interesting variations, such as the approach followed by various content protection groups like Digital Content Protection LLC, the group that created the HDCP specification, are omitted here – but rather to illustrate the point that the models for how technology consortia are formed and structured have followed a complex evolutionary path, reflecting an array of legal, tax and operational factors, and that these models continue to change. Some fundamental structural innovations have appeared just in the past several years.  

B. Changes in Deliverables: Software Becomes Increasingly Important

The discussion in this section of the paper is intended to illustrate that software has increasingly become a key tool for creating interoperability in the ICT industry. Further, the development methodologies, intellectual property models, and ultimately the culture of open source software appear increasingly important in the development of interoperability solutions.

Organizations like USB, PCI-SIG and other traditional consortia primarily produce technical specifications. These are textual documents that describe how to build interoperable products. Engineers read these documents, and build products accordingly. Consortia frequently additionally offer other supportive services, such as “plugfests” (informal forums where engineers could test their products with others), or more formal compliance testing services, or marketing services to promote the value of interoperable products, but fundamentally the core deliverable of many technology consortia are technical specifications.

Historically these specifications typically described implementations in hardware. Building hardware requires careful long-term planning, and once commitments to particular technical paths are made they are difficult or costly to reverse. Accordingly, organizations focused on creating technical specifications defining interoperability for hardware products typically emphasize specification quality and organizational consensus over development speed. Development methodologies generally typically follow disciplined systems engineering approaches.

Open source projects primarily produce software code. Examples of hugely successful open source software projects include the Linux operating system, the Apache web server, the Firefox web browser, the MySQL database and countless other widely-deployed components and applications. Open source software is, by definition, licensed under an open source license. Open source licenses grant licensees broad rights to re-use code. Most modern open source licenses include express royalty-free patent licenses applicable to contributed code, and many open source community members argue that such licenses are implied even when they are not explicit.  Further, at the risk of severe over-simplification, open source projects generally adopt a governance/decision-making model that relies more on meritocracy than hierarchy, and that permits open participation in a project.

As “software eats the world,” increasingly some interoperability problems that used to be solved in hardware are now solved in software. A leading example is the rise of software-defined networking (SDN) and network function virtualization (NFV) in telecommunications. Functions that historically were performed by hardware-based switches, controllers and data plan infrastructure now are performed by open source code produced by groups like Open Daylight, OpenSwitch, and FD.IO.

Traditionally consortia created technical specifications and then users took these specifications and developed their own implementations (e.g., built their own hardware devices). Software presents the opportunity for collaborators to simply develop a shared implementation, making that implementation available to all potential implementers as open source software code.

Software has also become more important as ICT and other industries focus on broader systems-level interoperability, which is another underlying component of the Marc Andreesen vision. As a prescient U.S. Department of Defense report stated the issue: “System interoperability is what makes heterogeneous systems of systems a reality. All of these systems are composed of hardware and software. Hardware is not easily changed. Furthermore, fielded hardware systems often cannot be wholly replaced. Therefore, as a practical matter, interoperability is more easily achieved through software…” (Hamilton & Murtagh, 2000). This shift towards software also enables new opportunities, as stated by Carney et. al. (2005): “The potential rate of change for software components vastly exceeds that for hardware components. This flexibility is a direct result of software’s malleability; software is easier and cheaper to change, and it requires no retooling of production machinery.”

An increased focus on software-driven systems-level interoperability may also explain the emergence of “reference architectures” as a deliverable from consortia. For example, the OpenFog Consortium recently produced a reference architecture document designed to address “the need for an interoperable end-to-end data connectivity solution along the cloud-to-things continuum.” Much of this envisioned architecture relies on software to enable interoperable connections between system-level components.

Even organizations that have historically focused on more traditional specifications increasingly are recognizing the importance of software. The Internet Engineering Task Force (IETF) was a pioneer in this area, as their specification development process has long required concrete examples of “running code.” More recently, organizations like the Broadband Forum, MIPI Alliance and many other consortia have implemented policies and practices aimed at incorporating software into the specification development and implementation process.

One challenge associated with the use of open source software code as a tool for interoperability is that it is effective at the moment in time when the relevant industry stakeholders have agreed to implement the shared code, but interoperability is potentially inhibited if any party diverges from that code. This challenge is exacerbated by the fact that the ability to diverge is an express feature of the open source license model. By definition, open source code is licensed in a manner that permits parties to make changes to the code. When one party unilaterally changes code that underpins an interoperable system, however, interoperability can break.

Consortia have been developing solutions to address this problem. Some organizations have been developing software code first, closely followed by a specification that defines a canonical implementation of that code. IoTivity and the Open Connectivity Foundation follow this model, with the IoTivity project developing code, and the OCF creating a specification. An organization called Alljoyn had previously followed this model as well, tying certain trademark and patent licenses to use of the canonical specified version of the code in an effort to incentivize ongoing compliance with a standardized implementation.

The collision between the traditional hardware-focused specification development process and a software-centric approach to interoperability has resulted in a clash of development methodologies, intellectual property models, and ultimately of cultures. Traditional consortia often apply a disciplined, systems engineering style approach to development. Open source projects sometimes embody a development methodology caricatured as “move fast and break things,” but perhaps more fairly characterized by the IETF slogan “rough consensus and running code.” Further, the FRAND intellectual property model used by some consortia can clash with the common expectation of the open source community that deliverables will be implementable on a royalty free basis.

Some leading consortia increasingly appear to be adopting more software-like methodologies, even when creating traditional specifications. The Khronos Group, for example, makes its specifications available and manages inputs via the popular software repository tool called GitHub. The World Wide Web Consortium similarly uses GitHub as a development forum, and increasingly its specifications themselves blur the line between traditional textual specifications and software code.

As “software eats the world,” one part of the world that it appears to be eating is traditional consortia. Increasingly consortia produce software code deliverables, and software-oriented deliverables such as reference architectures. Further, the development methodologies, intellectual property models, and ultimately the culture of open source appear increasingly important in the development of interoperability solutions in the ICT industry.

C. The Remarkable Growth of Linux Foundation Illustrates These Structural and Substantive Changes

This Section III of this paper has made two main arguments: (1) the structure of consortia have evolved, and continue to evolve, to address a complex set of legal, tax and operational issues, and (2) open source software has increasingly become a key tool for ICT interoperability. The extraordinary growth of Linux Foundation serves as evidence in support of both of these points.

In 2013 Linux Foundation hosted 10 projects, including Linux itself. In early 2019 it hosts 156.  In comparison, ISTO has supported about 50 projects in its 20 year history, and currently supports 17. VTM Group, a leading provider of support services to independent contractual and incorporated consortia founded in the late 1990s, lists 87 past and current clients on its website. LF’s 150+ projects is thus a striking number.

By 2013 LF’s revenues were already on a steep upward curve that appears to have begun in about 2010. 2013 revenues were over US$23 million. Four years later revenue had nearly quadrupled, to US$81 million. In 2013 LF reported 39 employees; in 2017 it reported 178. This author maintains a database of 132 U.S.-based consortia and standards setting organizations that are tax exempt under Section 501(c)(6) of the U.S. tax code, and compiles information reported on each organizations’ IRS tax forms. No other organization comes even remotely close to Linux Foundation’s growth rate, either in absolute dollars, percentage growth, or growth in human resources. LF’s growth is extraordinary.

From a structural perspective, Linux Foundation appears to be delivering the set of legal, tax and operational solutions that meets the ICT industry demands. While it is perhaps too early to state definitively, the ‘Series LLC’ legal model pioneered initially by JDF and adopted by LF appears to an evolutionary legal innovation related to consortia that is sticking. More substantively, it appears that Linux Foundation’s roots as a software development organization are right for an era when software is increasingly the tool of choice for driving interoperability. At the same time, Linux Foundation appears to be driving a synthesis of the software and specification-oriented models. Back in 2013 described that LF collaborative projects must meet two criteria: “the use of open source governance best practices including license and contribution agreement choices in keeping with the ideals of Linux” and  “the project must have the potential to fuel innovation in an industry through collaborative software development.” Contrast that to the description of a recently-announced “umbrella project,” LF Edge, that combines several LF projects to target a sophisticated vision for systems-level interoperability leveraging both specifications (standards) and code (the LF website notes “the ultimate output being working code”):

LF Edge will create a common framework for hardware and software standards and best practices critical to sustaining current and future generations of IoT and edge devices. We are fostering collaboration and innovation across the multiple industries including industrial manufacturing, cities and government, energy, transportation, retail, home and building automation, automotive, logistics and health care ‒ all of which stand to be transformed by edge computing.

This paper has argued that interoperability is a fundamental ingredient to ICT industry success, and that consortia play a critical role in facilitating interoperability. It has suggested that consortia have evolved to address a complex array of legal, tax and operational issues, and are in the midst of a particularly acute moment of evolution as “software eats the world” and software increasingly becomes a key tool for interoperability. While Linux Foundation is far from the only player in the game, it has ridden these evolutionary trends in a remarkable, unique way. Any stakeholders considering the future of ICT standardization must consider the example of Linux Foundation and its increasingly outsized status as an industry leader.

IV. Europe in a New Era for Consortia

So, what does this all mean for Europe? One answer conceivably could be: nothing. That is, European companies and other European stakeholders are already deeply involved in the consortia and related processes that are described in this paper. Representatives from European companies participate in the leadership of nearly every major consortium. Some of the U.S.-based consortia described in this paper are primarily led by European interests, such as the LoRa Alliance and the OPC Foundation. Further, many consortia have deep, complementary relationships with European-based organizations like ISO, IEC and the ITU. One conclusion might be that the status quo is working for European stakeholders.  

At least for the European Commission, however doing nothing is not the plan. In several EC communications (European Commission, 2016a, 2016b), and in the study proposal referenced earlier, the EC has stated a clear goal to “accelerate the development of common standards and interoperable solutions” as part of its Digital Single Market strategy.

This paper concludes by drawing from the points made earlier to offer several observations that may be relevant to the EC and other European stakeholders as they pursue this goal. It highlights some particular areas where the EC may be able to make unique contributions in this evolving environment of interoperability needs.

Some observations and recommendations:

1. Recognize the limitations of formal standards development organizations. Thirty years of experience shows that formal standards development organizations have not been able to comprehensively meet the interoperability requirements of the ICT industry. Consortia have grown increasingly important over time. This trend is unlikely to reverse. Europe has strong formal standards organizations, and a desire to rely on these organizations to effectuate a standards strategy would be understandable. Experience shows it will not work.

2. Investigate why few private sector-led consortia have formed in Europe. Relatively few consortia have formed in Europe. Some potential explanations include:

3. Understand the critical role of open source software – including its royalty free license models and its culture. Software is increasingly fundamental to the creation of interoperable systems. Open source software development methodologies are different from traditional consortia models, and certainly very different from the standards development methodologies practiced by the formal European standards organizations. Open source developers will not see a need to change their processes to accommodate historic standardization approaches. Further, they will see little value in trying to impose FRAND licensing models in an environment where royalty free models have been demonstrably effective. If European stakeholders want to benefit from the capabilities of open source software, they must meet the open source community on the community’s own terms.

4. Selectively embrace the role of ‘convener’ to break logjams. Consortia have been successful in part because they compete with each other. This process has been effective, and policymakers and regulators should not be quick to intervene in this competition. At times, however, consortia with competing visions can deadlock. For example, while its likely too soon to tell if there is a deadlock, the Open Fog Consortium and its allies, and the various constituent groups of the LF Edge program, have developed competing reference implementations for edge computing. The EC and particularly its partner institutions may be uniquely positioned to convene stakeholders and facilitate discussions that would not happen – or may only happen slowly – if left to actors driven exclusively by pecuniary motives. Convening stakeholders, but letting market forces ultimately drive decisions, could be a helpful role.

5. Selectively embrace the role of ‘convener’ to identify cross-industry systems architecture needs. The EC documents referenced above rightly emphasize the increasing importance of “reference architectures,” particularly in the context of the broad systems-level changes implicated by the Marc Andreesen article. The EC and its partners may similarly be uniquely positioned to be able to convene cross-industry stakeholders to address systems level requirements in a manner that may be difficult for private sector actors. Consideration of the global context of many industries will be critical in this context: narrowly focusing only on European interests may inhibit long term success.

6. Continue to focus on testbeds and related compliance services.  As described in the referenced communications, the EC has made considerable investments in “testbeds” that enable various parties to inexpensively test the practical interoperability of their products and services in real-world scenarios. This is a smart approach. Practical, working-level interoperability is the ultimate goal of a standardization process, but the difficult work of accomplishing this is often under-resourced.  It is particularly challenging for small and medium-sized enterprises (SMEs). Offering testbed services as a public resource is a clever and unique solution to an important problem. The EC could conceivably build global standardization leadership off of these resources alone. Adding related services, such as formal compliance testing, potentially coupled with compliance logo licensing, is also worthy of consideration.

V. Conclusion

As Marc Andreesen suggested, developments in ICT are now poised to transform industries far beyond ICT. Consortia are a large part of what brought us to this point, and they will continue to play a critical role as interoperability requirements grow in complexity. Consortia themselves have evolved over time, and are continuing to change. Open source software is increasingly part of how interoperability happens, and today’s consortia reflect this. The emergence and rapid growth of the software-centric Linux Foundation is striking evidence of this new reality. This story holds important lessons for European stakeholders. Within this changed landscape there are opportunities for Europe to play a unique, globally-leading role.

About the author

C. Bradford Biddle is a Faculty Fellow with the Center for Law, Science and Innovation at the Sandra Day O'Connor College of Law, Arizona State University, and the founder of Biddle Law PC, a boutique law firm based in Portland, Oregon that specializes in providing legal support for standards-setting organizations, open source foundations and other technology consortia. He thanks the organizers and attendees at the 24th EURAS Annual Standardisation Conference, held in Rome, Italy in June 2019, for their support and insightful comments about an earlier version of this article.  

References

Armstrong, A., Mueller, J. J., & Syrett, T. D. (2014, May 29). The Smartphone Royalty Stack: Surveying Royalty Demands for the Components Within Modern Smartphones. Retrieved from https://ssrn.com/abstract=2443848 or https://dx.doi.org/10.2139/ssrn.2443848

Andreessen, M. (2011, August 20). Why Software Is Eating The World. The Wall Street Journal. Retrieved from https://www.wsj.com/articles/SB10001424053111903480904576512250915629460

Baron, J. & Spulber, D. F. (2018, February 2). Technology Standards and Standard Setting Organizations: Introduction to the Searle Center Database. Northwestern Law & Econ Research Paper No. 17-16. Retrieved from https://ssrn.com/abstract=3073165 or https://dx.doi.org/10.2139/ssrn.3073165

Biddle, C. (2017). No Standard for Standards: Understanding the ICT Standards-Development Ecosystem. In J. L. Contreras (Ed.), The Cambridge Handbook of Technical Standardization Law: Competition, Antitrust, and Patents (pp. 17-28). Cambridge: Cambridge University Press.

Biddle, C., White, A., & Woods, S. (2010). How many standards in a laptop? (And other empirical questions). In 2010 ITU-T Kaleidoscope: Beyond the Internet?: Innovations for Future Networks and Services (pp. 123-30). Pune: ITU. Retrieved from https://ieeexplore.ieee.org/document/5682128

Bluetooth SIG, Inc. v. United States. (2010, July 8). Law.com. Retrieved from https://www.law.com/almID/1202463375482/?slreturn=20190309032730

Cargill, C. F. (1989). Information Technology Standardization: Theory, Process, and Organizations. Newton, MA: Digital Press.

Carney, D. J., Fisher, D., Morris, E. J. & Place P. R. (2005). Some Current Approaches to Interoperability. Retrieved from https://resources.sei.cmu.edu/library/asset-view.cfm?assetid=7511

CBR Staff Writer. (1989, August 21). Corporation for Open Systems Fires Half Its Staff. Computer Business Review. Retrieved from https://www.cbronline.com/news/corporation_for_open_systems_fires_half_its_staff/

European Commission. (2016a). ICT Standardisation Priorities for the Digital Single Market (COM(2016) 176 final). Brussels: The European Economic and Social Committee and the Committee of the Regions. Retrieved from https://ec.europa.eu/digital-single-market/en/news/communication-ict-standardisation-priorities-digital-single-market

European Commission. (2016b). Digitising European Industry Reaping the Full Benefits of a Digital Single Market (COM(2016) 180 final). Brussels: The European Economic and Social Committee and the Committee of the Regions. Retrieved from https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:52016DC0180

European Commission. (2018). Study on Technological and Economic Analysis of Industry Agreements in Current and Future Digital Value Chains (SMART 2018/0003).

Hamilton Jr., J. A. & Murtagh, J. L. (2000). Enabling Interoperability Via Software Architecture. Retrieved from https://apps.dtic.mil/dtic/tr/fulltext/u2/a458021.pdf

Huawei. (2018). Tap Into New Growth With Intelligent Connectivity: Mapping your transformation into a digital economy with GCI 2018. Retrieved from https://www.huawei.com/minisite/gci/assets/files/gci_2018_whitepaper_en.pdf?v=20180914

Mähönen, P. (1999). The Standardization Process in IT – Too Slow or Too Fast? In K. Jakobs (Ed.), Information Technology Standards and Standardization: A Global Perspective (pp. 35-47). Hershey, PA: IGI Global.

NV. (2014, July 14). In Praise of the Humble USB. The Economist. Retrieved from https://www.economist.com/babbage/2014/07/14/in-praise-of-the-humble-usb  

 

 

Licence and Attribution

This paper was published in the Journal of Open Law, Technology, & Society, Volume 11, Issue 1 (2019). It originally appeared online at http://www.jolts.world

This article should be cited as follows:

Biddle, C. Bradford (2019) Linux Foundation is Eating the World, Journal of Open Law, Technology, & Society, 11(1), pp 57 – 74
DOI: 105033/jolts.v11i1.137

Copyright © 2019 C. Bradford Biddle.

This article is licensed under a Creative Commons Attribution 4.0 CC-BY available at

 
 

1The website consortiuminfo.org lists over 1000 organizations, but some are formal standards development organizations rather than consortia. This author maintains a database that includes additional consortia that are not listed at consortiuminfo.org. Others likely exist that are included in neither database. This author’s best guess is that with modest effort one could specifically identify about 1000 past and current ICT consortia.

2The case is nicely summarized in Bluetooth SIG, Inc. v. United States (2010). To this author’s surprise, the case appears to have had little impact on the tax treatment of other U.S. technology consortia.  

3The National Cooperative Research and Production Act of 1993 provides certain antitrust liability protections to joint ventures and the Standards Development Organization Advancement Act of 2004 extended the provisions of the NCRPA to standards development organizations. The precise application of these statutes to consortia raises complex questions that won’t be addressed here. As a practical matter, whether as a result of these statutes or otherwise, generally consortia seem to avoided significant antitrust scrutiny.