Notes from the European Commission's Workshop on
"Research & Technology Development and Spectrum Policy"

in Brussels on 23 November 2004, organised by Information Society Directorate General Units D1 ("Communication and Network Technologies") and B4 ("Spectrum Policy"). Written contributions and slide presentations from the workshop will be posted soon. The workshop agenda is already online.

Dr. Rosalie Zobel opened the discussion. She is one of the directors of INFSO, responsible for IT in health, transport, the environment and embedded systems (e.g. avionics). "If interference did not exist there would be no reason to undertake spectrum management," she noted. A corollary is that changes in ways of dealing with interference change the options for managing spectrum. The key question now is how to eliminate the "bottleneck" in spectrum access. The EC will ask member states to consider new flexible spectrum policies, "so you can expect a higher political attention to spectrum matters over the coming months... A very wide range of stakeholders is required to generate momentum for change." Spectrum liberalisation was left out of the original Lisbon goals, but will be taken up in the midterm review at end of 2005.

She was followed by two keynote speakers:

Thomas Hazlett criticized both traditional regulatory approaches and the "new new thinking" (i.e. open spectrum; "new new thinking" is not to be confused with the "new thinking" of Ronald Coase - which holds that spectrum management should be based on property rights and market forces). Confusion over whether spectrum is scarce and/or abundant, property and/or non-property, is "schizophrenic and contagious," typical of current regulatory thinking around the world. But he asserted that the "tremendous success" of wi-fi is not a rationale for discarding the old framework of spectrum management and replacing it with unlicensed commons. He showed a table with estimates of the revenue generated by equipment sales and bandwidth exploitation in various licensed and unlicensed bands. About 3 percent of European national income is now due to wireless technology, and mobile telephony's revenues far exceed all other services. The "marginal social value" of adding frequencies to existing allocations is huge, he said, and usually not considered in regulators' estimates of spectrum's economic value. He cited the example of land mobile business radio where a nationwide license for 10 MHz in the US would probably be priced at about $5 billion, while its "marginal social value" probably exceeds $50 billion.

David Cleevely, founder of England's telecom consultancy Analysys and member of the UK Spectrum Management Advisory Group, was the other keynoter. He pointed out that in radio, "economies of density are more important than economies of scale." Over the next 25 years, "cognitive radio" (transceivers aware of their location and with a database of frequency assignments, which can detect channel vacancies and nearby signals and respond appropriately) will completely change the way we think about and regulate the radio spectrum. If you can tolerate latency, mesh networks are profoundly efficient in terms of both bandwidth and battery use. He pointed out that the optimum configuration for many communication systems is a local wireless access zone connected by wire to another wireless access zone, so it is important to look at the economics of the whole system, not just the wireless segment. He took issue with Hazlett's comments on "marginal social value," citing a study for the UK's Radiocommunication Authority which found that giving a network more spectrum initially reduces total equipment costs by a large amount, but after a certain point, the value added by more spectrum is almost nil. Conclusion: giving a network more bandwidth than it needs is a waste of spectrum. The trick is knowing ahead of time how much spectrum it needs.

The first panel, on "Future Spectrum Needs," opened with Pierre de Vries, Microsoft's Chief of Technology Incubation. He focussed on mesh networks and the wide deployment of license-exempt devices. He has also been looking at why broadband access is good. He cited Keith Hampton and Barry Wellman's "Netville Wired Neighborhood Study" (2000), which found that "access to a high-speed local network encourages greater community involvement, expands and strengthens local relationships with neighbors and family, and helps maintain ties with friends and relatives living farther away." The economic benefit to businesses using wireless networks is about $100 per employee per month, compared to $400/year for home use (no source cited for these figures). De Vries pointed out that licensed nets usually have a different business model than unlicensed: you pay for service on licensed nets, while for unlicensed, you pay for equipment and get service for free. Microsoft is looking at self-managing meshes and at improving receiver designs to get more signal range without more emitted power. De Vries sees a big future for "tribal meshes" - e.g. groups of teenagers always in contact with each other as they individually move around.

Ruprecht Niepold, head of INFSO's Spectrum Policy Unit, suggested that regulators might take population density into account when developing band plans and allocation tables.

Jim Connolly, chairman of CEPT's working group on frequency management and senior spectrum manager at Ireland's ComReg, sees "converging wireless access platforms" as a race between RTD and spectrum policy. A common European table of allocations has been under development for 10 years already and is to be implemented by 2008. He noted that ultrawideband is a very disruptive technology and he showed an interesting chart from CEPT ECC PT6's draft report on interference protection criteria for UWB below 10.6 GHz. How can regulators react more quickly to technological change? Develop closer links with RTD. Develop early warning systems (the CEPT-ETSI relationship is a good example). One way to introduce more flexibility into regulation is with global interference protection criteria or spectrum "masks" for specific bands rather than for specific services, so that a device could use a band for any purpose so long as it doesn't violate the criteria. Another way to keep up with technology change is by issuing shorter licenses, but that must be balanced against investment costs and returns. Flexibility is good - but don't forget the benefits of consistency and regional harmonization.

Kevin C. Kahn, Director of Intel's Communications Technology Lab, then spoke on "Technological Convergence meets Sector-Based Regulation." Users care about applications, not about channels, and they assume that all prices trend toward zero. The implication of computing power always getting cheaper is to push more system features from hardware into software, and that has great consequences for standards. Fully "software defined radio" is not yet cost effective, but it probably will be in 20 years. So during the next 20 years, more and more firms will try to create software defined radios. License exemption makes sense in short-range and low-density deployments, but in high-density or long-range situations, licensing allows for more sophisticated and efficient management. The FCC's rulemaking for UWB took 4 years, which was quite fast for a regulator, but during that time Intel's designs for UWB hardware went through 2.5 generations. This illustrates that high-tech firms operate at a very different pace than regulators. His recommendation on flexibility is to avoid requiring specific channelizations be used within bands.

Edoardo Marelli, chief of the European Space Agency's frequency management office and since 2002 head of the international Space Frequency Coordination Group. His topic was "Are the new regulatory trends in spectrum management good for the satellite, scientific and the technology research worlds?" His colleagues are in a major fight against "paper satellites" - systems registered at the ITU but not under serious development. These have claimed all frequencies in all bands at all orbital positions, which makes it difficult to assess the actual degree of congestion and real future needs. He noted that there is no possibility to regulate the density of unlicensed devices once they are authorized. That implies extra care must be taken in authorization: "We cannot have a trial-and-error approach because there are no corrective actions." Software defined radio will make it very difficult for any regulator to verify that SDR emissions are kept within limits - the device is open to uncontrolled user modifications. License-exempt SDRs would be particularly risky. Satellite radionavigation and passive sensing would be the most vulnerable to unauthorised SDR modifications. A problem that most space systems face is that national regulations tend to ignore the needs of transnational systems. No unilateral regulatory moves, please!

The 2nd panel, on "Trends in Spectrum Management," began with Richard Engelman, chief engineer in the FCC's International Bureau. He gave an overview of trends in US spectrum policy. One of the recommendations in the National Telecommunications and Information Administration's spectrum policy report was for a band to be designated as a technology test-bed, for experimental use only. He predicted that we will see more license-exempt devices, but the problem is that there aren't many un-allocated bands. There is an "ongoing tension between demands for unlicensed and for flexibly licensed spectrum," as well as different ideas on how to introduce flexibility.

Hugh Milloy heads the spectrum management policy group at the Australian Communications Authority. They have issued about 2.7 GHz-worth of licenses. They rely primarily on auctions of 15-year licenses in pre-cleared spectrum. They also charge an "apparatus fee" according to a formula based on geography, band congestion, power and bandwidth. The income from apparatus licenses is about 4 times the cost of regulation. "Spectrum trading is not going as well as we would like," he said. Most trades are between companies in the same group, rather than between different groups. They are discussing a proposal to "sublet" management responsibilities to the private sector: nongovernmental "band managers" would be responsible for granting access, setting fees and resolving interference complaints. Australia's Defence Department is charged now for its spectrum use, and they are moving toward a system in which Defence must get licences for their systems.

Gerry Oberst, a telecom lawyer from the firm of Hogan & Hartson, spoke next on "Managing Interference in a Smarter World." Who is getting smarter? he asked ironically. The key question for regulators of spectrum trading is how to set interference thresholds. In his view, interference studies always require an impartial referee to assess whether the claimed interference risks are real.

After the lunch break, there was a panel on the "Contribution of EU Research." Per Dofnas, Ericsson's Director of Technical Regulations, and Nicolas Demassieux, Director of Motorola's European Communications Research Lab, reported on their EC-funded work, while Henrik Abramowicz, Director of "Beyond 3G and IP Networks" for Ericsson Research spoke on "Prospects for an 'Always Best-Connected Network.'" A 22-million-euros Ambient Networks project started a year ago and will continue for 2 more years. Coordinated by Ericsson with 41 partners, it is part of the EU's 6th Framework Programme. The "Wireless World Initiative," established in April 2002, is another one of their research activities. It is a cooperation between vendors, operators and regulators, with about 100 partners, including Siemens, Motorola, Ericsson and Nokia. (Pekka Ojanen added his thoughts on the relationship between research and national regulation, but unfortunately I didn't take any notes.)

During the question period, someone in the audience from Motorola prompted Ruprecht Niepold to remark that the EC is paying for a study predicting and quantifying future spectrum demands which is due in February-March 2005. A public workshop on 11 January 2005 will discuss 3 possible spectrum-demand scenarios. Then a German expert on RF compatibility asked about the EC's desire to see many more experimental licenses issued. He rarely sees any applications for experimental licenses, he said.

The Workshop's final panel tackled the questions "What are the challenges for radio regulators and researchers in next 5-10 years? What way forward for Europe?" Gérard Pogorel, from France's Ecole Nationale Superieure des Telecommunications, explored the recurring themes of flexibility, harmonization and innovation. There are two ways of achieving flexibility, he said: one is by fostering market mechanisms - but do we mean making use of existing property rights, or coming up with new ones? And how can property rights be kept flexible? The second way to be flexible is with unlicensed spectrum commons. But are commons suitable only for niche markets and early-stage technologies, or can they work for any service at any stage? As for harmonization, should each EU member be responsible for its own process of harmonization, or should this be set at the regional level? As for innovation - what is the most dynamic "regulatory mix" to match the rate of technology innovation? Mike Goddard, the new head of the EU's Radio Spectrum Policy Group reported that the reconfigured RSPG met last Friday to adopt an opinion on spectrum trading rights. Critical of past harmonization efforts, he said, "Looking to the future, the question we have to ask ourselves is, what are the absolute minimum constraints that must be put on spectrum use?" Kevin Kahn of Intel added that simple "chaotic" systems tend to beat elaborately planned systems in the marketplace. So rather than look for the perfect regulatory solution, try to let the market decide. Tom Hazlett also emphasized the need to rely on market solutions, but David Cleevely disagreed with him that granting licensees full control over their spectrum will promote innovation and yield social benefits as effectively as other models.

Closing the Workshop, Francisco de Albuquerque, head of INFSO's Unit for Communication and Networking Technology (D1) pointed out that the "efficient use of spectrum" is a 6th Framework IST Thematic Priority. The updated work programme for 2005-6 was approved last month: 138 million euros is allocated for research on wireless and mobile systems beyond 3G, while the total IST programme is over 1.1 billion euros.

Ruprecht Niepold summed up: there is almost unanimous agreement that spectrum reform is needed and the time is now. It is no longer enough to say we need a mix of approaches, "we have to start hitting nails on the head" - applying specific approaches to specific services. He was impressed by statements during the workshop that apparently different approaches to spectrum management - unlicensed commons vs. licenses as property - actually have elements in common. What Europe needs is "harmonization of flexibility." But how much flexibility do we need? Referring to the similar but differing charts from Hazlett and Cleevely showing spectrum valuations, he asked what is the formula for deciding how much spectrum is appropriate for a given technology? And what are the proper roles of the national regulators and the EC? "Apparently there is a market for this type of workshop" so they will hold them from time to time, perhaps with more specific topics.

---Robert Horvitz

Open Spectrum International's homepage