Editor's note: A wide range of broadband access network infrastructure considerations are being driven by a mix of emerging technologies, competition, consumer and enterprise usage patterns, infrastructure costs, service pricing constraints, 4G security
concerns and regulation -- to name a few. As a result, carrier interest in driving more intelligence to the edge of the network is growing in order to make sure networks aren't overloaded when trying to meet new user needs. Tom Nolle, president of CIMI Corp., looks at the forces changing how broadband services are created and maintained.
When broadband access to the Internet was first introduced, it seemed a simple and profitable extension to dial-up Internet. But broadband service characteristics, regulations, competition and technology choices have proven dynamic and inter-reactive. Now new and even more dramatic pressures are being generated in each of these areas, and there is every reason to believe they will transform the way broadband services for consumers and businesses are created and sustained.
Security is perhaps the most significant issue facing broadband access network infrastructure planners, and mobile services are a major driver.
The Internet has created a mass data market, one whose total market value is far larger than that of the enterprise market that dominated data services in the past. The Internet has also transformed the enterprise view of the network from something that supports inter-company connectivity to something that provides a company with access to customers, computing and storage resources, applications and content. The minute-by-minute capacity requirements of enterprise branch networking were relatively easy to estimate, and so networks were based on pipes of fixed sizes and fixed access commitments.
Can access speed and size make peace with broadband service economics?
Modern enterprise and consumer networking is all about "headroom," with the goal to provide a pipe sufficient to serve everything the user might be interested in buying to maximize average revenue per user (ARPU).
The question has become, "How high can you go?" Online service and content providers like Google and Salesforce.com would like to see access bandwidth rise sharply to encourage service consumption. But operators have long known that user willingness to pay for bandwidth isn't proportional to speed.
Some studies have shown that enterprises and consumers are willing to pay about 50% more to double their access capacity the first time, but for the next doubling, they are willing to pay only 10% more, and beyond that, less than 5%. In addition, consumers tend to cluster in the lowest-priced service tiers, with less than 10% taking the highest available rates even in favorable markets. That makes it difficult to justify extensive upgrades in access infrastructure to increase access speed.
More than any other market force, competition has been the most consistent driver of access infrastructure upgrades. Where multi-mode competition exists (cable and telco, fixed wireless and wireline), operators have found it easiest to promote services by selling access speed. In the U.S., for example, cable companies transitioning to DOCSIS 3.0 technology can offer speeds from 50 to nearly 500 Mbps, and 100 Mbps cable broadband is already available. This is forcing telecom service providers to consider fiber to the home (FTTH) and VDSL technology to improve their older DSL access speeds.
Regulators consider variety of broadband access policies
Regulatory policy has joined competition in many areas to influence broadband access decisions—sometimes with radical results. In many parts of Europe, in Australia and in the U.S., regulators are concerned about the natural market's ability to provide quality broadband services and have opened inquiries or proposed projects to augment market behavior.
One idea that has gained appeal in many areas is the participation of communities, states or even countries (Australia, for example) in a not-for-profit access consortium that would then make the connections available to users. Google's much-publicized broadband initiative is a program to encourage municipal broadband tests and trials.
Traffic management concerns hover over access to certain applications
A related issue that crops up in broadband access networking is traffic management. Most consumer broadband is heavily oversubscribed, meaning that the capacity of the aggregation portion of the access network is less than the sum of the user access connections. Congestion will occur in periods of heavy use, and that could impact applications like video and voice over IP (VoIP) that demand stable Quality of Service (QoS). In addition, 4G wireless backhaul using generic mobile infrastructure virtually demands some control over QoS.
Regulators are concerned that traffic management could be used to poison the performance of some applications because they compete with the provider's own services or because they aren't considered "good" applications by the provider. Regulation on net neutrality is widespread, and in the meantime, it is necessary to have very flexible traffic management capabilities to be able to match whatever restrictions emerge.
Mobile security could become valuable wireless offering for carriers
Security is perhaps the most significant issue facing broadband access network infrastructure planners, and mobile services are a major driver. Mobile security is a potentially valuable service for operators to sell, in part because mobile devices would normally have fewer on-device security tools available than would be offered on PCs. The GPS capability increasingly included in mobile phones could also present, if hacked, a personal stalking risk to users, and mobile infrastructure components themselves could be attacked by hackers.
4G center stage in access planning for new user behavior
The transition to 4G wireless is a major factor in access infrastructure planning for many operators, in general, and femtocells are particularly important. Mobile technology, spurred by smartphones, eBook readers and tablets, is expected to create a whole new set of user behaviors and an associated set of service opportunities. These new services are most likely to be used in "hospitality" locations where femtocells could provide a quick 4G footprint and at the same time tap the heaviest content traffic off normal 4G cells. This will mean delivering 4G backhaul over traditional access, and that is likely to create both security and QoS issues for access connections.
Combined forces point to need for intelligent edge technology
All of this is combining to create a significant interest in "intelligent edge" technology in the access and even metro networks. Not only can these devices be used to add packet inspection, policy management, and security to services close to the point of user connection, they can also steer traffic to offload Internet content from premium mobile elements and even perform some of the special functions of the evolved packet core. The trend to utilize general-purpose edge devices with augmented application hosting capabilities for these missions is driven by a demand for lower capex and by the dynamic nature of the current market.
That dynamism is likely to continue and even to accelerate. Regulatory and competitive pressures have yet to peak in any of the major market areas, and the highly dynamic mobile market is clearly going to drive not only a direct change in mobile-to-metro traffic planning, but also changes in user behavior and user services. These will then create new demands on access bandwidth at all levels. In all, the current decade may be even more tumultuous than the last.
About the author: Tom Nolle is president of CIMI Corporation, a strategic consulting firm specializing in telecommunications and data communications since 1982. He is the publisher of Netwatcher, a journal addressing advanced telecommunications strategy issues. Check out his SearchTelecom.com networking blog Uncommon Wisdom.
This was first published in February 2010