With the broad adoption of personal computing, we have witnessed more than a quarter of a century of staggering incremental improvement in data processing power. These benefits have not only touched the traditional desktop. Smaller form factors such as laptops, netbooks, and now tablets and smart phones, are reaping the benefits of ever increasing clock speeds complimented by multiple core processors. In parallel, memory has become faster and cheaper.

A case in point is Apple’s iPad. Launched a year ago, the original iPad had a 1 GHz single core processor. A mere year later, Apple last week announced iPad 2 which boasts a dual core processor along with a nine fold increase in graphics processing capability. All of this at the same price yet in a form factor 1/3 thinner than its still novel predecessor. And a year later, Apple no longer owns the entire tablet market. Familiar names such as HPLGMotorolaRIMand Samsung are offering tablets with impressive specifications — all supported by powerful dual core processors.

The increase in processing speed, memory capacity and other performance related specifications align with Intel co-founder Gordon E. Moore’s law which essentially asserts that processing and memory performance improves exponentially per unit cost over the course of roughly one year. In addition, while battery technology is comparatively slow in its evolution, we’ve seen enormous improvements in power efficiency in microprocessors and RAM – allowing for device portability. Deloitte predicts that smart phones and tablets will outsell all other computer categories combined in 2011. Device portability is now an expectation of the consumer, and increasingly the enterprise as well.

With all this horsepower in the hands of the user, why is cloud computing so compelling? While the three previous installments in this series touched on cloud computing benefits such as real time collaboration, ubiquitous access to applications and user files on any device, perhaps the most compelling attraction is the exceptionally low cost of entry. Cloud computing user devices need be nothing more than a hardware platform functioning as an ultra thin client. Equally attractive, cloud computing is client platform agnostic – both from a hardware and operating system perspective.

For example, a user at head office on the east coast creates a spreadsheet in the cloud using her office notebook running Windows. Later on at lunch, she reviews the spreadsheet on her Xoom tablet and makes a few changes before discussing it with her colleague out on the west coast. Later on and now from home, that same user accesses the spreadsheet on her brand new MacBook Pro running OS/10. As she makes some final changes, her colleague from the west coast has the spreadsheet loaded on his office desktop running Windows. Through real time collaboration he adds the remaining numbers — the spreadsheet now ready for review by the CEO. The CEO is on an ecotour in Central America but is able to stop in a small village where there’s an Internet cafe. On an old PC running Windows 98 and with dial-up Internet access, the CEO pulls up the spreadsheet, reviews it, adds some comments and returns to his adventure.

Combining portability with a more ‘traditional’ user interface such as a low cost netbook is a very good platform for cloud based office productivity applications such as spreadsheet and document preparation. Even presentations are simple to prepare using cloud based applications.

Impact on the Network Operator

As the chart below depicts, cloud computing transfers virtually all of the burden away from the consumer and into the hands of the host (most often a webco), along with the network operator/carrier.

Cost Distribution of Cloud Computing

Clearly the end user enjoys very low fixed and variable costs. With service delivery via the Internet, virtually any device with a standards compliant browser can be used. In addition, cloud oriented ‘apps’ for smart phones and tablets continue to emerge – almost on a daily basis.

The aggregate cost burden for cloud computing service delivery (both capital and operational) is largely absorbed by the host webco and/or the network operator. With that in mind, cost mitigation and monetization strategies are being investigated by webcos and network operators alike.

Cloud Computing Cost Distribution

For network operators, an opportunity to repatriate some lost revenue from over-the-top users is one possibility. Many cloud computing webcos see benefit in dispersing their hardware assets beyond their own data centres. In the trend towards network edge oriented service delivery, installing an instance of the webco’s cloud services in a network operator’s facilities is becoming a compelling idea. This approach increases redundancy and geographic diversity for the webco, but it also disperses the global cost burden.

In turn, the network operator benefits from revenue sharing, or some other revenue generating mechanism. Co-branding, along with other enhanced marketing opportunities also become possible under such collaboration.

As the industry has learned in the past decade however, it is essential the user experience of the cloud service not be compromised in attempts to build walled gardens, or through attempts to offer reverse over-the-top services in competition with the webco itself. Users are sophisticated and know they have a choice. Importantly, users typically associate cloud computing value with the webco as opposed to the network operator. The enormous success of smart phone ‘apps’ stores offered by Apple, Google, RIM and others demonstrate that network operators are in fact cognizant of where their value is and equally important, where it isn’t. With that in mind, a great opportunity for network operator/webco collaboration awaits.

As a wholesale network operator in Canada, WireIE is capable of hosting Cloud services as a complement to our Transparent Ethernet Solutions.

Cloud applications are wide and varied. Household names such as Facebook and Twitter are cloud based as are content management systems such as WordPress. Netflix, another house hold name, streams video to millions of viewers from its servers based in the cloud. At the other end of the spectrum are advanced IT oriented cloud services such as Cisco’s OverDrive network virtualization services. OverDrive virtualizes routing, switching, security and access control in the cloud.

The general consensus is that MSN’s Hotmail was the original cloud computing service – although it wasn’t regarded as such when it launched in July, 1996. Google raised the bar in terms of capability by introducing their Docs & Spreadsheets (now simply called Google Docs) cloud service. Taking direct aim at Microsoft’s hold on the Office Suite space, Google Docs offered less functionality – the thinking being that a simplified feature set is actually an advantage for the vast majority of users. Studies have shown that 80 percent of the traditional desktop application user community only uses 20 percent of the available features. The busyness of the user interface becomes an impediment for these users. Offsetting the “dumbed-down” feature set is the ability to:

  • Collaborate on a file with other people on a real time basis regardless of where the participants are located.
  • Access the documents from any browser on any OS from anywhere there is Internet connectivity.
  • Use Google Docs at no charge.
  • Know that you will always be using the latest, most secure version of the application.
  • Know that user file backup practices offered by Google are going to be more reliable and secure than those followed in many homes and businesses.

Microsoft’s Office 365 offers tight integration between its desktop software model and its cloud services – essentially the best of both worlds – a richer feature set combined with the benefits of working in the cloud.

Dropbox and Carbonite, on the other hand, offer a more basic service by providing automatic, unattended synchronization and back up of user files to the cloud. Encryption options are available as are file sharing options with Dropbox.

The following video from the Pentasoft Channel describes the philosophy of cloud computing by concentrating on the three pillars of:

  • Virtualization
  • Utility Computing – Distributed Server Capacity
  • Software as a Service (SaaS)

Many consumer oriented cloud services predate Google Docs. Photo storage and sharing sites such as Flickr and Picasa have been around for years now. Even processor-intensive applications such as Photoshop have a cloud based repository and editing environment. Video editing, arguably one of the most bandwidth-demanding, processor intensive applications, is available in the cloud from the likes of YouTube Video Editor and Kaltura.

As the World Wide Web rapidly evolves to HTML5 many resources currently found in a client operating system are being moved out to the cloud. A simple example is cloud based fonts. Prior to HTML5, a web designer was limited to the fonts residing in the site visitor’s operating system. Among many other things, HTML5 allows new font sets to be loaded from the cloud. In fact, as we move to HTML5, the very tools used to develop websites are moving to the cloud.

An intriguing concept is Google Cloud Print. As a companion to Google Chrome, Cloud Print places printer drivers and security credentials in the cloud. Printers are then mapped to the appropriate cloud profile. Not only does this enable printing from virtually any computer anywhere, it also has the potential to redefine the way we use legacy services such as facsimile and the postal service.

In April 2010, the Eyjafjallajökull ice cap in Iceland erupted causing days of flight cancellations and delays for both passengers and air cargo. Some of the affected cargo was trans-Atlantic mail. Had we evolved to a cloud print world, much of the mail would have been unaffected because it would have printed locally – be it at a postal centre, or at the actual addressee’s home or office.

The world of Cloud Computing is advancing rapidly. Derrick Harris of Gigaom recently assembled a list of 8 cloud companies he feels we should be watching in 2011. Just click hereto read his analysis.

In our final installment we’ll take a look at the bandwidth implications as a result of the boom in cloud computing.

This post is the first in a series on Cloud Computing from the point of view of the network operator. We’ll provide an overview of the current cloudscape including the prominent players and their services. The series will wrap up with a discussion on bandwidth considerations for the network operator.


Desktop computing supported by the Local Area Network (LAN) has served business very well over the past couple of decades. The evolution of Ethernet has seen LAN speed and performance increase exponentially, while the adoption of IP has allowed us to internetwork LANs. This fundamental infrastructure hastened adoption of the World Wide Web – giving the human race access to infinite sources of knowledge, information, entertainment and social interaction. While it may seem hard to improve on such an incredible series of events, two related developments have exposed some constraints, and with them, opportunities.

The first is the progression in portability and mobility of end user devices. Laptop computers have become lighter and smaller while also becoming more computationally powerful and battery efficient. Today, the laptop shares the human productivity stage with smart phones and tablets. Clearly, device portability has become a fundamental user expectation.

The second is the profound evolution of cellular-based wireless network technology. First generation AMPS networks were launched as LANs were in their infancy. We now live in a 3G+ packet switched wireless world where data speeds on these networks rival landline data services of not too many years ago.

It is the portability of end user devices, combined with the performance and ubiquity of data networks, that has fueled the adoption of Cloud Computing. From a business perspective, the key driver is for many webcos (such as Google and Amazon) is to enhance their core offering through value added services in the cloud.

Today, operators in many parts of the world are supporting multi-generation wireless networks. As deployment of next generation network access technologies gain momentum, many operators will endure a transition period where legacy services based in the time domain will need to coexist with pure IP oriented packet data services.

With this in mind, network backhaul planning and design have become much more strategic than in the recent past. So much so that three out of four mobile operators are choosing backhaul network solutions independent of other network infrastructure.

This brief video with Heavy Reading’s Patrick Donegan and Shailesh Shukla of Cisco, speaks to the trend and provides insight into the countless benefits of evaluating backhaul solutions based on their own merits. The conversation is particularly germane as network operators continue the move towards fixed-mobile convergence.

WireIE’s Carrier Grade Network Extension solutions use the latest in Ethernet Radio technology. The solution’s inherent support for IP traffic is augmented by an array of configurable options in support of legacy TDM requirements.

Mary Meeker, often dubbed ‘Queen of the Internet’, presented a number of insights on trends in media, social networking, and mobile broadband usage at this week’s Web 2.0 Summit in San Francisco.

From the rapid redefinition of how media is delivered and consumed, to the impact social networking is having on our society, Meeker provides a concise yet comprehensive view into our rapidly changing world. Mobile networks, and the implicit requirement for broadband access and backhaul, are essential ingredients in support of virtually all the trends Meeker discusses. In fact, mobile in of itself is characterized as “ramping up faster than any new, new thing”.

Backed up by an ample helping of salient market data, Meeker discusses:

  • Globality
  • Mobile
  • Social Ecosystems
  • Advertising
  • Commerce
  • Media
  • Company Leadership Evolution
  • Steve Jobs
  • Ferocious Pace of Change in Tech

Meeker’s presentation is provided here courtesy of O’Reilly Media.

The digitization of US over-the-air television channels has afforded a twofold opportunity to expand broadband access in that country.

The most publicized was the reassigning of 108 MHz of spectrum at the ‘top’ of the former UHF analogue television band in what broadly became known as the 2008 700 MHz auction. The auction resulted in the coveted “B” (22 MHz) and “C” (12 MHz) blocks going to AT&T and Verizon respectively. The band will primarily be used for LTE as well as 3G overflow in the shorter term.

The other wireless broadband access opportunity resulting from the digitization of broadcast television is arguably more abstract. The FCC recently ruled in favour on the somewhat contentious concept of using the remaining 222 MHz of UHF, and 42 MHz of VHF digital television broadcast spectrum on shared basis with a broadband access technology commonly referred to as White Spaces.

Also touted as “Wi-Fi on steroids”, White Spaces will be an unlicensed radio access technology in which wireless broadband sessions occur on digital television guard band frequencies. Unassigned digital television channels can also be used by White Spaces technology yielding throughputs of up to 11 Mbps per device.

The philosophy behind how White Spaces manage RF spectrum is what the FCC refers to as “opportunistic” under a mechanism known as Spectrum Sensing Cognitive Radio (IEEE 802.22). The protocol also allows for database querying in addition to RF sensing – thus giving television broadcasters a second layer of protection from potential interference. This works by the location of the White Spaces device (based on its GPS coordinates) being reconciled with digital television transmitter/antenna values in the database. RF interference threshold parameters dictate whether the White Spaces device can use the channel.

This highly dynamic, in-the-moment approach to RF spectrum assignment is intriguing in that spectrum utilization is continually optimized based on many key parameters in the RF environment. In fact, this could be a bellwether for current generation radio access technologies where demand for spectrum is outstripping available bandwidth.

While the 2008 spectrum auction has given the successful bidders access to the favourable propagation of 700 MHz, White Spaces is permitted to operate across the entirety of both digital television bands (174-216 MHz VHF and 470-698 MHz UHF). In light of the propagation characteristics unique to each of these bands, and the comparatively ad hoc nature of the White Spaces approach to spectrum utilization, radio access can be exposed to a number of propagation anomalies that will temporarily enhance or degrade performance. Fortunately, the Spectrum Sensing Cognitive Radio protocol should mitigate the degradations and exploit the enhancements in signal propagation.

Network Extensions an Essential Piece of the Puzzle

As jurisdictions outside the US complete their transition to digital television, White Spaces will become broadly available as a new and innovative broadband wireless access technology. At that time, many regions will also be repurposing retired analogue television spectrum for 4G broadband wireless services. With these broadband access technologies based on IP, Ethernet Radio is an obvious choice for backhauling these dense access networks. WireIE is a leader in Network Extensions based on Ethernet Radio. We encourage you to visit here for more information.

By George Kaichis

Looking for ways to maintain profitability as an operator on your way to 4G/LTE? WireIE’s George Kaichis (Director, Radio Network Services) has some tips to help your company get there.

In order to meet the projected spike in demand and quality of data services, operators will ultimately need to migrate their networks and businesses to 4G/LTE. However, WireIE recognizes that most operators will not have the capital available to upgrade their networks and therefore suggest the following to ease the transition for operators:

  • Outsource non-core activities, particularly around the deployment and operation of your networks
  • Deploy a hosted 3G network
  • Sell operator-owned towers to tower companies and lease back space for your equipment
  • Sell microwave assets to wholesale backhaul providers and lease them back
  • Preserve roaming revenue through RF Optimization, site audits and KPI monitoring in order to maximize network capacity and performance

George recently wrote an article that was published in Cancion Magazine, a quarterly journal issued by CANTO (The Caribbean Association of National Telecommunication Organizations). The article expands upon these different measures that we believe will help operators handle the ever increasing consumer demand for higher speed data services while also maintaining profitability.

WireIE President & CEO Rob Barlow today formally commented to the federal government on it’s consultation entitled: “Improving Canada’s Digital Advantage, Strategies for Sustainable Prosperity.”

Removing Broadband Deployment Obstacles in Rural Canada

I appreciate the opportunity to share my views in response to the Government of Canada’s Consultation Paper on a Digital Economy Strategy.

As President & CEO of a Canadian based global broadband wireless company, I am keenly aware of the benefits modernized ICT infrastructure bring to an economy. Economic and social development varies from one economy to another, but in every case, significant, measurable increases in GDP are realized when access to broadband is made universal.

Much of my company’s work is in the developing world where access to broadband is extremely limited if available at all. By providing universal broadband access to education, health, industry, business and individual citizens, societies have been transformed in very dramatic ways. Creative minds are unleashed and given access to develop new products and services, not only for their local economy, but often for the world at large.

As a proud Canadian, I am profoundly disappointed that rural Canada is now lagging behind much of the developing world in terms of broadband access. Recognizing the enormity of our nation’s geography, along with the reality that we are one of the most urbanized countries on the planet, it is somewhat understandable that little attention has been paid to rural broadband access up to this point.

A look at the devil in the details reveals further concerns. For example, there is no clear, consistent delineation between urban and rural broadband service offerings. My office, for example, is located in the “technology centre” of York Region, mere kilometers from the City of Toronto boundary. Within five kilometers of my office, broadband service availability becomes very sporadic, even nonexistent in certain peripheral areas. Many businesses and residences encircling our country’s largest city have no access to broadband.

The Government’s digital economy initiative is a vital element in Canada reclaiming it’s prominence as a global telecommunications leader. The government’s paper on the matter does a good job of capturing the challenges, along with the necessity to address them. With that in mind, I offer the following comments.

I believe serious consideration should be given to defining broadband access as an essential service – much in the way access to electricity and traditional telephony services have been regarded for several decades now. I say this fully recognizing that political and economic realities of today are very different from the days when universal telephone service was being deployed in rural Canada.

It is my belief that one of the reasons our country has fallen so far behind is due to the lack of genuine competition in the telecommunications sector. With that in mind, and factoring in the significant capital infusion required to provide such universality, a structure based on private / public partnerships should be seriously considered.

I also recognize that our deregulated, competitive telecommunications environment necessitates that capital is allocated for broadband expansion based on Return on Investment per project. Understandably, areas with low population densities produce poor and very often negative ROIs.

The digital economy, however, is a broad, complex, multilayered concept as the Government’s paper describes so well. While the delivery systems (i.e.: telecommunications infrastructure) may yield poor or negative ROIs in many areas of the country, the creation of content, new products and services as a result of universal broadband have the potential to generate enormous wealth in the longer term. Put another way, universal broadband provides a consistent foundation from which immeasurable wealth can be generated over and above network operator revenue. This modernized infrastructure has the added benefit of providing remote and rural government offices and facilities with broadband, allowing for operational cost reductions, along with greater opportunity to offer services in more areas at a consistent level of quality and overall user experience.

A likely reciprocal result of this creation of wealth would be made-in-Canada innovation in the telecommunications sector itself. Our once global reputation as an innovator in telecommunications would be reestablished, but this time it would be substantially reinforced by services afforded by universal broadband access to the Internet and World Wide Web.

Realizing the longer term return on such a scenario, it is essential to incent telecommunications providers to expand where shorter-term ROIs are unattractive – even when augmented by public funds. Tax breaks are an obvious option but other incentive-oriented mechanisms should also be explored. For example, an easing, or where practical, elimination of radio frequency license fees in rural areas would aid in the provision of both broadband backhaul and access. Another deterrent for network operators in rural areas – both from a cost and logistical perspective – relates to inflexibility in accessing rights-of-way. Rigidity around collocation of multi-operator telecommunications facilities is another impediment. I believe that by clearing these obstacles, significant progress can be made in delivering universal broadband in rural Canada.

We deservingly pride ourselves on being a well educated society. Creation of wealth and the sharing of knowledge need not be confined to parts of our country where broadband is available. Our rural areas are bursting with clever, creative, educated people driven by an entrepreneurial spirit. Other rural residents long to learn and have access to the same infinitely rich resources enjoyed by their urban counterparts who take broadband access for granted.

I thank you for considering my comments on this extremely important matter and look forward to a bright future where every Canadian has the choice to participate in the Digital Economy.

Robert Barlow

President & CEO

WireIE Holdings International Inc.

As many are aware, a war has been waging between Apple and Adobe over Apple’s decision to abandon Adobe Flash in their iPhone, iPod Touch and iPad product lines. Instead, the always forward-looking Steve Jobs, President and CEO of Apple, wants to accelerate the development and ultimate adoption of the integrated video capabilities planned for HTML5.

While Job’s vision is laudable, WireIE also believes there’s a pragmatic side to the debate. At this point, selection of the codec for HTML5 video has not been finalized. While many agree the ubiquitous H.264 codec would be a great asset in the HTML5 multimedia suite, there are potential complications with licensing. Others fear that alternatives such as open sourced Ogg Theora fall slightly short of H.264 in video quality and compression optimization. Regardless, these factors, along with spotty browser support, mean HTML5 and its integrated multimedia support is by no means finalized.

In the interest of being pragmatic, WireIE encourages the industry to give Adobe’s Flash Player 10.1 for mobile a fair hearing. In the video below, Adobe’s Adrian Ludwig demonstrates online gaming along with some beautiful multimedia interactivity with National Geographic’s web site on an HTC Nexus One running Android 2.1 and Flash 10.1

The move by network operators to usage based pricing models will undoubtedly alter the behaviour of many heavy data users. But what could also result from these new pricing models are application developers coding their apps with much greater sensitivity to the fact that unlimited data pricing is all but becoming a thing of the past.

CNET News columnist Marguerite Reardon has assembled an assessment of how this pricing model change will inspire application developers to be more efficient. To quote Dave Grannan, CEO at Vlingo, “It’s a pretty basic economic principle. When there’s a perception of unlimited use, people use the resource in less than efficient ways.”

Reardon’s assessment doesn’t ignore the enormous growth of video services, user created video and the huge impact they have on wireless network bandwidth. To read her column, simply click on the logo below.