Tuesday, January 26, 2010

Location Based Services Part II: LBS Network Architectures

In the previous blog LBS Part I we discussed about the different Location technologies and their comparisons on different parameters with their advantages/disadvantages. Today we will see how these positioning technologies integrate with the network architecture in different Wireless Standards (3GPP, 3GPP2, OMA, WiMAX, LTE) We will first start with categorizing the location services by their usage as follows:



The above four categories can be practically implemented in the way the MS communicates through the network with the location server. The Wireless operators seeing the significant value in LBS delivering a solid ROI, the operator’s engineering team must select one of the two possible deployment methods. It can be implemented in either Control Plane or User plane mode. Each has its own advantages and disadvantages. This "control-plane" approach, while highly reliable, secure, and appropriate for emergency services, is costly and in many cases, overkill for commercial location-based services. In both 3GPP & 3GPP2 an IP based approach known as "user-plane" allows network operators to launch LBS without costly upgrades to their existing SS7 network and mobile switching elements



Let us consider an LBS implementation architecture as an example in both the modes.


1. Control Plane Architecture



The Control plane architecture consists of following core entities:



  • PDE/SMLC: Position Determination Entity/Serving Mobile Location Center -PDE facilitates determination of the geographical position for a target MS. Input to the PDE for requesting the position is a set of parameters such as PQoS (Position Quality of Service - Accuracy, Yield, Latency) requirements and information about the current radio environment of the Mobile Station (MS)

  • MPC/GMLC: Mobile Positioning Center/Gateway Mobile Location Center - MPC serves as the point of interface to the wireless network for the position determination network. MPC serves as the entity which retrieves, forwards, stores, and controls position information within the position network. MPC selects the PDE to use in position determination and forwards the position estimate to the requesting entity or stores it for subsequent retrieval.

  • LCS Client: LCS client is a logical entity that requests the LCS server to provide information on one or more target MS. LCS client being an logical entity can reside within a PLMN, or outside the PLMNs or even in the UE

  • Geoserver, LBS applications, SCP Service Control point and content


In this configuration, the MPC/GMLC effectively serves as the intermediary and gateway between the applications, running in the Web services space, while the PDE/SMLC runs in the signaling space. It serves as a holding agent for subscriber location information working with MSC<->VLR<->HLR and facilitates push and pull transactions. A "push" transaction might be an application that locates a subscriber and delivers a message, perhaps about a sale at a store nearby, while a "pull" transaction would consist of the subscriber invoking a service, such as Find my Nearest ATM machine. The service set-up and communication is performed via traditional signaling network. The MPC/GMLC also serves as a place to perform general administration functions, such as authentication/security, privacy, billing, provisioning, and so on. Let us consider an example of position request flow between different entities. This shows an network initiated location request from the LCS in C-plane LBS Architecture.



These type of requests initiated from network side are mostly for network performance measurements, emergency services or for push services querying the MS location.


2. User Plane Architecture The User Plane consists of following entities: PS: Position Server - PS provides geographic position information of a target MS to requesting entities. PS serves as the point of interface to the LCS server functionality in the wireless packet data network. PS performs functions such as accepting and responding to the requests for location estimate of a target MS, authentication, service authorization, privacy control, billing, and allocation of PDE resources for positioning. PDE: Position Determination Entity


3GPP2 U-Plane Architecture


The User plane architecture is similar to control plane but does not include the full functionality of the MPC/GMLC. Instead it allows the handset to invoke services directly with the trusted location applications, via TCP/IP, leaving out traditional SS7 messaging altogether. A scaled-down version of the MPC/GMLC handles authentication/security for the user-plane implementation approach. This method is focused on pull transactions, where the subscriber invokes a location-sensitive service. However, push transactions are possible and supported through the limited MPC/GMLC function. The User plane involves following entities Let us consider an example of position request flow between different entities. This shows a handset initiated location request from the LCS residing in MS in U-plane LBS Architecture. U plane LBS location processing request procedure


These requests are initiated from mobile station mostly for location based search requests like restaurants, navigation or for pull services querying the position server.


3. OMA (Open Mobile Alliance) U-Plane Architecture Open Mobile Alliance (OMA), a mobile communications industry forum is created to bring open standards, platform independence, and global interoperability to the LBS market. More than 360 companies are represented in OMA, including MNOs and wireless vendors, mobile device manufacturers, content and service providers, and other suppliers. The OMA User Plane consists of following entities and protocols.




  • MLP: Mobile Location Protocol: MLP is a protocol for querying the position of mobile station between location server and a location service client

  • RLP: Roaming Location Protocol: RLP is a protocol between location servers while UE is roaming

  • PCP: Privacy Checking Protocol: PCP is a protocol between location server and privacy checking entity


SUPL (Secure User Plane Location): SUPL is developed by the Open Mobile Alliance. SUPL is a separate network layer that performs many LBS functions that would otherwise be governed within the C-Plane, and is designed to work with existing mobile Internet systems. With SUPL, MNOs can validate the potential of the LBS market with a relatively small budget and few risks.  SUPL utilizes existing standard to transfer assistance data and positioning data over a user plane bearer. SUPL is an alternative and complementary solution to existing 3GPP and 3GPP2 control plane architecture. SUPL supports all handset based and assisted positioning technologies. SUPL is data bearer independent. SUPL architecture is composed of two basic elements: a SUPL Enabled Terminal (SET) and a SUPL Location Platform (SLP)



  • SUPL Enabled Terminal (SET): The SET is a mobile device, such as a phone or PDA, which has been configured to support SUPL transactions.

  • SUPL Location Platform (SLP): The SLP is a server or network equipment stack that handles tasks associated with user authentication, location requests, location-based application downloads, charging, and roaming.



SLP consists of following functional entities,



  • SUPL Location Center (SLC) coordinates the operation of SUPL in the network and manages SPCs.

  • SUPL Positioning Center (SPC) provides positioning assistance data to the SET and calculates the SET position.


The core strength of SUPL is the utilization, wherever possible, of existing protocols, IP connections, and data-bearing channels (GSM,GPRS,CDMA,EDGE or WCDMA). SUPL supports C-Plane protocols developed for the exchange of location data between a mobile device and a wireless network including RRLP (3GPP: Radio Resource LCS protocol) and TIA-8014(Telecommunications Industry Association 801-A, Position Determination Service for cdma2000). SUPL also supports MLP (Mobile Location Protocol) and ULP (UserPlane Location Protocol). MLP is used in the exchange of LBS data between elements such as an SLP and a GMLC, or between two SLPs; ULP is used in the exchange of LBS data between an SLP and an SET. Let us consider an example of position request flow between different entities. This shows a SET initiated location request in OMA-SUPL U-plane LBS Architecture.


SUPL vs. C-Plane Two functional entities must be added to the C-Plane network in order to support location services: a Serving Mobile Location Center (SMLC), which controls the coordination and scheduling of the resources required to locate the mobile device; and a Gateway Mobile Location Center (GMLC), which controls the delivery of position data, user authorization, charging, and more. Although simple enough in concept, the actual integration of SMLCs and GMLCs into the Control Plane requires multi-vendor, multi-platform upgrades, as well as modifications to the interfaces between the various network elements. LBS through SUPL is much less cumbersome. The SLP takes on most of the tasks that would normally be assigned to the SMLC and GMLC, drastically reducing interaction with Control Plane elements. SUPL supports the same protocols for location data that were developed for the C-Plane, which means little or no modification of C-Plane interfaces is required. Because SUPL is implemented as a separate network layer, MNOs have the choice of installing and maintaining their own SLPs or outsourcing LBS to a Location Services Provider.


4. LBS Architecture in WiMAX The WiMAX network architecture for LBS is based on the basic network reference model (NRM) specified by the WiMAX Forum. The model basically differentiates the network architecture into two separate business entities, (NAPs) Network Access Providers which provides radio access and infrastructure whereas (NSPs) Network Service Providers provides IP connectivity with subscription and service delivery functions. The NAP is typically deployed as one or more access service networks (ASNs). The NSP is typically deployed as one or more Connectivity service network CSN(s). The NAP interfaces with the MS on one side and the CSN on the other. Below shows the location request initiation from the application either located in device or network.


This is basically MS managed location service. The MS receives location requests from the applications and takes necessary measurements, and determines its location and provides it to the other requesting applications through upper layer messaging. The locations calculations at MS are aided by the periodic geolocation parameters broadcasted of the serving Base Station and the neighboring BS by the serving BS using layer 2 LBS-ADV message defined in IEEE 802.16-2009. The LBS-ADV message delivers the XYZ coordinates, the absolute and relative position of serving and neighboring BS allowing the MS to perform triangulation or trilateration techniques (either EOTDA or RSSI) and further aided by GPS to locate accurately. In this framework no major specific functional support for LBS is required in either the ASN or the CSN. Whereas in a network managed location service requires few functional entities to be added and enhancements to the network such as Location Requester, Location Server, Location Controller, and Location Agent. Also, The WiMAX network architecture for LBS is designed to accommodate user plane, control plane, and mixed-plane location approaches. The big advantage of user plane location is that the LS can directly get to the MS, and signaling is minimized across the various reference points.  However, for this to the happen, the MS needs to have obtained an IP address and be fully registered with the LS, and application layer support is required in the MS. In contrast, for the control plane location, the LS does not communicate directly with the MS, and hence there is no hard requirement for the MS to have obtained an IP address. In other words, the control plane location approach relies more on the L2 connectivity of the MS. However, the signaling costs are generally higher in control plane location as the signaling will have to traverse multiple reference points before measurements can be obtained. The mixed plane method is nothing but the LS invoking both control plane measurements and user plane measurements at the same time. The LS can then perform a hybrid location solution by combining the measurements to get much better accuracy for the location fix. This approach is also fully supported in the WiMAX network. The trade-off here is that this method costs a whole lot more in terms of latency for the fix and the associated signaling, however this will translate to much better accuracy for the MS location indoors where an insufficient number of GPS satellites may be visible


5. LBS in LTE LTE also generally supports the same types of positioning methods (Cell ID, A-GPS, mobile scan report-based, and hybrid) as in WiMAX. LTE offers user and control plane delivery of GPS assistance data; WiMAX chose to provide only user plane delivery. The rationale was that rapid IP session setup with the LS offered by WiMAX minimizes the need for a control plane solution. In WiMAX, authorization and authentication for LBS service is provided by the AAA, whereas in LTE the gateway mobile location center (GMLC) provides the equivalent functionality. The LTE Location Services specification is  being developed under the current work plan and targeted for 3GPP Release 9.


This sums up the Location Based Services Architecture covering 3GPP, 3GPP2, WiMAX, LTE and OMA standards. In the next Part we shall cover the Use Cases, Business Model with current and future trends for LBS. - Neil Shah


References:


3GPP TS 23.271, “Functional Stage 2 Description of Location Services (LCS)”; http://www.3gpp.org/

Open Mobile Alliance, “Secure User Plane Location V 2.0 Enabler Release Package”; http://member.openmobilealliance.org/

Etemad, K., Venkatachalam, M., Ballantyne, W., Chen, B.,(2009)  “Location Services in WiMAX Networks”, IEEE Communications Magazine.

WiMAX Forum, “Protocols and Procedures for Location Based Services,” v. 1.0.0, May 2009.

OMA, O. M. (2007). Enabler Release Definition for Secure UserPlane for Location (SUPL) . OMA

Faggion, N., S.Leroy, & Bazin, C. (2007). Alcatel Location-based Services Solution. France.

3GPP2. (2000). Location-Based Services Systems LBSS: Stage 1 Description. 3GPP2 S.R0019 . 3GPP2.

3GPP. (2006). 3GPP TS 23.271 V7.4.0 Technical Specification Group Services and System Aspects Functional stage 2 description of Location Services (LCS) (Release 7).

Share

2.6 GHz Spectrum & the Next Generation Mobile Broadband Networks

As the much awaited 2010 Mobile world Congress will kick in on 15th February in Barcelona, Spain, primary focus will be on discussing and showcasing the future of mobile broadband industry with cutting edge products, technologies highlighting m-commerce, m-marketing, m-advertisements, broadband deployment and initiatives broadening the mobile ecosystems.


“There is clear evidence that the volume of data flowing over mobile networks is growing rapidly and is being accelerated by the popularity of smart phones and the growth in music and video downloads,” said Tom Phillips, Chief Regulatory Affairs Officer at the GSMA.


With this unique and new view of the mobile landscape unfolding, one of the primary driver will be the utilization of ever scarce resource: Spectrum. The licensing of the 2.6 GHz spectrum will be vital in satisfying the demand for greater capacity for Mobile Broadband and launching next-generation networks such as LTE, which will start to be deployed commercially around the world this year .


As I mentioned in my previous article covering the WiMAX business model highlighting the importance of the spectrum and its contribution in the wireless operator's cost model. Building on it we now know that the licensing of the 2.6 GHz band will be critical to unlocking the benefits of global scale economies in the Mobile Broadband market. The outcome of 2.6GHz allocation will have far-reaching consequences for how the adoption dynamics of WiMAX and 3GPP (such as HSPA and, in future, LTE) will play out in this region since 2.6GHz is the first arena where the two proponents will be battling each other in the same area of spectrum.



So let's jump in discussing and analyzing about the 2.6GHz band its importance, what’s in store and bullet its implications on the future of mobile broadband. This analysis extends the scope of the report on 2.6 GHz band recently released by the GSMA & GVP. This report maybe biased towards LTE but lets draw some real pointers and analyze it.


Digging in about 2.6GHz band..


The 2.6 GHz band (2500-2690 MHz), sometimes also referred as the 2.5 GHz band, was allocated by the World Radio communication Conference (WRC) in 2000 for terrestrial mobile communications services. The 2.6GHz band is often referred to as the "IMT-2000 expansion band"(now !!) or the "3G expansion band"(earlier) and is 190MHz wide (substantial !!). This band has been allocated on primary basis to all the three ITU regions for terrestrial mobile communications compared to the smaller allocation of 3.5 GHz (3.4-4.2 GHz) Why ??


Note: ITU Regions: Region 1 comprises Europe, Africa, the Middle East west of the Persian Gulf including Iraq, the former Soviet Union and Mongolia; Region 2 covers the Americas, Greenland and some of the eastern Pacific Islands; Region 3 contains most of non-former-Soviet-Union Asia, east of and including Iran, and most of Oceania.


WRC imposed stringent power limits on satellite systems with limited geographic footprint operating in 2.6 GHz band shifting the importance of satellite systems more into 3.5 GHz bands. Also to add to this WRC-07 decided against the global identification for IMT, including WiMAX, in any part of the satellite C band (3.4-4.2 GHz)  with an exception of the mobile service allocation in 3.4-3.5 GHz thus making this band less globally harmonized for IMT. Hence the 2.6GHz band is now in a unique position to be exploited as a common band for commercial terrestrial mobile broadband access services on a global basis. The beauty of the 190MHz wide spectrum is how it is to be divided for allocation. Should it be paired or unpaired suiting to corresponding FDD and TDD modes of operations. The International Telecommunications Union (ITU) presents three possible options:



Option I: A mix of FDD(paired) and TDD(unpaired) spectrum plan which avoids interference problems this two different modes of operations


Option II: No unpaired spectrum included in this plan and leaves the second member of each pair undetermined


Option III: A Flexible plan on the amount of spectrum allocation for either of the paired(FDD) or unpaired(TDD) modes of the operation


The adoption of above plans differs from region to region, country to country and market to market depending upon the  technology Standard to be deployed either HSPA/HSPA+, LTE or WiMAX. A channel width of 20 MHz is recommended for most efficient use of current technology capabilities  for FDD (2x20 MHz) as well as TDD (a 20 MHz block is sufficient). Licensing should be based on a structure of 5 MHz channel blocks to allow support for 5,10, 15 or 20 MHz channels dependent on spectrum availability and each market’s competitive situation. Future technology evolution (4G) will most likely be based on combining multiple channels with 20 MHz being an ideal building block.


The ITU Option 1 band plan is well suited to meeting this goal by enabling technology neutrality and competitive ―4G‖wireless equipment choices for both FDD and TDD operation to mobile operators (including both LTE and WiMAX).There is widespread agreement at national levels as well as at the European Union and its Commission in adoption of Option 1 band plan. Recent licensing carries a bias toward Option 1 with slight differences related to country-specific situations. More auctions are expected in Europe as well as in major emerging markets such as Brazil and South Africa. Substantial 2.6 GHz spectrum is licensed in the United States, although allocation and utilization are less than ideal for unique, nonreproducible historical reasons that predate the allocation of this band to mobile communications.


The ITU Option 2 band plan does not accommodate demand for unpaired spectrum and, therefore, violates the principle of technology neutrality (WiMAX)


The ITU Option 3 might lead to is likely to lead to multiple different national band plans and other challenges such as regulatory hurdles coupled with interference management and costs and availability of the equipment to match up in a customized way to the different national band plans. It increases the need for guard bands and could drive costs up for spectrum owners since they would need to negotiate with each other to make sure efficient coexistence and sacrifice spectrum to use as guard bands


2.6 GHz Spectrum implications:


Socio - Economic Implications



  • Expanding the wireless mobile broadband to developed as well as developing nations  at affordable price points



  • With a standardized spectrum bands and allocations plan allowing global harmonization will help drive economies of scale driving the costs down



  • Standardization also enables easy and ready accessibility of the common services across many geographies.

  • Economic reuse and sharing of existing physical and operational infrastructure  of the mobile operators reducing CAPEX (deployment costs)



  • Proper Spectrum standardization and band plans options enables technology (FDD/TDD) and service neutralities  facilitating innovation and healthy competition between equipment, device, and applications and services vendors to the benefit of customers



  • The widespread mobile broadband deployment and growth have potential benefits (employment, GDP) for developed economies and in fact more for emerging economies.


Technological Implications



  • LTE and WiMAX can exploit 20 MHz of contiguous spectrum to deliver their highest spectral efficiency and highest throughputs. The 2.6 GHz band makes such allocation possible enabling the operators to operate high-speed LTE/WiMAX services at optimum performance.

  • The 2.6 GHz frequencies have relatively short propagation ranges and inferior in-building penetration characteristics compared to lower frequencies makes it less suitable for rural areas( But with beam forming this can be taken care of..)

  • On the other hand, the short propagation range and the large amounts of bandwidth (190 MHz) available in this band make it ideal for operators seeking to offer high network capacity and improve the speeds of mobile data transmission they can deliver to users in urban and suburban areas.

  • Looking ahead, the shorter 2.6 GHz wavelengths can achieve greater improvements in performance through increased use and capabilities of smart antenna techniques such as MIMO and beam forming than is possible at lower frequencies. Thus, the gaps between environments in which 2.6 GHz can be used economically and efficiently relative to those where frequencies below 1 GHz are better suited may be somewhat reduced in favor of 2.6 GHz.

  • The 2.6 GHz spectrum is the ideal complement to the 700 MHz spectrum, also known as ‘digital dividend’, and will enable the most cost-effective nationwide coverage of Mobile Broadband across both rural and urban environments. Also, LTE is likely to be of interest in other bands (e.g. 1800 MHz in Finland and Hong Kong).

  • Though I mentioned at the start 2.6 GHz was seen as the "3G Extension band" but the ITU has changed its destination to the IMT band (for all mobile applications) positioning it strong for growth of 4G technologies (LTE Advanced & 802.16m).


2.6 GHz Adoption Facts:



  • Recent licensing of this band in Hong Kong, Norway, Finland and Sweden, for example, has highlighted that there is more demand for paired (FDD) than unpaired spectrum (TDD) and that the ITU’s recommended Option 1 plan is the best structure to stimulate market growth in a technology-neutral and competitive environment. With an



  • In the United States band plan, incumbents have the flexibility to deploy Time Division Duplex (TDD) or Frequency Division Duplex (FDD) anywhere in the band (Option 3) . Here the major spectrum owners are Sprint & Clearwire deploying TDD WiMAX which will be followed by future LTE rollouts by Verizon Wireless possibly in 700MHz band and AT&T is currently focusing on HSPA/HSPA+ networks to match up to WiMAX speeds.



  • Governments in most Western European countries as well as in Brazil, Chile, Colombia, and South Africa are planning to award 2.6 GHz frequencies within the next two years.


Summarizing the benefits, implications, facts and the mobile broadband trends, 2.6 GHz spectrum ownership and band allocation can shape the business models for the next generation technologies. It will be a significant part in developing a wireless ecosystem which will offer high-speed mobile broadband solutions which shall  be easy to access, seamless across geographies and at an affordable price !! - Neil Shah


References:

Unstrung.com Report : 2.6GHz Spectrum Key for LTE
Maravedis-bwa.com : Europe Prepares for 2.6 GHz spectrum Feeding Frenzy
Five bidders take 2.6 GHz WiMAX spectrum in Norway
GSMA & GVP Report on "The 2.6 GHz Spectrum Band"
WiMAX Forum 2.5 GHz Spectrum Manager
Light Reading : GSMA Wants More LTE Spectrum
3G Americas: LTE Global Deployments
WiMAX Vision.com4G battle looms in Europe at 2.6GHz

Share

AT&T upgrading to HSPA+ but will it ensure reliability??

Stephen Lawson of IDG News Service recently mentioned in his article why AT&T needs to spend $5 Billion on its wireless network. I agree with him on this as AT&T has to catch up with the coverage offered by Verizon Wireless.


Though AT&T boasts of the fastest 3G Network and it might be too, but customer satisfaction and connection reliability index especially in urban areas are the two main reasons which might blur AT&T's image. And with inclusion of bandwidth hungry smartphone (iPhone primarily) users in its portfolio, loading their networks and juicing out the backhaul, situation might get out of control for AT&T  unless they start acting on it. Apart from the loading the other important factor which I mentioned earlier is coverage which affects the reliability.


Issue 1: 3G Speed & Reliability Tests


AT&T's 3G network is based on HSPA (High-Speed Packet Access) and an upgrade to HSPA+, a system designed to deliver as much as 7.2M bps (bits per second). Verizon uses EV-DO (Evolution-Data Optimized), which that carrier said offers as much as 1.4M bps in real-world performance. The speed of the network for individual subscribers depends on a variety of factors. But what matter here is the reliability along with the speed. The PC World test, conducted by Novarum last year, found mixed results for network speeds among AT&T, Verizon and Sprint but showed AT&T in last place for reliability in all 13 cities tested.



The above analysis puts light on The "reliability" score depicts the percentage of  the tests in which the service maintained an uninterrupted connection at a reasonable speed (faster than dial-up) for Verizon, Sprint and AT&T in 13 different cities.


Issue 2: CAPEX on wireless infrastructure


Recent reports from TownHall Investment Research depicts that AT&T is short on CAPEX behind key competitor Verizon and Sprint on its Wireless infrastructure. AT&T's capital expenditures on its wireless network from 2006 through September 2009 totaled about $21.6 billion, compared with $25.4 billion for Verizon and $16 billion for Sprint (including Sprint's investments in WiMax operator Clearwire). Over that time, Verizon has spent far more per subscriber: $353, compared with $308 for AT&T. Even Sprint has outspent AT&T per subscriber, laying out $310 for network capital expenditure. That investment shortfall has been the major cause of AT&T's poor network performance, which has been reflected in tests by Consumer Reports and PC World The other issue is AT&T invests more in its wired infrastructure than in its wireless network, even though the wireless business contributes a majority of the carrier's profit. AT&T gets 57 percent of its operating income from wireless and only 35 percent from wired services, but wireless only gets 34 percent of the capital expenditures, with the wired network taking up 65 percent of that spending.


Issue 3: Backhaul Capacity


Along with invest in upgrades to HSPA 7.2 in the base stations, AT&T needs to remove the backhaul bottlenecks to accommodate high speed data in the core. The backhaul limiting the speeds is the primary concern  as I mentioned in my previous post for operators choosing the right backhaul solution considering the capex/opex. The $5 billion investment gap could expand to $7 billion because of the need for new backhaul capacity to link AT&T's wireless network into the wired Internet.


Issue 4: Old Infrastructure


Another looming problem for AT&T is that its E911 emergency calling system, which works on its older GSM (Global System for Mobile communications) network, hasn't been adapted to use 3G and is unlikely to make the migration soon. That means AT&T will have to maintain that old network for the foreseeable future, including possibly more capital investment for more power-efficient GSM equipment.


Solutions:


Hot on the heels of T-Mobile USA's announcement that it upgraded its 3G footprint to HSPA 7.2, AT&T Mobility said it upgraded its own 3G cell sites across the country with HSPA 7.2 software. However, AT&T clarified that it is still working to deploy increased backhaul capacity to the sites, a job that it will continue into 2011. With this the customer experience will definitely get a boost with improvement in consistency in the data sessions access. So apart from base station upgrades and increasing backhaul capacity AT&T needs to add more number of  base stations especially in the urban areas where the user confidence level is shaky and expand their coverage.  AT&T has already started taking some smart steps by moving the 3G service to its longer range 850MHz radio band in the San Francisco area which seems to have helped coverage there, and the company will probably take that strategy nationwide while testing coverage in specific areas and "surgically" increasing capacity. So the ball is in AT&T's court and they have to act, spend and expand !!


- Neil Shah


References: Analyst: AT&T Needs to Spend US$5B to Catch up by Stephen Lawson, IDG News Service A Day in the life of 3G: Mark Suvillan, PC World AT&T plans to double 3G speeds Ian Paul, PC World AT&T upgrades cell sites to HSPA7.2 software: by Phil Goldstein, Fiercewireless.com


Share

NOKIA: NOt a King In America

NOKIA as we all know (not sure about a common man in Northern America !!) is the world's undisputed leader in Mobile Devices Technologies especially with greater dominance in Europe, Asia and Latin America markets.


But  NOt a King In America !! Why??


To put more light on this conundrum, let's analyze the growth of the mobile devices industry and the contribution of the industry leader over the last four years 2006-2009. This has been a significant phase for the big changes in mobile devices industry (advent and growth of smartphones, with new players entering with smart devices and  hurting the incumbents' sales..yes... smartly enough).



(Note: I have approximated Q4'2009 data from the previous industry and Nokia's sales data, as Nokia has still not released its Q4'09 results)


The above graph depicts the consistent market share for Nokia worldwide but there is a dip in market share in 2009 coupled with the industry de-growth in late 2008 and early 2009 due to the recession slowing the growth prospects of mobile devices industry.


There are more causes of this dip in performance from the world leader. Let's see where it has lost its market share, though there is no prize for guessing but let us confirm it…



It's crystal clear from the above depicting NOKIA's dominance everywhere accept North America. The contribution to the sales in  North America has been decreasing dramatically for Nokia from the year 2002 with a whopping 35% market share (leader) to 2009 with a below par 8%.


What has been the main reasons in such a developed market like North America where Nokia is not a King? Why isn't it able to leverage from its earned brand equity all over the world  here in American Markets. Is it the product portfolio or the operator partnerships or the go-to-market sales strategy or the right attitude or not understanding the North American consumers ?


Issue 1: Birth of Smartphones  and Apple iPhone …


The smartphone growth driven by Blackberries and the Palms of the world in mid 2000s was suddenly accelerated with the Apple's first step into mobile phone industry with iPhone,  leveraging from its iPod success. This certainly made it difficult for Nokia, as Apple with its good core hardware design,  a mind-blowing touch screen interface coupled with the applications and the app store revolutionized the smartphone market. Smartphone was now smarter and available for the local consumer(apart from Blackberries which had more of enterprise users) creating a breathtaking user experience. Nokia had no answer to the iPhone's  success which slowly was competing with it's high end product offerings and captured the growing smartphones category along with the incumbent Blackberry.


If we look at a snapshot of Year-on-Year growth of  smartphones market-share globally,  the situation is evident: Nokia losing worldwide market share in smartphones category from 51% down to 40%. We all know this is the most must have category in any product portfolio with a 15% year on year industry growth. Also, dominance in this category will dictate who will be the global leader in mobile industry in coming years as the advanced wireless networks (HSPA+, WiMAX, LTE) are designed for such data hungry devices.



Issue 2: Product Mix


Continuing with the discussion on Smartphones. Though Nokia is at 46% in global share of its Symbian OS and 40% in global share of its smartphones handset sales, the Symbian OS is equipped with far more clunky user interface and lacks the power of applications  which the iPhone ecosystem has in store for the users. Nokia should come up with new smartphone devices available to enterprise as well as normal consumers. The likes of Nokia E71x, N97 and N900 should enhance the Nokia's product portfolio in North America. Almost two of the top three operators in USA runs on CDMA technology (Verizon Wireless & Sprint-Nextel). Nokia never had a focus on CDMA handsets in its portfolio especially in North America.  CDMA based handsets comprises to only 14% of the entire Nokia's mobile handset sales. So this explains its focus on the inclusion of CDMA driven handsets in its portfolio.


Issue 3: Mobile Operator Partnerships & Go-to-Market Sales Strategy


I believe this is the key to  Nokia gaining market share in North America. "Subsidy" to the end users has been the primary factor in driving any handset sales here and mobile subscribers are habituated to it... But,  almost 70% of Nokia's worldwide sales is based on a "direct buy" go-to-market strategy. If a subscriber needs a handset he pays around $100-$300 for a multimedia phone or even $400 to $500 for a smartphone(high-end) and in many of the bigger Asian markets like India & China, Nokia sales are direct without any subsidy to the subscribers. In North American markets this sales strategy won't drive sales but only stronger partnerships with the  tier one  mobile operators will boost Nokia 's handset sales. Nokia has to be flexible on their price points in North American markets and develop relationships with these operators. The recent success of Nokia E71 series with AT&T is the biggest example for Nokia maintaining the 8% market share almost flat compared to 2008. Nokia should come up with a CDMA/EVDO based smartphone especially to tackle the Korean handset companies (Samsung & LG) who have also captured Nokia's market share over these years and topped the market share charts in North America. I can perfectly relate myself to this situation. I have always been a hardcore NOKIA fan. I always bought expensive  Nokia N Series phones costing between INR15000 to INR250000 ($300 to $500) back in India but after coming to USA and bitten by the subsidy bug, it lured me into choosing iPhone 3G($100) offered via AT&T in lieu of buying an unlocked Nokia 5800 for $400 !!


Issue 4: Understanding American Consumers


Along with the subsidy game and touching consumers on price points, Nokia failed earlier in understanding the tastes of American consumers instead mass producing devices for the global market to save on production costs. The lack of flip phones, smartphones with QWERTY keyboard catching up with the "texting" trends and touch screens coupled with great social networking, navigation , utility applications creating a great ease and user experience.


Road to the Throne !!


Summing all this up, Nokia does have lots of issues on its plate but at the same time it does possess the technical as well as business  capabilities, brand equity, capital to make amends. Already there has been a lots of improvements strategically from Nokia's end.


Nokia is revamping its North American operations to collaborate more closely with the major American operators. AT&T this year will begin billing customers who use Nokia services branded as Ovi. Those customers will no longer receive a second bill from Nokia. And in Canada, the network leader, Rogers Communications, is making it easier to access Ovi Maps and N-Gage game services on two Nokia models.


There are huge plans to revitalize the Nokia's Ovi  App store and with the acquisition of Navteq GPS will help it leverage significantly. In a recent move by Nokia in offering a free turn-by-turn navigation tool for lifetime  inbuilt in to its smartphone, it will revamp the location based services roadmap.It is available for 74 countries, in 46 languages, and with traffic information for about 10 countries and detailed maps for more than 180 countries to start with.



The another big step Nokia has taken a leap in is its R&D expenses. It has rose from 5% in 2007 to 12.5% in 2009 which should lay a strong roadmap for new product lines and diverse product portfolio.  Nokia has worked on its product's form factor, touchscreen capabilities, inclusion of latest social networking and utility apps, an intuitive mobile web experience and QWERTY keyboard capabilities with their Nokia N900  & Nokia N97 editions. Not sure whether any North American operators have announced their  newer commitments with Nokia , but soon we should expect.



Nokia is also planning to make Symbian an open source platform enabling a future full of broader capabilities and a deeper reach in the North American markets .Thus, developers will have access to every single line of code, in other words, to an Open Source operating system, and we would be able to see many more APIs, and induction of more and more functionality to the programs.



To round up, It's a long way to go for NOKIA in regaining the North American throne. It's going to be challenging and worth watching. At the same time, all eyes are now stuck on the fiercest battle between smartphone OS's in the war of smartphones: Symbian Vs. Blackberry Vs. Apple OSX Vs. Android Vs. Palm OS Vs. Win Mobile.


- Neil Shah


References:

Nokia Quarterly and Annual Reports 2006/2007/2008/2009
Yankee Research: The Battle for Smartphone OS  Supremacy

Share

Tuesday, January 12, 2010

Location Based Services Part I: Technologies in Wireless Networks

Wireless carriers and their partners are developing a host of new products, services, and business models based on data services to grow both average revenue per user (ARPU) and numbers of subscribers. The main focus of the developer and user community is, on real world mobile web services in an emerging mobile application field namely Location Based services (LBS), which provide information specific to a location, and hence are a key part of this portfolio.

Definition: A service provided to a subscriber based on the current geographic location of the MS. Location-based services (LBS) provides service providers the means to deliver personalized services to its subscribers

LBS reflects the convergence of multiple technologies:

Internet - Geographic Information System - Mobile devices




Localization

Localization is based on analysis, evaluation and interpretation of appropriate input parameters. Most of them are related to exploitation of physical characteristics being measurable in a direct or indirect way.

From a physical localization point of view, there are three principle techniques to be distinguished:

1. Signal Strength & Network parameters:

The most basic wireless location technology is given by the radio network setup itself. Each base station is assigned a unique identification number, named CellID. The CellID is received by all mobile phones in its coverage area, thus the position of a target is derived from the coordinates of the base station. Signal strength could be used to reduce the target position. Wave propagation is highly affected by several factors, especially in urban areas, thus signal strength is altered and does not provide a reliable parameter for localization. Cell ID accuracy can be further enhanced by including a measure of Timing Advance (TA) in GSM/GPRS networks or Round Trip Time (RTT) in UMTS networks. TA and RTT use time offset information sent from the Base Transceiver Station (BTS) to adjust a mobile handset’s relative transmit time to correctly align the time at which its signal arrives at the BTS. These measurements can be used to determine the distance from the Mobile Station (MS/UE) to the BTS, further reducing the position error.

2. Triangulation/Trilateration:

In trigonometry and elementary geometry, triangulation is the process of finding a distance to a point by calculating the length of one side of a triangle formed by that point and two other reference points, given measurements of sides and angles of the triangle. Such, trigonometric methods are used for position determination. It can be distinguished as

  • Distance-based (tri-)lateration (example: Global Positioning System, GPS), For distance-based lateration, the position of an object is computed by measuring its distance from multiple reference points

  • Angle- or direction-based (tri-)angulation (example: AOA-Angle of Arrival, TOA-Time of Arrival,AFLT- Advanced Forward Trilateration, EOTD- Enhanced Observed Time Difference)


3. Proximity:

Proximity is based on the determination of the place of an object that is nearby a well-known reference place. Again, one distinguishes three fundamental sub methods:

  • Monitoring and matching the terminal location with the database containing stamped locations(with RSSIs from different Base Stations)

  • Monitoring of (WLAN Radio) Access Points. Here, it is evaluated whether a terminal is in the range of one or several APs.


Localization Categories

Different localization principles may be applied to gain position information with respect to an object that is to be tracked. Four different categories can be distinguished:

  • Network-based: All necessary measurements are performed by the network (by one or several base stations). The measurements usually are sent to a common location centre being part of the core network. This centre takes over the final computation of the terminals’ positions

  • Terminal-based: In the terminal-based localization approach, the terminal accounts for position determination.Disadvantages of terminal-based localization obviously are given by increased terminal complexity. Increased challenges with respect to calculation power and equipment lead to the assumption that this method is only partly applicable for legacy terminals.

  • Network-assisted: Similar to terminal-based positioning, network-assisted positioning implies that the final calculation of the terminal’s position is taken over by the terminal.The difference is that possible assistance data is sent by the network. This can be done either on request or in a push-manner.

  • Terminal-assisted: This too is a hybrid implementation of the other methods like above. The terminal hereby measures reference signals of incoming base stations and provides feedback reports to the network. The final position computation takes place in a central location centre within the network.


Accuracy increased with adoption of Hybrid techniques.



Types of Positioning Techniques:

So we can classify the existing location positioning techniques currently being deployed by wireless operators over the world as:



Monday, January 4, 2010

Future of Multicast & Broadcast Services in Advanced Wireless Networks

With the evolution of Smartphone and exponentially growing market for high speed multimedia services the network needs to be smarter for delivering an exhilarating user-experience. .The transformation of mobile devices industry led by advent of successful smartphones such as Blackberry, Apple iPhone the users have become more data hungry and demanding more interactive services loading the mobile network operator's network.Multicast and Broadcast services (MBS) is the solution that will not only cater this need efficiently but also attract a large subscriber base. MBS offers real time streaming services, audio-video on demand, multiplayer online gaming, localized services, news advertisements, stocks bringing the most anticipated services at your finger tips.

MBS in 3rd Generation and 4th Generation Wireless systems requires efficient network resource utilization in access and core networks along with scalable and reliable service platforms. Also, it should incorporate the mobility aspects to continuously deliver multimedia information over an efficient air interface.

The major MBS technologies used in various 3G/4G deployment models are Media FLO by Qualcomm, DVB-H (Digital Video Broadcasting-Handheld) by DVB, MBBS by 3GPP and BCMCS by3GPP2. These technologies have garnered much attention for the revenues they can bring to terminal suppliers, network equipment suppliers, mobile operators, broadcast operators, service providers and even governments.

Multicast Broadcast Services Technologies

The main analysis considering the different old as well as new evolving use cases with the MBS technologies supporting these different services can be mapped as follows:

Considering the above use-cases we can draw insights:

1. Selection: The selection of particular MBS technology by the mobile network operator should be based on following criteria

2. Cost: For heavy duty broadcast applications the resources required would be greater in a 3G network compared to a Broadcast only network such as MediaFLO or DVB-H and hence the cost.
Whereas for light applications and highly interactive applications MBMS or BCMCS would be the ideal choice saving on resources by multicasting to the subscribed group of users instead of broadcasting it to every user in the network. Also, due to availability of an uplink channel, highly interactive applications can be easily supported on the mobile terminal providing a better user experience. Also, from unicasting perspective, with Multicast usage there is a considerable resource savings in core network and radio access network where the radio bearers are lesser than number of users compared to the number of bearers which is equal to number of users in unicast transmissions

3.Reach: MediaFLO and DVB-H have a larger cell size and hence a larger footprint which again thus requires lesser base stations covering groups of subscriber services. But again due to the existing vast coverage of the 2G/3G cellular network, these base stations can be easily upgraded to MBMS/BCMCS capabilities with a comparatively greater reach though smaller individual footprint.

4. Interactivity: Broadcast only networks are limited due to the lack of backward channel and hence no interactivity. But the interactivity can be implemented by using network operator’s feedback channel.

5.Mobile Terminal: In the current scenario, for specific applications such as Live TV, broadcast only technologies like MediaFLO or DVB-H might prove to be more efficient but the downfall is the corresponding handsets should be available to receive such broadcasts so that is an additional cost to the MNO’s.

6.Business Implicaitons: The broadcast and multicast are complimentary technologies where broadcast can be used for stimulating users to subscribe to the services and multicast services are used to cater specific endusers which eventually subscribes t ospecific services which generate revenues for the operators.

7.Mobile Trend: There is a significant growing trend towards a large number of interactive applications with the advent of modile web due to the availability of smartphones with a larger form factor and advanced capabilities. So most of the NGN will be equipped with cutting edge resource efficient technologies supporting heavy duty streaming and at the same time supporting a higher level of interactivity and a richer user experience with a better continous data connectivity and seamless mobility

Thus with a strong MBS technology selection by the MNO and a lucrative business model a smart telecom value chain is possible and with higher order benefits.

Tuesday, November 3, 2009

Farming Servers ? Make bucks out of the Clouds !!

Have you heard about Cloud Computing ? What is it? Lets check out... If you have a bunch or a grid of servers up 24x7 with many idle cycle times and warming your desk's coffee mug all night long then you have a business opportunity to harness all that excessive compute power. These resources can be leased on to work on big complex problems, or to software developers or for web applications like Facebook with huge data coming back and forth . The business model drawn from this concept is called "Utility computing" which is paying for what you use on shared servers like you pay for a public utility (such as electricity, gas, and so on).

Cloud computing is nothing but on demand resource provisioning service categorized under "Infrastructure as a Service (IaaS)" . eX: The New York Times that processed terabytes of archival data using hundreds of Amazon's EC2 cloud services instances within 36 hours. If The New York Times had not used EC2, it would have taken it days or months to process the data.

Top foggers in Cloud Computing domain:

Amazon Web Services:Google AppEngineMicrosoft AzureForce.ComIBMHP
Sun Microsystems
EMC/Mozy
Citrix and many more in the Cloud Ecosystem

The most visible early adoption of Cloud Computing was in small & medium businesses - and startups - where the advantages of insignificant up-front capital expenditure, manageable costs, simple management and rapid scaling are both compelling and clear. More of Mid-tier enterprises and SaaS(Software as a Service) providers have also significantly followed SMBs in utilizing Cloud Services.

Cloud computing is now not only seen just as "utility computing" but has extended its portfolio of usage to Internet Application hosting, databases, remote storage, on-demand storage, disaster recovery, application testing and development, batch computing jobs, billing and log processing applications.

With the growing flair of Cloud computing and intense competition between the major players as the industry structure transforming from a concentrated one to fragmented, Foggers like Amazon are building large server farms for providing the above mentioned services along with for ecommerce specifically putting everything up and ability to harvest in a big way to scale capacity to the peaks and demands.

But this is not foggy at all to the Verizons and AT&Ts of the Telecom world with the huge data centers and the infrastructure they already have, all they need to do is to start off with a row full of racks and "scale" upward. They can squeeze more money out of existing hardware and infrastructure while (potentially) being able to drop prices for more attractive SMBs.

As per co-founder of a growing SaaS products and service provder, Narinder Singh said. "Today’s economic climate will force enterprises to pick technology winners and losers for their environment in order to cut costs, be more efficient and deliver business-relevant innovation. Cloud computing makes this seemingly impossible task a possibility – much more so than with traditional software. This is why we believe cloud computing will be counter cyclical, with SaaS and Platform as a Service (PaaS) investment accelerating, and traditional software spending declining."

- Neil Shah

Note: Please feel free to comment on your views on Cloud computing's present and future. Collective learning is appreciated.

qpa9xy5fen