Mobile WiMAX may become a spent technology even before it gains any commercial traction, a market research group warned.
According to Frost & Sullivan, unless spectrum auctions and commercial Mobile WiMAX rollouts, which are compliant to Wave 2 Phase 2 certification, gather momentum before the end of 2008, the market scope for the broadband wireless technology "will be insignificant."
The researchers added the technology is facing a number of challenges that are likely to make it unfeasible as a mobile "access" technology.
However, they counteracted the bleak analysis by noting that the huge investment that has gone into mobile WiMAX may not have been for nought. The group believed that the work carried out on mobile WiMAX has the potential to spur new ventures, which could potentially lead Mobile WiMAX to merge with 3G LTE.
"Recent events have been unfavorable toward Mobile WiMAX," said Luke Thomas, program manager, Frost & Sullivan. He added that, "Sprint-Nextel Corp. recently announced a delay to the commercial rollout of its Mobile WiMAX service, Xohm, and has now stated that the first commercial service of Xohm will be in Baltimore in September 2008 and Washington DC and Chicago by Q4 2008, provided that the new WiMAX venture 'ClearWire' deal closes by Q4 2008."
Recent trends
Thomas said any operator looking at Mobile WiMAX has to consider the current environment where 97 percent of laptops are shipped with Wi-Fi technology.
3G LTE is expected to be a fully ratified standard by the Q4 08 or 1H 09 with deployments slated to occur in 2H 09 or Q1 10 offering peak data rates of up to 170Mbit/s.
He noted that the number of dual-mode Wi-Fi/cellphones is currently on the rise, with newer models emerging at lower costs and better battery life. He stressed that Alcatel-Lucent Technologies, Ericsson, NEC Electronics, NextWave Wireless, Nokia, Nokia Siemens Networks and Sony Ericsson recently encouraged all interested parties to join an initiative to keep royalty levels for essential LTE patents in mobile devices below 10 percent of the retail price.
What seems unresolved
"It is still unclear if members of the WiMAX Forum have reached an agreement pertaining to the IP rights they possess for Mobile WiMAX. Hence, prominent members of the Forum formed the Open Patent Alliance to address this issue," said Thomas.
He added that 2009 would be the year when operators begin to realize that Mobile WiMAX can no more be considered as a feasible mobile broadband access technology. "In terms of indoor wireless broadband, Wi-Fi fits well in this space and with the emergence of 802.11n, which includes MIMO, throughputs would be far better than what Mobile WiMAX can deliver," he noted.
With respect to outdoor mobile broadband environments, he said, users would expect Mobile WiMAX to seamlessly hand off to cellular networks in the absence of WiMAX reception. "In reality, this is not possible as this technology is not backwardly compatible with existing cellular technologies," he stressed.
At a recent WiMAX Forum workshop in Dubai, participants accepted that Mobile WiMAX is not optimized to simultaneously handle both data and voice applications as efficiently as high-speed packet access (HSPA) or 3G LTE. It is therefore unclear whether the initial client devices for Mobile WiMAX (ultramobile PCs or tablet devices) will meet with any degree of consumer receptiveness.
"While the Nokia N810 tablet will retail at $440 for Xohm users later this year, it is still ambiguous if consumers will want one mobile device for voice, based on cellular technology and another for personal broadband based on Mobile WiMAX," said Thomas, adding that, "This is relevant, considering that HSPA coupled with Wi-Fi can do both in a single mobile device."
- John Walko
EE Times Europe
Wednesday, June 25, 2008
Mobile WiMAX faces future setbacks
New iPhone carries $173 tag, reveals virtual teardown
Apple Inc.'s second-generation iPhone announced this month is expected to carry an initial hardware BOM and manufacturing cost of $173, according to a "virtual teardown" conducted by iSuppli Corp. The virtual teardown offers preliminary analysis and estimates of the iPhone content, suppliers and cost structure.
"At a hardware BOM and manufacturing cost of $173, the new iPhone is significantly less expensive to produce than the first-generation product, despite major improvements in the product's functionality and unique usability, due to the addition of 3G communications," said Jagdish Rebello, director and principal analyst for iSuppli. "The original 8Gbyte iPhone carried a cost of $226 after component price reductions, giving the new product a 23 percent hardware cost reduction due to component price declines."
The table presents iSuppli's preliminary virtual teardown estimate of the 8Gbyte 3G iPhone's costs. It doesn't include other costs, including software development, shipping and distribution, packaging, and miscellaneous accessories included with each phone.
New business model
With the second-generation iPhone, Apple is making a significant departure in its pricing strategy.
"The original 2G phone was sold at an unsubsidized price of $499," Rebello noted. "However, at a retail price of $199 for the low-end 8Gbyte version of the new 3G model, wireless communications service carriers will be selling the product at a subsidized rate, using a common business model for the mobile handset market."
Based on iSuppli estimates, the subsidy paid by wireless carriers to Apple will be about $300 per iPhone. "This means that with subsidies from carriers, Apple will be selling the 8Mbyte version of the second-generation iPhone to carriers at an effective price of about $499 per unit, the same as the original product," Rebello explained.
For the first version of the iPhone, Apple was given a portion of the wireless carriers' revenue from service subscriptions. For the second-generation version, Apple is not getting any service revenue, making it more imperative that the company cut a profit on the actual hardware through carrier subsidies.
"Hardware is vital to Apple profits, valuation and revenue in the consumer electronics and wireless communications realms," Rebello said. "In fact, two-thirds of Apple's revenue from the iPod still is derived from hardware, while only one third is from the iTunes service and accessories. The second-generation iPhone is no exception."
iSuppli observed that Apple's iPod and iPhone products typically are priced about 50 percent more than their BOM and manufacturing costs. With the new iPhone sold at $199 and the estimated subsidy of $300, Apple will achieve an even higher BOM/manufacturing margin, noted iSuppli.
Future costs
Like all electronic products, the 3G iPhone's BOM costs will decrease over time as component prices decline. The BOM/manufacturing cost of the second-generation iPhone will decrease to $148 in 2009, down 37 percent from $173 in 2008, according to data from iSuppli's Mobile Handset Cost Model.
"If the 3G iPhone design is unchanged, the cost will decline to $126 in 2012," said Tina Teng, wireless communications analyst at iSuppli.
Nokia buys 52% Symbian shares for $410M
Nokia Corp. has released a cash offer to acquire all of the shares of Symbian Ltd that Nokia does not already own, at a price of about $5.67 per share, or approximately $410 for the 52 percent Symbian shares.
The Finnish handset maker has received irrevocable undertakings from Sony Ericsson Mobile Communications AB, Telefonaktiebolaget LM Ericsson, Panasonic Mobile Communications Co. Ltd and Siemens International Holding BV to accept the offer, representing approximately 91 percent of the Symbian shares subject to the offer. Nokia also expects Samsung Electronics Co. Ltd. to accept the offer.
Symbian is the software company that develops and licenses Symbian OS, the open OS for mobile devices. User interfaces designed for Symbian OS include S60 from Nokia, MOAP (S) for the 3G network and UIQ, designed by UIQ Technology, a joint venture between Motorola and Sony Ericsson.
The acquisition is a fundamental step in the establishment of the Symbian Foundation, announced by Nokia, together with AT&T, LG Electronics, Motorola, NTT DoComo, Samsung, Sony Ericsson, STMicroelectronics, Texas Instruments and Vodafone.
"This is a significant milestone in our software strategy," said Olli-Pekka Kallasvuo, CEO of Nokia. "Symbian is already the leading open platform for mobile devices. Through this acquisition and the establishment of the Symbian Foundation, it will undisputedly be the most attractive platform for mobile innovation. This will drive the development of new and compelling, web-enabled applications to delight a new generation of consumers."
"Ten years ago, Symbian was established by far sighted players to offer an advanced open OS and software skills to the whole mobile industry", said Nigel Clifford, CEO of Symbian. "Our vision is to become the most widely used software platform on the planet and indeed today Symbian OS leads its market by any measure. Today's announcement is a bold new step to achieve that vision by embracing a complete and proven platform, offered in an open way, designed to stimulate innovation, which is at the heart of everything we do."
Nokia expects the acquisition to be completed during the Q4 and is subject to regulatory approval and customary closing conditions. After the closing, all Symbian employees will become Nokia employees.
Coming soon: Open mobile platform
Nokia Corp.'s announcement that it would pay about $410 million for the 52 percent of Symbian Ltd it does not already own, has sealed the deal to creating the foundation that will develop an open, free-for-use platform for mobile phones.
Epoch making. Unprecedented. Setting mobile software free. These were just some of the typical superlatives used by the senior executives from Symbian, Nokia, Sony-Ericsson, Motorola, Texas Instruments, STMicroelectronics, Vodafone and others at the London conference where Nokia disclosed its planned buyout.
Well it is certainly a bold and innovative move by Nokia, one that will be a hugely significant one for the mobile software sector. "Typically selfless and self interested," as David Levin, a former CEO of Symbian and now CEO of United Business Media, the company that owns TechInsights, publisher of EE Times commented.
What a pity, then, that the assembled executives from nine of the companies involved in the shrewd, logical, grand and ambitious scheme could not, under questioning, let themselves admit that it is also a defensive move to efforts by Microsoft, Google—with its delayed Android project—and to a lesser extent Apple with the iPhone and a slew of other open source platforms such as the growing LiMO Foundation, to muscle in on the mobile phone OS sector.
Uniting Symbian OS and the S60 platform, Motorola's UIQ and DoCoMo's MOAP is likely to take some time—two years to full completion was the target mentioned—and tax the "hugely experienced and talented" engineers Nokia executive VP Kai Oistamo said he is looking forward to welcoming on Nokia's payroll.
Elements of the platform being created will be available to the developer community as soon as the Symbian Foundation is up and running, expected to be later this year or early 2009 when Nokia has competed the purchase of the shares owned by Ericsson, Sony-Ericsson, Matsushita, Siemens and Samsung in Symbian Ltd. So over the next two years, we will see the integration of the three user interface systems into one with a release of code sometime in 2011.
The Foundation—backed by five of the top mobile phone OEMs, three major operators, two chipmakers—would be "open to all comers", commented Oistamo, and does not make non-fragmentation a condition of membership and licensing, but it is more than likely, there will be a compliance brand and cold shoulder approach to any that attempt to fragment the system.
More direct input from chipmakers and network operators, as well as closer integration of the OS and user interface, should make the whole platform more stable and attractive to operators and the millions of application developers. Operators such as Vodafone have long bemoaned the fact that they have to accommodate too many platforms.
One analyst called the move "the ultimate manifestation of the 'boy scout effect,' where Nokia believes that a greater opportunity for all will result in more profits for Nokia itself than going it alone." We could not agree more with the assessment of Richard Windsor, of Nomura Securities.
Only time will tell whether this is a masterstroke that leads to total dominance or the straw that broke the donkey's back.
There are two big risks here. So far, Symbian has been a tightly controlled ecosystem where fragmentation has not been allowed to happen. This control point seems in danger of being breached. This, combined with standardization by consensus, could give proprietary systems such as Windows Mobile and Apple's an advantage in time-to-market and nimbleness.
- John Walko
EE Times Europe
Monday, June 23, 2008
mimoOn taps Nokia wireless experts
Software-defined radio (SDR) solutions provider mimoOn GmbH has added key wireless and software engineers from Nokia's research location in Bochum, Germany and Helsinki, Finland. The team includes members with backgrounds and focus in software-defined radio, baseband design, software architectures, and UMTS/Long Term Evolution (LTE) PHY and protocol layer implementation for wireless handsets and base stations.
"The mobile wireless world is transitioning to fourth-generation standards in order to support high bandwidth applications such as mobile video, multimedia, Internet browsing, and interactive gaming. mimoOn is well positioned with its 4G LTE solutions which are finding market traction throughout the ecosystem, from macrocell, picocell, femtocell and test equipment manufacturers to semiconductor providers. These Nokia engineers will help propel the company securely into its next growth phase," commented Thomas Kaiser, CEO of mimoOn.
mimoOn's solutions are optimized for low memory footprint, as well as for processing power and power consumption. The PHY, MAC, RLC, PDCP, and RRC components are all fully featured, based on the Release 8 baseline specifications from 3GPP. In addition, mimoOn's solution is entirely software defined, so it is easy to modify or customize even after being deployed—an important feature as the standards for LTE will continue to be revised well into 2009.
mimoOn will demonstrate its LTE SDR solutions at the NGMN Industry Conference in Frankfurt from June 25-27.
Wireless sensor nets hunt for full 'killer' apps
Wireless sensor networks (WSNs) are not being fully utilized in commercial markets due to the lack of a 'killer application' to drive interest, a Plextek-led report for U.K. communications industry regulator Ofcom said.
The 10-month study was commissioned to examine technology developments in WSNs, along with market growth scenarios and what would be the spectrum implications.
Plextek worked with the University of St. Andrews and TWI Ltd, a Cambridge-based independent research and technology organization, on the research, which found that it is the traditional sensing applications that are currently commercially exploiting the advantages of WSNs.
The popularity of WSNs
The report suggested that WSNs may become more widely deployed over the next three to five years, with systems continuing to adopt existing license-free bands including 13.56MHz, 433MHz, 868MHz and 2.4GHz. The main issue for WSNs will be the band crowding, especially 2.4GHz with the increased Wi-Fi use.
Ofcom commissioned the study as part of its "Tomorrow's Wireless World" R&D program into the future of communications technology.
"Our research produced some very interesting conclusions," said Steve Methley, senior consultant, Plextek. "The lack of a killer application may be due to limiting factors such as current cost of wireless nodes and lack of understanding or experience by end users, especially regarding 'real-world' reliability," he added.
Methley noted that there is also a need for further improvements in batteries and energy scavenging technologies.
Suggested strategies
"One possible movement toward a killer application is to let major systems integrators get involved," said Methley. "Such players will increasingly come on board when there is a need to take a professional approach to defining, installing and maintaining substantial wireless sensor networks," he added.
The study suggested that while existing unlicensed spectrum can adequately support WSNs, "a dramatic increase in use could prove problematic."
"Typical radio protocols such as the popular 802.15.4 standard are designed to be polite and to check for clear channels before transmitting," it noted. This may become a problem when bands become crowded. The standard suffers because of its politeness in the face of increasing Wi-Fi usage, particularly in the streaming applications. This may lead to the appearance that WSNs are unreliable, an important issue as the perception of unreliability is one of the key barriers identified for WSN adoption.
- John Walko
EE Times Europe
Freescale channels embedded
It takes a strong ego to climb aboard a large, well-established, well-liked company that's in distress and think you can turn it around. Ninety days into his tenure, Rich Beyer, the no-nonsense, straight shooting chairman and CEO of Freescale Semiconductor Inc., still thinks he has what it takes. Beyer used the recent Freescale Technology Forum to layout just how he plans to do it.
While it was no surprise to hear the new boss will continue to emphasize automotive, a sector in which Freescale has always done exceptionally well, he also took great pains to hammer home the centrality of embedded.
"Freescale, at its heart, is an embedded-processor company," Beyer said. "We serve those applications with additional functionality, such as analog, sensors, RF and software. That strength in processors is the essence of where our company is leading to," he added.
Underscoring that message, Freescale used the four-day conference to the QorIQ, a multicore processor designed to enable advanced network processing, or what Freescale marketers call "the Net Effect."
The prospect for innovation
"The processor's flagship device will have eight cores, although it will take some time to develop the tools to support that level of functionality," said Lisa Su, chief technology officer, Freescale.
Beyer and his team of executives and technologists spoke in the forum as one on how the company's strengths and customer relationships will form the paths to a turnaround. Along with embedded, automotive will continue to be an essential element, especially as hybrid vehicles raise the electronics bill of materials. Cited as crucial to Freescale's future are also wireless, analog and sensors, combined with a new emphasis on the fast-turnaround consumer market in general and the "green," health and network-processing areas in particular.
Learning from experience
While the new QorIQ processor signifies the top-end of the embedded spectrum, the company is no less passionate about the lower-end, microcontroller segment. "We have a God-given right to be a leader in microcontrollers," said Henri Richard, senior VP and chief sales and marketing officer, referring to Freescale's long history in the space. Richard noted, however, that through error, the company eventually ceded ground in the 8bit arena and made some poor distribution-channel decisions.
In an interview during the opening day of the forum, Beyer had no illusions about the task ahead of him. "Clearly, we have had a series of challenges over the past several years and have shared the pain with former parent Motorola," he said.
But the new boss believed that his experience as CEO of Intersil Corp. and Elantec Semiconductor leaves him well equipped for the road ahead. When asked about his knowledge about the market and the product families, he affirmed he has dealt with these issues before so they will come handy. He described Freescale as in good, if not great, shape, adding, "This is a company that I want to be successful."
What must be done
Three months on the job have led Beyer to conclude that Freescale to be successful, it must focus on its strengths and change its business practices and models in some areas right down to its roots.
"We do not have deeply in our gene pool the DNA for consumer products," he said. "We need to invest in markets that will see return in 18 to 24 months, but we also need to invest with more stable returns," he added. The marriage of the company's i.MX processing platform with the recently acquired SigmaTel Inc.'s low-cost analog/mixed-signal expertise is a move in that direction.
While SigmaTel hit it big with its iPod design win, Beyer is not betting the bank on a repeat. "We're not depending on hits, since you need an awful number of strikeouts to get those home runs," he said. "But we're in enough applications that we'll have many singles," he added.
Besides directing his business team to come with a three-level product-development plan that calls for near-, mid- and long-term revenue returns, Beyer is also looking to improve on customer execution, in part by not overcommitting the projects. "We try to do way too many things: It leads to failure," he said.
The combination of a push deeper into embedded and a strong integration story puts the company right up against Texas Instruments Inc. (TI). "TI has recently started to talk about itself in the context of embedded processors, but they're not really an embedded-processor company," said Beyer. "That's never been a central market for TI," he added.
What others think
TI begs to differ. When asked about Beyer's comment, Mark Dennison, VP, strategic marketing, TI, argued that the company has been shipping processors into embedded applications such as base stations, voiP equipment and software-defined radio. TI also has a strong microcontroller line, signified by the MSP430.
"We've shipped a few billion ARM cores, and in February we launched the 3500 series," Dennison said. "I'm a little confused as to where Freescale is coming from," he added.
Jeff Bier, president of research and analysis firm BDTI, agreed. "To say TI is not embedded is ridiculous," he said. To Will Strauss, president of Forward Concepts, it's all semantics, given the wide range of "embedded processor" definitions. "Put a Pentium in anything besides a PC, and it's embedded," he stressed.
Dennison also commented on Freescale's claimed integration and analog advantage, pointing to TI's power management, amplifier RF and converter lineup. "We can integrate all those technologies," he said, "whether it is on stacked dice, multichip modules or package-on-package."
- Patrick Mannion
EE Times
ABI: Wireless HDTV to hit 1M installations in 2012
After wireless phones, wireless Internet and wireless home networks, the next attraction coming to the living room is wireless HDTV. At present, the market is still in its incubation stage, with fewer than 100,000 devices expected to ship this year.
According to a study from ABI Research, optimistic forecasts point to 2012 as the earliest year for having 1 million wireless HDTV installations worldwide.
Meanwhile, a "battle of technologies" is being fought. There are three contending systems, loosely characterized as 5GHz, 60GHz and UWB.
"5GHz technology is better understood and more proven," says principal analyst Steve Wilson, "but achieving the required data rates requires new approaches and more complex solutions. UWB technology has bandwidth advantages at in-room distances but drops rapidly at greater ranges. 60GHz allows high data rates, but so far only one company is even close to a viable solution."
Small numbers of 5GHz and UWB devices are currently shipping; demo products of 60GHz systems are expected early next year.
"Over the next two to three years, we're going to see one or two of these wireless HDTV approaches emerge as the primary ones," says Wilson.
Who would want wireless HDTV and why? Wireless will simplify some installations and allow more flexibility in positioning TVs. There are both commercial applications—digital signage, for example—and domestic applications such as wall-mounting a flat-screen HDTV. "The initial sweet spot in the market is where wired installation would be difficult or complicated," says Wilson.
All the wireless HDTV silicon vendors are venture-backed startups and most established wireless vendors are waiting to see how the market evolves. Product manufacturers are moving forward with different strategies: some, like Westinghouse and Belkin are initially targeting commercial and custom installers where there is clear value-add. In contrast, some TV manufacturers such as Sharp and Hitachi are targeting buyers of their latest technology, offering design-oriented products with a wireless connectivity option.
Monday, June 16, 2008
LTE subscribers to boom to 32M in 2013
There will be over 32 million subscribers using Long Term Evolution (LTE) networks by 2013, just three years after it is expected to go commercial, forecasts ABI Research. Three of the largest mobile operators—China Mobile, Vodafone and Verizon Wireless—have announced plans to adopt LTE.
Asia Pacific will account for the largest regional share. "ABI Research anticipates about 12 million Asia-Pacific LTE network subscribers in 2013," says senior analyst Nadine Manjaro. "The remainder will be split about 60-40 percent between Western Europe and North America"
Moreover, LTE commitments from NTT DoCoMo and KDDI in Japan are said to further boost adoption.
The long wait for the China government to issue 3G licenses may also become a factor driving LTE in that country.
"It wouldn't surprise me to see some operators skip over 3G and go straight to LTE," says Manjaro. "Although China's own TD-SCDMA 3G technology will be deployed on a small scale during the [Beijing] Olympics, I can't see operators spending billions to implement that or any other 3G technology if they will just have to upgrade within a year or two."
Since LTE deployment involves new hardware and software, several industry sectors stand to benefit. Before 2010, it will be vendors of the test equipment used to ensure network interoperability and performance. Next will come vendors of the required network infrastructure equipment itself. Finally, it will be device manufacturers.
Because LTE is primarily about data, not voice, its first phase will see devices such as USB dongles for PCs: ABI estimates 53 million to ship by 2013. Because LTE will compete directly with cable and DSL services, in-home modems will also see volume shipments, as will mobile Internet devices and ultramobile PCs. Manjaro calls the device market "a huge opportunity."
Enhance VoIP telephony with HD Voice
By Daniel Hartnett
Infineon Technologies AG
Do you remember hearing FM radio for the first time, or listening to your first CD after years of scratched Vinyl? That's the experience high-definition (HD) sound brings to a telephone. As VoIP becomes commoditized, the focus of system developers and service providers shifts from providing VoIP to providing higher-quality VoIP.
Taking advantage of the strong marketing behind HDTV, HD-sound is now the accepted brand name for Wideband Voice. This allows service providers to offer superb and pristine audio quality over their IP phone-enabled home gateways. The traditional "narrowband" telephony was a compromise between speech intelligibility and data rates, providing an acoustic bandwidth of 300Hz to 3.4kHz. In contrast, HD-sound uses wideband technologies to offer a transmission range of 50Hz to 7 kHz or beyond.
The result is significantly increased intelligibility and a much more natural sound not only for voice conversation, but also for a range of other audio applications, such as MP3 and Internet radio. This article attempts to address the hurdles associated with delivering HD performance in telephony, and explore its market potential.
Wideband telephony
"Wideband" telephony specifies a transmission range of 150Hz to 6.3kHz. While this is not CD bandwidth (20Hz up to 20kHz), the increased bandwidth compared to narrowband offers significantly improved intelligibility.
Wideband telephony was standardized for ISDN with the G.722 codec about 20 years ago, but never really enjoyed wide deployment. G.722 however did make its way into journalism, where wideband with G.722 is often used for voice transmission from remote locations as an alternative to the poor quality of standard telephones.
As IP-phones already have powerful signal processing capabilities for narrowband speech compression algorithms, wideband codecs can easily be handled by the voice engines within IP-phones. If the ADCs and DACs support a 16kHz sampling rate, wideband telephony on an IP-phone comes with relatively low additional overhead.Another factor driving the development of wideband telephony is the new DECT standard CAT-iq, which also specifies G.722 as the required codec for HD Voice.
PC soundcards support 8- and 16-, 32-, 44.1- and 48kHz samplings rates, and generally have the necessary processing power for wideband codecs. PC-based soft-phone applications like Skype already have a huge footprint in the market.
Most enterprise IP-Phones like Siemens' OpenStage series already support wideband. The enterprise market for wideband is an excellent proof of concept as it is much easier to control the hard and software running on the end points. The deployment of HD voice in the residential space is much more difficult. Wideband requires that both parties in a call have wideband capable hardware and that the phone immediately shifts up to the best codec available.
In the past VoIP had to contend with a less than solid reputation. From its early days where only brave pioneers would make a connection over the internet, broadband users have been fast to take up the offerings of new players on the voice service provider market. The traditional trade-off was quality against price.
Today, VoIP quality has improved beyond recognition and is easily comparable to that of POTS services. As available bandwidth and processing power of customer premise equipment becomes the norm, the possibility of using more bandwidth for vastly improved voice quality is very real and imminent. This is where providers can differentiate their services.
HD VoIP
VoIP is not just VoIP. HD Sound makes it marketable above and beyond price. A POTS phone call is thin and almost monotone in comparison to a well-implemented HD Sound call. This leads to a "warmer" sounding phone call, where all the nuances of our voice are captured. Mistaking "s" for "f" is now a thing of the past. The possibilities that this brings are manifold. The hurdles associated with bringing it to a wide audience are also considerable.
To optimize their Wideband implementations, it is vital for phone manufacturers (fixed and cordless) to adhere to some important rules: The electro-acoustic components, especially the handset receiver or the hands-free loudspeaker have to be able to reproduce the whole wideband frequency range with low distortion and high fidelity in their respective mountings.
This poses huge challenges to the device designers, especially for devices with a small form factor like cordless or mobile phones. First-class voice quality does come at a price, but one assumes that a mass market for the application will regulate this.
On the speakerphone side, the following is key. It is advisable to encase the speakerphone in order to avoid echo within the housing and to emphasize the lower frequencies like a home Hi-Fi speaker, which is also completely sealed.
In any VoIP phone (narrow or wideband), delay is the most difficult hurdle to overcome in the quest for full-duplex performance. The human ear is insensitive for the echo that immediately follows the spoken word. Otherwise you would always hear a strong echo inside any given room.
But the higher the delay between one's speech and its echo, the more sensitive the ear becomes. That's why you always hear echo in a church. In a standard IP network packet delays of more than 100ms are possible—that's one BIG church.
For this reason additional effort has to be spent to reduce echo. The echo cancellation inside a phone behaves like the human ear. It cancels echo by estimating, calculating and subtracting the result from the microphone signal. This can be a difficult job as it must work in any environment where a phone can reside.
Added markets
HD Voice opens a myriad of possibilities for system vendors and service providers to access new markets.
Interactive voice response: Can you imagine trying to book a flight with the aid of a call service using pre-recorded voice samples? Hardly. Today's voice-activated services mainly serve to drive people mad, unable to understand even the slightest delta to the trained version of the word.
With Wideband the nuances in the human voice can be captured more easily and make voice-activated services a viable market with huge potential. Not only could we upgrade our broadband or phone services without actually speaking to anybody but booking a flight, a hotel or a train all become real possibilities.
Speech recognition systems will also benefit from increased bandwidth and provide a better recognition rate, especially because sibilants can be recognized much better. (Sibilant is the "s" sound we make when we talk - in this respect the letter "f" is often mistaken for "s" in a narrowband call)
A text-to-speech (TTS) system converts normal language text into speech (using synthesized speech). The quality of a speech synthesizer is judged by its similarity to the human voice, and by its ability to be understood. An intelligible text-to-speech program allows people with visual impairments or reading disabilities to listen to written works on a telephone or PC.
Automatic translation: Voice samples are translated to text in real time
Automotive speech recognition: Uses voice to command various functions in a car (wipers, radio, windows etc. not to drive it though!)
Speech Biometric Recognition: Speaker dependant Authentication. Possible applications could be in workplaces or anywhere that requires some sort of identification.
Dictation
Hands-free computing: Speech recognition for commands on a PC for disabled users.
Home automation: Uses voice to command things we usually need a switch to do. E.g. Close the shutters, turn off the lights, turn on the heating
Medical transcription : The practice of Modern Medicine dictates that physicians spend more time serving patient needs than creating documents in order to make financial ends meet. More modern methods of document creation are being implemented through the technology of computers and the internet. Voice Recognition (VR) is one of these new-age technologies. With the power to write up to 200 words per minute with 99 percent accuracy Voice Recognition has freed physicians from the shackles of traditional transcription services.
Web Radio on a cordless VoIP phone: The Bandwidth provided by today's broadband connections is more than adequate to drive Wideband down to the residential end user. To this end the DECT forum has initiated CAT-iq (Cordless Advanced Technology -Internet and Quality), a new cordless phone standard to tap into the potential of Wideband in VoIP end points.
Several steps are envisaged:
HD Voice in Cordless Phones: Vendors are striving to bring new products to the market that support HD voice. As discussed earlier this means upgrading the phones to include improved microphones and speakerphones to get the most out of the wideband codec.
Conferencing in Wideband quality: With improved hardware, it will be possible to add new features like 3 party conferencing in pristine quality, bringing a whole new experience to the user.
Web Radio: As part of rolling out new services, future CAT-iq products will support things Like News-Tickers and more noticeably Web Radio in HD quality. This promises to be the killer application for VoIP in the home, marrying the power of the Internet with HD audio quality. Now Irish people in Australia can listen to Radio Cork and Chinese in Munich can listen to Shanghai FM without launching their PC down in the basement.
Streaming audio content: CAT-iq will enable Cordless equipment vendors and service providers to enter markets previously the domain of the Hi-Fi specialists. Audio speakers containing a DECT receiver would be the perfect solution to distribute audio content around the home and even between different floors in the home. Not only is the air interface stable but also has optimal power consumption for this application.
Tuesday, June 10, 2008
Intel's Atom climbs higher ground
There is no place like Taipei for mobile mania, the city where mopeds are the vehicles of choice. So, it is no surprise Intel's Atom and its rivals from Nvidia, Via and others gathered at Computex Taipei last week to pop a few wheelies about new kinds of mobile product concepts.
These days, Intel loves to egg on this mobile mania with its 2W+ Atom. But make no mistake about its agenda.
The world's biggest and most narrowly focused semiconductor maker is craving for growth. Desktops have peaked and servers are humming along at a moderate pace and only notebooks are really growing at a lively pace.
So the x86 giant wants to generate a little excitement about new categories of products its marketing managers dream of in their spare time. These days, Intel is generating names faster than they can come up with rational definitions such as net-tops, net-books, ultramobile PCs, mobile Internet devices.
Next innovations
Taipei has always been high on such visions from the smoke-and-mirrors department in Santa Clara. When I first traveled to Computex in 1990, it was the season of the Palmtop PC, little clamshell devices with Chicklet keyboards, black-and-white LCDs and dumbed-down versions of Windows.
We were so excited about Palmtop PCs. Every self-respecting ODM in Taiwan had a prototype palmtop at their booth that year. Every Computex participant during that time wanted to be the first have it. Within a year, the whole category was dead.
At best, the devices slipped into your pocket with all the grace of a grapefruit. They were nearly as useful.
Scroll ahead nearly 20 years and see what little we have learned. The Taiwan industry is still as gullible for a new system concept that promises something better than a single-digit profit.
But that is not the mobile Internet device. Nor is it the net-top, net-book or next-generation mobo-mumbo-jumbo. "These are systems that will bring the billion users to the Internet because they will be cheap," said Intel. Certainly, it is less than $300 or probably less than $200.
The only expensive item is the Intel processor. Everything else can be a commodity. Such is the vision of mobile computing from Santa Clara.
At present, everyone is drinking the Kool-Aid. Seeing its future in this zero-billion-dollar market, Nvidia has rolled out its Tegra, a smaller, lower-power alternative to Atom that is just as potent and as expensive. Via has a Nano CPU that was awarded a Best of Computex prize as the CPU from the Taiwan homeland. Even normally sober Broadcom Corp. came to Taipei talking nonsense about media codecs for mobile Internet devices and ultra mobile PCs.
The problem with many of these devices is not that they have not had a powerful enough CPU or video decoder. The problem is there are not good display and input technologies that can be easily tucked into a pocket then rolled out to let human eyes and fingers do real work or have fun.
Engineers sometimes forget they have no power to redesign pockets, fingers or eyes. But sometimes when they are swept up in gadget lust they can forget these truths.
I'll make one exception. It's possible some net-tops and net-books may actually be new versions of entry-level desktop and notebook PCs.
Craig Mathias, one of my favorite wireless analysts, raves over his Asus eePC. Mathias said he can't wait to get his fingers on an MSI Wind because it is small and inexpensive like the eePC, but it has a bigger 80Gbyte hard drive and 10-inch display.
Such systems are not new product concepts, but stepwise extensions of old ones. They will not open up new markets but create new niches in existing ones.
Heading for the future
The future of the mobile market lies in the smart phone. Apple has shown with the iPhone how to create a useful and pocketable device for Web access and telephony.
The first time I saw Andy Bechtolsheim carrying one at a conference he was so excited about it he nearly jumped out of his signature Silicon Valley sandals. "Finally, someone has found a way to put the Internet experience in your pocket," he told me.
The iPhone doesn't need an x86 chip, much as that must frustrate Paul Otellini. According to the latest trends, my colleagues at Portelligent Inc. have seen in their teardowns that it may not even need an applications processor in the near future.
A simple cellular baseband with an extra ARM core or two will probably do quite nicely for these systems. Last-generation hardware is just fine.
But these mobile systems of the future will need a number of creative software to make use of new input technologies like multitouch displays software. That's something Intel and Taiwan generally put at the end of the product-creation cycle as icing on the hardware cake.
That's why the mobile future is coming not from Santa Clara but from Cupertino. Even more than 18 months after this future was shown to the world, Intel and Taiwan have still not quite figured out how to replicate any piece of it except the mobile mania.
- Rick Merritt
EE Times
Compal favors Wavesat's 4G broadband chipset
Wavesat has announced that Compal Communications Inc. has selected Wavesat to co-develop products for its mobile WiMAX product line, starting with a USB dongle. The collaboration between the two companies will leverage Wavesat's multimode Odyssey 8500 4G broadband chipset, providing access to multiple broadband wireless technologies including WiMAX Wave2, XG-PHS, Wi-Fi and seamless migration to LTE.
The Odyssey 8500 chipset from Wavesat employs a 4G multicore architecture incorporating multiple ultralow power DSPs. The SOC manufactured using advanced Embedded DRAM technology requires no external memory, thus saving customers valuable real estate, cost and power consumption for very small form-factor portable and mobile applications such as wireless USB dongle, mobile handsets and other consumer electronic devices, says Wavesat.
"We are pleased to be working with Wavesat to deliver WiMAX Wave 2 solutions with advanced functionality and low power consumption", said Marketson Ma, chief product manager for Compal Communications. "Wavesat's multimode architecture meets our goals for being the leader in providing innovative 4G broadband wireless terminal solutions to the market".
"Compal Communications is a company that can turn innovations into wireless reality very rapidly and we are extremely happy to have been chosen as a Mobile WiMAX technology partner and chipset supplier," said Raj Singh, president and CEO for Wavesat. "Our Odyssey family of products offers unique multimode capabilities which can easily be adapted to support today's broadband wireless technologies, and offer a seamless migration path to future 4G standards."
Huawei helps launch HK's first CDMA2000 1xEV-DO Rev. A network
Huawei Technologies Co. Ltd has been selected by Hong Kong telecom provider PCCW to help deploy its first CDMA2000 1X/1xEV-DO Rev. A Network. The service will allow travelers in the city to roam on PCCW's network and experience seamless CDMA roaming and mobile data services.
Under the terms of agreement, Huawei will provide its fourth generation CDMA base transceiver station (BTS) solution to help PCCW construct a CDMA2000 1xEV-DO Rev. A network. Huawei's fourth generation BTS features multi-standard convergence, high integration, All-IP, energy conservation and environmental protection, and fully meets PCCW's business requirement for long-term development as it reduces the operator's operational costs and ensures it can evolve smoothly beyond 3G in the future.
"Operators nowadays are increasingly looking for long-term partners who are able to help them build their future-oriented CDMA networks," said Zhao Ming, VP of wireless product line, Huawei. "Huawei has a great deal of experience in developing CDMA technologies and is committed to providing mobile operators with customized solutions under the concept 'Green, Convergence, Broadband and Evolution'."
As of March 2008, Huawei has won 84 EV-DO commercial contracts and has served CDMA operators such as China Unicom, Leap in North America, Reliance and TATA in India, and Telkom in Indonesia.
IBM splashes water on hot 3D chips
IBM's Zurich Research Laboratory recently demonstrated 3D chip stacks that are cooled with water. The company expects to commercialize such stacks for its multicore servers as early as 2013.
IBM plans to stack memory chips between processor cores to multiply interconnections by 100 times while reducing their feature size tenfold. To cool the stack at a rate of 180W per layer, water flows down 50ยต channels between the stacked chips.
"Electrical interconnects are in a wiring crisis. The wiring does not scale the way transistors do it because the wire width is shrinking while their length is not," said Thomas Brunschwiler, researcher, IBM Zurich. "Our solution is to go to 3D to stack multicore dice and have the interconnections go in between them vertically, which can decrease their length by up to 1,000 times," he added.
IBM's paper on the approach, "Forced convective interlayer cooling in vertically integrated packages," received a Best Paper award at the IEEE ITherm conference, held recently in Florida. This marked the third consecutive year that the IBM's Advanced Thermal Packaging team has won the awards. The Zurich group claimed to be fixated on water cooling because it is up to 4,000 times more effective than air in removing heat from electronics.
Early this year, the same group described the water cooling method for IBM's Hydro-Cluster supercomputer. For the Hydro-Cluster Power 575, the group replaced heat sinks with water-filled copper plates above each core. The team predicted high-end IBM multicore computers would migrate from the copper-plate water-cooling method to the 3D chip-stack in five to 10 years.
The 3D water-cooled chip stacks will interleave processor cores and memory chips so that interconnects run vertically chip-to-chip through copper vias that are surrounded by silicon oxide. Thin-film soldering (using electroplating) enables the separate dice to be electrically bonded to the layers above and below them, with the insulating layers of silicon oxide separating the flowing water from the copper vias.
The power density dramatically increases for such 3D chip stacks, since enough heat gets trapped between layers to melt the cores. To solve the problem, IBM etched a liquid aqueduct into the silicon oxide on the back each die. This creates a water-filled separating cavity with 10,000 pillars, each housing a copper via surrounded by silicon oxide. The cooling technique runs water through the aqueduct between each layer in the chip stack, enabling IBM to channel heat away from 3D multichip stacks of nearly any scale.
"The technology forces water between the layers in the chip stack, picking up the heat right at its source," said Brunschwiler. "We found that to create an efficient heat remover, we had to use a structure with very little resistance to the fluid flow. We found that round pillars aligned in the flow direction and put under pressure gave the best convective heat transfer," he added.
IBM packages the chip stacks in sealed pressurized silicon housing with an inlet reservoir on one side and an outlet reservoir on the other. The only way water can get from the inlet side of the silicon box to its outlet side is by going through the silicon oxide layers separating the layers of the 3D stack. Cool water enters a 3D chip stack and exits heated. The protected copper vias connect the chips vertically. After being forced through the layers between the chips in a stack, the heated water could be fed to the hot tap of the customer's plumbing, turning a data center's wasted heat into a means for reducing the data center's carbon footprint, according to IBM.
Meanwhile, the team plans to optimize the cooling structures for smaller chip dimensions, more interconnects and sophisticated heat transfer structures. In particular, the lab is experimenting with ways of adding extra cooling to the designated hotspots on cores.
Eventually, IBM envisions a hierarchy of cooling structures similar to those in the brain, which branch out to cover a large surface area combined with many interconnections among layers.
Friday, June 6, 2008
Targeting Niche Audiences - AOL's New Branding Strategy
By Scott Buresh
AOL, once considered a pioneer in internet technology, has fallen on hard times over the years, unable to devise an effective branding strategy. A failed merger with Time-Warner, a non-focus on search while Google built an empire (the AOL search engine eventually began serving up Google results on its portal site), and declining dial-up business are all contributing factors to the ongoing difficulties of AOL and its search engine.
However, AOL seems to have a new branding strategy in mind for the AOL search engine, which would revamp its services and target specific niches. And while many "analysts" claim that it is already a failure before the results are in, it is too soon to tell how this will affect AOL and the search engine that bears its name. Personally, I think it's a smart play for the company - and something that bears watching. If the branding strategy is successful, another huge company may want to follow AOL's example.
You see, AOL understands that the AOL search engine and its other services are not a brand beloved by many. The AOL search engine and AOL itself are seen as somewhat ancient, old school, 56k, etc. Nightmare stories about its online services are not in short supply. I haven't done any specific studies on this, but in my circle of friends and business acquaintances, people consider an AOL subscriber a little behind the times.
The point is (in my opinion) that the "AOL brand" itself has decreasing value and may actually have negative value if the specific sites that it owns or has recently purchased are brought in under an umbrella branding strategy. These sites include those catering toward everything from country music fans to moms sharing photos to guys trying to pick up women. In some cases, the niche sites do not even display their affiliation with AOL or its search engine (or if they do, it is not featured very prominently).
The logic behind this branding strategy is clear. First of all, the AOL search engine and portal weren't attracting new visitors. Secondly, the AOL search engine and brand itself are not particularly hip or fresh. Third, and probably most importantly, specific portal sites attract specific types of users, which are usually highly targeted, prompting a potential for more ad revenue (in theory).
Basically, the AOL portal has stopped trying to be all things to all people. Google is able to pull off the "all things to all people" approach primarily because it doesn't have issues with a branding strategy yet - in fact, the new vertical searches that it adds under the Google "branding umbrella" are augmented by implied hipness and coolness. However, as AOL has discovered, hipness usually has a shelf life. If people began to see Google as the huge corporation that it is now, rather than the uber-cool underdog, the company may not be able to keep this record up. There have already been some cracks in its veneer, although by and large, the Google brand is still very positive and powerful.
There is another company much bigger than AOL that suffers from much of the same problems (and in some cases, worse problems) than AOL does but still wants to take on Google head to head. I refer, of course, to Microsoft.
In terms of a brand, Microsoft is almost universally disliked. The monopoly issue may be one thing. The fact that it is seen as 'old school' may be another. Gates and Ballmer don't exactly have reputations as "nice guys," like Sergei and Larry do (the fact that it seems natural to refer to the former two by their last names and the latter two by the first may help illustrate this point). And the list goes on.
The bottom line is that I have a hard time seeing MSN.com gaining the kind of traction that Google has, simply because the brand is less than sexy. This means, of course, that any additional vertical search options that MSN adds to its site are bound to be appreciated only by the dwindling few who already swear by the portal.
AOL has decided that its branding strategy for the AOL search engine and niche sites is not nearly as important as the amount of traffic and ad revenue that the site commands. This is not uncommon in the publishing industry, where many different publications on many different topics may be owned by one large (but largely silent) entity. Many of these offline publications have moved online and are beginning to monetize their diverse base of websites. AOL seems to have a similar model and branding strategy in mind for the AOL search engine and other niche sites.
If it works for AOL and its search engine, it could be the best possible branding strategy for Microsoft to follow. Lord knows Microsoft has the money. The company has already bought the ad networks that can service sites under its own new branding strategy. But if pride dictates that it keep everything under the MSN name or add a huge "brought to you by Microsoft" banner across the top of any popular online property that it decides to buy, MSN is, in my opinion, shooting itself in the foot.
I never said it was fair, but your brand and branding strategy can either be an asset, neutral, or a detriment. Microsoft has to realize that most people consider its brand to be in the neutral to detrimental range and that most people consider Google to be in the neutral to asset range (and that's probably being charitable). Microsoft should not try to compete with Google head to head without considering the disparities in the conceptions of their respective brands.